Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[GoogleGenerativeAI Error]: Content with role 'user' can't contain 'functionResponse' part #1530

Open
merbaz opened this issue Nov 28, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@merbaz
Copy link

merbaz commented Nov 28, 2024

Description
This issue seems to occur when outputs from tool functions are mapped by the LLMAgent's chat method. Specifically while generating a chat template for the LLM, and maintaining chat history between tool function calls.
This issue, as far as I know, is caused when using Gemini with LLMAgent.

For further reference, I found something that was discussed that may be relevant to this which is already merged in PR:
Comment By parhammmm on PR # 822 where he states that an issue occurred where "role = assistant was being mapped to Gemini's user, causing the entire history to be merged into one"

To Reproduce
A basic example involves using the following tools:

// pool: DB connection pool configured using mysql2

const viewAllTables = FunctionTool.from(
      async ({ dummy }: { dummy: string }): Promise<JSONValue> => {
        const query = "SHOW TABLES";
        return pool
          .promise()
          .query(query)
          .then(([rows]) => {
            if (Array.isArray(rows)) {
              const listString: string = rows
                .map((row: any): string => row[`Tables_in_${db_name}`] || "")
                .join(", ");
              return listString;
            } else {
              throw new Error("rows not in correct format");
            }
          })
          .catch((error: unknown) => {
            if (error instanceof Error) {
              return error.message;
            }
            return "Unknown Error Occurred";
          });
      },
      {
        name: "viewAllTables",
        description: "Use this function to get table names from database.",
        parameters: {
          type: "object",
          properties: {
            dummy: {
              type: "string",
              description:
                "This is a placeholder parameter and will be ignored",
            },
          },
          required: ["dummy"],
        },
      }
    );

const applyQuery = FunctionTool.from(
      async ({ query }: { query: string }): Promise<JSONValue> => {
        console.log({ query });
        return pool
          .promise()
          .query(query)
          .then(([rows, fields]) => {
            console.log({ rows });
            console.log({ fields });
            console.log({ stringified: JSON.stringify({ rows, fields }) });

            return `1500`;
          })
          .catch((error: unknown) => {
            console.log(error);
            if (error instanceof Error) {
              return error.message;
            }
            return "Unknown Error Occurred";
          });
      },
      {
        name: "applyQuery",
        description:
          "Use this function to apply any SQL query on the pre-defined database.",
        parameters: {
          type: "object",
          properties: {
            query: {
              type: "string",
              description: "Enter Syntax-Correct SQL Query.",
            },
          },
          required: ["query"],
        },
      }
    );

Next, I simply used the tools and Gemini API as per the docs:

const gemini = new Gemini({
        model: GEMINI_MODEL.GEMINI_PRO_1_5_FLASH_LATEST,
        session: new GeminiSession({ apiKey: process.env.GOOGLE_API_KEY }),
      });
      const agent = new LLMAgent({
        llm: gemini,
        tools: [viewAllTables, applyQuery],
      });

// chat with agent:
const response = await agent.chat({
        message:
          "Which tables are in the database? Then count the rows of online_charity table",
      });

// error handling below ...

Expected behaviour
Message should be provided by Agent, answering the relevant query, or provide feedback if it was not able to find sufficient information from tools. Everytime chat is invoked, the message output should be correctly present in response.

Detailed Error

GoogleGenerativeAIError: [GoogleGenerativeAI Error]: Content with role 'user' can't contain 'functionResponse' part
    at validateChatHistory (../node_modules/llamaindex/node_modules/@google/generative-ai/dist/index.js:1047:23)
    at new ChatSession (../node_modules/llamaindex/node_modules/@google/generative-ai/dist/index.js:1089:13)
    at GenerativeModel.startChat (/node_modules/llamaindex/node_modules/@google/generative-ai/dist/index.js:1337:16)
    at Gemini.nonStreamChat (/node_modules/llamaindex/dist/cjs/llm/gemini/base.js:570:29)
    at Gemini.chat (/node_modules/llamaindex/dist/cjs/llm/gemini/base.js:617:21)
    at Gemini.withLLMEvent (/node_modules/@llamaindex/core/decorator/dist/index.cjs:75:47)
    at defaultTaskHandler (/node_modules/@llamaindex/core/agent/dist/index.cjs:738:44)
    at processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async Object.pull (//node_modules/@llamaindex/core/agent/dist/index.cjs:634:13)

Environement

  • OS: macOS
  • Llama-Index TS / Express (Node) TS
  • Node.js version: 20.11.0
  • llamaindex version in package-json: ^0.8.23

Additional Context
This issue often occurs but not always, leading me to believe that the chat history is not being correctly formatted into chat template with correct roles for Gemini.

@merbaz merbaz added the bug Something isn't working label Nov 28, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant