Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add function calling support for models that support it. #1

Open
5 tasks done
vkameswaran opened this issue Apr 24, 2024 · 5 comments
Open
5 tasks done

Add function calling support for models that support it. #1

vkameswaran opened this issue Apr 24, 2024 · 5 comments
Labels
enhancement New feature or request

Comments

@vkameswaran
Copy link

⚠️ Please check that this feature request hasn't been suggested before.

  • I searched previous Ideas in Discussions didn't find any similar feature requests.
  • I searched previous Issues didn't find any similar feature requests.

🔖 Feature description

I want us to use the LiteLLM function calling spec for instead of our hybrid GPT approach for models that support function calling. Cohere, Anthropic, Fireworks etc. etc.

Acknowledgements

  • My issue title is concise and descriptive.
  • I have searched the existing issues to make sure this feature has not been requested yet.
  • I have provided enough information for the maintainers to understand and evaluate this request.
@vkameswaran vkameswaran added the enhancement New feature or request label Apr 24, 2024
@vkameswaran vkameswaran reopened this Apr 24, 2024
@vkameswaran vkameswaran reopened this Apr 24, 2024
@vkameswaran vkameswaran reopened this Apr 24, 2024
@vkameswaran vkameswaran reopened this Apr 24, 2024
@vkameswaran vkameswaran reopened this Apr 24, 2024
Copy link

greptile-apps bot commented Apr 24, 2024

To implement function calling support for models like Cohere, Anthropic, and Fireworks, you can follow these steps, inspired by the existing codebase structure and the handling of tools and datasources in agents:

  1. Extend the LangchainAgent Class:

    • Modify the LangchainAgent class in /libs/superagent/app/agents/langchain.py to include a method for function calling. This method should be capable of handling the specific API calls or SDK methods required by models that support function calling, such as Cohere, Anthropic, and Fireworks.
  2. Define Function Calling Specifications:

    • Create a new method within the LangchainAgent class that defines how function calls are made. This method should take into account the different specifications and capabilities of each model. For example, the method signature could look like this:
      async def call_function(self, function_name: str, **kwargs):
          # Implementation for calling a function on the LLM
  3. Integrate with LLM Providers:

    • In the _get_llm method of the LangchainAgent class, add support for new LLM providers that support function calling. This might involve integrating with their SDKs or APIs and handling authentication and request formatting according to each provider's requirements.
  4. Utilize Existing Tool Integration Logic:

    • Leverage the logic used for integrating tools with agents, as demonstrated in the chat_with_tools.mdx file. This involves creating and attaching tools (in this case, function calling capabilities) to an agent and then invoking these tools based on user input or agent logic.
  5. Update Agent Configuration:

    • Ensure that the agent configuration (AgentConfig model) supports specifying whether a model supports function calling and any related configurations. This could include function names, parameters, and provider-specific options.
  6. Documentation and Examples:

    • Update the documentation in /fern/mdx/concepts.mdx and /fern/mdx/sdk/chat_with_tools.mdx to include examples and explanations of how to use function calling with supported models. This will help users understand how to leverage this new feature in their agents.
  7. Testing and Validation:

    • Implement comprehensive tests to validate the function calling feature across different models and scenarios. This ensures compatibility and reliability of the feature.

By following these steps and utilizing the existing structure for tools and datasources, you can effectively integrate function calling support for models that offer this capability, enhancing the flexibility and power of the agents.

Copy link

To integrate function calling support for models like LiteLLM, Cohere, Anthropic, and Fireworks, follow these steps:

  1. Extend LLM Mapping:
    Update libs/superagent/app/utils/llm.py to include mappings for the new models, ensuring that each model's specific API requirements and capabilities are accounted for.

  2. Modify LangchainAgent:
    In libs/superagent/app/agents/langchain.py, adapt the _get_llm method to instantiate the correct LLM client based on the provider. This may involve creating new client classes similar to ChatOpenAI and AzureChatOpenAI for each new model, handling their unique function calling specifications.

  3. Tool Integration:
    Ensure that the tool creation and invocation processes in /fern/mdx/sdk/local_tools.mdx and libs/superagent/app/agents/langchain.py can accommodate the function calling syntax and parameters required by the new models. This might include extending the create_tool function and the tool dispatch mechanism to handle new types of function calls.

  4. Update Agent Configuration:
    Allow agent configurations to specify which LLM model to use, including those that support function calling. This involves updating the agent creation scripts and possibly the agent configuration UI to allow users to select and configure function-calling models.

  5. Documentation and Examples:
    Update the documentation in /fern/mdx/sdk/local_tools.mdx to include examples and guidance on using the new function calling models. Provide clear examples of how to configure agents to use these models and how to structure function calls.

These steps require a deep understanding of both the Superagent framework and the specific capabilities and requirements of the models being integrated. It's crucial to thoroughly test the integration with each new model to ensure compatibility and performance.

References

greptileai/superagent/fern/mdx/sdk/local_tools.mdx
greptileai/superagent/libs/superagent/app/agents/langchain.py
greptileai/superagent/libs/superagent/app/utils

Ask Greptile

@greptileai greptileai deleted a comment from greptile-apps-local bot May 7, 2024
@greptileai greptileai deleted a comment from greptile-apps bot May 7, 2024
@vkameswaran vkameswaran reopened this May 7, 2024
Copy link

To implement function calling support for models like Cohere, Anthropic, and Fireworks using the LiteLLM function calling spec, you'll need to make several updates across the codebase. Here's a concise implementation plan:

  1. Update the LLMProvider Enum: Modify the LLMProvider enum in the migration scripts to ensure it supports all required models. For Anthropic, this has been done in the 20240306005619_add_anthropic migration. Similar updates may be needed for Cohere and Fireworks.

  2. Extend the Agent API: In /libs/superagent/app/api/agents.py, enhance the invoke method to handle function calls. This involves parsing the request body for function call specifications and adjusting the invocation logic to accommodate these.

  3. Modify the Assistant Classes: Within the same file, update the Assistant and OpenAIAssistantSdk classes (or create equivalent classes for other models) to support function calling. This includes adding methods that specifically handle the function call logic.

  4. Update Callback Handlers: In /libs/superagent/app/utils/callbacks.py, ensure that the callback handlers can process responses from function calls. This might involve adding new methods or modifying existing ones in CustomAsyncIteratorCallbackHandler and CostCalcAsyncHandler to handle the specifics of function call responses.

  5. Documentation: Update the SDK documentation in /fern/mdx/sdk to include examples and guidance on making function calls using the updated API.

This plan targets the unique aspects of adding function calling support, leveraging the existing structure of the SuperAgent codebase.

References

greptileai/superagent/fern/mdx/sdk
greptileai/superagent/libs/superagent/app/api/agents.py
greptileai/superagent/libs/superagent/prisma/migrations/20240306005619_add_anthropic
greptileai/superagent/libs/superagent/app/utils/callbacks.py

Ask Greptile

@greptileai greptileai deleted a comment from greptile-apps bot May 7, 2024
@vkameswaran vkameswaran reopened this May 7, 2024
Copy link

To add function calling support for models that support it, follow these steps:

  1. Update the LLMProvider enum in prisma/schema.prisma to ensure all relevant models (Cohere, Anthropic, Fireworks) are included. Refer to migrations like /libs/superagent/prisma/migrations/20240420075553_add_cohere for examples of adding new providers.
  2. In app/api/agents.py, extend the invoke function to handle function calling based on the LLM model. Use the model's capabilities to determine if function calling is supported.
  3. Modify the LangchainAgent class in /libs/superagent/app/agents/langchain.py to include logic for function calling, leveraging the LiteLLM function calling spec.
  4. Ensure that the database migrations for adding new LLM providers (e.g., 20240420075553_add_cohere) are applied to support the new models.
  5. Update any relevant frontend components, if necessary, to allow users to utilize function calling features through the UI.

References

/libs/superagent/app/api/agents.py
/libs/superagent/app/agents/langchain.py
/libs/superagent/prisma/migrations/20240306005619_add_anthropic
/libs/superagent/prisma/migrations/20240420075553_add_cohere
/libs/superagent/prisma/migrations/20240418181431_add_mistral
/libs/superagent/prisma/migrations/20240418183001_add_groq
/libs/superagent/prisma/migrations/20240412075016_add_aws_bedrock
/libs/superagent/prisma/migrations/20240213050512_llm_agent
/libs/superagent/prisma/migrations/20240201161130_add_gpt_4_turbo_preview
/libs/superagent/prisma/migrations/20231106194639_gpt_4_1106_preview

Ask Greptile

Copy link

greptile-apps bot commented May 7, 2024

To implement function calling support for models like Cohere, Anthropic, Fireworks, etc., using the LiteLLM function calling spec, follow these steps:

  1. Update the Database Schema: Ensure the database schema can accommodate function calling capabilities. This might involve adding new fields to existing tables or creating new tables. Refer to the migration scripts in /libs/superagent/prisma/migrations/ for guidance on how to add new values to enums or alter tables.

  2. Modify the LangchainAgent Class: In /libs/superagent/app/agents/langchain.py, extend the LangchainAgent class to include a method for function calling. This method should be capable of determining if the selected model supports function calling and, if so, use the LiteLLM function calling spec.

  3. Integrate LiteLLM Spec: Implement the LiteLLM function calling specification within the new method. This involves constructing the function call request according to the LiteLLM spec and handling the response.

  4. Update LLM Provider Enum: If necessary, update the LLMProvider enum to include any new providers that support function calling but are not yet listed. This can be done by adding migration scripts similar to those found in /libs/superagent/prisma/migrations/.

  5. Test Function Calling: Ensure thorough testing with each LLM provider that supports function calling to verify that the integration works as expected.

This approach leverages the existing infrastructure while introducing the necessary enhancements to support function calling across different models.

References

greptileai/superagent/libs/superagent/app/agents/langchain.py
greptileai/superagent/libs/superagent/prisma/migrations/20240306005619_add_anthropic
greptileai/superagent/libs/superagent/prisma/migrations/20240420075553_add_cohere
greptileai/superagent/libs/superagent/prisma/migrations/20240418181431_add_mistral
greptileai/superagent/libs/superagent/prisma/migrations/20240418183001_add_groq
greptileai/superagent/libs/superagent/prisma/migrations/20240412075016_add_aws_bedrock
greptileai/superagent/libs/superagent/prisma/migrations/20240311200421_tavily_tool
greptileai/superagent/libs/superagent/prisma/migrations/20240213050512_llm_agent
greptileai/superagent/libs/superagent/prisma/migrations/20240213051312_add_perplexity
greptileai/superagent/libs/superagent/prisma/migrations/20240213203005_add_togetherai

Ask Greptile

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant