Skip to content

modelmerge is a powerful library designed to simplify and unify the use of various large language models, including GPT-4/4t/4o, GPT-3.5, Claude3/3.5, Claude2, Gemini1.5 Pro/flash, DALL-E 3, DuckDuckGo(gpt-4o-mini, claude-3-haiku, Meta-Llama-3.1-70B, Mixtral-8x7B), and Groq.

License

Notifications You must be signed in to change notification settings

yym68686/ModelMerge

Repository files navigation

modelmerge

English | Chinese

modelmerge is a powerful library designed to simplify and unify the use of different large language models, including GPT-3.5/4/4 Turbo/4o, o1-preview/o1-mini, DALL-E 3, Claude2/3/3.5, Gemini1.5 Pro/Flash, Vertex AI (Claude, Gemini), DuckDuckGo, and Groq. The library supports GPT format function calls and has built-in Google search and URL summarization features, greatly enhancing the practicality and flexibility of the models.

✨ Features

  • Multi-model support: Integrate various latest large language models.
  • Real-time Interaction: Supports real-time query streams, real-time model response retrieval.
  • Function Expansion: With built-in function calling support, the model's functions can be easily expanded, currently supporting plugins such as DuckDuckGo and Google search, content summarization, Dalle-3 drawing, arXiv paper summaries, current time, code interpreter, and more.
  • Simple Interface: Provides a concise and unified API interface, making it easy to call and manage the model.

Quick Start

The following is a guide on how to quickly integrate and use modelmerge in your Python project.

Install

First, you need to install modelmerge. It can be installed directly via pip:

pip install modelmerge

Usage example

The following is a simple example demonstrating how to use modelmerge to request the GPT-4 model and handle the returned streaming data:

from ModelMerge import chatgpt

# Initialize the model, set the API key and the selected model
bot = chatgpt(api_key="{YOUR_API_KEY}", engine="gpt-4o")

# Get response
result = bot.ask("python list use")

# Send request and get streaming response in real-time
for text in bot.ask_stream("python list use"):
    print(text, end="")

# Disable all plugins
bot = chatgpt(api_key="{YOUR_API_KEY}", engine="gpt-4o", use_plugins=False)

🍃 Environment Variables

The following is a list of environment variables related to plugin settings:

Variable Name Description Required?
SEARCH Enable search plugin. Default value is True. No
URL Enable URL summary plugin. The default value is True. No
ARXIV Whether to enable the arXiv paper abstract plugin. The default value is False. No
CODE Whether to enable the code interpreter plugin. The default value is False. No
IMAGE Whether to enable the image generation plugin. The default value is False. No
DATE Whether to enable the date plugin. The default value is False. No

Supported models

  • GPT-3.5/4/4 Turbo/4o
  • o1-preview/o1-mini
  • DALL-E 3
  • Claude2/3/3.5
  • Gemini1.5 Pro/Flash
  • Vertex AI (Claude, Gemini)
  • Groq
  • DuckDuckGo(gpt-4o-mini, claude-3-haiku, Meta-Llama-3.1-70B, Mixtral-8x7B)

🧩 Plugin

This project supports multiple plugins, including: DuckDuckGo and Google search, URL summary, ArXiv paper summary, DALLE-3 drawing, and code interpreter, etc. You can enable or disable these plugins by setting environment variables.

  • How to develop a plugin?

All the code related to plugins is in the git submodule ModelMerge within this repository. ModelMerge is an independent repository that I developed to handle API requests, conversation history management, and other functions. When you clone this repository using the --recurse-submodules parameter with git clone, ModelMerge will be automatically downloaded to your local machine. All the plugin code in this repository is located at the relative path ModelMerge/src/ModelMerge/plugins. You can add your own plugin code in this directory. The plugin development process is as follows:

  1. Create a new Python file in the ModelMerge/src/ModelMerge/plugins directory, for example, myplugin.py. Import your plugin in the ModelMerge/src/ModelMerge/plugins/__init__.py file, for example, from .myplugin import MyPlugin.

  2. Add your plugin OpenAI tool formatted request body to the function_call_list variable in ModelMerge/src/ModelMerge/tools/chatgpt.py. The Claude Gemini tool does not require additional writing, you only need to fill in the tool request body in the OpenAI format, and the program will automatically convert it to the Claude/Gemini tool format when requesting the Gemini or Claude API. function_call_list is a dictionary where the key is the name of the plugin and the value is the request body of the plugin. Please ensure the key names in the function_call_list dictionary are unique and do not duplicate existing plugin key names.

  3. Add key-value pairs to the PLUGINS dictionary in ModelMerge/src/ModelMerge/plugins/config.py. The key is the name of the plugin, and the value is the environment variable of the plugin and its default value. This default value is the switch for the plugin; if the default value is True, then the plugin is enabled by default. If the default value is False, then the plugin is disabled by default and needs to be manually enabled by the user in the /info command.

  4. Finally, in the ModelMerge/src/ModelMerge/plugins/config.py functions get_tools_result_async, add the code for plugin invocation. When the robot needs to call a plugin, it will call this function. You need to add the plugin invocation code within this function.

After completing the above steps, your plugin will be ready to use. 🎉

License

This project is licensed under the MIT License.

Contribution

Welcome to contribute improvements by submitting issues or pull requests through GitHub.

Contact Information

If you have any questions or need assistance, please contact us at [email protected].

About

modelmerge is a powerful library designed to simplify and unify the use of various large language models, including GPT-4/4t/4o, GPT-3.5, Claude3/3.5, Claude2, Gemini1.5 Pro/flash, DALL-E 3, DuckDuckGo(gpt-4o-mini, claude-3-haiku, Meta-Llama-3.1-70B, Mixtral-8x7B), and Groq.

Topics

Resources

License

Stars

Watchers

Forks

Languages