Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add the option to use a custom OpenAI endpoint and model. #44

Open
NekoMirra opened this issue Dec 5, 2024 · 2 comments
Open

Add the option to use a custom OpenAI endpoint and model. #44

NekoMirra opened this issue Dec 5, 2024 · 2 comments

Comments

@NekoMirra
Copy link

No description provided.

@h-siyuan
Copy link
Collaborator

h-siyuan commented Dec 6, 2024

We understand this feature is a big need for many users in specific regions, we will consider implementing this function in future updates.

@sosacrazy126
Copy link

@NekoMirra If you are referring to an OpenAI-compatible endpoint, it is already implemented. To use a custom OpenAI endpoint, a proxy server can be set up to ensure it matches the function calling conventions of OpenAI. This allows the OpenAI API client to interact with a specified endpoint effectively

@InfernalDread
Copy link

@NekoMirra If you are referring to an OpenAI-compatible endpoint, it is already implemented. To use a custom OpenAI endpoint, a proxy server can be set up to ensure it matches the function calling conventions of OpenAI. This allows the OpenAI API client to interact with a specified endpoint effectively

How can you modify the files here to look for the locally run URL?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants