Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Openrouter does not work with Cody #6109

Open
githubdebugger opened this issue Nov 12, 2024 · 1 comment
Open

Openrouter does not work with Cody #6109

githubdebugger opened this issue Nov 12, 2024 · 1 comment
Labels
bug Something isn't working repo/cody

Comments

@githubdebugger
Copy link

githubdebugger commented Nov 12, 2024

Version

v1.41.1731027960

Describe the bug

Openrouter does not work with Cody

Added this config in settings.json:

{
"provider": "groq", // keep groq as provider
"model": "qwen/qwen-2.5-coder-32b-instruct",
"inputTokens": 128000,
"outputTokens": 8192,
"apiKey": "<api_key>",
"apiEndpoint": "https://openrouter.ai/api/v1/chat/completions"
},

Using groq as the provider with openrouter endpoint (as groq is the provider which needs to be used for OpenAI compatible API).

And this is the response:

Request Failed: HTTP 400 Bad Request: {"error":{"message":"qwen-2.5-coder-32b-instruct is not a valid model ID","code":400}}

Seems like Cody is sending the model as qwen-2.5-coder-32b-instruct instead of qwen/qwen-2.5-coder-32b-instruct ?

Expected behavior

Seems like Cody is sending the model as qwen-2.5-coder-32b-instruct instead of qwen/qwen-2.5-coder-32b-instruct ? and the expected behaviour is to get a response from the endpoint.

Additional context

No response

@githubdebugger githubdebugger added bug Something isn't working repo/cody labels Nov 12, 2024
Copy link

linear bot commented Nov 12, 2024

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working repo/cody
Projects
None yet
Development

No branches or pull requests

2 participants