You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Using groq as the provider with openrouter endpoint (as groq is the provider which needs to be used for OpenAI compatible API).
And this is the response:
Request Failed: HTTP 400 Bad Request: {"error":{"message":"qwen-2.5-coder-32b-instruct is not a valid model ID","code":400}}
Seems like Cody is sending the model as qwen-2.5-coder-32b-instruct instead of qwen/qwen-2.5-coder-32b-instruct ?
Expected behavior
Seems like Cody is sending the model as qwen-2.5-coder-32b-instruct instead of qwen/qwen-2.5-coder-32b-instruct ? and the expected behaviour is to get a response from the endpoint.
Additional context
No response
The text was updated successfully, but these errors were encountered:
Version
v1.41.1731027960
Describe the bug
Openrouter does not work with Cody
Added this config in settings.json:
{
"provider": "groq", // keep groq as provider
"model": "qwen/qwen-2.5-coder-32b-instruct",
"inputTokens": 128000,
"outputTokens": 8192,
"apiKey": "<api_key>",
"apiEndpoint": "https://openrouter.ai/api/v1/chat/completions"
},
Using groq as the provider with openrouter endpoint (as groq is the provider which needs to be used for OpenAI compatible API).
And this is the response:
Request Failed: HTTP 400 Bad Request: {"error":{"message":"qwen-2.5-coder-32b-instruct is not a valid model ID","code":400}}
Seems like Cody is sending the model as qwen-2.5-coder-32b-instruct instead of qwen/qwen-2.5-coder-32b-instruct ?
Expected behavior
Seems like Cody is sending the model as qwen-2.5-coder-32b-instruct instead of qwen/qwen-2.5-coder-32b-instruct ? and the expected behaviour is to get a response from the endpoint.
Additional context
No response
The text was updated successfully, but these errors were encountered: