-
Notifications
You must be signed in to change notification settings - Fork 208
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cannot use OpenRouter's free models #245
Comments
same issue on my side (Mac), cannot select other models but claude-3.5-sonnet when using OpenRouter. |
Hmm, it seems like it might be an issue with this API call failing. Any chance you have a firewall or something that would be blocking this? Roo-Cline/src/core/webview/ClineProvider.ts Line 1047 in c30e9c6
|
in some case, ipv6 setup not correctly in wsl, you can try to disable ipv6 to test it again |
Yes, I have VPN on my side, but i tried curl "https://openrouter.ai/api/v1/models" on terminal, it can get correct response. |
|
@tonygao2k anything interesting in the network tab of the developer tools? |
it's blank |
Which editor are you using? |
I am using vscode. Version: 1.96.2 (Universal) |
@tonygao2k can you give us screenshoot of reponse
|
Please find attachment, these two commands can return correct result on vscode terminal. As i said, i am using VPN on my side otherwise "https://openrouter.ai/" is unreachable. Can you please check if Roo-Cline applied "http_proxy" correctly from environment variable or settings.json? |
Which API Provider are you using?
OpenRouter
Which Model are you using?
google/gemini-2.0-flash-exp:free
What happened?
Display this model as invalid, please allow custom model names in the openrouter api provider
Steps to reproduce
Relevant API REQUEST output
No response
Additional context
No response
The text was updated successfully, but these errors were encountered: