Zed AI: OpenRouter Support #16576
Replies: 19 comments 9 replies
-
It would be nice to have this support. |
Beta Was this translation helpful? Give feedback.
-
It would be nice to have this support. |
Beta Was this translation helpful? Give feedback.
-
it would be awesome : ) |
Beta Was this translation helpful? Give feedback.
-
Would love to have this too. ZedAI currently works with Anthropic only. Since OpenRouter has access to Anthropic, i hope that ZedAI can be used with OpenRouter too, to make use of the prompt caching also! |
Beta Was this translation helpful? Give feedback.
-
we need this! |
Beta Was this translation helpful? Give feedback.
-
isn't this already supported : #13276 |
Beta Was this translation helpful? Give feedback.
-
Awaiting OpenRouter config? |
Beta Was this translation helpful? Give feedback.
-
OpenRouter's API is OpenAI compatible, so just e.g.:
|
Beta Was this translation helpful? Give feedback.
-
Is there an example of configuring multiple models like openai/chatgpt-4o-latest, o1-mini & o1-preview? |
Beta Was this translation helpful? Give feedback.
-
In addition to the LLM providers (e.g., the OpenAI API Key option), may there be an option for OpenRouter? Run assistant (shift+command+p): show configuration |
Beta Was this translation helpful? Give feedback.
-
Does it work for anyone else? |
Beta Was this translation helpful? Give feedback.
-
Seems as if right now we cant "mix" multiple OpenAI(-compatible) APIs because they all would have to share the same API key. Would be great if Zed would get rid of these restrictions so that someone use multiple different OpenAi-compatible APIs simultaniously. :) |
Beta Was this translation helpful? Give feedback.
-
+1, is the Zed team on this ? It would massively open up the ability to use any models at token level pricing |
Beta Was this translation helpful? Give feedback.
-
Waiting on this as well, I just discovered Zed and this missing integration is a killer, I have an OpenRouter API key but no Anthropic API key. |
Beta Was this translation helpful? Give feedback.
-
+1 for this. I want to throw deepseek-r1 at my problems! |
Beta Was this translation helpful? Give feedback.
-
After configuring this method in settings, and in .zshrc replacing key listed in OPENAI_API_KEY with the openrouter key, while keeping OPENAI_API_KEY=sk-or-v1- … of openrouter key, and testing a prompt, going to openrouter activity shows OpenAI: o1-mini model was selected, not DeepSeek R1
Anyone else try this and see different activity?
… On Jan 21, 2025, at 4:47 AM, Bálint Fülöp ***@***.***> wrote:
I'm using this. Not perfect (it overrides openai) but works:
"language_models": {
"openai": {
"api_url": "https://openrouter.ai/api/v1",
"available_models": [
{
"name": "deepseek/deepseek-r1",
"display_name": "DeepSeek R1",
"max_tokens": 64000
},
{
"name": "deepseek/deepseek-chat",
"display_name": "DeepSeek V3",
"max_tokens": 64000
}
],
"version": "1"
}
},
Then in my .zshrc I replaced the OPENAI_API_KEY with the openrouter key.
—
Reply to this email directly, view it on GitHub <#16576 (reply in thread)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/AAFZMFDWHAJA6WSVKXHBKXL2LYJUXAVCNFSM6AAAAABM3LHOWOVHI2DSMVQWIX3LMV43URDJONRXK43TNFXW4Q3PNVWWK3TUHMYTCOJQGAZTCMA>.
You are receiving this because you commented.
|
Beta Was this translation helpful? Give feedback.
-
~/.config/zed/settings.json, updated … how did you configure the 'openai' env variable via the Zed UI, as DeepSeek R1 is not an option under my config options like your screenshot.
…----
On Jan 21, 2025 at 12:41:25 PM, Peter Conerly wrote:
Thanks! That worked. FYI @drivencompassion , i put these settings into ~/.config/zed/settings.json, and then configured the 'openai' env variable via the Zed UI.
screenshot:
Screenshot.2025-01-21.at.9.30.36.AM.png (view on web)
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you were mentioned.Message ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
Tried rebooting and restarting app, but 3-dot options does not have an option for DeepSeek R1, although settings does have:
"language_models": {
"openai": {
"api_url": "https://openrouter.ai/api/v1",
"available_models": [
{
"name": "deepseek/deepseek-r1",
"display_name": "DeepSeek R1",
"max_tokens": 64000
},
{
"name": "deepseek/deepseek-chat",
"display_name": "DeepSeek V3",
"max_tokens": 64000
}
],
"version": "1"
}
}
…----
On Jan 21, 2025 at 1:40:15 PM, Peter Conerly wrote:
In the context window there's a 3-dot button to open options, and one of those options is "Configure". I also gave Zed a restart but I don't know if that was necessary
Screenshot.2025-01-21.at.10.38.43.AM.png (view on web)
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you were mentioned.Message ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
I've tried both OpenRouter and the experimental Gemini model, but neither worked for me. "language_models": { |
Beta Was this translation helpful? Give feedback.
-
Hi, I'm from OpenRouter and a fan of Zed and would love to see OpenRouter integration as a new assistant provider.
OpenRouter supports dozens of the best coding models and hundreds of different models with one API key and an OpenAI-style schema.
I saw that some Zed community users have requested this feature 1 2.
Is there anything I can do to help? Thanks! 💙
Beta Was this translation helpful? Give feedback.
All reactions