-
Notifications
You must be signed in to change notification settings - Fork 8.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
"Model schema not found" error only in agents #12655
Comments
Hello @profonline! I'm here to help you with bug reports, answer questions, and assist you in becoming a contributor. Let's work together to resolve any issues you may be experiencing. The "Model schema not found" error in Dify's agent module typically occurs when the To resolve this issue, verify that the To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
I got the same error in agent only. It is new bug? |
This error is raised by these line. dify/api/core/app/apps/agent_chat/app_runner.py Lines 205 to 206 in 56e15d0
Can you guys provide the settings of the model in orchestration page and settings page. |
How can I solve this problem via UI? As Dify does not allow me debugging it via code level. |
Please provide the necessary information as I mentioned above. |
same error |
1 similar comment
same error |
Hello @crazywoola, I'm encountering the same error specifically with my Llama models. I am using Together AI as the provider for these models, and I have kept the default settings. |
I encountered the same error - "Model scheme not found" in agent with Qwen and Mistral model. I did not try other but I expect the result should be the same. Thanks! |
Hello @crazywoola , I am beginner level. I do not know the exact setting you want. I copied the setting file here. |
same error when I use agent and ollama |
same |
When I enabled function call support in the Ollama model configuration, this issue no longer occurred |
Hello @JiweiZh , you save me! Thank you very much! 😘💕 |
If it is a newly added agent, you cannot select the disable function call model at all |
The function call is enabled in the model setting directly. It is not related to the agent. |
Hi, could you please tell me how to do it? |
The agent needs llm to have an inference mode,the function call is one of the inference modes,So it's relevant. |
Self Checks
Dify version
0.15.0
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
✔️ Expected Behavior
I was expecting to get the results from the agent.
❌ Actual Behavior
A "Model schema not found" error occurred instead of the actual answer from the agent. And this happened only with agents. No problem at all with chatbots or completions.
The text was updated successfully, but these errors were encountered: