Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Model schema not found" error only in agents #12655

Open
5 tasks done
profonline opened this issue Jan 12, 2025 · 21 comments · May be fixed by #12760
Open
5 tasks done

"Model schema not found" error only in agents #12655

profonline opened this issue Jan 12, 2025 · 21 comments · May be fixed by #12760
Labels
🐞 bug Something isn't working 🌚 invalid This doesn't seem right

Comments

@profonline
Copy link

Self Checks

  • This is only for bug report, if you would like to ask a question, please head to Discussions.
  • I have searched for existing issues search for existing issues, including closed ones.
  • I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • Please do not modify this template :) and fill in all the required fields.

Dify version

0.15.0

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

  1. Install smollm model in Ollama
  2. Create an agent with this model
  3. Start a conversation with the agent

✔️ Expected Behavior

I was expecting to get the results from the agent.

❌ Actual Behavior

A "Model schema not found" error occurred instead of the actual answer from the agent. And this happened only with agents. No problem at all with chatbots or completions.

@dosubot dosubot bot added 🐞 bug Something isn't working 🤖 feat:agent Agent related topics labels Jan 12, 2025
Copy link

dosubot bot commented Jan 12, 2025

Hello @profonline! I'm here to help you with bug reports, answer questions, and assist you in becoming a contributor. Let's work together to resolve any issues you may be experiencing.

The "Model schema not found" error in Dify's agent module typically occurs when the model_schema is not retrieved or lacks necessary features. This can happen if the get_model_schema method of the LargeLanguageModel class returns None or an empty schema for the specified model and credentials. Additionally, ensure that the model instance is of type LargeLanguageModel, as the error can also be raised if it is not [1][2].

To resolve this issue, verify that the get_model_schema method is correctly implemented for the smollm model and that it returns a valid schema. Also, ensure that the model instance is properly configured as a LargeLanguageModel.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@vincentvc168
Copy link

vincentvc168 commented Jan 13, 2025

I got the same error in agent only. It is new bug?

@crazywoola
Copy link
Member

This error is raised by these line.

if not model_schema or not model_schema.features:
raise ValueError("Model schema not found")

Can you guys provide the settings of the model in orchestration page and settings page.

@vincentvc168
Copy link

Hello @profonline! I'm here to help you with bug reports, answer questions, and assist you in becoming a contributor. Let's work together to resolve any issues you may be experiencing.

The "Model schema not found" error in Dify's agent module typically occurs when the model_schema is not retrieved or lacks necessary features. This can happen if the get_model_schema method of the LargeLanguageModel class returns None or an empty schema for the specified model and credentials. Additionally, ensure that the model instance is of type LargeLanguageModel, as the error can also be raised if it is not [1][2].

To resolve this issue, verify that the get_model_schema method is correctly implemented for the smollm model and that it returns a valid schema. Also, ensure that the model instance is properly configured as a LargeLanguageModel.

To continue talking to Dosu, mention @dosu.

Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

How can I solve this problem via UI? As Dify does not allow me debugging it via code level.

@crazywoola
Copy link
Member

Please provide the necessary information as I mentioned above.

@crazywoola crazywoola added 🌚 invalid This doesn't seem right and removed 🤖 feat:agent Agent related topics labels Jan 13, 2025
@hsoftxl
Copy link

hsoftxl commented Jan 13, 2025

same error

1 similar comment
@zhuzhe1983
Copy link

same error

@MitraSafarinejad
Copy link

MitraSafarinejad commented Jan 13, 2025

Hello @crazywoola,

I'm encountering the same error specifically with my Llama models. I am using Together AI as the provider for these models, and I have kept the default settings.
dify version = 0.14.2

image

@vincentvc168
Copy link

I encountered the same error - "Model scheme not found" in agent with Qwen and Mistral model. I did not try other but I expect the result should be the same. Thanks!
Remark: I upgraded to 0.15.1 but the error is the same.

@vincentvc168
Copy link

vincentvc168 commented Jan 13, 2025

This error is raised by these line.

if not model_schema or not model_schema.features:
raise ValueError("Model schema not found")

Can you guys provide the settings of the model in orchestration page and settings page.

Hello @crazywoola , I am beginner level. I do not know the exact setting you want. I copied the setting file here.
.zip Thank you!

@JiweiZh
Copy link

JiweiZh commented Jan 14, 2025

same error when I use agent and ollama

@hansheng654
Copy link

same

@suntao2015005848
Copy link

spark modele出现这样的问题:
截屏2025-01-14 13 13 58

@JiweiZh
Copy link

JiweiZh commented Jan 14, 2025

When I enabled function call support in the Ollama model configuration, this issue no longer occurred

@vincentvc168
Copy link

When I enabled function call support in the Ollama model configuration, this issue no longer occurred

Hello @JiweiZh , you save me! Thank you very much! 😘💕

@jiandanfeng
Copy link

If it is a newly added agent, you cannot select the disable function call model at all

@joe-ouyang
Copy link

Edit the file '/app/api/core/app/apps/agent_chat/app_runner.py' in the docker-api container and delete the code in the red box. When using an OpenAI-API-compatible LLM that does not support Function calling, Stream function calling, and Vision, the features array will be empty.
image

@vincentvc168
Copy link

If it is a newly added agent, you cannot select the disable function call model at all

The function call is enabled in the model setting directly. It is not related to the agent.

@profonline
Copy link
Author

If it is a newly added agent, you cannot select the disable function call model at all

The function call is enabled in the model setting directly. It is not related to the agent.

Hi, could you please tell me how to do it?

@jiandanfeng
Copy link

jiandanfeng commented Jan 15, 2025

If it is a newly added agent, you cannot select the disable function call model at all

The function call is enabled in the model setting directly. It is not related to the agent.

The agent needs llm to have an inference mode,the function call is one of the inference modes,So it's relevant.

@zhangsong8888
Copy link

image
let me solve this error directly

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐞 bug Something isn't working 🌚 invalid This doesn't seem right
Projects
None yet
Development

Successfully merging a pull request may close this issue.