Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inability to correctly use Ollama-deployed models when using official Agent policy group nodes #368

Open
Beeovan opened this issue Mar 3, 2025 · 1 comment

Comments

@Beeovan
Copy link

Beeovan commented Mar 3, 2025

I. Problem Description

When using the workflow (chatflow) of Dify version 1.0.0 and attempting to use the official Agent policy group nodes of Dify to call the models deployed by Ollama, the models cannot be used correctly. The workflow originally expected to effectively call the Ollama-deployed models through the Agent policy group nodes for conversation interaction. However, an error occurred during the actual execution, resulting in the inability to use the model to complete the corresponding tasks normally. But models of the same type can execute the policy nodes normally when using a provider other than Ollama. This problem still exists in the newly released Agent strategy plugin of version 0.0.9.

Image

Image

II. Error Message

Run failed: Failed to transform agent message: PluginInvokeError: {"args":{},"error_type":"ValidationError","message":"1 validation error for FunctionCallingParams\nmodel.entity\n Input should be a valid dictionary or instance of AIModelEntity [type=model_type, input_value=None, input_type=NoneType]\n For further information visit https://errors.pydantic.dev/2.8/v/model_type"}
This error message indicates that there is a validation error in FunctionCallingParams. Specifically, the input of model.entity should be a valid dictionary or an instance of AIModelEntity, but the actual input is None.

III.Environment Information

Operating System: [WSL2 Ubuntu 22.04 LTS]
Dify Version: 1.0.0
Ollama Version: [Ollama 0.15.2]

IVSupplementary Instructions

The running status of the Ollama service has been checked, and it can normally respond to independent requests, indicating that the deployment and operation of Ollama itself are not problematic.
The model parameters of the Agent policy group nodes have been reconfigured, but the problem still persists.
I hope that the maintainers of the Dify project can help troubleshoot and solve this problem. Thank you very much! Please feel free to let me know if you need more information.

@EcoleKeine
Copy link
Contributor

same problem
model: volcengine_maas + deepseek-v3

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants