Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DeepSeek API Error with response format active (text and json_object) #12713

Open
5 tasks done
juniorsereno opened this issue Jan 14, 2025 · 4 comments · May be fixed by #12747
Open
5 tasks done

DeepSeek API Error with response format active (text and json_object) #12713

juniorsereno opened this issue Jan 14, 2025 · 4 comments · May be fixed by #12747
Labels
🐞 bug Something isn't working

Comments

@juniorsereno
Copy link

Self Checks

  • This is only for bug report, if you would like to ask a question, please head to Discussions.
  • I have searched for existing issues search for existing issues, including closed ones.
  • I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • Please do not modify this template :) and fill in all the required fields.

Dify version

0.15.1

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

Choose chatflow
Put a LLM node
Choose deepseek-chat
Active response format
Both text and json_object
Then we got an error

✔️ Expected Behavior

Normal response from deepseek

❌ Actual Behavior

Run failed: [deepseek] Error: API request failed with status code 422: Failed to deserialize the JSON body into the target type: response_format.type: invalid type: map, expected variant identifier at line 1 column 152

@dosubot dosubot bot added 🐞 bug Something isn't working 🤔 cant-reproduce We can not reproduce it or the information is limited labels Jan 14, 2025
Copy link

dosubot bot commented Jan 14, 2025

I found a similar closed issue regarding the DeepSeek API error when using the response format. The issue was resolved by ensuring that the response_format parameter in model_parameters is correctly structured. Specifically, the response_format should be a dictionary with a key "type".

Here's the relevant code snippet that was used to fix the issue:

# {"response_format": "xx"} need convert to {"response_format": {"type": "xx"}}
if "response_format" in model_parameters:
    model_parameters["response_format"] = {"type": model_parameters.get("response_format")}

Make sure that when you set the response_format in model_parameters, it follows this structure:

model_parameters = {
    "response_format": {"type": "text"}  # or {"type": "json_object"}
}

This should help in resolving the JSON deserialization failure by ensuring the API receives the expected format [1][2].

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@crazywoola crazywoola removed the 🤔 cant-reproduce We can not reproduce it or the information is limited label Jan 14, 2025
@crazywoola crazywoola reopened this Jan 14, 2025
@Designerxsh
Copy link
Contributor

same problem on Dify version 0.15.0

@suntao2015005848
Copy link

The issue has not been fixed, and the same problem persists.

@k99k5
Copy link

k99k5 commented Jan 15, 2025

+1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐞 bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants