Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Constantly loses connection to LLM (both local and cloud) on long responses #4373

Open
3 tasks done
d00mus opened this issue Feb 26, 2025 · 0 comments
Open
3 tasks done
Assignees
Labels
area:chat Relates to chat interface ide:jetbrains Relates specifically to JetBrains extension ide:vscode Relates specifically to VS Code extension kind:bug Indicates an unexpected problem or unintended behavior "needs-triage" priority:high Indicates high priority

Comments

@d00mus
Copy link

d00mus commented Feb 26, 2025

Before submitting your bug report

Relevant environment info

- OS: Windows 10
- Continue version: 0.8.68 - for VS Code and 0.0.92 for IDEA
- IDE version: VS Code 1.96.4 and JetBrains IDEA 2023.1.7
- Model: Qwen 2.5 coder 32b on local lmstudio but the same issue with cloud DeepSeek v3
- config:
  
{
  "models": [
    {
      "apiBase": "http://127.0.0.1:1234/v1/",
      "title": "Lmstudio",
      "model": "qwen2.5-coder-32b",
      "contextLength": 65536,
      "provider": "lmstudio",
      "timeout": 6000000
    },
    {
      "title": "DeepSeek Coder",
      "model": "deepseek-coder",
      "contextLength": 128000,
      "apiKey": "ххх",
      "provider": "deepseek",
      "timeout": 6000000
    },
    {
      "title": "DeepSeek",
      "model": "deepseek-chat",
      "contextLength": 128000,
      "apiKey": "ххх",
      "provider": "deepseek",
      "timeout": 6000000
    },
    {
      "title": "DeepSeek R1",
      "model": "deepseek-reasoner",
      "contextLength": 128000,
      "apiKey": "ххх",
      "provider": "deepseek",
      "timeout": 60000
    }
  ],

Description

It just stopping receiving streamed response from LLM after about 400 lines of generated code and no any related errors in the plugin logs.

But In the logs of lmstudio server i'm seening this:
025-02-26 13:48:11 [INFO]
Finished streaming response
2025-02-26 13:48:11 [INFO]
[LM STUDIO SERVER] Client disconnected. Stopping generation... (If the model is busy processing the prompt, it will finish first.)

It looks like the plugin just closes connection for some reason.

Tried to increase timeout - it didn't help.

As the result it just returns a half of the file that being generated in that moment

To reproduce

  1. Configure it to use cloud DeepSeek V3 or local LLM via lmstudio with Qwen 2.5 coder 32b (didn't try others),
  2. Ask it to fix some problem in the relatively big file. like 800 lines of the java code and provide all the context, about 5 related files.
  3. Wait it to start getting streamed response from the LLM
  4. It will stop getting generated code somewhere in the middle leaving you with the half of the fixed file.

The problem is reproduceable with both, with the local LLM and with the cloud DeepSeek so I don't think it related to my local LLM or to my internet connection. And I don't have any similar issues with any other tool I use.

Log output

This is the tail of the log but I don't think these lines relate to this particular problem(different timestamp) but it does not contains any others. 
[2025-02-26T10:02:37] Error: Connection error. 
[2025-02-26T10:02:37] Error running handler for "llm/streamChat":  Error: Connection error.
[2025-02-26T10:05:10] Error: Connection error. 
[2025-02-26T10:05:10] Error running handler for "llm/streamChat":  Error: Connection error.
[2025-02-26T10:06:18] Error: Connection error. 
[2025-02-26T10:06:18] Error running handler for "llm/streamChat":  Error: Connection error.
[2025-02-26T10:06:42] Error: Connection error. 
[2025-02-26T10:06:42] Error running handler for "llm/streamChat":  Error: Connection error.
@dosubot dosubot bot added area:chat Relates to chat interface ide:jetbrains Relates specifically to JetBrains extension ide:vscode Relates specifically to VS Code extension kind:bug Indicates an unexpected problem or unintended behavior priority:high Indicates high priority labels Feb 26, 2025
@d00mus d00mus changed the title Constantly loses connection to LLM (either local and cloud) on long responses Constantly loses connection to LLM (both local and cloud) on long responses Feb 26, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area:chat Relates to chat interface ide:jetbrains Relates specifically to JetBrains extension ide:vscode Relates specifically to VS Code extension kind:bug Indicates an unexpected problem or unintended behavior "needs-triage" priority:high Indicates high priority
Projects
None yet
Development

No branches or pull requests

2 participants