Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The way sampleLLM in server-everything assembles results is incompatible with the defined client response format #381

Closed
aiqlcom opened this issue Dec 19, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@aiqlcom
Copy link
Contributor

aiqlcom commented Dec 19, 2024

Describe the bug

const result = await requestSampling(
prompt,
ToolName.SAMPLE_LLM,
maxTokens,
);
return {
content: [{ type: "text", text: `LLM sampling result: ${result}` }],

In the sampleLLM function within @modelcontextprotocol/server-everything, the result returned by the client is directly inserted into the text.

However, according to the official definition, this result should be an object, such as:

{
  model: string,  // Name of the model used
  stopReason?: "endTurn" | "stopSequence" | "maxTokens" | string,
  role: "user" | "assistant",
  content: {
    type: "text" | "image",
    text?: string,
    data?: string,
    mimeType?: string
  }
}

As a consequence, the final content returned will display as [object Object] instead of the actual content of the object.

To Reproduce

Steps to reproduce the behavior:

  1. Initialize a Client
  2. Defining a request handler for the client such as:
client.setRequestHandler(CreateMessageRequestSchema, async (request) => {
    console.log(request)
    return {
        model: "test-sampling-model",
        stopReason: "endTurn",
        role: "assistant",
        content: {
            type: "text",
            text: "This is the test message from client which use as sampling LLM",
        }
    };
});
  1. Trigger call tool of sampleLLM

Expected behavior

After the server retrieves the result, it should extract and insert the necessary content, rather than embedding the entire result object into the string.

Logs
Console log:

{
    "content": [
        {
            "type": "text",
            "text": "LLM sampling result: [object Object]"
        }
    ]
}

Additional context
Examples of feasible fix proposals:
Proposal 1

            const result = await requestSampling(prompt, ToolName.SAMPLE_LLM, maxTokens);
            return {
                content: [{ type: "text", text: `LLM sampling result: ${result.content.text}` }],
            };

Proposal 2

            const result = await requestSampling(prompt, ToolName.SAMPLE_LLM, maxTokens);
            return {
                content: [result.content],
            };

If you confirm that this is an issue and agree with the proposed fix, I can submit a PR.

Alternatively, feel free to share any other suggestions or perspectives you might have.

@aiqlcom aiqlcom added the bug Something isn't working label Dec 19, 2024
@aiqlcom aiqlcom closed this as completed Jan 14, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant