Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature Request: Automatic Tracing? #113

Open
TylerLeonhardt opened this issue Oct 16, 2024 · 0 comments
Open

Feature Request: Automatic Tracing? #113

TylerLeonhardt opened this issue Oct 16, 2024 · 0 comments
Assignees
Labels
feature-request Request for new features or functionality

Comments

@TylerLeonhardt
Copy link
Member

I wrote this fun helper:

import { BasePromptElementProps, HTMLTracer, IChatEndpointInfo, PromptElementCtor, PromptRenderer, PromptSizing, RenderPromptResult, toVsCodeChatMessages } from "@vscode/prompt-tsx";
import { AnyTokenizer } from "@vscode/prompt-tsx/dist/base/tokenizer/tokenizer";
import { CancellationToken, ExtensionMode, LanguageModelChat, LanguageModelChatMessage, LanguageModelToolInvocationOptions, ProgressLocation, window } from "vscode";

let tracingEnabled = false;
export function toggleTracing(extensionMode: ExtensionMode) {
    if (extensionMode === ExtensionMode.Development) {
        tracingEnabled = !tracingEnabled;
    }
    return tracingEnabled;
}

export async function renderPrompt<P extends BasePromptElementProps>(endpoint: IChatEndpointInfo, ctor: PromptElementCtor<P, any>, props: P, model: LanguageModelChat, token: CancellationToken) {
    const renderer = new PromptRenderer(endpoint, ctor, props, new AnyTokenizer(model.countTokens));
    let result: RenderPromptResult;
    if (tracingEnabled) {
        const tracer = new HTMLTracer();
        renderer.tracer = tracer;
        result = await renderer.render(undefined, token);
        const server = await tracer.serveHTML();
        await window.withProgress({ location: ProgressLocation.Notification, title: `Tracer server running at ${server.address}`, cancellable: true }, async (progress, token) => {
            await new Promise<void>(resolve => {
                const dispose = token.onCancellationRequested(() => {
                    dispose.dispose();
                    server.dispose();
                    resolve();
                });
            });
        });
    } else {
        result = await renderer.render(undefined, token);
    }

    return {
        messages: toVsCodeChatMessages(result.messages) as LanguageModelChatMessage[],
        references: result.references
    };
}

That can be used when debugging my extension and running the Toggle Prompt Tracing command. To start a server every time I render a prompt so I can inspect it.

It would be cool to have something like this baked into prompt-tsx. Maybe it could remember the last X prompts and just kick in automatically when debugging an extension. It could log the address to the debug console or something.

In short, lets make tracing really easy to get so debugging these prompts doesn't have to be so hard. It's a big sell for prompt-tsx.

@connor4312 connor4312 added the feature-request Request for new features or functionality label Dec 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature-request Request for new features or functionality
Projects
None yet
Development

No branches or pull requests

2 participants