-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
flexGrow behaviour - token limit reached #156
Comments
I would ensure that you're using the correct tokenizer for your model. Note that the default tokenizer this library ships with is the GPT 3 O100k tokenizer. Also, simply wrapping a string in a component with |
Thank you, @connor4312. I mistakenly thought it would be automatic. I'll check out |
That's right |
@connor4312 I have made some progress, thanks for your advice. Since I am sending a JSON, I am breaking data with custom logic. I do have a quick question, though. I have a prompt which consists of child elements, and I add a JSON to one of the child elements. The JSON has a budget of 7000 tokens. //dataPrompt.tsx
async render(state: DataPromptState, sizing: PromptSizing) {
return (
...
<ChunkedJsonPrompt data={this.props.data} />
);
}
// chunkedJsonPrompt.tsx
export class ChunkedJsonPrompt extends PromptElement<ChunkedJsonPromptProps, ChunkedJsonPromptState> {
override async prepare(sizing: PromptSizing): Promise<ChunkedJsonPromptState> {
//tokenCount is approx 7000 here
let tokenCount = await sizing.countTokens(JSON.stringify(summary, null, 2));
}
...
}
//extension.ts
const urtaPrompt = await renderPrompt(
MyPrompt,
{ data: data },
{ modelMaxPromptTokens: maxTokensAllowed },
request.model,
undefined,
token
);
//!!urtaPrompt.tokenCount is 39 here!! WHY?
log.info(
`Analyzing data...Used Token: ${urtaPrompt.tokenCount}, Max Tokens: ${maxTokensAllowed}`
); However, when I call |
The token count is from the |
My Copilot extension requires me to send a rather large JSON (1000 elements) to the model.
I frequently encountered token limits with string prompts, which is why I've made the switch to prompt-tsx. You can see my new component here: https://gist.github.com/onlyutkarsh/b972c5a0942d757de1c599e927069e3a.
I call the prompt as below
Although I have
flexGrow
, I still see that all of the JSON data is sent, and I keep getting a "message exceeds token limit" error.I searched the prompt-tsx repo for more examples, but most use
flexGrow
, and I am unclear about what I am doing wrong.DataPrompt
)?dataPrompt.tokenCount
returns 24, which seems wrong given I have a large JSON.The text was updated successfully, but these errors were encountered: