Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add more settings to CompletionOptions #13

Open
sestinj opened this issue Sep 4, 2023 · 4 comments
Open

Add more settings to CompletionOptions #13

sestinj opened this issue Sep 4, 2023 · 4 comments
Labels
enhancement New feature or request good first issue Good for newcomers

Comments

@sestinj
Copy link
Contributor

sestinj commented Sep 4, 2023

Every LLM completion is passed a set of parameters in the CompletionOptions object.

We currently support common settings like max_tokens, temperature, top_p, top_k, frequency_penalty, and presence_penalty, but are missing things like tail-free sampling or certain mirostat parameters.

Some model providers, like llama.cpp will accept these, and so it is only a matter of allowing the parameter to be passed in.

  1. Update CompletionOptions to have the parameter
  2. Many model providers (all are in the core/llm/llms folder) have a function called _convertArgs that turns the CompletionOptions object into the request body expected by their API. For the providers taht support this parameter, make sure that it gets passed to the request. For other providers, take a look to make sure that this extraneous parameter doesn't get sent in the request.
@sestinj sestinj converted this from a draft issue Sep 4, 2023
@sestinj sestinj added the enhancement New feature or request label Sep 4, 2023
@TyDunn TyDunn added the good first issue Good for newcomers label Jan 3, 2024
@ffshreyansh
Copy link

Hey @TyDunn can I move forward with this one. please let me know.

@TyDunn
Copy link

TyDunn commented Sep 8, 2024

@ffshreyansh We'd love for you to contribute! Let's check with @sestinj about how to scope this one if it is still makes sense now

@sestinj
Copy link
Contributor Author

sestinj commented Sep 8, 2024

@ffshreyansh This is definitely still open for contribution! I think one of the last parameters we'll want to add is tfs_z, for tail-free sampling. It is supported only in Ollama and Llama.cpp as far as I know, as shown in their docs:

@gordonwu66
Copy link

Hi @TyDunn, @sestinj I'm interested in contributing and picking this up if its still open. If it's already worked on then I'd love to take a look at any other good first issues you may have!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request good first issue Good for newcomers
Projects
Status: Good First Issues (Code)
Development

No branches or pull requests

4 participants