Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow setting default params in config? #82

Closed
vkryukov opened this issue Nov 21, 2024 · 1 comment
Closed

Allow setting default params in config? #82

vkryukov opened this issue Nov 21, 2024 · 1 comment

Comments

@vkryukov
Copy link

vkryukov commented Nov 21, 2024

Right now, there is an asymmetry between params and config in Instructor.chat_completion, which makes it harder to switch between different providers.

For example, if I need to send a message to Claude, ChatGPT, and Gemini, respectively, I would need to use slightly different code:

# Claude
Instructor.chat_completion(
    max_token: 4096, # Unique for Claude
    model: "claude-3-5-sonnet-latest", 
    messages: [...]
)

# Gemini
Instructor.chat_completion(
    mode: json_schema, # Unique for Gemini
    model: "gemini-exp-1114", 
    messages: [...]
)

# ChatGPT
Instructor.chat_completion(
    model: "gpt-4o",
    messages: [...]
)

Ideally, I would just set these parameters (including even the default model) in my config.exs:

# config/config.exs
config :instructor,
  # adapter: Instructor.Adapters.Anthropic,
  adapter: Instructor.Adapters.Gemini,
  anthropic: [
    api_key: System.fetch_env!("ANTHROPIC_API_KEY"),
    max_tokens: 4096,
    model: "claude-3-5-sonnet-latest"
  ],
  gemini: [
    api_key: System.fetch_env!("GOOGLE_API_KEY"),
    mode: :json_schema,
    model: "gemini-exp-1114"
  ],
  openai: [
    api_key: System.fetch_env!("OPENAI_API_KEY"),
    model: "gpt-4o-mini"
  ]

then my application code could simply become Instructor.chat_completion(messages: [...] and I can switch to a different provider by only changing :adapter.

If @thmsmlr you are aligned with this approach, I can send a pull request.

@thmsmlr
Copy link
Owner

thmsmlr commented Feb 9, 2025

I'm torn on this.

On the one hand, I totally see your point. It would be nice to set a default params as part of the config, where you could default the mode, the model, really any set of parameters.

However, in practice, when you're switching between different models, you often have to tweak your prompts as well. And so while I think superficially it's useful, in practice it won't yield the results you actually want.

My recommendation is to just create a wrapper function for the time being. When more concrete use cases arise, maybe we can revisit.

@thmsmlr thmsmlr closed this as completed Feb 9, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants