Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add module config for module_example and module_text_llm #44

Merged
merged 97 commits into from
Jul 17, 2023

Conversation

FelixTJDietrich
Copy link
Collaborator

@FelixTJDietrich FelixTJDietrich commented Jul 4, 2023

Motivation and Context

We cant to test out multiple module configurations mostly for evaluation purposes, but it can also be used by the LMS to have some more control over the module. For my llm modules I want to be able to configure multiple approaches, multiple LLMs, and also prompts for the LLMs.

Another thing is that we might want to configure the module to be in debug mode or evaluation mode and send back debug data or evaluation data. For example for the LLMs, we would like to track the token usage, or the exact prompts used, etc.

Description

Small illustration

image

List of changes

  • Add config_schema_provider decorator to athena (GET /config_schema endpoint for module sending back possible config options)
  • Also proxy GET requests in assessment_module_manager to modules
  • Add example config_schema_provider to module_example
  • Fix error that made module_example not run
  • Playground updates
    • Display /config_schema request in Module Requests as Get Config Schema
    • Display configs using react-jsonschema-forms in the BaseInfoHeader (it tries to render the schema if it is found, config component can be customized)
    • Send module config with request headers to Athena (if enabled)
  • Add example of how to consume the config in module_example
  • Update module_llm_text to support azure and non-azure credentials at the same time
  • Add config_schema_provider to module_text_llm
    • Available models (LLMs)
    • Available approaches
    • Available prompts and their plain text (using monaco editor)
  • Rework openai model config
  • Use config in module_text_llm
    • Use model
    • Use approach
    • Use prompts
  • Add emit_meta and get_meta to athena and include the emitted metadata with the request response

Steps for Testing

Prerequisites:
Update environment variables for module_text_llm or alternatively use module_example for testing, but it is less cool.

  1. Do all requests in the playground without overriding the config
  2. Do all requests in the playground with overriding the config for module_text_llm in the BaseInfoHeader
    • Setting debug to true sends back metadata (prompt & result) in the request feedback suggestions request
    • Prompt from the metadata should be the same of the config

Screenshots

localhost_3000_playground (2)

Screenshot cut off....

localhost_3000_playground (5)

@FelixTJDietrich FelixTJDietrich changed the title Add switch LLM model config to playground Add switch LLM config to playground Jul 4, 2023
@FelixTJDietrich FelixTJDietrich changed the title Add switch LLM config to playground Add LLM config to playground Jul 4, 2023
@FelixTJDietrich FelixTJDietrich mentioned this pull request Jun 30, 2023
16 tasks
@FelixTJDietrich FelixTJDietrich changed the title Add LLM config to playground Add LLM and prompt config to playground Jul 4, 2023
@FelixTJDietrich FelixTJDietrich changed the title Add LLM and prompt config to playground Add module config for LLM modules for changing approach, model, and prompts Jul 7, 2023
@FelixTJDietrich FelixTJDietrich changed the title Add module config for LLM modules for changing approach, model, and prompts Add module config for module_example and module_text_llm Jul 9, 2023
@FelixTJDietrich
Copy link
Collaborator Author

The error you experienced with my module is still on my todo list. I will replace CSV parsing through JSON parsing and improve error handling so this does not happen anymore :S

@FelixTJDietrich FelixTJDietrich requested a review from pal03377 July 14, 2023 20:03
Copy link
Contributor

@pal03377 pal03377 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice changes! I only have larger problems in request_to_module.py left now. I also tested it locally again.

Also, I notice that endpoints.py is getting more and more complex (which is not really your fault). We might need to think of some refactoring in the future.

@FelixTJDietrich FelixTJDietrich requested a review from pal03377 July 17, 2023 09:37
Copy link
Contributor

@pal03377 pal03377 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code looks good & I tested it again locally. Thanks for the contribution, very cool changes! I'll merge it right now

@pal03377 pal03377 merged commit 77fa249 into develop Jul 17, 2023
@pal03377 pal03377 deleted the feature/llm-config branch August 14, 2023 15:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants