-
Notifications
You must be signed in to change notification settings - Fork 49
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Switch to llama-simple-chat #454
Conversation
Reviewer's Guide by SourceryThis PR switches the chat implementation from Sequence diagram for switching to llama-simple-chatsequenceDiagram
actor User
participant Model as Model
participant LlamaSimpleChat as llama-simple-chat
User->>Model: Run chat
Model->>LlamaSimpleChat: Execute with common_params
LlamaSimpleChat-->>Model: Processed chat response
Model-->>User: Return chat response
Updated class diagram for model executionclassDiagram
class Model {
-exec_model_path
-exec_args
+run(args)
}
Model : +run(args)
Model : -exec_model_path
Model : -exec_args
note for Model "Updated to use llama-simple-chat with common_params"
File-Level Changes
Tips and commandsInteracting with Sourcery
Customizing Your ExperienceAccess your dashboard to:
Getting Help
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @ericcurtin - I've reviewed your changes - here's some feedback:
Overall Comments:
- Please document the functional differences between llama-cli and llama-simple-chat, particularly how features like prompt handling and prefix/suffix parameters are handled in the new implementation.
Here's what I looked at during the review
- 🟢 General issues: all looks good
- 🟢 Security: all looks good
- 🟢 Testing: all looks good
- 🟢 Complexity: all looks good
- 🟢 Documentation: all looks good
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.
This is a new chat program in llama.cpp which is much simpler than the existing one we were using. It doesn't have the debug/verbose output problem and just seems higher quality in general for a simple chatbot, it's a few 100 lines of code. Signed-off-by: Eric Curtin <[email protected]>
f573427
to
1db401d
Compare
This recently added llama-simple-chat program is more suited to our needs. Maybe we can actively start contributing to it, I already started: |
LGTM |
This is a new chat program in llama.cpp which is much simpler than the existing one we were using. It doesn't have the debug/verbose output problem and just seems higher quality in general for a simple chatbot, it's a few 100 lines of code.
Summary by Sourcery
Switch to using llama-simple-chat for a more streamlined and higher quality chat experience, removing verbose options and updating container configurations.
New Features:
Enhancements:
Build: