-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SGLang Llama 3.2 11b Multimodal Model #119
base: main
Are you sure you want to change the base?
Conversation
|
||
model: | ||
id: "llama3_2-11b-vision-instruct" | ||
user_id: "meta" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is better to leave user_id as user_id and app_id as app_id
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
see my tiny ask, fix and merge.
temperature=temperature, | ||
max_tokens=max_tokens, | ||
top_p=top_p, | ||
extra_body={"stop_token_ids": [151645, 151643]}, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This one is mysterious to me. If I change the model, do I need to change these? You may want to leave a comment here.
# if checkpoints section is in config.yaml file then checkpoints will be downloaded at this path during model upload time. | ||
# checkpoints = os.path.join(os.path.dirname(__file__), "checkpoints") | ||
|
||
checkpoints = "unsloth/Llama-3.2-11B-Vision-Instruct" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
maybe we want to do this really high up in the code as a variable so it serves better as an example
No description provided.