Replies: 2 comments 2 replies
-
oh, thank you! |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
I recently saw a great video showing how to use the Hugging Face Large Language models from PyTorch.
They used very simple python code like this below.
Is this possible to do with TorchSharp?
I would really love to have a go with this it if is possible.
Thank you!
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM, GenerationConfig line = 'What color is the undoubtedly beautiful sky?' model_name = 'google/flan-t5-xl' tokenizer = AutoTokenizer.from_pretrained(model_name) model = AutoModelForSeq2SeqLM.from_pretrained(model_name) config = GenerationConfig(max_new_tokens=200) ...
This was from this excellent video https://www.youtube.com/watch?v=tL1zltXuHO8
Beta Was this translation helpful? Give feedback.
All reactions