keras_nlp option to store and load from the local drive? #1050
-
Hi, For example, W]when I call following code, I see that it downloads from the network like this: instead I prefer to know the option to store and load from the local drive. Is there such option in keras_nlp? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Yes, @ramkumarkoppu! It's a little involved but the easiest workflow would be this (colab version). The same workflow applies if you finetune the weights with a training loop. Just save and load back into the backbone later. import keras
import keras_nlp
# Download directly or save the relevant attributes off the model
weights_file = keras.utils.get_file(
"model.h5",
"https://storage.googleapis.com/keras-nlp/models/gpt2_base_en/v1/model.h5",
)
vocab_file = keras.utils.get_file(
"vocab.txt",
"https://storage.googleapis.com/keras-nlp/models/gpt2_base_en/v1/vocab.json",
)
merges_file = keras.utils.get_file(
"merges.json",
"https://storage.googleapis.com/keras-nlp/models/gpt2_base_en/v1/merges.txt",
)
gpt2_tokenizer = keras_nlp.models.GPT2Tokenizer(
vocabulary=vocab_file,
merges=merges_file,
)
gpt2_preprocessor = keras_nlp.models.GPT2CausalLMPreprocessor(
tokenizer=gpt2_tokenizer,
sequence_length=256,
add_end_token=True,
)
gpt2_backbone = keras_nlp.models.GPT2Backbone.from_preset(
'gpt2_base_en',
load_weights=False,
)
gpt2_backbone.load_weights(weights_file)
gpt2_lm = keras_nlp.models.GPT2CausalLM(
backbone=gpt2_backbone,
preprocessor=gpt2_preprocessor,
) |
Beta Was this translation helpful? Give feedback.
Yes, @ramkumarkoppu! It's a little involved but the easiest workflow would be this (colab version).
The same workflow applies if you finetune the weights with a training loop. Just save and load back into the backbone later.