Skip to content

Commit

Permalink
chore: temporarily remove embedding/lm_head from remote
Browse files Browse the repository at this point in the history
  • Loading branch information
RomanBredehoft committed Aug 7, 2024
1 parent f20697d commit 3537f2e
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion use_case_examples/lora_finetune/gpt2_finetune_hybrid.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -244,7 +244,8 @@
" # parameters. Side note: \"lm_head\" does not appear in model.parameters() because the weights\n",
" # are directly tied to the embedding ones, but we still need to remove both modules in\n",
" # order to get rid of the weights\n",
" if isinstance(module, (Conv1D, Embedding)) or \"lm_head\" in name:\n",
" if isinstance(module, (Conv1D)):\n",
" # if isinstance(module, (Conv1D, Embedding)) or \"lm_head\" in name:\n",
" remote_names.append(name)\n",
"\n",
" elif isinstance(module, CustomConv1D):\n",
Expand Down

0 comments on commit 3537f2e

Please sign in to comment.