Skip to content

Commit

Permalink
Update readme
Browse files Browse the repository at this point in the history
  • Loading branch information
EricLBuehler committed Feb 12, 2024
1 parent b16bb00 commit 1f7abed
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/xlora/xlora_insertion.py
Original file line number Diff line number Diff line change
Expand Up @@ -230,7 +230,7 @@ def generate(self, *args, **kwargs):

def set_global_scaling_weight(self, weight: float):
"""
Set the global LoRA weight, a scalar to multiply the output of each LoRA adapter by. This is reflected in the config.
Set the global LoRA weight, a scalar to multiply the output of each LoRA adapter by. This is by default 1. This is reflected in the config.
"""
classifier: xLoRAClassifier = self.model.internal_xlora_classifier # type: ignore
classifier.config.global_scaling_weight = weight
Expand Down

0 comments on commit 1f7abed

Please sign in to comment.