diff --git a/README.md b/README.md index 9f5d89a..69ae97a 100644 --- a/README.md +++ b/README.md @@ -50,7 +50,7 @@ Some notes about the `train` command: - The `--data` flag is used to pass your dataset to axolotl. This dataset is then written to the `datasets.path` as specified in your config file. If you already have a dataset at `datasets.path`, you must be careful to also pass the same path to `--data` to ensure the dataset is correctly loaded. - Unlike axolotl, you cannot pass additional flags to the `train` command. However, you can specify all your desired options in the config file instead. -- The `--no-mergre-lora` will prevent the Lora adapter weights from being merged into the base model weights. +- `--no-mergre-lora` will prevent the LoRA adapter weights from being merged into the base model weights. - This example training script is opinionated in order to make it easy to get started. For example, a LoRA adapter is used and merged into the base model after training. @@ -219,4 +219,4 @@ Make sure your `modal` client >= 0.55.4164 (upgrade to the latest version using > AttributeError: 'Accelerator' object has no attribute 'deepspeed_config' -Try removing the `wandb_log_model` option from your config. See [#4143](https://github.com/microsoft/DeepSpeed/issues/4143). \ No newline at end of file +Try removing the `wandb_log_model` option from your config. See [#4143](https://github.com/microsoft/DeepSpeed/issues/4143).