Skip to content

Commit

Permalink
fix doc in readme (#303)
Browse files Browse the repository at this point in the history
Co-authored-by: Clémentine Fourrier <[email protected]>
  • Loading branch information
NathanHB and clefourrier authored Sep 16, 2024
1 parent 6b943ec commit 7295c78
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -207,7 +207,7 @@ lighteval accelerate \
An alternative to launching the evaluation locally is to serve the model on a TGI-compatible server/container and then run the evaluation by sending requests to the server. The command is the same as before, except you specify a path to a yaml config file (detailed below):

```shell
python run_evals_accelerate.py \
lighteval accelerate \
--model_config_path="/path/to/config/file"\
--tasks <task parameters> \
--output_dir output_dir
Expand Down Expand Up @@ -262,7 +262,7 @@ lighteval accelerate \
### Using the dummy model
To debug or obtain random baseline scores for a given set of tasks, you can use the `dummy` model:
```shell
python run_evals_accelerate.py \
lighteval accelerate \
--model_args "dummy"\
--tasks <task parameters> \
--output_dir output_dir
Expand All @@ -279,7 +279,7 @@ However, we are very grateful to the Harness and HELM teams for their continued

## How to navigate this project
`lighteval` is supposed to be used as a standalone evaluation library.
- To run the evaluations, you can use `run_evals_accelerate.py` or `run_evals_nanotron.py`.
- To run the evaluations, you can use `lighteval accelerate` or `lighteval nanotron`.
- [src/lighteval](https://github.com/huggingface/lighteval/tree/main/src/lighteval) contains the core of the lib itself
- [lighteval](https://github.com/huggingface/lighteval/tree/main/src/lighteval) contains the core of the library, divided in the following section
- [main_accelerate.py](https://github.com/huggingface/lighteval/blob/main/src/lighteval/main_accelerate.py) and [main_nanotron.py](https://github.com/huggingface/lighteval/blob/main/src/lighteval/main_nanotron.py) are our entry points to run evaluation
Expand Down

0 comments on commit 7295c78

Please sign in to comment.