Skip to content

Commit

Permalink
Adjusted Readme
Browse files Browse the repository at this point in the history
  • Loading branch information
AlexejPenner committed Oct 22, 2024
1 parent e7d66ed commit a27bd7d
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion llm-complete-guide/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,8 @@ environment and install the dependencies using the following command:
pip install -r requirements.txt
```

blah blah if it fails FLASH_ATTENTION_SKIP_CUDA_BUILD=TRUE pip install flash-attn --no-build-isolation
Depending on your setup you may run into some issues when running the pip install command with the
`flash_attn` package. In that case running `FLASH_ATTENTION_SKIP_CUDA_BUILD=TRUE pip install flash-attn --no-build-isolation` could help you.

In order to use the default LLM for this query, you'll need an account and an
API key from OpenAI specified as another environment variable:
Expand Down

0 comments on commit a27bd7d

Please sign in to comment.