Skip to content

Commit

Permalink
fix
Browse files Browse the repository at this point in the history
  • Loading branch information
AkshitaB committed Jul 17, 2024
1 parent eca4972 commit 74cc618
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 2 deletions.
2 changes: 0 additions & 2 deletions .github/actions/setup-venv/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,6 @@ runs:
pip install 'torch${{ inputs.torch-version }}' --extra-index-url https://download.pytorch.org/whl/cpu
pip install -e .[all]
pip install -e hf_olmo
pip install --no-cache-dir triton==2.0.0 https://storage.googleapis.com/ai2-python-wheels/flash_attn/flash_attn-0.2.8%2Bcu118torch2.0.0-cp310-cp310-linux_x86_64.whl
- if: steps.virtualenv-cache.outputs.cache-hit == 'true'
Expand All @@ -55,7 +54,6 @@ runs:
. .venv/bin/activate
pip install --no-deps -e .[all]
pip install --no-deps -e hf_olmo
pip install --no-deps --no-cache-dir triton==2.0.0 https://storage.googleapis.com/ai2-python-wheels/flash_attn/flash_attn-0.2.8%2Bcu118torch2.0.0-cp310-cp310-linux_x86_64.whl
- shell: bash
run: |
Expand Down
4 changes: 4 additions & 0 deletions .github/workflows/main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -128,6 +128,10 @@ jobs:
run: |
echo "COMMIT_SHA=$GITHUB_SHA" >> $GITHUB_ENV
- name: Install flash attn
- run: |
pip install --no-cache-dir triton==2.0.0 https://storage.googleapis.com/ai2-python-wheels/flash_attn/flash_attn-0.2.8%2Bcu118torch2.0.0-cp310-cp310-linux_x86_64.whl
- name: GPU Tests
uses: allenai/[email protected]
if: env.BEAKER_TOKEN != ''
Expand Down

0 comments on commit 74cc618

Please sign in to comment.