Skip to content

Commit

Permalink
groq content
Browse files Browse the repository at this point in the history
  • Loading branch information
sraskar committed Oct 29, 2024
1 parent 751f217 commit 5f3f538
Show file tree
Hide file tree
Showing 7 changed files with 370 additions and 0 deletions.
126 changes: 126 additions & 0 deletions aiTestbeds/Groq/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,126 @@
# Groq

## Connection to Groq

![Groq connection diagram](./groqrack_system_diagram.png)

Login to the Groq login node from your local machine.
Once you are on the login node, ssh to one of the Groq nodes.

```bash
local > ssh [email protected]
```
```bash
groq-login > ssh groq-r01-gn-01.ai.alcf.anl.gov
# or
groq-login > ssh groq-r01-gn-09.ai.alcf.anl.gov
# or any node with hostname of form groq-r01-gn-0[1-9].ai.alcf.anl.gov
```

## Create Virtual Environment

### Install Miniconda

```bash
wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh
bash Miniconda3-latest-Linux-x86_64.sh
```

### PyTorch virtual environment

```bash
export PYTHON_VERSION=3.10.12
conda create -n groqflow python=$PYTHON_VERSION
conda activate groqflow
```

### Install Groqflow

```bash
git clone https://github.com/groq/groqflow.git
cd groqflow
pip install --upgrade pip
pip install -e .
pushd .
cd demo_helpers
pip install -e .
popd
```


## Job Queuing and Submission

Groq jobs in the AI Testbed's groqrack are managed by the PBS job scheduler.

* `qsub` : to submit a batch job using a script
* `qstat`: to display queue information
* `qdel`: to delete (cancel) a job:
* `qhold`: to hold a job

### Schedule batch Job

<details>
<summary>Sample run_minilmv2.sh script</summary>

```bash
#!/bin/bash
# >>> conda initialize >>>
# !! Contents within this block are managed by 'conda init' !!
__conda_setup="$(${HOME}'/miniconda3/bin/conda' 'shell.bash' 'hook' 2> /dev/null)"
if [ $? -eq 0 ]; then
eval "$__conda_setup"
else
if [ -f "${HOME}/miniconda3/etc/profile.d/conda.sh" ]; then
. "${HOME}/miniconda3/etc/profile.d/conda.sh"
else
export PATH="${HOME}/miniconda3/bin:$PATH"
fi
fi
unset __conda_setup
# <<< conda initialize <<<
conda activate groqflow
cd ~/groqflow/proof_points/natural_language_processing/minilm
pip install -r requirements.txt
python minilmv2.py

```

</details>

Then run the script as a batch job with PBS:
```bash
qsub run_minilmv2.sh
```


### Schedule Interactive Job

Following command gives a single Groq node interactively for 1 hour
```bash
qsub -I -l walltime=1:00:00
```
Other flags that can be used
```bash
-l ncpus=1
-l groq_accelerator=1
```

## Run Examples

Refer to respective instrcutions below

* [GPT2](./GPT2.md)


## Next Steps

* [MiniLM](./minilm.md)
* [ResNet50](./resnet50.md)

* Explore other examples under [Proof Points](https://github.com/groq/groqflow/tree/main/proof_points)

## Useful Resources

* [ALCF Groq Documenation](https://docs.alcf.anl.gov/ai-testbed/groq/system-overview/)
* [Groq Documentation](https://support.groq.com/#/login)
* [Groq Examples Repository](https://github.com/groq/groqflow/tree/main/proof_points)
43 changes: 43 additions & 0 deletions aiTestbeds/Groq/gpt2.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
# GPT2 On Groq

GPT-2 is a transformers model pretrained on a very large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was trained to guess the next word in sentences.

More precisely, inputs are sequences of continuous text of a certain length and the targets are the same sequence, shifted one token (word or piece of word) to the right. The model uses internally a mask-mechanism to make sure the predictions for the token i only uses the inputs from 1 to i but not the future tokens.

This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks. The model is best at what it was pretrained for however, which is generating texts from a prompt.

#### Get a Groq node interactively

```bash
qsub -I -l walltime=1:00:00
```

#### Go to directory with GPT2 example.
```bash
git clone [email protected]:argonne-lcf/ALCF_Hands_on_HPC_Workshop.git
cd aiTestbeds/Groq
```

#### Activate groqflow virtual Environment
```bash
conda activate groqflow
```

#### Install Requirements

Install the python dependencies using the following command:
```bash
pip install transformers
```

#### Run Inference Job

```bash
python gpt2.py
```

<!-- #### Run end-to-end Inference Job with WikiText dataset
```bash
python GPT2-wiki.py
``` -->
22 changes: 22 additions & 0 deletions aiTestbeds/Groq/gpt2.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
import torch
import torch.nn as nn
import transformers
try:
from groqflow import groqit
except:
raise ImportError("GroqFlow module not found!")

# Instantiate model from transformers library with the corresponding config
model = transformers.GPT2Model(transformers.GPT2Config())

# Create dummy inputs with static dimensions and specified data type
inputs = {
"input_ids": torch.ones(1, 256, dtype=torch.long),
"attention_mask": torch.ones(1, 256, dtype=torch.float),
}

# Rock it with GroqIt to compile the model
gmodel = groqit(model, inputs, rebuild="never")
groq_output = gmodel(**inputs) # Run inference on the model on Groq with the GroqIt runtime

print(groq_output) # print outputs in raw form, this should be decoded with a tokenizer in a real life example
Binary file added aiTestbeds/Groq/groqrack_system_diagram.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading

0 comments on commit 5f3f538

Please sign in to comment.