diff --git a/aiTestbeds/Groq/README.md b/aiTestbeds/Groq/README.md new file mode 100644 index 0000000..dd3d51c --- /dev/null +++ b/aiTestbeds/Groq/README.md @@ -0,0 +1,126 @@ +# Groq + +## Connection to Groq + +![Groq connection diagram](./groqrack_system_diagram.png) + +Login to the Groq login node from your local machine. +Once you are on the login node, ssh to one of the Groq nodes. + +```bash +local > ssh ALCFUserID@groq.ai.alcf.anl.gov +``` +```bash +groq-login > ssh groq-r01-gn-01.ai.alcf.anl.gov +# or +groq-login > ssh groq-r01-gn-09.ai.alcf.anl.gov +# or any node with hostname of form groq-r01-gn-0[1-9].ai.alcf.anl.gov +``` + +## Create Virtual Environment + +### Install Miniconda + +```bash +wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh +bash Miniconda3-latest-Linux-x86_64.sh +``` + +### PyTorch virtual environment + +```bash +export PYTHON_VERSION=3.10.12 +conda create -n groqflow python=$PYTHON_VERSION +conda activate groqflow +``` + +### Install Groqflow + +```bash +git clone https://github.com/groq/groqflow.git +cd groqflow +pip install --upgrade pip +pip install -e . +pushd . +cd demo_helpers +pip install -e . +popd +``` + + +## Job Queuing and Submission + +Groq jobs in the AI Testbed's groqrack are managed by the PBS job scheduler. + +* `qsub` : to submit a batch job using a script +* `qstat`: to display queue information +* `qdel`: to delete (cancel) a job: +* `qhold`: to hold a job + +### Schedule batch Job + +
+ Sample run_minilmv2.sh script + + ```bash + #!/bin/bash + # >>> conda initialize >>> + # !! Contents within this block are managed by 'conda init' !! + __conda_setup="$(${HOME}'/miniconda3/bin/conda' 'shell.bash' 'hook' 2> /dev/null)" + if [ $? -eq 0 ]; then + eval "$__conda_setup" + else + if [ -f "${HOME}/miniconda3/etc/profile.d/conda.sh" ]; then + . "${HOME}/miniconda3/etc/profile.d/conda.sh" + else + export PATH="${HOME}/miniconda3/bin:$PATH" + fi + fi + unset __conda_setup + # <<< conda initialize <<< + conda activate groqflow + cd ~/groqflow/proof_points/natural_language_processing/minilm + pip install -r requirements.txt + python minilmv2.py + + ``` + +
+ +Then run the script as a batch job with PBS: +```bash +qsub run_minilmv2.sh +``` + + +### Schedule Interactive Job + +Following command gives a single Groq node interactively for 1 hour +```bash +qsub -I -l walltime=1:00:00 +``` +Other flags that can be used +```bash +-l ncpus=1 +-l groq_accelerator=1 +``` + +## Run Examples + +Refer to respective instrcutions below + +* [GPT2](./GPT2.md) + + +## Next Steps + +* [MiniLM](./minilm.md) +* [ResNet50](./resnet50.md) + +* Explore other examples under [Proof Points](https://github.com/groq/groqflow/tree/main/proof_points) + +## Useful Resources + +* [ALCF Groq Documenation](https://docs.alcf.anl.gov/ai-testbed/groq/system-overview/) +* [Groq Documentation](https://support.groq.com/#/login) +* [Groq Examples Repository](https://github.com/groq/groqflow/tree/main/proof_points) diff --git a/aiTestbeds/Groq/gpt2.md b/aiTestbeds/Groq/gpt2.md new file mode 100644 index 0000000..bd684ac --- /dev/null +++ b/aiTestbeds/Groq/gpt2.md @@ -0,0 +1,43 @@ +# GPT2 On Groq + +GPT-2 is a transformers model pretrained on a very large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was trained to guess the next word in sentences. + +More precisely, inputs are sequences of continuous text of a certain length and the targets are the same sequence, shifted one token (word or piece of word) to the right. The model uses internally a mask-mechanism to make sure the predictions for the token i only uses the inputs from 1 to i but not the future tokens. + +This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks. The model is best at what it was pretrained for however, which is generating texts from a prompt. + +#### Get a Groq node interactively + +```bash +qsub -I -l walltime=1:00:00 +``` + +#### Go to directory with GPT2 example. +```bash +git clone git@github.com:argonne-lcf/ALCF_Hands_on_HPC_Workshop.git +cd aiTestbeds/Groq +``` + +#### Activate groqflow virtual Environment +```bash +conda activate groqflow +``` + +#### Install Requirements + +Install the python dependencies using the following command: +```bash +pip install transformers +``` + +#### Run Inference Job + +```bash +python gpt2.py +``` + + \ No newline at end of file diff --git a/aiTestbeds/Groq/gpt2.py b/aiTestbeds/Groq/gpt2.py new file mode 100644 index 0000000..cc2f84d --- /dev/null +++ b/aiTestbeds/Groq/gpt2.py @@ -0,0 +1,22 @@ +import torch +import torch.nn as nn +import transformers +try: + from groqflow import groqit +except: + raise ImportError("GroqFlow module not found!") + +# Instantiate model from transformers library with the corresponding config +model = transformers.GPT2Model(transformers.GPT2Config()) + +# Create dummy inputs with static dimensions and specified data type +inputs = { + "input_ids": torch.ones(1, 256, dtype=torch.long), + "attention_mask": torch.ones(1, 256, dtype=torch.float), +} + +# Rock it with GroqIt to compile the model +gmodel = groqit(model, inputs, rebuild="never") +groq_output = gmodel(**inputs) # Run inference on the model on Groq with the GroqIt runtime + +print(groq_output) # print outputs in raw form, this should be decoded with a tokenizer in a real life example diff --git a/aiTestbeds/Groq/groqrack_system_diagram.png b/aiTestbeds/Groq/groqrack_system_diagram.png new file mode 100644 index 0000000..368828a Binary files /dev/null and b/aiTestbeds/Groq/groqrack_system_diagram.png differ diff --git a/aiTestbeds/Groq/minilm.md b/aiTestbeds/Groq/minilm.md new file mode 100644 index 0000000..7248ec5 --- /dev/null +++ b/aiTestbeds/Groq/minilm.md @@ -0,0 +1,89 @@ +# MiniLM v2 On Groq + +MiniLM v2 is a distilled model that employs a generalization of the deep self-attention distillation method that the authors of the linked paper introduced in their first paper MiniLm. + +#### Get a Groq node interactively + +```bash +qsub -I -l walltime=1:00:00 +``` + +#### Go to directory with MiniLM v2 example. +```bash +cd ~/proof_points/natural_language_processing/minilm +``` + +#### Activate groqflow virtual Environment +```bash +conda activate groqflow +``` + +#### Install Requirements + +Install the python dependencies using the requirements.txt file included with this proof point using the following command: +```bash +pip install -r requirements.txt +``` + +#### Run Training Job + + +```bash +python minilmv2.py +``` +
+ Sample Output + + ```bash + $ python minilmv2.py + Downloading (…)okenizer_config.json: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 350/350 [00:00<00:00, 1.64MB/s] + Downloading (…)solve/main/vocab.txt: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 232k/232k [00:00<00:00, 1.72MB/s] + Downloading (…)/main/tokenizer.json: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 466k/466k [00:00<00:00, 26.5MB/s] + Downloading (…)cial_tokens_map.json: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 112/112 [00:00<00:00, 502kB/s] + Downloading (…)lve/main/config.json: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 612/612 [00:00<00:00, 7.19MB/s] + Downloading pytorch_model.bin: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 90.9M/90.9M [00:00<00:00, 177MB/s] + + + + Building "minilmv2" + ✓ Exporting PyTorch to ONNX + ✓ Optimizing ONNX file + ✓ Checking for Op support + ✓ Converting to FP16 + ✓ Compiling model + ✓ Assembling model + + Woohoo! Saved to ~/.cache/groqflow/minilmv2 + Preprocessing data. + Downloading builder script: 0%| |Downloading builder script: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 7.43k/7.43k [00:00<00:00, 29.5MB/s] + Downloading metadata: 0%| |Downloading metadata: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 19.0k/19.0k [00:00<00:00, 70.8MB/s] + Downloading readme: 0%| |Downloading readme: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 9.98k/9.98k [00:00<00:00, 63.3MB/s] + Downloading data files: 0%| | 0/3 [00:00 \ No newline at end of file diff --git a/aiTestbeds/Groq/resnet50.md b/aiTestbeds/Groq/resnet50.md new file mode 100644 index 0000000..a21f866 --- /dev/null +++ b/aiTestbeds/Groq/resnet50.md @@ -0,0 +1,71 @@ +# ResNet50 On Groq + +ResNet50 is a Convolutional Neural Network (CNN) model used for image classification. Kaiming He, et al. first introduced ResNet models and the revolutionary residual connection (also known as skip connection) in their 2015 paper, Deep Residual Learning for Image Recognition. + +#### Get a Groq node interactively + +```bash +qsub -I -l walltime=1:00:00 +``` + +#### Go to directory with Resnet50 example. +```bash +cd ~/proof_points/computer_vision/resnet50 +``` + +#### Activate groqflow virtual Environment +```bash +conda activate groqflow +``` + +#### Install Requirements + +Install the python dependencies using the requirements.txt file included with this proof point using the following command: +```bash +pip install -r requirements.txt +``` + +#### Run Training Job + +```bash +python resnet50.py +``` +
+ Sample Output + + ```bash + $ python resnet50.py + Downloading: "https://github.com/pytorch/vision/zipball/v0.10.0" to /home/sraskar/.cache/torch/hub/v0.10.0.zip + Downloading: "https://download.pytorch.org/models/resnet50-0676ba61.pth" to /home/sraskar/.cache/torch/hub/checkpoints/resnet50-0676ba61.pth + 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 97.8M/97.8M [00:00<00:00, 254MB/s] + + + + Building "resnet50" + ✓ Exporting PyTorch to ONNX + ✓ Optimizing ONNX file + ✓ Checking for Op support + ✓ Converting to FP16 + ✓ Compiling model + ✓ Assembling model + + Woohoo! Saved to ~/.cache/groqflow/resnet50 + Preprocessing data. + Downloading: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 342M/342M [00:10<00:00, 31.1MB/s] + 100% [..........................................................................] 2568145 / 2568145 + Info: No inputs received for benchmark. Using the inputs provided during model compilation. + /projects/datascience/sraskar/groq/groqflow/groqflow/groqmodel/execute.py:87: DeprecationWarning: `product` is deprecated as of NumPy 1.25.0, and will be removed in NumPy 2.0. Please use `prod` instead. + return tsp_runner(**example) + Running inference on GroqChip. + /projects/datascience/sraskar/groq/groqflow/groqflow/groqmodel/execute.py:87: DeprecationWarning: `product` is deprecated as of NumPy 1.25.0, and will be removed in NumPy 2.0. Please use `prod` instead. + return tsp_runner(**example) + Running inference using PyTorch model (CPU). + 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 3925/3925 [06:43<00:00, 9.73it/s] + +--------+----------+-------------------------+----------------+----------------------+-------------+ + | Source | Accuracy | end-to-end latency (ms) | end-to-end IPS | on-chip latency (ms) | on-chip IPS | + +--------+----------+-------------------------+----------------+----------------------+-------------+ + | cpu | 84.54% | 102.76 | 9.73 | -- | -- | + | groq | 84.51% | 0.40 | 2515.15 | 0.33 | 2985.40 | + +--------+----------+-------------------------+----------------+----------------------+-------------+ + ``` +
\ No newline at end of file diff --git a/aiTestbeds/Groq/run_gpt2.sh b/aiTestbeds/Groq/run_gpt2.sh new file mode 100644 index 0000000..16c4e31 --- /dev/null +++ b/aiTestbeds/Groq/run_gpt2.sh @@ -0,0 +1,19 @@ +#!/bin/bash +# >>> conda initialize >>> +# !! Contents within this block are managed by 'conda init' !! +__conda_setup="$(${HOME}'/miniconda3/bin/conda' 'shell.bash' 'hook' 2> /dev/null)" +if [ $? -eq 0 ]; then + eval "$__conda_setup" +else + if [ -f "${HOME}/miniconda3/etc/profile.d/conda.sh" ]; then + . "${HOME}/miniconda3/etc/profile.d/conda.sh" + else + export PATH="${HOME}/miniconda3/bin:$PATH" + fi +fi +unset __conda_setup +# <<< conda initialize <<< +conda activate groqflow + + +python gpt2.py \ No newline at end of file