Skip to content

Commit

Permalink
reqs update
Browse files Browse the repository at this point in the history
  • Loading branch information
alibillalhammoud committed Aug 13, 2024
1 parent 55171b7 commit 5142004
Show file tree
Hide file tree
Showing 4 changed files with 62 additions and 41 deletions.
11 changes: 7 additions & 4 deletions openfasoc/MLoptimization/README.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,11 @@
# Machine Learning Optimization
Code for reinforcement learning loop with openfasoc generators for optimizing metrics

## Supported Versions
Please note that this program has only been tested with python3.11

## Quick Start
run `bash quickstart.bash` to get an example RL run optimizing opamps at room temperature.
run `bash quickstart.bash` to get an example RL run optimizing opamps.

## Code Setup
The code is setup as follows:
Expand All @@ -18,13 +21,13 @@ Make sure that you have OpenAI Gym and Ray installed. To do this, run the follow

To generate the design specifications that the agent trains on, run:
```
python3.10 gen_specs.py
python3.11 gen_specs.py
```
The result is a yaml file dumped to the ../generators/gdsfactory-gen/.

To train the agent, open ipython from the top level directory and then:
```
python3.10 model.py
python3.11 model.py
```
The training checkpoints will be saved in your home directory under ray\_results. Tensorboard can be used to load reward and loss plots using the command:

Expand All @@ -36,7 +39,7 @@ tensorboard --logdir path/to/checkpoint
The evaluation script takes the trained agent and gives it new specs that the agent has never seen before. To generate new design specs, run the gen_specs.py file again with your desired number of specs to validate on. To run validation:

```
python3.10 eval.py
python3.11 eval.py
```

The evaluation result will be saved to the ../generators/gdsfactory-gen/.
Expand Down
36 changes: 18 additions & 18 deletions openfasoc/MLoptimization/quickstart.bash
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
#!/bin/bash

# this script will recreate the ICCAD paper RL results (using the default seed)
echo "This script has been verified to run with python3.10 and all package versions provided"
echo "This script has been verified to run with python3.11 and package versions provided"



Expand All @@ -10,28 +10,28 @@ echo "This script has been verified to run with python3.10 and all package versi
# find most recent version of python
#
# Find all installed Python 3 versions and sort them in descending order
PYTHON_VERSIONS=$(compgen -c | grep -E '^python3\.[0-9]+$' | sort -V | tail -n 1)
#PYTHON_VERSIONS=$(compgen -c | grep -E '^python3\.[0-9]+$' | sort -V | tail -n 1)
# Extract the most recent version
MOST_RECENT_PYTHON=$(echo "$PYTHON_VERSIONS" | tail -n 1)
#MOST_RECENT_PYTHON=$(echo "$PYTHON_VERSIONS" | tail -n 1)
# Check if a Python version was found
if [ -z "$MOST_RECENT_PYTHON" ]; then
echo "No Python 3 versions found."
exit 1
fi
#if [ -z "$MOST_RECENT_PYTHON" ]; then
# echo "No Python 3 versions found."
# exit 1
#fi
# Print the most recent Python version
echo
echo "Currently using Python version: $MOST_RECENT_PYTHON"
echo
#echo
#echo "Currently using Python version: $MOST_RECENT_PYTHON"
#echo
# Check if the most recent version is at least 3.10
MINIMUM_VERSION="3.10"
if [[ "$(echo $MOST_RECENT_PYTHON | cut -d '.' -f2)" -lt "$(echo $MINIMUM_VERSION | cut -d '.' -f2)" ]]; then
echo "The most recent Python version ($MOST_RECENT_PYTHON) is less than $MINIMUM_VERSION. Please update your Python installation."
echo
exit 1
fi
#MINIMUM_VERSION="3.10"
#if [[ "$(echo $MOST_RECENT_PYTHON | cut -d '.' -f2)" -lt "$(echo $MINIMUM_VERSION | cut -d '.' -f2)" ]]; then
# echo "The most recent Python version ($MOST_RECENT_PYTHON) is less than $MINIMUM_VERSION. Please update your Python installation."
# echo
# exit 1
#fi
# Save the command to run the most recent Python version into a variable
PY_RUN=$MOST_RECENT_PYTHON

#PY_RUN=$MOST_RECENT_PYTHON
PY_RUN="python3.11"



Expand Down
28 changes: 12 additions & 16 deletions openfasoc/MLoptimization/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,27 +3,11 @@ prettyprint
prettyprinttree
nltk
torch
transformers
langchain
langchain_community
chromadb
ollama
klayout
unstructured
unstructured[md]
sentence-transformers
peft
accelerate
bitsandbytes
safetensors
requests
datasets
optimum
trl
langchain_huggingface
tensorboard
ray==2.7.1
ray[default]
gym==0.26.2
gymnasium==0.28.1
scikit-learn
Expand All @@ -33,3 +17,15 @@ seaborn
matplotlib
lz4
async-timeout
dm_tree
pyarrow
aiohttp>=3.7
aiohttp_cors
colorful
py-spy>=0.2.0
grpcio>=1.42.0
opencensus
virtualenv>=20.0.24, !=20.21.1
memray
smart_open
prometheus_client >= 0.7.1
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
import sys
from os import path, rename, environ

environ['OPENBLAS_NUM_THREADS'] = '1'
# path to glayout
sys.path.append(path.join(path.dirname(__file__), '../../'))

Expand Down Expand Up @@ -41,6 +41,7 @@
import pickle
import tempfile
import subprocess
import traceback

global _TAPEOUT_AND_RL_DIR_PATH_
global _GET_PARAM_SET_LENGTH_
Expand All @@ -54,7 +55,7 @@
_TAKE_OUTPUT_AT_SECOND_STAGE_ = True

if 'PDK_ROOT' in environ:
PDK_ROOT = Path(environ['PDK_ROOT']).resolve()
PDK_ROOT = str(Path(environ['PDK_ROOT']).resolve())
else:
PDK_ROOT = "/usr/bin/miniconda3/share/pdk/"

Expand Down Expand Up @@ -1169,7 +1170,28 @@ def execute(self):

# same as calling single_build_and_simulation, but runs in a subprocess
def safe_single_build_and_simulation(*args, **kwargs) -> dict:
return safe_single_build_and_simulation_helperclass(*args,**kwargs).results
def get_parameter_value(param_name: str, *args, **kwargs):
# Check if the parameter is in kwargs
if param_name in kwargs:
return kwargs[param_name]
# Check if the parameter is in args
try:
# Find the index of the param_name in args and return the next item as its value
index = args.index(param_name)
return args[index + 1]
except (ValueError, IndexError):
# ValueError if param_name is not in args
# IndexError if param_name is the last item and has no value after it
return None
try:
return safe_single_build_and_simulation_helperclass(*args,**kwargs).results
except Exception as e_LorA:
if bool(get_parameter_value("hardfail",*args,**kwargs)):
raise e_LorA
results = opamp_results_serializer()
with open('get_training_data_ERRORS.log', 'a') as errlog:
errlog.write("\nopamp run "+str(get_parameter_value("index",*args,**kwargs))+" with the following params failed: \n"+str(get_parameter_value("params",*args,**kwargs)))
return results



Expand Down

0 comments on commit 5142004

Please sign in to comment.