Skip to content

Latest commit

 

History

History
131 lines (84 loc) · 4.94 KB

INSTALL.md

File metadata and controls

131 lines (84 loc) · 4.94 KB

 Installation

The following instructions will guide you through setting up the project and running a workflow.

Prerequisites

We are working with poetry for our dependency management, so you must install it first to run any part of our Python package.

pip install poetry

Note: for Linux Users

If this does not work for you and you are using a Linux machine, you will need to install poetry with:

apt install python3-poetry

Alternatively, you need to reinstall poetry:

curl -sSL https://install.python-poetry.org | python3 -
poetry self-update

Then, install the dependencies (run this in the root directory of the project) with:

poetry install

Run any file with the following command:

poetry run <executable> <args>

Why do we use poetry?

Poetry is a tool for dependency management and packaging in Python. It allows you to declare the libraries your project depends on, and it will manage (install/update) them for you.

For more information, see the poetry documentation.

Install dependencies

To install the dependencies, run the following command:

poetry install

This will install all the dependencies required to run the framework. To check the dependencies, you can run:

poetry show

To open a shell with the dependencies installed, you can run:

poetry shell

If you opened a shell, you can run all the caribou commands without the poetry run prefix.

caribou --help

Ensure the poetry environment is running Python 3.12+

Poetry will ensure that you are using Python 3.12 or higher based on the configuration in the pyproject.toml file. If you do not have the correct Python version, Poetry will notify you and you will need to install Python 3.12 or higher.

To verify your Python version manually, you can use:

python --version

If your version is lower than 3.12, follow the instructions for your operating system to install Python 3.12 or higher.

AWS Account Access

To run the framework, you need an AWS account and the necessary permissions to create and manage the required resources. In IAM Policies we list the required permissions for any user wanting to interact with a deployed framework.

The fastest way to set up the necessary permissions is to create a new AWS user under your account with the necessary permissions and use the access key and secret key to login the AWS CLI of this user to interact with the framework.

Setup AWS Environment

First of all, make sure to have AWS CLI installed. To set up the required tables in AWS required for the framework to run, you can use the following command:

poetry run caribou setup_tables

Note: The bucket that Caribou uses to store the resources (a feature for future provider compatibility) needs to be manually created. Since AWS bucket names need to be unique, the currently configured bucket might already exist and be used by another version of the framework deployed somewhere else. In this case, adapt the bucket name for the variable DEPLOYMENT_RESOURCES_BUCKET in the caribou/common/constants.py file.

Docker

The Deployment Utility has an additional dependency on docker. To install it, follow the instructions on the docker website. Ensure you have the docker daemon running before running the deployment utility.

To verify that Docker is installed correctly, you can try running:

docker --version

Google Maps API Key

For the server side component, more specifically the data collectors, we use the Google Maps Geocoding API to resolve the location of the data centers. To use this API, you need to have a Google Maps API key. You can get one by following the instructions on the Google Maps Platform website.

Electricity Map API Key

For the server side component, more specifically the data collectors, we use the Electricity Map API to get the carbon intensity of the electricity in the regions. To use this API, you need to have an Electricity Map API key. You can get one by following the instructions on the Electricity Map website.

Other dependencies

Since the AWS lambda environment restricts us from using Docker, we have to migrate the workflows using crane. If you plan on running the framework locally instead of deploying it to the cloud, please install the crane as described in the crane documentation.