eidos is an API for validating and executing AI functions. It aims to be a generic API to serve as a common interface to allow execution of functions by LLMs.
From source:
git clone [email protected]:KhaosResearch/eidos.git
cd eidos
python -m pip install -e .
Or directly from GitHub:
python -m pip install "eidos @ git+ssh://[email protected]/KhaosResearch/eidos.git"
- Development
Run the API with the following command:
uvicorn eidos.api:app --host 0.0.0.0 --port 8090 --reload
You can override the default configuration by setting environment variables.
- Docker
Alternatively, you can use the provided Dockerfile to build a Docker image and run the API in a container:
docker build -t eidos-server:latest .
docker run --rm -v $(pwd)/functions:/functions -p 8090:80 eidos-server:latest
Example:
curl -X POST -H "Content-Type: application/json" -d '{"who": "me"}' http://localhost:8090/api/v1/execution/salute
- Kubernetes
To deploy the container in Kubernetes, a reference deployment is available and documented at manifests.
- Serverless in AWS Another docker image to deploy serverless in AWS Lambda is provided in Dockerfile.lambda. The image is based on the official AWS Lambda Python 3.11 image. For extending this image the process is the same as the main image.
docker build -t eidos-lambda -f Dockerfile.lambda .
Run the container locally with the following command or deploy in AWS Lambda as a docker container image:
docker run --rm -p 8091:8080 eidos-lambda
Invoke the function for local testing with sample query
curl -XPOST "http://localhost:8091/2015-03-31/functions/function/invocations" -d '{"command": "EXECUTE", "parameters": {"function": "salute", "args": {"who": "me, I am executing serverless"}}}'
pytest
is used for testing. You can run the tests with the following command:
pytest tests/