Table of Contents
pip install ollama-haystack
ollama-haystack
is distributed under the terms of the Apache-2.0 license.
To run tests first start a Docker container running Ollama and pull a model for integration testing It's recommended to use the smallest model possible for testing purposes - see https://ollama.ai/library for a list that Ollama supportd
docker run -d -p 11434:11434 --name ollama ollama/ollama:latest
docker exec ollama ollama pull <your model here>
Then run tests:
hatch run test
The default model used here is orca-mini