Skip to content

Commit

Permalink
add comments on how to set up and expected output
Browse files Browse the repository at this point in the history
  • Loading branch information
AlistairLR112 committed Jan 4, 2024
1 parent 58ffa56 commit 1759da4
Showing 1 changed file with 12 additions and 0 deletions.
12 changes: 12 additions & 0 deletions integrations/ollama/example/example.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,10 @@
# In order to run this example, you will need to have an instance of Ollama running with the
# orca-mini model downloaded. We suggest you use the following commands to serve an orca-mini
# model from Ollama
#
# docker run -d -p 11434:11434 --name ollama ollama/ollama:latest
# docker exec ollama ollama pull orca-mini

from haystack import Document, Pipeline
from haystack.components.builders.prompt_builder import PromptBuilder
from haystack.components.retrievers import InMemoryBM25Retriever
Expand Down Expand Up @@ -41,3 +48,8 @@
response = pipe.run({"prompt_builder": {"query": query}, "retriever": {"query": query}})

print(response["llm"]["replies"])
# An expected response - the output is not deterministic:
# ['Based on the information provided, Super Mario is a successful military leader who fought
# off several invasion attempts by his arch rival - Bowser. He is also an important politician and owns several
# castles where he conducts political business. ' 'Therefore, it can be inferred that Super Mario is a combination of
# both a military leader and an important politician.']

0 comments on commit 1759da4

Please sign in to comment.