Skip to content

Commit

Permalink
Fix secret management breaking change (#296)
Browse files Browse the repository at this point in the history
* Fix secret management breaking change

* Update content/blog/customizing-rag-to-summarize-hacker-news-posts-with-haystack2/index.md

Co-authored-by: Tuana Çelik <[email protected]>

* Update index.md

---------

Co-authored-by: Tuana Çelik <[email protected]>
  • Loading branch information
bilgeyucel and TuanaCelik authored Feb 6, 2024
1 parent 0d42c5b commit 2443ef1
Show file tree
Hide file tree
Showing 5 changed files with 17 additions and 12 deletions.
5 changes: 3 additions & 2 deletions content/blog/astradb-haystack-integration/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,8 +48,9 @@ Remember earlier when I mentioned you were going to need your credentials? I hop
```python
from getpass import getpass
import os

OPENAI_API_KEY = getpass("Enter your openAI key:")
os.environ["OPENAI_API_KEY"] = getpass("Enter your openAI key:")
ASTRA_DB_ID = getpass("Enter your Astra database ID:")
ASTRA_DB_APPLICATION_TOKEN = getpass("Enter your Astra application token (e.g.AstraCS:xxx ):")
ASTRA_DB_REGION = getpass("Enter your AstraDB Region: ")
Expand Down Expand Up @@ -140,7 +141,7 @@ rag_pipeline.add_component(
)
rag_pipeline.add_component(instance=AstraRetriever(document_store=document_store), name="retriever")
rag_pipeline.add_component(instance=PromptBuilder(template=prompt_template), name="prompt_builder")
rag_pipeline.add_component(instance=OpenAIGenerator(api_key=OPENAI_API_KEY), name="llm")
rag_pipeline.add_component(instance=OpenAIGenerator(), name="llm")
rag_pipeline.add_component(instance=AnswerBuilder(), name="answer_builder")
rag_pipeline.connect("embedder", "retriever")
rag_pipeline.connect("retriever", "prompt_builder.documents")
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -117,8 +117,9 @@ First, we initialize all of the components we will need for the pipeline:
```python
from haystack import Pipeline
from haystack.components.builders.prompt_builder import PromptBuilder
from haystack.components.generators import OpenAIGenerator
from haystack.components.generators import OpenAIGenerator
from haystack.utils import Secret
prompt_template = """
You will be provided a few of the latest posts in HackerNews, followed by their URL.
For each post, provide a brief summary followed by the URL the full post can be found at.
Expand All @@ -131,7 +132,7 @@ Posts:
"""
prompt_builder = PromptBuilder(template=prompt_template)
llm = OpenAIGenerator(mode="gpt-4", api_key='YOUR_API_KEY')
llm = OpenAIGenerator(mode="gpt-4", api_key=Secret.from_token('YOUR_API_KEY'))
fetcher = HackernewsNewestFetcher()
```
Next, we add the components to a Pipeline:
Expand Down
5 changes: 3 additions & 2 deletions content/blog/mixtral-8x7b-healthcare-chatbot/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -89,11 +89,12 @@ So now our flow is as follows:
First, initialize the LLMs and warm them up.
```python
from haystack.components.generators import HuggingFaceTGIGenerator
from haystack.utils import Secret

keyword_llm = HuggingFaceTGIGenerator("mistralai/Mixtral-8x7B-Instruct-v0.1", token=huggingface_token)
keyword_llm = HuggingFaceTGIGenerator("mistralai/Mixtral-8x7B-Instruct-v0.1", token=Secret.from_token(huggingface_token))
keyword_llm.warm_up()

llm = HuggingFaceTGIGenerator("mistralai/Mixtral-8x7B-Instruct-v0.1", token=huggingface_token)
llm = HuggingFaceTGIGenerator("mistralai/Mixtral-8x7B-Instruct-v0.1", token=Secret.from_token(huggingface_token))
llm.warm_up()
```

Expand Down
9 changes: 5 additions & 4 deletions content/blog/using-jina-embeddings-haystack/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -55,10 +55,11 @@ pip install jina-haystack chroma-haystack pypdf
Then let's input our credentials. Or you can set them as environment variables instead if you're feeling fancy.

```python
import getpass
from getpass import getpass
import os

jina_api_key = getpass.getpass("JINA api key:")
hf_token = getpass.getpass("Enter your HuggingFace api token:")
jina_api_key = getpass("JINA api key:")
os.environ["HF_API_TOKEN"] = getpass("Enter your HuggingFace api token: ")
```

## Building the indexing pipeline
Expand Down Expand Up @@ -147,7 +148,7 @@ question: {{question}}
"""

text_embedder = JinaTextEmbedder(api_key=jina_api_key, model="jina-embeddings-v2-base-en")
generator = HuggingFaceTGIGenerator("mistralai/Mixtral-8x7B-Instruct-v0.1", token=hf_token)
generator = HuggingFaceTGIGenerator("mistralai/Mixtral-8x7B-Instruct-v0.1")
generator.warm_up()

prompt_builder = PromptBuilder(template=prompt)
Expand Down
3 changes: 2 additions & 1 deletion content/overview/quick-start.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,6 +49,7 @@ Then, index your data to the DocumentStore, build a RAG pipeline, and ask a ques
import os

from haystack import Pipeline, Document
from haystack.utils import Secret
from haystack.document_stores.in_memory import InMemoryDocumentStore
from haystack.components.retrievers.in_memory import InMemoryBM25Retriever
from haystack.components.generators import OpenAIGenerator
Expand All @@ -75,7 +76,7 @@ Answer:

retriever = InMemoryBM25Retriever(document_store=document_store)
prompt_builder = PromptBuilder(template=prompt_template)
llm = OpenAIGenerator(api_key=api_key)
llm = OpenAIGenerator(api_key=Secret.from_token(api_key))

rag_pipeline = Pipeline()
rag_pipeline.add_component("retriever", retriever)
Expand Down

0 comments on commit 2443ef1

Please sign in to comment.