Skip to content

Commit

Permalink
Allow index name customization via env var in rag-conversation (#12315)
Browse files Browse the repository at this point in the history
  • Loading branch information
jacoblee93 authored Oct 26, 2023
1 parent 869a49a commit 28c3950
Show file tree
Hide file tree
Showing 2 changed files with 16 additions and 7 deletions.
4 changes: 2 additions & 2 deletions templates/rag-conversation/README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Conversational RAG
# Conversational RAG

This template performs [conversational](https://python.langchain.com/docs/expression_language/cookbook/retrieval#conversational-retrieval-chain) [retrieval](https://python.langchain.com/docs/use_cases/question_answering/), which is one of the most popular LLM use-cases.

Expand All @@ -10,4 +10,4 @@ Be sure that `OPENAI_API_KEY` is set in order to use the OpenAI models.

## Pinecone

Be sure that `PINECONE_API_KEY` is set in order to use Pinecone.
This template uses Pinecone as a vectorstore and requires that `PINECONE_API_KEY`, `PINECONE_ENVIRONMENT`, and `PINECONE_INDEX` are set.
19 changes: 14 additions & 5 deletions templates/rag-conversation/rag_conversation/chain.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
import os
from typing import Tuple, List
from pydantic import BaseModel
from operator import itemgetter
Expand All @@ -10,6 +11,14 @@
from langchain.schema.output_parser import StrOutputParser
from langchain.schema.runnable import RunnablePassthrough, RunnableBranch, RunnableLambda, RunnableMap

if os.environ.get("PINECONE_API_KEY", None) is None:
raise Exception("Missing `PINECONE_API_KEY` environment variable.")

if os.environ.get("PINECONE_ENVIRONMENT", None) is None:
raise Exception("Missing `PINECONE_ENVIRONMENT` environment variable.")

PINECONE_INDEX_NAME = os.environ.get("PINECONE_INDEX", "langchain-test")

### Ingest code - you may need to run this the first time
# Load
# from langchain.document_loaders import WebBaseLoader
Expand All @@ -20,14 +29,14 @@
# from langchain.text_splitter import RecursiveCharacterTextSplitter
# text_splitter = RecursiveCharacterTextSplitter(chunk_size=500, chunk_overlap=0)
# all_splits = text_splitter.split_documents(data)
#

# # Add to vectorDB
# vectorstore = Pinecone.from_documents(
# documents=all_splits, embedding=OpenAIEmbeddings(), index_name='langchain-test'
# documents=all_splits, embedding=OpenAIEmbeddings(), index_name=PINECONE_INDEX_NAME
# )
# retriever = vectorstore.as_retriever()

vectorstore = Pinecone.from_existing_index("langchain-test", OpenAIEmbeddings())
vectorstore = Pinecone.from_existing_index(PINECONE_INDEX_NAME, OpenAIEmbeddings())
retriever = vectorstore.as_retriever()

# Condense a chat history and follow-up question into a standalone question
Expand Down Expand Up @@ -62,9 +71,9 @@ def _format_chat_history(chat_history: List[Tuple[str, str]]) -> List:
buffer.append(AIMessage(content=ai))
return buffer

# User input
# User input
class ChatHistory(BaseModel):
chat_history: List[Tuple[str, str]]
chat_history: List[Tuple[str, str]]
question: str


Expand Down

0 comments on commit 28c3950

Please sign in to comment.