-
Notifications
You must be signed in to change notification settings - Fork 15.9k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
community[minor]: Add OCI Generative AI integration (#16548)
<!-- Thank you for contributing to LangChain! Please title your PR "<package>: <description>", where <package> is whichever of langchain, community, core, experimental, etc. is being modified. Replace this entire comment with: - **Description:** Adding Oracle Cloud Infrastructure Generative AI integration. Oracle Cloud Infrastructure (OCI) Generative AI is a fully managed service that provides a set of state-of-the-art, customizable large language models (LLMs) that cover a wide range of use cases, and which is available through a single API. Using the OCI Generative AI service you can access ready-to-use pretrained models, or create and host your own fine-tuned custom models based on your own data on dedicated AI clusters. https://docs.oracle.com/en-us/iaas/Content/generative-ai/home.htm - **Issue:** None, - **Dependencies:** OCI Python SDK, - **Twitter handle:** we announce bigger features on Twitter. If your PR gets announced, and you'd like a mention, we'll gladly shout you out! Please make sure your PR is passing linting and testing before submitting. Run `make format`, `make lint` and `make test` from the root of the package you've modified to check this locally. Passed See contribution guidelines for more information on how to write/run tests, lint, etc: https://python.langchain.com/docs/contributing/ If you're adding a new integration, please include: 1. a test for the integration, preferably unit tests that do not rely on network access, 2. an example notebook showing its use. It lives in `docs/docs/integrations` directory. we provide unit tests. However, we cannot provide integration tests due to Oracle policies that prohibit public sharing of api keys. If no one reviews your PR within a few days, please @-mention one of @baskaryan, @eyurtsev, @hwchase17. --> --------- Co-authored-by: Arthur Cheng <[email protected]> Co-authored-by: Bagatur <[email protected]>
- Loading branch information
1 parent
b8768bd
commit c4e9c9c
Showing
11 changed files
with
819 additions
and
6 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,191 @@ | ||
{ | ||
"cells": [ | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"## Oracle Cloud Infrastructure Generative AI" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"Oracle Cloud Infrastructure (OCI) Generative AI is a fully managed service that provides a set of state-of-the-art, customizable large language models (LLMs) that cover a wide range of use cases, and which is available through a single API.\n", | ||
"Using the OCI Generative AI service you can access ready-to-use pretrained models, or create and host your own fine-tuned custom models based on your own data on dedicated AI clusters. Detailed documentation of the service and API is available __[here](https://docs.oracle.com/en-us/iaas/Content/generative-ai/home.htm)__ and __[here](https://docs.oracle.com/en-us/iaas/api/#/en/generative-ai/20231130/)__.\n", | ||
"\n", | ||
"This notebook explains how to use OCI's Genrative AI models with LangChain." | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"### Prerequisite\n", | ||
"We will need to install the oci sdk" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"!pip install -U oci" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"### OCI Generative AI API endpoint \n", | ||
"https://inference.generativeai.us-chicago-1.oci.oraclecloud.com" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"## Authentication\n", | ||
"The authentication methods supported for this langchain integration are:\n", | ||
"\n", | ||
"1. API Key\n", | ||
"2. Session token\n", | ||
"3. Instance principal\n", | ||
"4. Resource principal \n", | ||
"\n", | ||
"These follows the standard SDK authentication methods detailed __[here](https://docs.oracle.com/en-us/iaas/Content/API/Concepts/sdk_authentication_methods.htm)__.\n", | ||
" " | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"## Usage" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"from langchain_community.llms import OCIGenAI\n", | ||
"\n", | ||
"# use default authN method API-key\n", | ||
"llm = OCIGenAI(\n", | ||
" model_id=\"MY_MODEL\",\n", | ||
" service_endpoint=\"https://inference.generativeai.us-chicago-1.oci.oraclecloud.com\",\n", | ||
" compartment_id=\"MY_OCID\",\n", | ||
")\n", | ||
"\n", | ||
"response = llm.invoke(\"Tell me one fact about earth\", temperature=0.7)\n", | ||
"print(response)" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"from langchain.chains import LLMChain\n", | ||
"from langchain_core.prompts import PromptTemplate\n", | ||
"\n", | ||
"# Use Session Token to authN\n", | ||
"llm = OCIGenAI(\n", | ||
" model_id=\"MY_MODEL\",\n", | ||
" service_endpoint=\"https://inference.generativeai.us-chicago-1.oci.oraclecloud.com\",\n", | ||
" compartment_id=\"MY_OCID\",\n", | ||
")\n", | ||
"\n", | ||
"prompt = PromptTemplate(input_variables=[\"query\"], template=\"{query}\")\n", | ||
"\n", | ||
"llm_chain = LLMChain(llm=llm, prompt=prompt)\n", | ||
"\n", | ||
"response = llm_chain.invoke(\"what is the capital of france?\")\n", | ||
"print(response)" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"from langchain.schema.output_parser import StrOutputParser\n", | ||
"from langchain.schema.runnable import RunnablePassthrough\n", | ||
"from langchain_community.embeddings import OCIGenAIEmbeddings\n", | ||
"from langchain_community.vectorstores import FAISS\n", | ||
"\n", | ||
"embeddings = OCIGenAIEmbeddings(\n", | ||
" model_id=\"MY_EMBEDDING_MODEL\",\n", | ||
" service_endpoint=\"https://inference.generativeai.us-chicago-1.oci.oraclecloud.com\",\n", | ||
" compartment_id=\"MY_OCID\",\n", | ||
")\n", | ||
"\n", | ||
"vectorstore = FAISS.from_texts(\n", | ||
" [\n", | ||
" \"Larry Ellison co-founded Oracle Corporation in 1977 with Bob Miner and Ed Oates.\",\n", | ||
" \"Oracle Corporation is an American multinational computer technology company headquartered in Austin, Texas, United States.\",\n", | ||
" ],\n", | ||
" embedding=embeddings,\n", | ||
")\n", | ||
"\n", | ||
"retriever = vectorstore.as_retriever()\n", | ||
"\n", | ||
"template = \"\"\"Answer the question based only on the following context:\n", | ||
"{context}\n", | ||
" \n", | ||
"Question: {question}\n", | ||
"\"\"\"\n", | ||
"prompt = PromptTemplate.from_template(template)\n", | ||
"\n", | ||
"llm = OCIGenAI(\n", | ||
" model_id=\"MY_MODEL\",\n", | ||
" service_endpoint=\"https://inference.generativeai.us-chicago-1.oci.oraclecloud.com\",\n", | ||
" compartment_id=\"MY_OCID\",\n", | ||
")\n", | ||
"\n", | ||
"chain = (\n", | ||
" {\"context\": retriever, \"question\": RunnablePassthrough()}\n", | ||
" | prompt\n", | ||
" | llm\n", | ||
" | StrOutputParser()\n", | ||
")\n", | ||
"\n", | ||
"print(chain.invoke(\"when was oracle founded?\"))\n", | ||
"print(chain.invoke(\"where is oracle headquartered?\"))" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [] | ||
} | ||
], | ||
"metadata": { | ||
"kernelspec": { | ||
"display_name": "oci_langchain", | ||
"language": "python", | ||
"name": "python3" | ||
}, | ||
"language_info": { | ||
"codemirror_mode": { | ||
"name": "ipython", | ||
"version": 3 | ||
}, | ||
"file_extension": ".py", | ||
"mimetype": "text/x-python", | ||
"name": "python", | ||
"nbconvert_exporter": "python", | ||
"pygments_lexer": "ipython3", | ||
"version": "3.9.18" | ||
} | ||
}, | ||
"nbformat": 4, | ||
"nbformat_minor": 2 | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.