-
Notifications
You must be signed in to change notification settings - Fork 15.8k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
docs[patch]:
promptlayer
pages update (#14416)
Updated provider page by adding LLM and ChatLLM references; removed a content that is duplicate text from the LLM referenced page. Updated the collback page
- Loading branch information
Showing
2 changed files
with
34 additions
and
33 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,49 +1,49 @@ | ||
# PromptLayer | ||
|
||
This page covers how to use [PromptLayer](https://www.promptlayer.com) within LangChain. | ||
It is broken into two parts: installation and setup, and then references to specific PromptLayer wrappers. | ||
>[PromptLayer](https://docs.promptlayer.com/introduction) is a platform for prompt engineering. | ||
> It also helps with the LLM observability to visualize requests, version prompts, and track usage. | ||
> | ||
>While `PromptLayer` does have LLMs that integrate directly with LangChain (e.g. | ||
> [`PromptLayerOpenAI`](https://docs.promptlayer.com/languages/langchain)), | ||
> using a callback is the recommended way to integrate `PromptLayer` with LangChain. | ||
## Installation and Setup | ||
|
||
If you want to work with PromptLayer: | ||
- Install the promptlayer python library `pip install promptlayer` | ||
- Create a PromptLayer account | ||
To work with `PromptLayer`, we have to: | ||
- Create a `PromptLayer` account | ||
- Create an api token and set it as an environment variable (`PROMPTLAYER_API_KEY`) | ||
|
||
## Wrappers | ||
Install a Python package: | ||
|
||
### LLM | ||
|
||
There exists an PromptLayer OpenAI LLM wrapper, which you can access with | ||
```python | ||
from langchain.llms import PromptLayerOpenAI | ||
```bash | ||
pip install promptlayer | ||
``` | ||
|
||
To tag your requests, use the argument `pl_tags` when initializing the LLM | ||
|
||
## Callback | ||
|
||
See a [usage example](/docs/integrations/callbacks/promptlayer). | ||
|
||
```python | ||
from langchain.llms import PromptLayerOpenAI | ||
llm = PromptLayerOpenAI(pl_tags=["langchain-requests", "chatbot"]) | ||
import promptlayer # Don't forget this import! | ||
from langchain.callbacks import PromptLayerCallbackHandler | ||
``` | ||
|
||
To get the PromptLayer request id, use the argument `return_pl_id` when initializing the LLM | ||
|
||
## LLM | ||
|
||
See a [usage example](/docs/integrations/llms/promptlayer_openai). | ||
|
||
```python | ||
from langchain.llms import PromptLayerOpenAI | ||
llm = PromptLayerOpenAI(return_pl_id=True) | ||
``` | ||
This will add the PromptLayer request ID in the `generation_info` field of the `Generation` returned when using `.generate` or `.agenerate` | ||
|
||
For example: | ||
```python | ||
llm_results = llm.generate(["hello world"]) | ||
for res in llm_results.generations: | ||
print("pl request id: ", res[0].generation_info["pl_request_id"]) | ||
``` | ||
You can use the PromptLayer request ID to add a prompt, score, or other metadata to your request. [Read more about it here](https://magniv.notion.site/Track-4deee1b1f7a34c1680d085f82567dab9). | ||
|
||
This LLM is identical to the [OpenAI](/docs/ecosystem/integrations/openai) LLM, except that | ||
- all your requests will be logged to your PromptLayer account | ||
- you can add `pl_tags` when instantiating to tag your requests on PromptLayer | ||
- you can add `return_pl_id` when instantiating to return a PromptLayer request id to use [while tracking requests](https://magniv.notion.site/Track-4deee1b1f7a34c1680d085f82567dab9). | ||
## Chat Models | ||
|
||
See a [usage example](/docs/integrations/chat/promptlayer_chatopenai). | ||
|
||
```python | ||
from langchain.chat_models import PromptLayerChatOpenAI | ||
``` | ||
|
||
PromptLayer also provides native wrappers for [`PromptLayerChatOpenAI`](/docs/integrations/chat/promptlayer_chatopenai) and `PromptLayerOpenAIChat` |