Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Community: add new integration Novita AI #27985

Closed
wants to merge 14 commits into from
218 changes: 218 additions & 0 deletions docs/docs/integrations/chat/novita.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,218 @@
{
"cells": [
{
"cell_type": "raw",
"metadata": {
"vscode": {
"languageId": "raw"
}
},
"source": [
"---\n",
"sidebar_label: Novita AI\n",
"---"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# ChatNovita\n",
"\n",
"Delivers an affordable, reliable, and simple inference platform for running top LLM models.\n",
"\n",
"You can find all the models we support here: [Novita AI Featured Models](https://novita.ai/model-api/product/llm-api?utm_source=github_langchain&utm_medium=github_readme&utm_campaign=link) or request the [Models API](https://novita.ai/docs/model-api/reference/llm/models.html?utm_source=github_langchain&utm_medium=github_readme&utm_campaign=link) to get all available models.\n",
"\n",
"Try the [Novita AI Llama 3 API Demo](https://novita.ai/model-api/product/llm-api/playground#meta-llama-llama-3.1-8b-instruct?utm_source=github_langchain&utm_medium=github_readme&utm_campaign=link) today!"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Overview\n",
"\n",
"### Model features\n",
"| [Tool calling](../../how_to/tool_calling.ipynb) | [Structured output](../../how_to/structured_output.ipynb) | JSON mode | [Image input](../../how_to/multimodal_inputs.ipynb) | Audio input | Video input | [Token-level streaming](../../how_to/chat_streaming.ipynb) | Native async | [Token usage](../../how_to/chat_token_usage_tracking.ipynb) | [Logprobs](../../how_to/logprobs.ipynb) |\n",
"| :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |\n",
"| ❌ | ✅ | ✅ | ❌ | ❌ | ❌ | ❌ | ❌ | ✅ | ❌ |"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Setup\n",
"\n",
"To access Novita AI models you'll need to create a Novita account and get an API key.\n",
"\n",
"### Credentials\n",
"\n",
"Head to [this page](https://novita.ai/settings#key-management?utm_source=github_langchain&utm_medium=github_readme&utm_campaign=link) to sign up to Novita AI and generate an API key. Once you've done this set the NOVITA_API_KEY environment variable:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import getpass\n",
"import os\n",
"\n",
"if \"NOVITA_API_KEY\" not in os.environ:\n",
" os.environ[\"NOVITA_API_KEY\"] = getpass.getpass(\"Enter your Novita API key: \")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"If you want to get automated tracing of your model calls you can also set your [LangSmith](https://docs.smith.langchain.com/) API key by uncommenting below:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# os.environ[\"LANGSMITH_API_KEY\"] = getpass.getpass(\"Enter your LangSmith API key: \")\n",
"# os.environ[\"LANGSMITH_TRACING\"] = \"true\""
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Installation\n",
"\n",
"The LangChain Novita integration lives in the `langchain-community` package:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%pip install -qU langchain-community"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Instantiation\n",
"\n",
"Now we can instantiate our model object and generate chat completions. Try the [Novita AI Llama 3 API Demo](https://novita.ai/model-api/product/llm-api/playground#meta-llama-llama-3.1-8b-instruct?utm_source=github_langchain&utm_medium=github_readme&utm_campaign=link) today!"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from langchain_community.chat_models.novita import ChatNovita\n",
"from langchain_core.messages import HumanMessage, SystemMessage\n",
"\n",
"llm = ChatNovita(\n",
" model=\"meta-llama/llama-3.1-8b-instruct\",\n",
" temperature=0,\n",
" max_tokens=None,\n",
" timeout=None,\n",
" max_retries=2,\n",
" # other params...\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Invocation"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"messages = [\n",
" SystemMessage(\n",
" content=\"You are a helpful assistant that translates English to French.\"\n",
" ),\n",
" HumanMessage(\n",
" content=\"Translate this sentence from English to French. I love programming.\"\n",
" ),\n",
"]\n",
"ai_msg = llm.invoke(messages)\n",
"ai_msg"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"print(ai_msg.content)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Chaining\n",
"\n",
"We can [chain](../../how_to/sequence.ipynb) our model with a prompt template like so:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from langchain_core.prompts import ChatPromptTemplate\n",
"\n",
"prompt = ChatPromptTemplate.from_messages(\n",
" [\n",
" (\n",
" \"system\",\n",
" \"You are a helpful assistant that translates {input_language} to {output_language}.\",\n",
" ),\n",
" (\"human\", \"{input}\"),\n",
" ]\n",
")\n",
"\n",
"chain = prompt | llm\n",
"chain.invoke(\n",
" {\n",
" \"input_language\": \"English\",\n",
" \"output_language\": \"German\",\n",
" \"input\": \"I love programming.\",\n",
" }\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## API reference\n",
"\n",
"For detailed documentation of Novita AI LLM APIs, head to [Novita AI LLM API reference](https://novita.ai/docs/model-api/reference/llm/llm.html?utm_source=github_langchain&utm_medium=github_readme&utm_campaign=link)\n"
]
}
],
"metadata": {
"language_info": {
"name": "python"
}
},
"nbformat": 4,
"nbformat_minor": 2
}
34 changes: 34 additions & 0 deletions docs/docs/integrations/providers/novita.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
# Novita AI

>[Novita AI](https://novita.ai) is a generative AI inference platform to run and
> customize models with industry-leading speed and production-readiness.



## Installation and setup

- Get a Novita AI API key by signing up at [novita.ai](https://novita.ai).
- Authenticate by setting the NOVITA_API_KEY environment variable.

### Authentication

There are two ways to authenticate using your Novita API key:

1. Setting the `NOVITA_API_KEY` environment variable.

```python
os.environ["NOVITA_API_KEY"] = "<KEY>"
```

2. Setting `api_key` field in the Novita LLM module.

```python
llm = Novita(api_key="<KEY>")
```
## Chat models

See a [usage example](/docs/integrations/chat/novita).

```python
from langchain_community.chat_models import ChatNovita
```
6 changes: 6 additions & 0 deletions libs/community/langchain_community/chat_models/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -128,6 +128,9 @@
from langchain_community.chat_models.naver import (
ChatClovaX,
)
from langchain_community.chat_models.novita import (
ChatNovita,
)
from langchain_community.chat_models.oci_data_science import (
ChatOCIModelDeployment,
ChatOCIModelDeploymentTGI,
Expand Down Expand Up @@ -194,6 +197,7 @@
from langchain_community.chat_models.zhipuai import (
ChatZhipuAI,
)

__all__ = [
"AzureChatOpenAI",
"BedrockChat",
Expand Down Expand Up @@ -258,6 +262,7 @@
"SolarChat",
"VolcEngineMaasChat",
"ChatYi",
"ChatNovita",
]


Expand Down Expand Up @@ -325,6 +330,7 @@
"ChatPremAI": "langchain_community.chat_models.premai",
"ChatLlamaCpp": "langchain_community.chat_models.llamacpp",
"ChatYi": "langchain_community.chat_models.yi",
"ChatNovita": "langchain_community.chat_models.novita",
}


Expand Down
4 changes: 4 additions & 0 deletions libs/community/langchain_community/chat_models/litellm.py
Original file line number Diff line number Diff line change
Expand Up @@ -225,6 +225,7 @@ class ChatLiteLLM(BaseChatModel):
replicate_api_key: Optional[str] = None
cohere_api_key: Optional[str] = None
openrouter_api_key: Optional[str] = None
novita_api_key: Optional[str] = None
streaming: bool = False
api_base: Optional[str] = None
organization: Optional[str] = None
Expand Down Expand Up @@ -326,6 +327,9 @@ def validate_environment(cls, values: Dict) -> Dict:
values["together_ai_api_key"] = get_from_dict_or_env(
values, "together_ai_api_key", "TOGETHERAI_API_KEY", default=""
)
values["novita_api_key"] = get_from_dict_or_env(
values, "novita_api_key", "NOVITA_API_KEY", default=""
)
values["client"] = litellm

if values["temperature"] is not None and not 0 <= values["temperature"] <= 1:
Expand Down
65 changes: 65 additions & 0 deletions libs/community/langchain_community/chat_models/novita.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,65 @@
"""Wrapper around Novita chat models."""

from typing import Any, Dict

from langchain_core.utils import (
convert_to_secret_str,
get_from_dict_or_env,
)
from pydantic import Field, SecretStr, model_validator

from langchain_community.chat_models import ChatOpenAI

NOVITA_API_BASE = "https://api.novita.ai/v3/openai"


class ChatNovita(ChatOpenAI): # type: ignore[misc]
"""Novita AI LLM.

To use, you should have the ``openai`` python package installed, and the
environment variable ``NOVITA_API_KEY`` set with your API key.

Example:
.. code-block:: python

from langchain_community.chat_models import ChatNovita

chat = ChatNovita(model="gryphe/mythomax-l2-13b")
"""

novita_api_key: SecretStr = Field(default=None, alias="api_key")
model_name: str = Field(default="gryphe/mythomax-l2-13b", alias="model")

@model_validator(mode="before")
@classmethod
def validate_environment(cls, values: Dict) -> Any:
"""Validate that the environment is set up correctly."""
values["novita_api_key"] = convert_to_secret_str(
get_from_dict_or_env(
values,
["novita_api_key", "api_key", "openai_api_key"],
"NOVITA_API_KEY",
)
)

try:
import openai
except ImportError:
raise ImportError(
"Could not import openai python package. "
"Please install it with `pip install openai`."
)

client_params = {
"api_key": values["novita_api_key"].get_secret_value(),
"base_url": values.get("base_url", NOVITA_API_BASE),
}

if not values.get("client"):
values["client"] = openai.OpenAI(**client_params).chat.completions
if not values.get("async_client"):
values["async_client"] = openai.AsyncOpenAI(
**client_params
).chat.completions

return values
Loading
Loading