Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(core): install litellm by default #236

Merged
merged 5 commits into from
Dec 5, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/prepare_release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -61,4 +61,4 @@ jobs:
gh pr create -B main --title "$COMMIT_MESSAGE" \
--body 'Update ${{ github.event.inputs.packageName }} version from ${{ steps.packages_update.outputs.old_version }} to ${{ steps.packages_update.outputs.new_version }}'
env:
GH_TOKEN: ${{ secrets.GH_TOKEN }}
GH_TOKEN: ${{ secrets.GH_TOKEN }}
2 changes: 1 addition & 1 deletion .github/workflows/push_release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -48,4 +48,4 @@ jobs:
uv tool run twine upload dist/*
env:
TWINE_USERNAME: __token__
TWINE_PASSWORD: ${{ secrets.PYPI_TOKEN }}
TWINE_PASSWORD: ${{ secrets.PYPI_TOKEN }}
4 changes: 2 additions & 2 deletions docs/how-to/document_search/distributed_ingestion.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ job_id = client.submit_job(
runtime_env={
"working_dir": "./",
"pip": [
"ragbits-core[litellm]",
"ragbits-core",
"ragbits-document-search[distributed]"
]
},
Expand All @@ -62,7 +62,7 @@ Ray Jobs is also available as CLI commands. You can submit a job using the follo
```bash
ray job submit \
--address http://<cluster_address>:8265 \
--runtime-env '{"pip": ["ragbits-core[litellm]", "ragbits-document-search[distributed]"]}'\
--runtime-env '{"pip": ["ragbits-core", "ragbits-document-search[distributed]"]}'\
--working-dir . \
-- python script.py
```
Expand Down
4 changes: 2 additions & 2 deletions docs/quickstart/quickstart1_prompts.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,10 +7,10 @@ In this Quickstart guide, you will learn how to define a dynamic prompt in Ragbi
To install Ragbits, run the following command in your terminal:

```bash
pip install ragbits[litellm]
pip install ragbits
```

This command will install all the popular Ragbits packages, along with [LiteLLM](https://docs.litellm.ai/docs/), which we will use in this guide for communicating with LLM APIs.
This command will install all the popular Ragbits packages.

## Defining a Static Prompt
The most standard way to define a prompt in Ragbits is to create a class that inherits from the `Prompt` class and configure it by setting values for appropriate properties. Here is an example of a simple prompt that asks the model to write a song about Ragbits:
Expand Down
2 changes: 1 addition & 1 deletion examples/apps/documents_chat.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
# dependencies = [
# "gradio",
# "ragbits-document-search",
# "ragbits-core[chroma,litellm]",
# "ragbits-core[chroma]",
# ]
# ///
from collections.abc import AsyncIterator
Expand Down
2 changes: 1 addition & 1 deletion examples/core/llm.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# /// script
# requires-python = ">=3.10"
# dependencies = [
# "ragbits-core[litellm]",
# "ragbits-core",
# ]
# ///
import asyncio
Expand Down
2 changes: 1 addition & 1 deletion examples/document-search/basic.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@
# requires-python = ">=3.10"
# dependencies = [
# "ragbits-document-search",
# "ragbits-core[litellm]",
# "ragbits-core",
# ]
# ///

Expand Down
2 changes: 1 addition & 1 deletion examples/document-search/chroma.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@
# requires-python = ">=3.10"
# dependencies = [
# "ragbits-document-search",
# "ragbits-core[chroma,litellm]",
# "ragbits-core[chroma]",
# ]
# ///

Expand Down
2 changes: 1 addition & 1 deletion examples/document-search/chroma_otel.py
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@
# requires-python = ">=3.10"
# dependencies = [
# "ragbits-document-search",
# "ragbits-core[chroma,litellm,otel]",
# "ragbits-core[chroma,otel]",
# ]
# ///

Expand Down
2 changes: 1 addition & 1 deletion examples/document-search/distributed.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@
# requires-python = ">=3.10"
# dependencies = [
# "ragbits-document-search[distributed]",
# "ragbits-core[litellm]",
# "ragbits-core",
# ]
# ///

Expand Down
2 changes: 1 addition & 1 deletion examples/document-search/from_config.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ class to rephrase the query.
# requires-python = ">=3.10"
# dependencies = [
# "ragbits-document-search",
# "ragbits-core[chroma,litellm]",
# "ragbits-core[chroma]",
# ]
# ///

Expand Down
2 changes: 1 addition & 1 deletion examples/document-search/multimodal.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@
# requires-python = ">=3.10"
# dependencies = [
# "ragbits-document-search",
# "ragbits-core[litellm]",
# "ragbits-core",
# ]
# ///
import asyncio
Expand Down
2 changes: 1 addition & 1 deletion examples/document-search/qdrant.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@
# requires-python = ">=3.10"
# dependencies = [
# "ragbits-document-search",
# "ragbits-core[litellm,qdrant]",
# "ragbits-core[qdrant]",
# ]
# ///

Expand Down
2 changes: 1 addition & 1 deletion examples/evaluation/document-search/evaluate.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
# dependencies = [
# "ragbits-document-search",
# "ragbits-evaluate[relari]",
# "ragbits-core[litellm,chroma]",
# "ragbits-core[chroma]",
# ]
# ///
import asyncio
Expand Down
2 changes: 1 addition & 1 deletion examples/evaluation/document-search/ingest.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
# requires-python = ">=3.10"
# dependencies = [
# "ragbits-document-search[huggingface]",
# "ragbits-core[litellm,chroma]",
# "ragbits-core[chroma]",
# "hydra-core~=1.3.2",
# "unstructured[md]>=0.15.13",
# ]
Expand Down
4 changes: 1 addition & 3 deletions packages/ragbits-core/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@ dependencies = [
"pydantic>=2.9.1",
"typer~=0.12.5",
"tomli~=2.0.2",
"litellm~=1.46.0",
]

[project.urls]
Expand All @@ -48,9 +49,6 @@ dependencies = [
chroma = [
"chromadb~=0.4.24",
]
litellm = [
"litellm~=1.46.0",
]
local = [
"torch~=2.2.1",
"transformers~=4.44.2",
Expand Down
13 changes: 1 addition & 12 deletions packages/ragbits-core/src/ragbits/core/embeddings/litellm.py
Original file line number Diff line number Diff line change
@@ -1,9 +1,4 @@
try:
import litellm

HAS_LITELLM = True
except ImportError:
HAS_LITELLM = False
import litellm

from ragbits.core.audit import trace
from ragbits.core.embeddings import Embeddings
Expand Down Expand Up @@ -40,13 +35,7 @@ def __init__(
for more information, follow the instructions for your specific vendor in the\
[LiteLLM documentation](https://docs.litellm.ai/docs/embedding/supported_embedding).
api_version: The API version for the call.
Raises:
ImportError: If the 'litellm' extra requirements are not installed.
"""
if not HAS_LITELLM:
raise ImportError("You need to install the 'litellm' extra requirements to use LiteLLM embeddings models")

mhordynski marked this conversation as resolved.
Show resolved Hide resolved
super().__init__()
self.model = model
self.options = options or {}
Expand Down
3 changes: 2 additions & 1 deletion packages/ragbits-core/src/ragbits/core/llms/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,9 @@
from ragbits.core.utils.config_handling import get_cls_from_config

from .base import LLM
from .litellm import LiteLLM

__all__ = ["LLM"]
__all__ = ["LLM", "LiteLLM"]

module = sys.modules[__name__]

Expand Down
19 changes: 3 additions & 16 deletions packages/ragbits-core/src/ragbits/core/llms/clients/litellm.py
Original file line number Diff line number Diff line change
@@ -1,17 +1,10 @@
from collections.abc import AsyncGenerator
from dataclasses import dataclass

import litellm
from litellm.utils import CustomStreamWrapper, ModelResponse
from pydantic import BaseModel

try:
import litellm
from litellm.utils import CustomStreamWrapper, ModelResponse

HAS_LITELLM = True
except ImportError:
HAS_LITELLM = False


from ragbits.core.audit import trace
from ragbits.core.prompt import ChatFormat

Expand Down Expand Up @@ -64,13 +57,7 @@ def __init__(
api_key: API key used to authenticate with the LLM API.
api_version: API version of the LLM API.
use_structured_output: Whether to request a structured output from the model. Default is False.

Raises:
ImportError: If the 'litellm' extra requirements are not installed.
"""
if not HAS_LITELLM:
raise ImportError("You need to install the 'litellm' extra requirements to use LiteLLM models")
mhordynski marked this conversation as resolved.
Show resolved Hide resolved

super().__init__(model_name)
self.base_url = base_url
self.api_key = api_key
Expand Down Expand Up @@ -181,7 +168,7 @@ async def _get_litellm_response(
options: LiteLLMOptions,
response_format: type[BaseModel] | dict | None,
stream: bool = False,
) -> "ModelResponse | CustomStreamWrapper":
) -> ModelResponse | CustomStreamWrapper:
try:
response = await litellm.acompletion(
messages=conversation,
Expand Down
13 changes: 1 addition & 12 deletions packages/ragbits-core/src/ragbits/core/llms/litellm.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,12 +2,7 @@
import warnings
from functools import cached_property

try:
import litellm

HAS_LITELLM = True
except ImportError:
HAS_LITELLM = False
import litellm

from ragbits.core.prompt.base import BasePrompt, ChatFormat

Expand Down Expand Up @@ -47,13 +42,7 @@ def __init__(
use_structured_output: Whether to request a
[structured output](https://docs.litellm.ai/docs/completion/json_mode#pass-in-json_schema)
from the model. Default is False. Can only be combined with models that support structured output.

Raises:
ImportError: If the 'litellm' extra requirements are not installed.
"""
if not HAS_LITELLM:
raise ImportError("You need to install the 'litellm' extra requirements to use LiteLLM models")
mhordynski marked this conversation as resolved.
Show resolved Hide resolved

super().__init__(model_name, default_options)
self.base_url = base_url
self.api_key = api_key
Expand Down
2 changes: 1 addition & 1 deletion packages/ragbits-document-search/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ dev-dependencies = [
"pytest-cov~=5.0.0",
"pytest-asyncio~=0.24.0",
"pip-licenses>=4.0.0,<5.0.0",
"ragbits[litellm,local]"
"ragbits[local]"
]

[tool.uv.sources]
Expand Down
3 changes: 0 additions & 3 deletions packages/ragbits/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -52,9 +52,6 @@ gcs = [
lab = [
"gradio~=4.44.0",
]
litellm = [
"litellm~=1.46.0",
]
local = [
"torch~=2.2.1",
"transformers~=4.44.2",
Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ readme = "README.md"
requires-python = ">=3.10"
dependencies = [
"ragbits-cli",
"ragbits-core[chroma,lab,litellm,local,otel,qdrant]",
"ragbits-core[chroma,lab,local,otel,qdrant]",
"ragbits-document-search[gcs,huggingface,distributed]",
"ragbits-evaluate[relari]",
"ragbits-guardrails[openai]",
Expand Down
16 changes: 7 additions & 9 deletions uv.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Loading