-
Notifications
You must be signed in to change notification settings - Fork 5
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge branch 'main' into 19-featdocument-search-add-evaluation-pipeli…
…ne-for-retrieval-accuracy
- Loading branch information
Showing
13 changed files
with
477 additions
and
36 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,28 @@ | ||
# Installation | ||
|
||
## Build from source | ||
|
||
To build and run Ragbits from the source code: | ||
|
||
1. Requirements: [**uv**](https://docs.astral.sh/uv/getting-started/installation/) & [**python**](https://docs.astral.sh/uv/guides/install-python/) 3.10 or higher | ||
2. Install dependencies and run venv in editable mode: | ||
|
||
```bash | ||
$ source ./setup_dev_env.sh | ||
``` | ||
|
||
## Install pre-commit | ||
|
||
To ensure code quality we use pre-commit hook with several checks. Setup it by: | ||
|
||
``` | ||
pre-commit install | ||
``` | ||
|
||
All updated files will be reformatted and linted before the commit. | ||
|
||
To reformat and lint all files in the project, use: | ||
|
||
`pre-commit run --all-files` | ||
|
||
The used linters are configured in `.pre-commit-config.yaml`. You can use `pre-commit autoupdate` to bump tools to the latest versions. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,32 +1,90 @@ | ||
# Ragbits | ||
<div align="center"> | ||
|
||
Repository for internal experiment with our upcoming LLM framework. | ||
<h1>Ragbits</h1> | ||
|
||
# Installation | ||
*Building blocks for rapid development of GenAI applications* | ||
|
||
## Build from source | ||
[Documentation](https://ragbits.deepsense.ai) | [Contact](https://deepsense.ai/contact/) | ||
|
||
To build and run Ragbits from the source code: | ||
[![PyPI - License](https://img.shields.io/pypi/l/ragbits)](https://pypi.org/project/ragbits) | ||
[![PyPI - Version](https://img.shields.io/pypi/v/ragbits)](https://pypi.org/project/ragbits) | ||
[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/ragbits)](https://pypi.org/project/ragbits) | ||
|
||
1. Requirements: [**uv**](https://docs.astral.sh/uv/getting-started/installation/) & [**python**](https://docs.astral.sh/uv/guides/install-python/) 3.10 or higher | ||
2. Install dependencies and run venv in editable mode: | ||
</div> | ||
|
||
```bash | ||
$ source ./setup_dev_env.sh | ||
--- | ||
|
||
## What's Included? | ||
|
||
- [X] **[Core](packages/ragbits-core)** - Fundamental tools for working with prompts and LLMs. | ||
- [X] **[Document Search](packages/ragbits-document-search)** - Handles vector search to retrieve relevant documents. | ||
- [X] **[CLI](packages/ragbits-cli)** - The `ragbits` shell command, enabling tools such as GUI prompt management. | ||
- [ ] **Flow Controls** - Manages multi-stage chat flows for performing advanced actions *(coming soon)*. | ||
- [ ] **Structured Querying** - Queries structured data sources in a predictable manner *(coming soon)*. | ||
- [ ] **Caching** - Adds a caching layer to reduce costs and response times *(coming soon)*. | ||
- [ ] **Observability & Audit** - Tracks user queries and events for easier troubleshooting *(coming soon)*. | ||
- [ ] **Guardrails** - Ensures response safety and relevance *(coming soon)*. | ||
|
||
## Installation | ||
|
||
To use the complete Ragbits stack, install the `ragbits` package: | ||
|
||
```sh | ||
pip install ragbits | ||
``` | ||
|
||
Alternatively, you can use individual components of the stack by installing their respective packages: `ragbits-core`, `ragbits-document-search`, `ragbits-cli`. | ||
|
||
## Quickstart | ||
|
||
First, create a prompt and a model for the data used in the prompt: | ||
|
||
```python | ||
from pydantic import BaseModel | ||
from ragbits.core.prompt import Prompt | ||
|
||
class Dog(BaseModel): | ||
breed: str | ||
age: int | ||
temperament: str | ||
|
||
class DogNamePrompt(Prompt[Dog, str]): | ||
system_prompt = """ | ||
You are a dog name generator. You come up with funny names for dogs given the dog details. | ||
""" | ||
|
||
user_prompt = """ | ||
The dog is a {breed} breed, {age} years old, and has a {temperament} temperament. | ||
""" | ||
``` | ||
|
||
## Install pre-commit | ||
Next, create an instance of the LLM and the prompt: | ||
|
||
To ensure code quality we use pre-commit hook with several checks. Setup it by: | ||
```python | ||
from ragbits.core.llms.litellm import LiteLLM | ||
|
||
llm = LiteLLM("gpt-4o") | ||
example_dog = Dog(breed="Golden Retriever", age=3, temperament="friendly") | ||
prompt = DogNamePrompt(example_dog) | ||
``` | ||
pre-commit install | ||
|
||
Finally, generate a response from the LLM using the prompt: | ||
|
||
```python | ||
response = await llm.generate(prompt) | ||
print(f"Generated dog name: {response}") | ||
``` | ||
|
||
All updated files will be reformatted and linted before the commit. | ||
<!-- | ||
TODO: | ||
Add links to quickstart guides for individual packages, demonstrating how to extend this with their functionality. | ||
Add a link to the full tutorial. | ||
--> | ||
|
||
## License | ||
|
||
To reformat and lint all files in the project, use: | ||
Ragbits is licensed under the [MIT License](LICENSE). | ||
|
||
`pre-commit run --all-files` | ||
## Contributing | ||
|
||
The used linters are configured in `.pre-commit-config.yaml`. You can use `pre-commit autoupdate` to bump tools to the latest versions. | ||
We welcome contributions! Please read [CONTRIBUTING.md](CONTRIBUTING.md) for more information. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,60 @@ | ||
import importlib | ||
|
||
from ragbits.core.config import core_config | ||
from ragbits.core.llms.base import LLM | ||
from ragbits.core.llms.litellm import LiteLLM | ||
|
||
|
||
def get_llm_from_factory(factory_path: str) -> LLM: | ||
""" | ||
Get an instance of an LLM using a factory function specified by the user. | ||
Args: | ||
factory_path (str): The path to the factory function. | ||
Returns: | ||
LLM: An instance of the LLM. | ||
""" | ||
module_name, function_name = factory_path.rsplit(".", 1) | ||
module = importlib.import_module(module_name) | ||
function = getattr(module, function_name) | ||
return function() | ||
|
||
|
||
def has_default_llm() -> bool: | ||
""" | ||
Check if the default LLM factory is set in the configuration. | ||
Returns: | ||
bool: Whether the default LLM factory is set. | ||
""" | ||
return core_config.default_llm_factory is not None | ||
|
||
|
||
def get_default_llm() -> LLM: | ||
""" | ||
Get an instance of the default LLM using the factory function | ||
specified in the configuration. | ||
Returns: | ||
LLM: An instance of the default LLM. | ||
Raises: | ||
ValueError: If the default LLM factory is not set. | ||
""" | ||
factory = core_config.default_llm_factory | ||
if factory is None: | ||
raise ValueError("Default LLM factory is not set") | ||
|
||
return get_llm_from_factory(factory) | ||
|
||
|
||
def simple_litellm_factory() -> LLM: | ||
""" | ||
A basic LLM factory that creates an LiteLLM instance with the default model, | ||
default options, and assumes that the API key is set in the environment. | ||
Returns: | ||
LLM: An instance of the LiteLLM. | ||
""" | ||
return LiteLLM() |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,5 @@ | ||
import sys | ||
from pathlib import Path | ||
|
||
# Add "llms" to sys.path | ||
sys.path.append(str(Path(__file__).parent.parent)) |
15 changes: 15 additions & 0 deletions
15
packages/ragbits-core/tests/unit/llms/factory/test_get_default_llm.py
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,15 @@ | ||
from ragbits.core.config import core_config | ||
from ragbits.core.llms.factory import get_default_llm | ||
from ragbits.core.llms.litellm import LiteLLM | ||
|
||
|
||
def test_get_default_llm(monkeypatch): | ||
""" | ||
Test the get_llm_from_factory function. | ||
""" | ||
|
||
monkeypatch.setattr(core_config, "default_llm_factory", "factory.test_get_llm_from_factory.mock_llm_factory") | ||
|
||
llm = get_default_llm() | ||
assert isinstance(llm, LiteLLM) | ||
assert llm.model_name == "mock_model" |
22 changes: 22 additions & 0 deletions
22
packages/ragbits-core/tests/unit/llms/factory/test_get_llm_from_factory.py
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,22 @@ | ||
from ragbits.core.llms.factory import get_llm_from_factory | ||
from ragbits.core.llms.litellm import LiteLLM | ||
|
||
|
||
def mock_llm_factory() -> LiteLLM: | ||
""" | ||
A mock LLM factory that creates a LiteLLM instance with a mock model name. | ||
Returns: | ||
LiteLLM: An instance of the LiteLLM. | ||
""" | ||
return LiteLLM(model_name="mock_model") | ||
|
||
|
||
def test_get_llm_from_factory(): | ||
""" | ||
Test the get_llm_from_factory function. | ||
""" | ||
llm = get_llm_from_factory("factory.test_get_llm_from_factory.mock_llm_factory") | ||
|
||
assert isinstance(llm, LiteLLM) | ||
assert llm.model_name == "mock_model" |
20 changes: 20 additions & 0 deletions
20
packages/ragbits-core/tests/unit/llms/factory/test_has_default_llm.py
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,20 @@ | ||
from ragbits.core.config import core_config | ||
from ragbits.core.llms.factory import has_default_llm | ||
|
||
|
||
def test_has_default_llm(monkeypatch): | ||
""" | ||
Test the has_default_llm function when the default LLM factory is not set. | ||
""" | ||
monkeypatch.setattr(core_config, "default_llm_factory", None) | ||
|
||
assert has_default_llm() is False | ||
|
||
|
||
def test_has_default_llm_false(monkeypatch): | ||
""" | ||
Test the has_default_llm function when the default LLM factory is set. | ||
""" | ||
monkeypatch.setattr(core_config, "default_llm_factory", "my_project.llms.get_llm") | ||
|
||
assert has_default_llm() is True |
Oops, something went wrong.