Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: switch to uv packaging #10

Merged
merged 11 commits into from
Sep 17, 2024
Merged
Show file tree
Hide file tree
Changes from 8 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
45 changes: 29 additions & 16 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,14 @@ jobs:
contents: read
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v4

- name: Install uv
uses: astral-sh/setup-uv@v2
with:
version: "0.4.10"

- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: "3.10"

Expand All @@ -26,11 +33,13 @@ jobs:
path: ~/.cache/pre-commit
key: pre-commit-3|${{ env.pythonLocation }}|${{ hashFiles('.pre-commit-config.yaml') }}

- name: Install pre-commit
run: pip3 install pre-commit
- name: Install Dependencies
run: source ./setup_dev_env.sh

- name: Run pre-commit checks
run: pre-commit run --all-files --show-diff-on-failure --color always
run: |
source .venv/bin/activate
pre-commit run --all-files --show-diff-on-failure --color always

- name: Run Trivy vulnerability scanner
uses: aquasecurity/trivy-action@master
Expand All @@ -57,17 +66,14 @@ jobs:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
comment_tag: trivy

- name: Create venv
run: . ./setup_dev_env.sh

- name: Check licenses
run: |
source venv/bin/activate
source .venv/bin/activate
./check_licenses.sh

- name: Generate pip freeze
run: |
source venv/bin/activate
source .venv/bin/activate
pip freeze > requirements-freeze.txt

- name: Publish Artefacts
Expand All @@ -93,9 +99,10 @@ jobs:

- name: Validate package build
run: |
source venv/bin/activate
python -m pip install -U build
for dir in packages/*/; do python -m build "$dir"; done
source .venv/bin/activate
uv build packages/ragbits-core --out-dir dist
micpst marked this conversation as resolved.
Show resolved Hide resolved
uv build packages/ragbits-dev-kit --out-dir dist
uv build packages/ragbits-document-search --out-dir dist

- name: Publish Package
uses: actions/upload-artifact@v3
Expand All @@ -121,6 +128,12 @@ jobs:
- python-version: "3.10"
steps:
- uses: actions/checkout@v4

- name: Install uv
uses: astral-sh/setup-uv@v2
with:
version: "0.4.10"

- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
with:
Expand All @@ -129,18 +142,18 @@ jobs:
- name: Cache Dependencies
uses: actions/cache@v3
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ hashFiles('**/requirements-dev.txt') }}-${{ hashFiles('**/setup.cfg') }}-${{ hashFiles('**/pyproject.toml') }}
path: ~/.cache/uv
key: ${{ runner.os }}-pip-${{ hashFiles('**/pyproject.toml') }}
restore-keys: |
${{ runner.os }}-pip-

- name: Install Dependencies
run: . ./setup_dev_env.sh
run: source ./setup_dev_env.sh

- name: Run Tests With Coverage
run: |
# run with coverage to not execute tests twice
source venv/bin/activate
source .venv/bin/activate
coverage run -m pytest -v -p no:warnings --junitxml=report.xml
coverage report
coverage xml
Expand Down
7 changes: 3 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# ragbits
# Ragbits

Repository for internal experiment with our upcoming LLM framework.

Expand All @@ -12,11 +12,10 @@ To start, you need to setup your local machine.
You need to setup virtual environment, simplest way is to run from project root directory:

```bash
$ . ./setup_dev_env.sh
$ source venv/bin/activate
$ source ./setup_dev_env.sh
```

This will create a new venv and install all packages from this repository in editable mode.
It will also intall their dependencies and the dev dependencies from `requirements-dev.txt`.

## Install pre-commit

Expand Down
2 changes: 1 addition & 1 deletion check_licenses.sh
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
#!/bin/bash
set -e

. venv/bin/activate
source .venv/bin/activate

pip-licenses --from=mixed --ignore-packages `cat .libraries-whitelist.txt`> licenses.txt
micpst marked this conversation as resolved.
Show resolved Hide resolved
cat licenses.txt
Expand Down
3 changes: 0 additions & 3 deletions packages/ragbits-common/pyproject.toml

This file was deleted.

53 changes: 0 additions & 53 deletions packages/ragbits-common/setup.cfg

This file was deleted.

3 changes: 0 additions & 3 deletions packages/ragbits-common/src/ragbits/common/prompt/__init__.py

This file was deleted.

1 change: 1 addition & 0 deletions packages/ragbits-core/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
# Ragbits Core
Original file line number Diff line number Diff line change
@@ -1,9 +1,15 @@
# /// script
# requires-python = ">=3.10"
# dependencies = [
# "ragbits[litellm]",
# ]
# ///
import asyncio

from pydantic import BaseModel

from ragbits.common.llms.litellm import LiteLLM
from ragbits.common.prompt import Prompt
from ragbits.core.llms.litellm import LiteLLM
from ragbits.core.prompt import Prompt


class LoremPromptInput(BaseModel):
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,12 @@
# /// script
# requires-python = ">=3.10"
# dependencies = [
# "ragbits",
# ]
# ///
from pydantic import BaseModel

from ragbits.common.prompt import Prompt
from ragbits.core.prompt import Prompt


class LoremPromptInput(BaseModel):
Expand Down
69 changes: 69 additions & 0 deletions packages/ragbits-core/pyproject.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,69 @@
[project]
name = "ragbits"
version = "0.1.0"
description = "Building blocks for rapid development of GenAI applications"
readme = "README.md"
requires-python = ">=3.10"
license = "MIT"
authors = [
{ name = "deepsense.ai", email = "[email protected]"}
]
keywords = [
"Retrieval Augmented Generation",
"RAG",
"Large Language Models",
"LLMs",
"Generative AI",
"GenAI",
"Prompt Management"
]
classifiers = [
"Development Status :: 1 - Planning",
"Environment :: Console",
"Intended Audience :: Science/Research",
"License :: OSI Approved :: MIT License",
"Natural Language :: English",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Topic :: Scientific/Engineering :: Artificial Intelligence",
"Topic :: Software Development :: Libraries :: Python Modules",
"Private :: Do Not Upload"
]
dependencies = [
"jinja2>=3.1.4",
"pydantic>=2.9.1"
]

[project.optional-dependencies]
litellm = [
"litellm~=1.46.0",
]
local = [
"torch~=2.2.1",
"transformers~=4.44.2",
"numpy~=1.24.0"
]

[tool.uv]
dev-dependencies = [
"pre-commit~=3.8.0",
"pytest~=8.3.3",
"pytest-cov~=5.0.0",
"pytest-asyncio~=0.24.0",
"pip-licenses>=4.0.0,<5.0.0"
]

[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"

[tool.hatch.metadata]
allow-direct-references = true

[tool.hatch.build.targets.wheel]
packages = ["src/ragbits"]

[tool.pytest.ini_options]
asyncio_mode = "auto"
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,8 @@
except ImportError:
HAS_LITELLM = False

from ragbits.common.embeddings.base import Embeddings
from ragbits.common.embeddings.exceptions import EmbeddingConnectionError, EmbeddingResponseError, EmbeddingStatusError
from ragbits.core.embeddings.base import Embeddings
from ragbits.core.embeddings.exceptions import EmbeddingConnectionError, EmbeddingResponseError, EmbeddingStatusError


class LiteLLMEmbeddings(Embeddings):
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
from functools import cached_property
from typing import Generic, Optional, Type, cast, overload

from ragbits.common.prompt.base import BasePrompt, BasePromptWithParser, OutputT
from ragbits.core.prompt.base import BasePrompt, BasePromptWithParser, OutputT

from .clients.base import LLMClient, LLMClientOptions, LLMOptions

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

from pydantic import BaseModel

from ragbits.common.prompt import ChatFormat
from ragbits.core.prompt import ChatFormat

from ..types import NotGiven

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
HAS_LITELLM = False


from ragbits.common.prompt import ChatFormat
from ragbits.core.prompt import ChatFormat

from ..types import NOT_GIVEN, NotGiven
from .base import LLMClient, LLMOptions
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
except ImportError:
HAS_LOCAL_LLM = False

from ragbits.common.prompt import ChatFormat
from ragbits.core.prompt import ChatFormat

from ..types import NOT_GIVEN, NotGiven
from .base import LLMClient, LLMOptions
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
except ImportError:
HAS_LITELLM = False

from ragbits.common.prompt.base import BasePrompt
from ragbits.core.prompt.base import BasePrompt

from .base import LLM
from .clients.litellm import LiteLLMClient, LiteLLMOptions
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
except ImportError:
HAS_LOCAL_LLM = False

from ragbits.common.prompt.base import BasePrompt
from ragbits.core.prompt.base import BasePrompt

from .base import LLM
from .clients.local import LocalLLMClient, LocalLLMOptions
Expand Down
3 changes: 3 additions & 0 deletions packages/ragbits-core/src/ragbits/core/prompt/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
from ragbits.core.prompt.prompt import ChatFormat, Prompt

__all__ = ["Prompt", "ChatFormat"]
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
from pydantic import BaseModel

from ragbits.common.llms.clients.litellm import LiteLLMOptions
from ragbits.common.llms.litellm import LiteLLM
from ragbits.common.prompt import Prompt
from ragbits.common.prompt.base import BasePrompt, BasePromptWithParser, ChatFormat
from ragbits.core.llms.clients.litellm import LiteLLMOptions
from ragbits.core.llms.litellm import LiteLLM
from ragbits.core.prompt import Prompt
from ragbits.core.prompt.base import BasePrompt, BasePromptWithParser, ChatFormat


class MockPrompt(BasePrompt):
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,8 @@

import pytest

from ragbits.common.prompt import Prompt
from ragbits.common.prompt.parsers import ResponseParsingError
from ragbits.core.prompt import Prompt
from ragbits.core.prompt.parsers import ResponseParsingError

from .test_prompt import _PromptOutput

Expand Down
Loading
Loading