Skip to content

Commit

Permalink
Merge pull request #1 from mobiusml/improved_structure
Browse files Browse the repository at this point in the history
Project Structure Improvement
  • Loading branch information
movchan74 authored Jul 5, 2024
2 parents 67b1f05 + 2572c45 commit 6f36827
Show file tree
Hide file tree
Showing 19 changed files with 168 additions and 5 deletions.
1 change: 0 additions & 1 deletion .devcontainer/devcontainer.json
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,6 @@
"securityOpt": [
"seccomp=unconfined"
],
"postCreateCommand": "sh ${containerWorkspaceFolder}/install.sh --with=dev",
"postStartCommand": "git config --global --add safe.directory ${containerWorkspaceFolder}",
"customizations": {
"vscode": {
Expand Down
84 changes: 84 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@ This repository contains a template that you can use to start building your own
2. Give your repository a name and click on "Create repository". The name of the repository will also be the name of your application and the Python package.
3. Wait for the first workflow to finish. This will rename the package to match the repository name.
4. Clone the repository to your local machine and start building your application.
5. Change the [LICENSE](/LICENSE) file to match your project's license. The default license is the Apache License 2.0.

## Getting started

Expand All @@ -21,3 +22,86 @@ poetry install

See [Tutorial](https://github.com/mobiusml/aana_sdk/blob/main/docs/tutorial.md) for more information on how to build your application.

## Project structure

```
aana_app_project/
├── config/ | various configs, including settings, deployments and endpoints
│ ├── endpoints.py | list of endpoints to deploy
│ ├── deployments.py | list of deployments (models) to deploy
│ └── settings.py | app settings
├── core/ | core models and functionality
│ ├── models/ | data models
│ └── prompts/ | prompt templates for LLMs
├── deployments/ | custom deployments
├── endpoints/ | endpoint classes for the app
├── exceptions/ | custom exception classes
├── utils/ | various utility functionality
└── app.py | main application file
```


## Installation

To install the project, follow these steps:

1. Clone the repository.

2. Install additional libraries.

```bash
apt update && apt install -y libgl1
```
> **🗒️ Note**
>
> For optimal performance, you should also install [PyTorch](https://pytorch.org/get-started/locally/) version >=2.1 appropriate for your system. You can continue directly to the next step, but it will install a default version that may not make optimal use of your system's resources, for example, a GPU or even some SIMD operations. Therefore we recommend choosing your PyTorch package carefully and installing it manually.
> **🗒️ Note**
>
> Some models use Flash Attention. Install Flash Attention library for better performance. See [flash attention installation instructions](https://github.com/Dao-AILab/flash-attention?tab=readme-ov-file#installation-and-features) for more details and supported GPUs.
3. Install the package with poetry.

The project is managed with [Poetry](https://python-poetry.org/docs/). See the [Poetry installation instructions](https://python-poetry.org/docs/#installation) on how to install it on your system.

It will install the package and all dependencies in a virtual environment.

```bash
poetry install
```

4. Run the app.

```bash
aana deploy aana_app_project.app:aana_app
```

## Usage

To use the project, follow these steps:

1. Run the app as described in the installation section.

```bash
aana deploy aana_app_project.app:aana_app
```

Once the application is running, you will see the message `Deployed successfully.` in the logs. It will also show the URL for the API documentation.

> **⚠️ Warning**
>
> If the application is using GPU, make sure that the GPU is available and the application can access it.
>
> The applications will detect the available GPU automatically but you need to make sure that `CUDA_VISIBLE_DEVICES` is set correctly.
>
> Sometimes `CUDA_VISIBLE_DEVICES` is set to an empty string and the application will not be able to detect the GPU. Use `unset CUDA_VISIBLE_DEVICES` to unset the variable.
>
> You can also set the `CUDA_VISIBLE_DEVICES` environment variable to the GPU index you want to use: `export CUDA_VISIBLE_DEVICES=0`.
2. Send a POST request to the app.

For example, if your application has `/summary` endpoint that accepts videos, you can send a POST request like this:

```bash
curl -X POST http://127.0.0.1:8000/summary -Fbody='{"video":{"url":"https://www.youtube.com/watch?v=VhJFyyukAzA"}}'
```
4 changes: 2 additions & 2 deletions aana_app_project/app.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
from aana.sdk import AanaSDK

from aana_app_project.deployments import deployments
from aana_app_project.endpoints import endpoints
from aana_app_project.configs.deployments import deployments
from aana_app_project.configs.endpoints import endpoints

aana_app = AanaSDK(name="aana_app_project")

Expand Down
File renamed without changes.
25 changes: 25 additions & 0 deletions aana_app_project/configs/deployments.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
deployments: list[dict] = []

# Add deployments for models that you want to deploy here.
#
# For example:
# from aana.deployments.whisper_deployment import (
# WhisperComputeType,
# WhisperConfig,
# WhisperDeployment,
# WhisperModelSize,
# )
# asr_deployment = WhisperDeployment.options(
# num_replicas=1,
# ray_actor_options={"num_gpus": 0.1},
# user_config=WhisperConfig(
# model_size=WhisperModelSize.MEDIUM,
# compute_type=WhisperComputeType.FLOAT16,
# ).model_dump(mode="json"),
# )
# deployments.append({"name": "asr_deployment", "instance": asr_deployment})
#
# You can use predefined deployments from the Aana SDK or create your own.
# See https://github.com/mobiusml/aana_sdk/blob/main/docs/integrations.md for the list of predefined deployments.
#
# If you want to create your own deployment, put your deployment classes in a separate files in the `deployments` directory and import them here.
17 changes: 17 additions & 0 deletions aana_app_project/configs/endpoints.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
endpoints: list[dict] = []

# Add your endpoints here.
#
# For example:
# endpoints.append(
# {
# "name": "predict",
# "path": "/predict",
# "summary": "Predict the class of an image.",
# "endpoint_cls": PredictEndpoint,
# }
# )
#
# Endpoints can be created by inheriting from the `Endpoint` class.
# Put your endpoint classes in a separate files in the `endpoints` directory and import them here.
# See https://github.com/mobiusml/aana_sdk/tree/main?tab=readme-ov-file#endpoints in how to create endpoints.
13 changes: 13 additions & 0 deletions aana_app_project/configs/settings.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
from aana.configs.settings import Settings as AanaSettings


class Settings(AanaSettings):
"""A pydantic model for App settings."""
# Add your custom settings here
# Then, you can access them in your app like this:
# from aana_app_project.configs.settings import settings
# settings.custom_property
pass


settings = Settings()
File renamed without changes.
Empty file.
25 changes: 25 additions & 0 deletions aana_app_project/core/prompts/loader.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
from jinja2 import Environment, PackageLoader, Template


def get_prompt_template(name: str) -> Template:
"""Load a prompt template by name.
Use this function to load a prompt templates for LLMs:
```python
from aana_app_project.core.prompts.loader import get_prompt_template
template = get_prompt_template("test")
prompt = template.render(your_variable="your_value")
```
Args:
name (str): The name of the prompt template.
Returns:
Template: The prompt template.
"""
env = Environment(loader=PackageLoader(
"aana_app_project", "core", "prompts"))
template = env.get_template(f"{name}.j2")
return template
1 change: 1 addition & 0 deletions aana_app_project/core/prompts/test.j2
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Define your prompts for LLMs here. Use jinja2 templating to include variables like {{ your_variable }}.
1 change: 0 additions & 1 deletion aana_app_project/deployments.py

This file was deleted.

Empty file.
1 change: 0 additions & 1 deletion aana_app_project/endpoints.py

This file was deleted.

Empty file.
1 change: 1 addition & 0 deletions aana_app_project/exceptions/core.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
from aana.exceptions.core import BaseException
Empty file.
Empty file.
Empty file added aana_app_project/utils/core.py
Empty file.

0 comments on commit 6f36827

Please sign in to comment.