Skip to content

Latest commit

 

History

History
64 lines (39 loc) · 1.98 KB

README.md

File metadata and controls

64 lines (39 loc) · 1.98 KB

Dockerize Llamafile

This repository provides a containerized version of the LlamaFile to making it easy to deploy and manage. It also includes an example Controller Service built with FastAPI, which demonstrates how to handle incoming logic processed by LlamaFile.

Features

  • LlamaFile: The main application, ready for containerized deployment.
  • Controller Service: A FastAPI-based example service to handle incoming logic and integrate seamlessly with LlamaFile.
  • Docker Support: Simplifies deployment using Docker Compose.
  • Environment Configuration: Configurable using .env for easy customization.

Getting Started

Prerequisites

Ensure you have the following installed:

Installation

  1. Clone the repository:

    git clone https://github.com/hfahrudin/Dockerize-Llamafile.git
    cd Dockerize-Llamafile
  2. Configure environment variables:

    • Create a .env file in the root directory.
  3. Build and start the services:

    docker-compose up --build
  4. Access the Controller Service at http://localhost:8000 (default).

Example Request

Here’s an example request to the Controller Service:

curl -X GET "http://localhost:8000/health" \
     -H "Content-Type: application/json" \

Customization

  • LlamaFile Configuration: Update the application in the llamafile/ directory as needed.
  • Controller Logic: Modify the FastAPI code in the controller/ directory to implement custom logic.
  • Docker Compose: Adjust the docker-compose.yml file for your infrastructure requirements.

Contributing

Contributions are welcome! Please fork the repository and submit a pull request with your changes.

License

This project is licensed under the MIT License.