From 214274ac3583b5150d6c8e0364f83c679d73e91a Mon Sep 17 00:00:00 2001 From: parkervg Date: Mon, 13 May 2024 17:31:16 -0400 Subject: [PATCH] documentation --- docs/index.md | 16 +++++++++------- docs/reference/blenders/llamacpp.md | 15 +++++++++++++++ 2 files changed, 24 insertions(+), 7 deletions(-) create mode 100644 docs/reference/blenders/llamacpp.md diff --git a/docs/index.md b/docs/index.md index 9e06ff4a..a59c283d 100644 --- a/docs/index.md +++ b/docs/index.md @@ -25,6 +25,15 @@ pip install blendsql +### Features +- Supports many DBMS 💾 + - Currently, SQLite and PostgreSQL are functional - more to come! +- Easily extendable to [multi-modal usecases](reference/examples/vqa-ingredient) 🖼️ +- Smart parsing optimizes what is passed to external functions 🧠 + - Traverses abstract syntax tree with [sqlglot](https://github.com/tobymao/sqlglot) to minimize LLM function calls 🌳 +- Constrained decoding with [guidance](https://github.com/guidance-ai/guidance) 🚀 +- LLM function caching, built on [diskcache](https://grantjenks.com/docs/diskcache/) 🔑 + BlendSQL is a *superset of SQLite* for problem decomposition and hybrid question-answering with LLMs. As a result, we can *Blend* together... @@ -99,13 +108,6 @@ SELECT date, rival, score, documents.content AS "Team Description" FROM w }} WHERE rival = 'nsw waratahs' ``` -### Features -- Smart parsing optimizes what is passed to external functions 🧠 - - Traverses abstract syntax tree with [sqlglot](https://github.com/tobymao/sqlglot) to minimize LLM function calls 🌳 -- LLM function caching, built on [diskcache](https://grantjenks.com/docs/diskcache/) 🔑 -- Constrained decoding with [guidance](https://github.com/guidance-ai/guidance) 🚀 - - For a technical walkthrough of how a BlendSQL query is executed, check out [technical_walkthrough.md](reference/technical_walkthrough.md). ### Citation diff --git a/docs/reference/blenders/llamacpp.md b/docs/reference/blenders/llamacpp.md new file mode 100644 index 00000000..e7966eba --- /dev/null +++ b/docs/reference/blenders/llamacpp.md @@ -0,0 +1,15 @@ +# Llama-Cpp + +## LlamaCppLLM + +::: blendsql.models.local._llama_cpp.LlamaCppLLM + handler: python + show_source: false + +### Example Usage + +```python +from blendsql.models import LlamaCppLLM + +blender = LlamaCppLLM("./tinyllama-1.1b-1t-openorca.Q4_0.gguf") +``` \ No newline at end of file