diff --git a/docs/quickstart/quickstart2_cli.md b/docs/how-to/prompts_lab.md similarity index 58% rename from docs/quickstart/quickstart2_cli.md rename to docs/how-to/prompts_lab.md index 878becf6..4b6ebc1e 100644 --- a/docs/quickstart/quickstart2_cli.md +++ b/docs/how-to/prompts_lab.md @@ -1,18 +1,19 @@ -# Quickstart 2: Working with Prompts from the Command Line +# How to Manage Prompts using GUI with Prompts Lab -In the [previous chapter](quickstart1_prompts.md), you learned how to define a prompt in Ragbits and how to use it with Large Language Models. In this guide, you will learn how to use the `ragbits` CLI to detect prompts that you have defined in your project and test them with a Large Language Model. +Prompts Lab is a GUI tool that automatically detects prompts in your project and allows you to interact with them. You can use it to test your prompts with Large Language Models and see how the model responds to different prompts. !!! note - To follow this guide, ensure that you have installed the `ragbits` package and that you are in a directory with Python files that define some ragbits prompts (usually, this would be the root directory of your project) in your command line terminal. You can use code from the [previous chapter](quickstart1_prompts.md). + To follow this guide, ensure that you have installed the `ragbits` package and are in a directory with Python files that define some ragbits prompts (usually, this would be the root directory of your project) in your command line terminal. If you haven't defined any prompts yet, you can use the `SongPrompt` example from [Ragbit's Quickstart Guide](../quickstart/quickstart1_prompts.md) and save it in a Python file with a name starting with "prompt_" in your project directory. -## Prompts Lab: GUI for Interacting with Prompts -Prompts Lab is a GUI tool that automatically detects prompts in your project and allows you to interact with them. You can use it to test your prompts with Large Language Models and see how the model responds to different prompts. Start Prompts Lab by running the following command in your terminal: +## Starting Prompts Lab + +Start Prompts Lab by running the following command in your terminal: ```bash ragbits prompts lab ``` -The tool will open in your default web browser. You will see a list of prompts detected in your project. To view the prompt defined in the previous chapter, select "SongPrompt" from the list. +The tool will open in your default web browser. You will see a list of prompts detected in your project. !!! note By default, Prompts Lab assumes that prompts are defined in Python files with names starting with "prompt_". If you use a different naming convention, you can specify a different file name pattern using the `--file-pattern` option. For instance, if you want to search for prompts in all Python files in your project, run the following command: @@ -23,23 +24,21 @@ The tool will open in your default web browser. You will see a list of prompts d You can also change the default pattern for your entire project by setting the `prompt_path_pattern` configuration option in the `[tool.ragbits]` section of your `pyproject.toml` file. -The "Inputs" pane allows you to enter the values for the placeholders in the prompt. For the `SongPrompt` prompt, you can input the subject, age group, and genre of the song: +## Interacting with Prompts + +To work with a specific prompt, select it from the list. The "Inputs" pane allows you to enter the values for the placeholders in the prompt. For the `SongPrompt` prompt example, this would be the subject, age group, and genre of the song: ![Prompts Lab](./prompts_lab_input.png){style="max-width: 300px; display: block; margin: 0 auto;"} Then, click "Render prompt" to view the final prompt content, with all placeholders replaced with the values you provided. To check how the Large Language Model responds to the prompt, click "Send to LLM". !!! note - If there is no default LLM configured for your project, Prompts Lab will use OpenAI's gpt-3.5-turbo. Ensure that the OPENAI_API_KEY environment variable is set and contains your OpenAI API key + If there is no default LLM configured for your project, Prompts Lab will use OpenAI's gpt-3.5-turbo. Ensure that the OPENAI_API_KEY environment variable is set and contains your OpenAI API key. - Alternatively, you can use your own custom LLM factory (a function that creates an instance of [ragbit's LLM class][ragbits.core.llms.LLM]) by specifying the path to the factory function using the `--llm-factory` option to the `ragbits prompts lab` command. + Alternatively, you can use your own custom LLM factory (a function that creates an instance of [ragbit's LLM class][ragbits.core.llms.LLM]) by specifying the path to the factory function using the `--llm-factory` option with the `ragbits prompts lab` command. - ## Conclusion - -In this guide, you learned how to use the `ragbits` CLI to interact with prompts that you have defined in your project using the Prompts Lab tool. This tool enables you to test your prompts with Large Language Models and see how the model responds to different prompts. -## Next Step -In the next Quickstart guide, you will learn how to use ragbit's Document Search capabilities to retrieve relevant documents for your prompts: [Quickstart 3: Adding RAG Capabilities](quickstart3_rag.md). \ No newline at end of file +In this guide, you learned how to use the `ragbits` CLI to interact with prompts that you have defined in your project using the Prompts Lab tool. This tool enables you to test your prompts with Large Language Models and see how the model responds to different prompts. \ No newline at end of file diff --git a/docs/quickstart/prompts_lab_input.png b/docs/how-to/prompts_lab_input.png similarity index 100% rename from docs/quickstart/prompts_lab_input.png rename to docs/how-to/prompts_lab_input.png diff --git a/docs/quickstart/quickstart1_prompts.md b/docs/quickstart/quickstart1_prompts.md index b66a13ac..639a09a5 100644 --- a/docs/quickstart/quickstart1_prompts.md +++ b/docs/quickstart/quickstart1_prompts.md @@ -14,12 +14,34 @@ class JokePrompt(Prompt): """ ``` -In this case, all you had to do was to set the `user_prompt` property to the desired prompt. That's it! This prompt can now be used anytime you want to pass Ragbits a prompt to use. +In this case, all you had to do was set the `user_prompt` property to the desired prompt. That's it! This prompt can now be used anytime you want to pass a prompt to Ragbits. Next, we'll learn how to make this prompt more dynamic (e.g., by adding placeholders for user inputs). But first, let's see how to use this prompt with a Large Language Model. -## Passing the Prompt to a Large Language Model -To use the defined prompt with a Large Language Model, you need to create an instance of the model and pass the prompt to it. For instance: +## Testing the Prompt from the CLI +Even at this stage, you can test the prompt using the built-in `ragbits` CLI tool. To do this, you need to run the following command in your terminal: + +```bash +uv run ragbits prompts exec path.within.your.project:JokePrompt +``` + +Where `path.within.your.project` is the path to the Python module where the prompt is defined. In the simplest case, when you are in the same directory as the file, it will be the name of the file without the `.py` extension. For example, if the prompt is defined in a file named `joke_prompt.py`, you would run: + +```bash +uv run ragbits prompts exec joke_prompt:JokePrompt +``` + +This command will send the prompt to the default Large Language Model (LLM) and display the generated response in the terminal. + +!!! note + If there is no default LLM configured for your project, Ragbits will use OpenAI's gpt-3.5-turbo. Ensure that the `OPENAI_API_KEY` environment variable is set and contains your OpenAI API key. + + Alternatively, you can use your custom LLM factory (a function that creates an instance of [ragbit's LLM class][ragbits.core.llms.LLM]) by specifying the path to the factory function using the `--llm-factory` option with the `ragbits prompts exec` command. + + + +## Using the Prompt in Python Code +To use the defined prompt with a Large Language Model in Python, you need to create an instance of the model and pass the prompt to it. For instance: ```python from ragbits.core.llms.litellm import LiteLLM @@ -29,10 +51,10 @@ response = await llm.generate(prompt) print(f"Generated song: {response}") ``` -In this code snippet, we first created an instance of the `LiteLLM` class and configured it to use the OpenAI's `gpt-4` model. We then generated a response by passing the prompt to the model. As a result, the model will generate a song about Ragbits based on the provided prompt. +In this code snippet, we first created an instance of the `LiteLLM` class and configured it to use OpenAI's `gpt-4` model. We then generated a response by passing the prompt to the model. As a result, the model will generate a song about Ragbits based on the provided prompt. ## Making the Prompt Dynamic -You could make the prompt dynamic by declaring a Pydantic model that serves as the prompt's input schema (i.e., declares the shape of the data that you will be able to use in the prompt). Here's an example: +You can make the prompt dynamic by declaring a Pydantic model that serves as the prompt's input schema (i.e., declares the shape of the data that you will be able to use in the prompt). Here's an example: ```python from pydantic import BaseModel @@ -70,10 +92,19 @@ class SongPrompt(Prompt[SongIdea]): This example illustrates how to set a system prompt and use conditional statements in the prompt. +## Testing the Dynamic Prompt in CLI +Besides using the dynamic prompt in Python, you can still test it using the `ragbits` CLI tool. The only difference is that now you need to provide the values for the placeholders in the prompt in JSON format. Here's an example: + +```bash +uv run ragbits prompts exec joke_prompt:SongPrompt --payload '{"subject": "unicorns", "age_group": 12, "genre": "pop"}' +``` + +Remember to change `joke_prompt` to the name of the module where the prompt is defined and adjust the values of the placeholders to your liking. + ## Conclusion You now know how to define a prompt in Ragbits and how to use it with Large Language Models. You've also learned to make the prompt dynamic by using Pydantic models and the Jinja2 templating language. To learn more about defining prompts, such as configuring the desired output format, refer to the how-to article [How to define and use Prompts in Ragbits](../how-to/use_prompting.md). ## Next Step -In the next Quickstart guide, you will learn how to use the `ragbits` CLI to manage the prompts that you've defined in your project: [Quickstart 2: Working with prompts from the command line](quickstart2_cli.md). \ No newline at end of file +In the next Quickstart guide, you will learn how to use Ragbit's Document Search capabilities to retrieve relevant documents for your prompts: [Quickstart 2: Adding RAG Capabilities](quickstart2_rag.md). \ No newline at end of file diff --git a/docs/quickstart/quickstart3_rag.md b/docs/quickstart/quickstart2_rag.md similarity index 99% rename from docs/quickstart/quickstart3_rag.md rename to docs/quickstart/quickstart2_rag.md index dde137af..abab147c 100644 --- a/docs/quickstart/quickstart3_rag.md +++ b/docs/quickstart/quickstart2_rag.md @@ -1,4 +1,4 @@ -# Quickstart 3: Adding RAG Capabilities +# Quickstart 2: Adding RAG Capabilities In this chapter, we will look at how to use Ragbit's Document Search capabilities to retrieve relevant documents for your prompts. This technique is based on the Retrieve and Generate (RAG) architecture, which allows the LLM to generate responses informed by relevant information from your documents. diff --git a/mkdocs.yml b/mkdocs.yml index b3a8e0b7..29fa2b50 100644 --- a/mkdocs.yml +++ b/mkdocs.yml @@ -8,13 +8,13 @@ nav: - rabgbits: index.md - Quick Start: - quickstart/quickstart1_prompts.md - - quickstart/quickstart2_cli.md - - quickstart/quickstart3_rag.md + - quickstart/quickstart2_rag.md - How-to Guides: + - how-to/use_prompting.md + - how-to/prompts_lab.md - how-to/optimize.md - how-to/use_guardrails.md - how-to/integrations/promptfoo.md - - how-to/use_prompting.md - how-to/generate_dataset.md - Document Search: - how-to/document_search/async_processing.md