Skip to content

Commit

Permalink
Replace chapter 2 with text on RAG
Browse files Browse the repository at this point in the history
  • Loading branch information
ludwiktrammer committed Dec 3, 2024
1 parent f8a5f9d commit 64b6b26
Show file tree
Hide file tree
Showing 5 changed files with 54 additions and 24 deletions.
27 changes: 13 additions & 14 deletions docs/quickstart/quickstart2_cli.md → docs/how-to/prompts_lab.md
Original file line number Diff line number Diff line change
@@ -1,18 +1,19 @@
# Quickstart 2: Working with Prompts from the Command Line
# How to Manage Prompts using GUI with Prompts Lab

In the [previous chapter](quickstart1_prompts.md), you learned how to define a prompt in Ragbits and how to use it with Large Language Models. In this guide, you will learn how to use the `ragbits` CLI to detect prompts that you have defined in your project and test them with a Large Language Model.
Prompts Lab is a GUI tool that automatically detects prompts in your project and allows you to interact with them. You can use it to test your prompts with Large Language Models and see how the model responds to different prompts.

!!! note
To follow this guide, ensure that you have installed the `ragbits` package and that you are in a directory with Python files that define some ragbits prompts (usually, this would be the root directory of your project) in your command line terminal. You can use code from the [previous chapter](quickstart1_prompts.md).
To follow this guide, ensure that you have installed the `ragbits` package and are in a directory with Python files that define some ragbits prompts (usually, this would be the root directory of your project) in your command line terminal. If you haven't defined any prompts yet, you can use the `SongPrompt` example from [Ragbit's Quickstart Guide](../quickstart/quickstart1_prompts.md) and save it in a Python file with a name starting with "prompt_" in your project directory.

## Prompts Lab: GUI for Interacting with Prompts
Prompts Lab is a GUI tool that automatically detects prompts in your project and allows you to interact with them. You can use it to test your prompts with Large Language Models and see how the model responds to different prompts. Start Prompts Lab by running the following command in your terminal:
## Starting Prompts Lab

Start Prompts Lab by running the following command in your terminal:

```bash
ragbits prompts lab
```

The tool will open in your default web browser. You will see a list of prompts detected in your project. To view the prompt defined in the previous chapter, select "SongPrompt" from the list.
The tool will open in your default web browser. You will see a list of prompts detected in your project.

!!! note
By default, Prompts Lab assumes that prompts are defined in Python files with names starting with "prompt_". If you use a different naming convention, you can specify a different file name pattern using the `--file-pattern` option. For instance, if you want to search for prompts in all Python files in your project, run the following command:
Expand All @@ -23,23 +24,21 @@ The tool will open in your default web browser. You will see a list of prompts d

You can also change the default pattern for your entire project by setting the `prompt_path_pattern` configuration option in the `[tool.ragbits]` section of your `pyproject.toml` file.

The "Inputs" pane allows you to enter the values for the placeholders in the prompt. For the `SongPrompt` prompt, you can input the subject, age group, and genre of the song:
## Interacting with Prompts

To work with a specific prompt, select it from the list. The "Inputs" pane allows you to enter the values for the placeholders in the prompt. For the `SongPrompt` prompt example, this would be the subject, age group, and genre of the song:

![Prompts Lab](./prompts_lab_input.png){style="max-width: 300px; display: block; margin: 0 auto;"}

Then, click "Render prompt" to view the final prompt content, with all placeholders replaced with the values you provided. To check how the Large Language Model responds to the prompt, click "Send to LLM".

!!! note
If there is no default LLM configured for your project, Prompts Lab will use OpenAI's gpt-3.5-turbo. Ensure that the OPENAI_API_KEY environment variable is set and contains your OpenAI API key
If there is no default LLM configured for your project, Prompts Lab will use OpenAI's gpt-3.5-turbo. Ensure that the OPENAI_API_KEY environment variable is set and contains your OpenAI API key.

Alternatively, you can use your own custom LLM factory (a function that creates an instance of [ragbit's LLM class][ragbits.core.llms.LLM]) by specifying the path to the factory function using the `--llm-factory` option to the `ragbits prompts lab` command.
Alternatively, you can use your own custom LLM factory (a function that creates an instance of [ragbit's LLM class][ragbits.core.llms.LLM]) by specifying the path to the factory function using the `--llm-factory` option with the `ragbits prompts lab` command.

<!-- TODO: link to the how-to on configuring default LLMs in pyproject.toml -->


## Conclusion
<!-- TODO: Add a link to the how-to article on using `ragbits prompts exec` -->
In this guide, you learned how to use the `ragbits` CLI to interact with prompts that you have defined in your project using the Prompts Lab tool. This tool enables you to test your prompts with Large Language Models and see how the model responds to different prompts.

## Next Step
In the next Quickstart guide, you will learn how to use ragbit's Document Search capabilities to retrieve relevant documents for your prompts: [Quickstart 3: Adding RAG Capabilities](quickstart3_rag.md).
In this guide, you learned how to use the `ragbits` CLI to interact with prompts that you have defined in your project using the Prompts Lab tool. This tool enables you to test your prompts with Large Language Models and see how the model responds to different prompts.
File renamed without changes
43 changes: 37 additions & 6 deletions docs/quickstart/quickstart1_prompts.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,12 +14,34 @@ class JokePrompt(Prompt):
"""
```

In this case, all you had to do was to set the `user_prompt` property to the desired prompt. That's it! This prompt can now be used anytime you want to pass Ragbits a prompt to use.
In this case, all you had to do was set the `user_prompt` property to the desired prompt. That's it! This prompt can now be used anytime you want to pass a prompt to Ragbits.

Next, we'll learn how to make this prompt more dynamic (e.g., by adding placeholders for user inputs). But first, let's see how to use this prompt with a Large Language Model.

## Passing the Prompt to a Large Language Model
To use the defined prompt with a Large Language Model, you need to create an instance of the model and pass the prompt to it. For instance:
## Testing the Prompt from the CLI
Even at this stage, you can test the prompt using the built-in `ragbits` CLI tool. To do this, you need to run the following command in your terminal:

```bash
uv run ragbits prompts exec path.within.your.project:JokePrompt
```

Where `path.within.your.project` is the path to the Python module where the prompt is defined. In the simplest case, when you are in the same directory as the file, it will be the name of the file without the `.py` extension. For example, if the prompt is defined in a file named `joke_prompt.py`, you would run:

```bash
uv run ragbits prompts exec joke_prompt:JokePrompt
```

This command will send the prompt to the default Large Language Model (LLM) and display the generated response in the terminal.

!!! note
If there is no default LLM configured for your project, Ragbits will use OpenAI's gpt-3.5-turbo. Ensure that the `OPENAI_API_KEY` environment variable is set and contains your OpenAI API key.

Alternatively, you can use your custom LLM factory (a function that creates an instance of [ragbit's LLM class][ragbits.core.llms.LLM]) by specifying the path to the factory function using the `--llm-factory` option with the `ragbits prompts exec` command.

<!-- TODO: link to the how-to on configuring default LLMs in pyproject.toml -->

## Using the Prompt in Python Code
To use the defined prompt with a Large Language Model in Python, you need to create an instance of the model and pass the prompt to it. For instance:

```python
from ragbits.core.llms.litellm import LiteLLM
Expand All @@ -29,10 +51,10 @@ response = await llm.generate(prompt)
print(f"Generated song: {response}")
```

In this code snippet, we first created an instance of the `LiteLLM` class and configured it to use the OpenAI's `gpt-4` model. We then generated a response by passing the prompt to the model. As a result, the model will generate a song about Ragbits based on the provided prompt.
In this code snippet, we first created an instance of the `LiteLLM` class and configured it to use OpenAI's `gpt-4` model. We then generated a response by passing the prompt to the model. As a result, the model will generate a song about Ragbits based on the provided prompt.

## Making the Prompt Dynamic
You could make the prompt dynamic by declaring a Pydantic model that serves as the prompt's input schema (i.e., declares the shape of the data that you will be able to use in the prompt). Here's an example:
You can make the prompt dynamic by declaring a Pydantic model that serves as the prompt's input schema (i.e., declares the shape of the data that you will be able to use in the prompt). Here's an example:

```python
from pydantic import BaseModel
Expand Down Expand Up @@ -70,10 +92,19 @@ class SongPrompt(Prompt[SongIdea]):

This example illustrates how to set a system prompt and use conditional statements in the prompt.

## Testing the Dynamic Prompt in CLI
Besides using the dynamic prompt in Python, you can still test it using the `ragbits` CLI tool. The only difference is that now you need to provide the values for the placeholders in the prompt in JSON format. Here's an example:

```bash
uv run ragbits prompts exec joke_prompt:SongPrompt --payload '{"subject": "unicorns", "age_group": 12, "genre": "pop"}'
```

Remember to change `joke_prompt` to the name of the module where the prompt is defined and adjust the values of the placeholders to your liking.

## Conclusion
You now know how to define a prompt in Ragbits and how to use it with Large Language Models. You've also learned to make the prompt dynamic by using Pydantic models and the Jinja2 templating language. To learn more about defining prompts, such as configuring the desired output format, refer to the how-to article [How to define and use Prompts in Ragbits](../how-to/use_prompting.md).

<!-- TODO: Add a link to the how-to articles on using images in prompts and on defining custom prompt sources -->

## Next Step
In the next Quickstart guide, you will learn how to use the `ragbits` CLI to manage the prompts that you've defined in your project: [Quickstart 2: Working with prompts from the command line](quickstart2_cli.md).
In the next Quickstart guide, you will learn how to use Ragbit's Document Search capabilities to retrieve relevant documents for your prompts: [Quickstart 2: Adding RAG Capabilities](quickstart2_rag.md).
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Quickstart 3: Adding RAG Capabilities
# Quickstart 2: Adding RAG Capabilities

In this chapter, we will look at how to use Ragbit's Document Search capabilities to retrieve relevant documents for your prompts. This technique is based on the Retrieve and Generate (RAG) architecture, which allows the LLM to generate responses informed by relevant information from your documents.

Expand Down
6 changes: 3 additions & 3 deletions mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,13 +8,13 @@ nav:
- rabgbits: index.md
- Quick Start:
- quickstart/quickstart1_prompts.md
- quickstart/quickstart2_cli.md
- quickstart/quickstart3_rag.md
- quickstart/quickstart2_rag.md
- How-to Guides:
- how-to/use_prompting.md
- how-to/prompts_lab.md
- how-to/optimize.md
- how-to/use_guardrails.md
- how-to/integrations/promptfoo.md
- how-to/use_prompting.md
- how-to/generate_dataset.md
- Document Search:
- how-to/document_search/async_processing.md
Expand Down

0 comments on commit 64b6b26

Please sign in to comment.