Skip to content

Commit

Permalink
docs(quickstart): fix errors when runnign Quickstart 1
Browse files Browse the repository at this point in the history
  • Loading branch information
ludwiktrammer committed Dec 4, 2024
1 parent 02218f0 commit d5718c6
Show file tree
Hide file tree
Showing 2 changed files with 14 additions and 4 deletions.
16 changes: 13 additions & 3 deletions docs/quickstart/quickstart1_prompts.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,16 @@

In this Quickstart guide, you will learn how to define a dynamic prompt in Ragbits and how to use such a prompt with Large Language Models.

## Installing Ragbits

To install Ragbits, run the following command in your terminal:

```bash
pip install ragbits[litellm]
```

This command will install all the popular Ragbits packages, along with [LiteLLM](https://docs.litellm.ai/docs/), which we will use in this guide for communicating with LLM APIs.

## Defining a Static Prompt
The most standard way to define a prompt in Ragbits is to create a class that inherits from the `Prompt` class and configure it by setting values for appropriate properties. Here is an example of a simple prompt that asks the model to write a song about Ragbits:

Expand All @@ -22,13 +32,13 @@ Next, we'll learn how to make this prompt more dynamic (e.g., by adding placehol
Even at this stage, you can test the prompt using the built-in `ragbits` CLI tool. To do this, you need to run the following command in your terminal:

```bash
uv run ragbits prompts exec path.within.your.project:SongPrompt
ragbits prompts exec path.within.your.project:SongPrompt
```

Where `path.within.your.project` is the path to the Python module where the prompt is defined. In the simplest case, when you are in the same directory as the file, it will be the name of the file without the `.py` extension. For example, if the prompt is defined in a file named `song_prompt.py`, you would run:

```bash
uv run ragbits prompts exec song_prompt:SongPrompt
ragbits prompts exec song_prompt:SongPrompt
```

This command will send the prompt to the default Large Language Model and display the generated response in the terminal.
Expand Down Expand Up @@ -96,7 +106,7 @@ This example illustrates how to set a system prompt and use conditional statemen
Besides using the dynamic prompt in Python, you can still test it using the `ragbits` CLI tool. The only difference is that now you need to provide the values for the placeholders in the prompt in JSON format. Here's an example:

```bash
uv run ragbits prompts exec song_prompt:SongPrompt --payload '{"subject": "unicorns", "age_group": 12, "genre": "pop"}'
ragbits prompts exec song_prompt:SongPrompt --payload '{"subject": "unicorns", "age_group": 12, "genre": "pop"}'
```

Remember to change `song_prompt` to the name of the module where the prompt is defined and adjust the values of the placeholders to your liking.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -181,7 +181,7 @@ async def _get_litellm_response(
options: LiteLLMOptions,
response_format: type[BaseModel] | dict | None,
stream: bool = False,
) -> ModelResponse | CustomStreamWrapper:
) -> "ModelResponse" | "CustomStreamWrapper":
try:
response = await litellm.acompletion(
messages=conversation,
Expand Down

0 comments on commit d5718c6

Please sign in to comment.