Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Restrucure/reword docs #42

Merged
merged 4 commits into from
Dec 18, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/scripts/report_nightly_build_failure.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@
response = requests.post(
os.environ["SLACK_WEBHOOK_URL"],
json={
"text": "A Nightly build failed. See https://github.com/tomusher/wagtail-ai/actions/runs/"
"text": "A Nightly build failed. See https://github.com/wagtail/wagtail-ai/actions/runs/"
+ os.environ["GITHUB_RUN_ID"],
},
)
Expand Down
16 changes: 16 additions & 0 deletions .readthedocs.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
version: 2
build:
os: ubuntu-22.04
tools:
python: '3.11'

mkdocs:
configuration: mkdocs.yml

# Dependencies required to build your docs
python:
install:
- method: pip
path: .
extra_requirements:
- docs
91 changes: 7 additions & 84 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ Get help with your content using AI superpowers.

[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
[![PyPI version](https://badge.fury.io/py/wagtail-ai.svg)](https://badge.fury.io/py/wagtail-ai)
[![ai CI](https://github.com/tomusher/wagtail-ai/actions/workflows/test.yml/badge.svg)](https://github.com/tomusher/wagtail-ai/actions/workflows/test.yml)
[![ai CI](https://github.com/wagtail/wagtail-ai/actions/workflows/test.yml/badge.svg)](https://github.com/wagtail/wagtail-ai/actions/workflows/test.yml)

Wagtail AI integrates Wagtail with AI's APIs (think ChatGPT) to help you write and correct your content.

Expand All @@ -26,91 +26,14 @@ You'll need a paid OpenAI or Anthropic account and an API key. There'll also be
* \+ (1,000 * 1.3) tokens received from the API
* = 2,645 tokens = $0.0053

## The Future

Wagtail AI is very new. Here's some things we'd like to do:

* [ ] Streaming support - the API supports server-sent events, we could do the same
* [ ] A nice UI - it's a bit rough right now
* [ ] Reduce bundle size
* [ ] Internationalisation on text and support for different language prompts
* [ ] Find a better way to hook in to Draftail to do things like show progress bars/spinners.
* [ ] Add more AI behaviours and features - content recommendations, content based Q&A tools, better ways to direct the prompt.
* [ ] Tests!

If you're interested in working on these things, please do!

## Links

- [Documentation](https://github.com/tomusher/wagtail-ai/blob/main/README.md)
- [Changelog](https://github.com/tomusher/wagtail-ai/blob/main/CHANGELOG.md)
- [Contributing](https://github.com/tomusher/wagtail-ai/blob/main/CHANGELOG.md)
- [Discussions](https://github.com/tomusher/wagtail-ai/discussions)
- [Security](https://github.com/tomusher/wagtail-ai/security)
- [Documentation](https://github.com/wagtail/wagtail-ai/blob/main/docs/index.md)
- [Changelog](https://github.com/wagtail/wagtail-ai/blob/main/CHANGELOG.md)
- [Contributing](https://github.com/wagtail/wagtail-ai/blob/main/docs/contributing.md)
- [Discussions](https://github.com/wagtail/wagtail-ai/discussions)
- [Security](https://github.com/wagtail/wagtail-ai/security)

## Supported Versions

* Wagtail 4.0, 4.1, 4.2, 5.0, 5.2

## Contributing

### Install

To make changes to this project, first clone this repository:

```sh
git clone https://github.com/tomusher/wagtail-ai.git
cd wagtail-ai
```

With your preferred virtualenv activated, install testing dependencies:

#### Compile front-end assets

```sh
nvm use
npm install
npm run build
```

#### Using pip

```sh
python -m pip install --upgrade pip>=21.3
python -m pip install -e .[testing] -U
```

#### Using flit

```sh
python -m pip install flit
flit install
```

### pre-commit

Note that this project uses [pre-commit](https://github.com/pre-commit/pre-commit).
It is included in the project testing requirements. To set up locally:

```shell
# go to the project directory
$ cd wagtail-ai
# initialize pre-commit
$ pre-commit install

# Optional, run all checks once for this, then the checks will run only on the changed files
$ git ls-files --others --cached --exclude-standard | xargs pre-commit run --files
```

### How to run tests

Now you can run tests as shown below:

```sh
tox
```

or, you can run them for a specific environment `tox -e python3.11-django4.2-wagtail5.2` or specific test
`tox -e python3.11-django4.2-wagtail5.2-sqlite wagtail-ai.tests.test_file.TestClass.test_method`

To run the test app interactively, use `tox -e interactive`, visit `http://127.0.0.1:8020/admin/` and log in with `admin`/`changeme`.
* Wagtail 5.2
3 changes: 1 addition & 2 deletions docs/.pages
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
nav:
- installation.md
- editor-integration.md
- "Backends":
- llm-backend.md
- ai-backends.md
- text-splitting.md
109 changes: 109 additions & 0 deletions docs/ai-backends.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,109 @@
# AI Backends

Wagtail AI can be configured to use different backends to support different AI services.

Currently the only (and default) backend available in Wagtail AI is the [LLM Backend](#llm-backend)

## LLM Backend

This backend uses the [llm library](https://llm.datasette.io/en/stable/) which offers support for many AI services through plugins.

By default, it is configured to use OpenAI's `gpt-3.5-turbo` model.

### Using other models

You can use the command line interface to see the llm models installed in your environment:

```sh
llm models
```

Then you can swap `MODEL_ID` in the configuration to use a different model. For example, to use GPT-4:

```python
WAGTAIL_AI = {
"BACKENDS": {
"default": {
"CLASS": "wagtail_ai.ai.llm.LLMBackend",
"CONFIG": {
"MODEL_ID": "gpt-4",
},
}
}
}
```

!!! info

The `llm` package comes with OpenAI models installed by default.

You can install other models using [`llm`'s plugin functionality](https://llm.datasette.io/en/stable/plugins/index.html).

### Customisations

There are two settings that you can use with the LLM backend:

- `INIT_KWARGS`
- `PROMPT_KWARGS`

#### `INIT_KWARGS`

These are passed to `llm` as ["Model Options"](https://llm.datasette.io/en/stable/python-api.html#model-options). You can use them to customize the model's initialization.

For example, for OpenAI models you can set a custom API key. By default the `openai` library will use the value of the `OPENAI_API_KEY` environment variable.

```python
WAGTAIL_AI = {
"BACKENDS": {
"default": {
"CLASS": "wagtail_ai.ai.llm.LLMBackend",
"CONFIG": {
"MODEL_ID": "gpt-3.5-turbo", # Model ID recognizable by the llm package.
"INIT_KWARGS": {"key": "your-custom-api-key"},
},
}
}
}
```

#### `PROMPT_KWARGS`

Using `PROMPT_KWARGS` you can pass arguments to [`llm`'s `prompt` method](https://llm.datasette.io/en/stable/python-api.html#system-prompts), e.g. a system prompt which is passsed with every request.

```python
WAGTAIL_AI = {
"BACKENDS": {
"default": {
"CLASS": "wagtail_ai.ai.llm.LLMBackend",
"CONFIG": {
"MODEL_ID": "gpt-3.5-turbo", # Model ID recognizable by the llm package.
"PROMPT_KWARGS": {"system": "A custom, global system prompt."},
},
}
}
}
```

#### Specify the token limit for a model

!!! info

Token limit is referred to as "context window" which is the maximum amount of tokens in a single context that a specific chat model supports.

While Wagtail AI knows the token limit of some models (see [`tokens.py`](https://github.com/wagtail/wagtail-ai/blob/main/src/wagtail_ai/tokens.py)), you might choose to use a model that isn't in this mappping, or you might want to set a lower token limit for an existing model.

You can do this by setting `TOKEN_LIMIT`.

```python
WAGTAIL_AI = {
"BACKENDS": {
"default": {
"CLASS": "wagtail_ai.ai.llm.LLMBackend",
"CONFIG": {
"MODEL_ID": "gpt-3.5-turbo",
"TOKEN_LIMIT": 4096,
},
}
}
}
```
13 changes: 8 additions & 5 deletions docs/editor-integration.md
Original file line number Diff line number Diff line change
@@ -1,14 +1,17 @@
# Editor Integration

Wagtail AI integrates with Wagtail's Draftail rich text editor to provide tools to help write content.
Wagtail AI integrates with Wagtail's Draftail rich text editor to provide tools to help write content. To use it, highlight some text and click the 'magic wand' icon in the toolbar.

By default, it includes tools to:
By default, it includes prompts that:

* Run AI assisted spelling/grammar checks on your content
* Generate additional content based on what you're writing

You can also define your own prompts:

### Adding Your Own Prompts

Explore the `AI Prompts` settings, accessible via the Wagtail settings menu. Here you'll be able to view, edit and add new prompts.
You can add your own prompts and customise existing prompts from the Wagtail admin under Settings -> Prompts.

When creating prompts you can provide a label and description to help describe the prompt to your editors, specify the full prompt that will be passed with your text to the AI, and a 'method', which can be one of:

- 'Append after existing content' - keep your existing content intact and add the response from the AI to the end (useful for completions/suggestions).
- 'Replace content' - replace the content in the editor with the response from the AI (useful for corrections, rewrites and translations.)
59 changes: 12 additions & 47 deletions docs/installation.md
Original file line number Diff line number Diff line change
@@ -1,13 +1,17 @@
# Installation

At this moment in time the only backend that ships by default with wagtail-ai is [llm](https://llm.datasette.io/en/stable/)
that lets you use a number of different chat models, including OpenAI's.

1. Install the package along with the relevant client libraries for the AI Backend you want to use:
- For [llm](https://llm.datasette.io/en/stable/) which includes OpenAI chat models,
`python -m pip install wagtail-ai[llm]`
1. Install the package along with the relevant client libraries for the default [AI Backend](ai-backends.md):
```bash
python -m pip install wagtail-ai[llm]
```
2. Add `wagtail_ai` to your `INSTALLED_APPS`
3. Add an AI chat model and backend configuration (any model supported by [llm](https://llm.datasette.io/en/stable/)).
```
INSTALLED_APPS = [
"wagtail_ai",
# ...
]
```
3. Add an AI chat model and backend configuration (by default, `MODEL_ID` can be any model supported by [llm](https://llm.datasette.io/en/stable/)).
```python
WAGTAIL_AI = {
"BACKENDS": {
Expand All @@ -20,43 +24,4 @@ that lets you use a number of different chat models, including OpenAI's.
}
}
```

The openai package can be provided with the API key via the `OPENAI_API_KEY`
environment variable. If you want to provide a custom API key for
each backend please read the llm backend's documentation page.

Read more about the [llm backend here](llm-backend.md).


## Specify the token limit for a backend

!!! info

Token limit is referred to as "context window" which is the maximum amount
of tokens in a single context that a specific chat model supports.

If you want to use a chat model that does not have a default token limit configured
or want to change the default token limit, you can do so by adding the `TOKEN_LIMIT`
setting.

```python
WAGTAIL_AI = {
"BACKENDS": {
"default": {
"CLASS": "wagtail_ai.ai.llm.LLMBackend",
"CONFIG": {
"MODEL_ID": "gpt-3.5-turbo",
"TOKEN_LIMIT": 4096,
},
}
}
}
```

This `TOKEN_LIMIT` value depend on the chat model you select as each of them support
a different token limit, e.g. `gpt-3.5-turbo` supports up to 4096 tokens,
`gpt-3.5-turbo-16k` supports up to 16384 tokens.

!!! info "Text splitting"

[Read more about text splitting and Wagtail AI customization options here](text-splitting.md).
4. If you're using an OpenAI model, specify an API key using the `OPENAI_API_KEY` environment variable, or by setting it as a key in [`INIT_KWARGS`](ai-backends.md#init-kwargs).
Loading
Loading