Skip to content

Commit

Permalink
forefront possibility of local model in docs (closes #63)
Browse files Browse the repository at this point in the history
  • Loading branch information
simonpcouch committed Nov 21, 2024
1 parent ae27a02 commit 83bcccb
Show file tree
Hide file tree
Showing 3 changed files with 11 additions and 2 deletions.
2 changes: 1 addition & 1 deletion README.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ You can install pal like so:
pak::pak("simonpcouch/pal")
```

Then, ensure that you have an [`ANTHROPIC_API_KEY`](https://console.anthropic.com/) environment variable set, and you're ready to go. If you'd like to use an LLM other than Anthropic's Claude 3.5 Sonnet—like OpenAI's ChatGPT—to power the pal, see the [Getting started with pal](https://simonpcouch.github.io/pal/articles/pal.html) vignette.
Then, ensure that you have an [`ANTHROPIC_API_KEY`](https://console.anthropic.com/) environment variable set, and you're ready to go. If you'd like to use an LLM other than Anthropic's Claude 3.5 Sonnet—like OpenAI's ChatGPT or a local ollama model—to power the pal, see the [Getting started with pal](https://simonpcouch.github.io/pal/articles/pal.html) vignette.

Pals are interfaced with the via the pal addin. For easiest access, we recommend registering the pal addin to a keyboard shortcut.

Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ pak::pak("simonpcouch/pal")
Then, ensure that you have an
[`ANTHROPIC_API_KEY`](https://console.anthropic.com/) environment
variable set, and you’re ready to go. If you’d like to use an LLM other
than Anthropic’s Claude 3.5 Sonnet—like OpenAI’s ChatGPT—to power the
than Anthropic’s Claude 3.5 Sonnet—like OpenAI’s ChatGPT or a local ollama model—to power the
pal, see the [Getting started with
pal](https://simonpcouch.github.io/pal/articles/pal.html) vignette.

Expand Down
9 changes: 9 additions & 0 deletions vignettes/pal.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -47,6 +47,15 @@ options(
)
```

To use a local model with ollama, you might write:

```r
options(
.pal_fn = "chat_ollama",
.pal_args = list(model = "qwen2.5-coder:14b")
)
```

You'll probably want pal to always use whichever model you're configuring with this option. To make this selection persist across sessions, add that `options()` code to your `.Rprofile`. You might use `usethis::edit_r_profile()` to open the file. After making those changes and restarting R, your pal will use the new model.

## The pal addin
Expand Down

0 comments on commit 83bcccb

Please sign in to comment.