Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feat/llm responses #376

Merged
merged 234 commits into from
Feb 18, 2025
Merged

Feat/llm responses #376

merged 234 commits into from
Feb 18, 2025

Conversation

NotBioWaste905
Copy link
Collaborator

@NotBioWaste905 NotBioWaste905 commented Jul 24, 2024

Description

Added functionality for calling LLMs via langchain API for utilizing them in responses and conditions.

Checklist

  • I have performed a self-review of the changes

List here tasks to complete in order to mark this PR as ready for review.

To Consider

  • Add tests
  • Update API reference / tutorials / guides
  • Update CONTRIBUTING.md

@NotBioWaste905 NotBioWaste905 requested a review from RLKRo July 24, 2024 12:22
Copy link

@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It appears this PR is a release PR (change its base from master if that is not the case).

Here's a release checklist:

  • Update package version
  • Update poetry.lock
  • Change PR merge option
  • Update template repo
  • Search for objects to be deprecated

@NotBioWaste905 NotBioWaste905 changed the base branch from master to dev July 24, 2024 12:22
@RLKRo
Copy link
Member

RLKRo commented Aug 8, 2024

I got an idea for more complex prompts: we can allow passing responses as prompts instead of just strings.

And then it'd be possible to incorporate slots into a prompt:

model = LLM_API(prompt=rsp.slots.FilledTemplate("You are an experienced barista in a local coffeshop."
"Answer your customers questions about coffee and barista work.\n"
"Customer data:\nAge {person.age}\nGender: {person.gender}\nFavorite drink: {person.habits.drink}"
))

@RLKRo RLKRo merged commit c194747 into dev Feb 18, 2025
17 checks passed
@RLKRo RLKRo mentioned this pull request Feb 18, 2025
RLKRo added a commit that referenced this pull request Feb 18, 2025
# Changelog

## Breaking Changes

- Dropped support for python 3.8; added support for python 3.12 (#400);
- Reworked DB architecture to support partials turn reads/writes (#93).
  Old Context storages are incompatible with the new ones.
  See tutorial Context Storages: 8 for more info;
- `Context.labels`, `Context.requests`, `Context.responses` are now only
lazily loaded (#93).
  Items from older turns can be loaded on demand.
  Their `__getitem__` and `get` methods are now async.


## Features 

- Added `LLMResponse` and `LLMCondition` classes that allow using LLMs
(#376).
See the new LLM Integration tutorials and LLM user guide for more info;
- Added option to extract group slots partially (#394).
  See tutorial Slots: 2 for more information;
- `Message.original_message` is replaced with `Message.origin` which
stores both
the original message and the interface from which the message originated
(#398);
- Added `Context.current_turn_id` field which stores the number of the
current turn (#93);
- Added `Context.created_at`, `Context.updated_at` timestamp fields
(#93);
- Added `Context.turns` property which allows iterating over
requests/labels/responses by their turn ids (#93);
- `Context.labels`, `Context.requests`, `Context.responses` now support
slicing (#93).
`__getitem__`, `__setitem__` and `__delitem__` methods can now accept
slices of turn ids in addition to single turn id.
`get` method can now accepts iterable of turn ids in addition to single
turn id.


## Documentation

- Documentation is now versioned (#346, #409).
You can select preferred version via the drop-down menu in the top-right
corner.


## Developer changes

- Context now has field `origin_interface` to store name of the
interface that created it (#398);
- Added script `docs_no_docker` for building documentation without
docker (ef11ff9);
- Added in-RAM context storage to be the default one instead of a plain
dict (#93);
- Removed methods `Context.add_request`, `Context.add_label` and
`Context.add_response` (#93).
  Use setters with `Context.current_turn_id` instead.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants