Skip to content

Commit

Permalink
Merge pull request #153 from pipecat-ai/expose-llm-messages
Browse files Browse the repository at this point in the history
aggregators: expose LLM messages
  • Loading branch information
aconchillo authored May 20, 2024
2 parents 1c8b9d8 + e4c990c commit 077bb9f
Show file tree
Hide file tree
Showing 2 changed files with 15 additions and 0 deletions.
7 changes: 7 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,13 @@ All notable changes to **pipecat** will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [Unreleased]

### Changed

- `LLMUserResponseAggregator` and `LLMAssistantResponseAggregator` internal
messages are now exposed through the `messages` property.

## [0.0.18] - 2024-05-20

### Fixed
Expand Down
8 changes: 8 additions & 0 deletions src/pipecat/processors/aggregators/llm_response.py
Original file line number Diff line number Diff line change
Expand Up @@ -45,6 +45,14 @@ def __init__(
# Reset our accumulator state.
self._reset()

@property
def messages(self):
return self._messages

@property
def role(self):
return self._role

#
# Frame processor
#
Expand Down

0 comments on commit 077bb9f

Please sign in to comment.