Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug: BedrockLLMAgent callbacks on_llm_start and on_llm_end does not work #239

Open
rjesh-git opened this issue Feb 17, 2025 · 1 comment
Labels
bug Something isn't working triage

Comments

@rjesh-git
Copy link

Expected Behaviour

All the three callback events on_llm_start, on_llm_end and on_llm_new_token fires for BedrockLLMAgent while streaming response.

Current Behaviour

Only on_llm_new_token fires for BedrockLLMAgent while streaming response.

Code snippet

Refer to https://github.com/awslabs/multi-agent-orchestrator/blob/main/examples/fast-api-streaming/main.py#L41

Possible Solution

No response

Steps to Reproduce

https://github.com/awslabs/multi-agent-orchestrator/blob/main/examples/fast-api-streaming/main.py#L41

@rjesh-git rjesh-git added the bug Something isn't working label Feb 17, 2025
@brnaba-aws
Copy link
Contributor

Hi @rjesh-git
Thanks for reaching out.
This example is wrong. We actually don't have any definition apart from on_llm_new_token

Here is where the callbacks are defined:

We had plan to add more callbacks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triage
Projects
None yet
Development

No branches or pull requests

2 participants