-
Notifications
You must be signed in to change notification settings - Fork 8.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Security Assistant] Abort signal fix #203041
Conversation
Pinging @elastic/security-solution (Team: SecuritySolution) |
@@ -160,10 +160,7 @@ export const streamGraph = async ({ | |||
finalMessage += msg.content; | |||
} | |||
} else if (event.event === 'on_llm_end' && !didEnd) { | |||
const generations = event.data.output?.generations[0]; | |||
if (generations && generations[0]?.generationInfo.finish_reason === 'stop') { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this condition prevented aborted generations from being persisted in the conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me (aside from failing CI). I tested locally but maybe another review from someone with more context would be good.
Starting backport for target branches: 8.15, 8.16, 8.17, 8.x |
💚 Build Succeeded
Metrics [docs]
History
|
(cherry picked from commit b3b2c17)
(cherry picked from commit b3b2c17)
(cherry picked from commit b3b2c17)
💔 Some backports could not be created
Note: Successful backport PRs will be merged automatically after passing CI. Manual backportTo create the backport manually run:
Questions ?Please refer to the Backport tool documentation |
Starting backport for target branches: 8.x |
Starting backport for target branches: 8.x |
(cherry picked from commit b3b2c17)
💚 All backports created successfully
Note: Successful backport PRs will be merged automatically after passing CI. Questions ?Please refer to the Backport tool documentation |
(cherry picked from commit b3b2c17)
💚 All backports created successfully
Note: Successful backport PRs will be merged automatically after passing CI. Questions ?Please refer to the Backport tool documentation |
# Backport This will backport the following commits from `main` to `8.x`: - [[Security Assistant] Abort signal fix (#203041)](#203041) <!--- Backport version: 9.4.3 --> ### Questions ? Please refer to the [Backport tool documentation](https://github.com/sqren/backport) <!--BACKPORT [{"author":{"name":"Steph Milovic","email":"[email protected]"},"sourceCommit":{"committedDate":"2024-12-05T21:27:29Z","message":"[Security Assistant] Abort signal fix (#203041)","sha":"b3b2c1745ac846b377d8e234526aa965196e4de9","branchLabelMapping":{"^v9.0.0$":"main","^v8.18.0$":"8.x","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["release_note:skip","v9.0.0","Team: SecuritySolution","backport:prev-major","Team:Security Generative AI"],"title":"[Security Assistant] Abort signal fix","number":203041,"url":"https://github.com/elastic/kibana/pull/203041","mergeCommit":{"message":"[Security Assistant] Abort signal fix (#203041)","sha":"b3b2c1745ac846b377d8e234526aa965196e4de9"}},"sourceBranch":"main","suggestedTargetBranches":[],"targetPullRequestStates":[{"branch":"main","label":"v9.0.0","branchLabelMappingKey":"^v9.0.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/203041","number":203041,"mergeCommit":{"message":"[Security Assistant] Abort signal fix (#203041)","sha":"b3b2c1745ac846b377d8e234526aa965196e4de9"}}]}] BACKPORT--> Co-authored-by: Steph Milovic <[email protected]>
Summary
I noticed that
abortSignal
was not being properly piped through for any of the connector types. I suspect this happened when we made the switch to be fully LangChain. It is most evident with OpenAI, as the stream never finishes (handleStreamEnd
not called) so the message does not get persisted, resulting in the following misbehavior:bad1.mov
In Bedrock and Gemini we had the
handleStreamEnd
behavior correctly handling abort signal. However, we did not pass through the signal in the chat model. On top of that, for some reason Bedrock gets the abort signal off of theinvoke
config and we were not providingsignal
there. I made changes to pass the signal to the connector, and to pass the signal through theinvoke
config for Bedrock.Impact
OpenAI
OpenAI had the biggest impact. Below is a screen capture of the problematic steps above, now behaving as expected. The stream runs slightly longer on the server, so when the UI updates it has a little extra text than when the stream originally finished.
good1a.mov
Bedrock
Bedrock behaves differently in that when you end the stream, it still gives you the complete output. So although the signal is now included in the connector request, there is not a UI implication. It behaves the same way as before: The stream will stop in the UI, but when you reload the conversation the full response will be there.
Gemini
The stream is so fast in Gemini, I can't actually successfully press "Stop generating" so it seems there is no user impact. However, I ensured that the signal is now included in the connector request.