-
Notifications
You must be signed in to change notification settings - Fork 16
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Threads endpoint remapping #32
Conversation
Unified `Annotations` to `MessageAnnotation`, added CRUD operations for messages, and improved data consistency with OpenAI request/response standards. Adjusted thread demos to support new message functionalities.
…r extracing info from a file
/// <summary> | ||
/// The content of the message. | ||
/// </summary> | ||
[JsonProperty("content")] | ||
public string Content { get; } | ||
// public IReadOnlyList<MessageContent> Content { get; } | ||
public required string Content { get; set; } //TODO: open ai supports also array of MessageContent object, but the text object differs from create request and from response object |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this will be resolved in a follow-up pr? See ChatRequest
for a reference - we give users both fields to use, if the implementing IEnumerable<>
one is filled it has precedence when serializing the request.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since this is a request and won't be de-serialized, I left only the array (since it support text content as well) and if a message with a single string content needs to be created, for the simplicity I overloaded a constructor, that automatically constructs the array with text content. What do you think?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
added a few ctor overloads in 3c79c73, which should solve this
LlmTornado/Threads/ToolCall.cs
Outdated
/// <summary> | ||
/// For now, this is always going to be an empty object. | ||
/// For now, this is always going to be an empty object. TODO: When OpenAI finished implementation, map it here |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
memento to check this out
Tried running all the threads tests:
Some nits:
Overall, this looks really good and consistent! Looking forward to more tests, streaming, and wrapping this up. Let me know when this is ready. |
18aef4e
to
4b7bd81
Compare
- adding submit tool output - adding Function assistant test
Pushed a few changes, and optimized Assistants streaming. To wrap this up:
|
Thanks for working on this! |
Implemented and mapped most of the endpoints for Threads (thread, messages, run, run steps)
https://platform.openai.com/docs/api-reference/threads
https://platform.openai.com/docs/api-reference/messages
https://platform.openai.com/docs/api-reference/runs
https://platform.openai.com/docs/api-reference/run-steps
Still missing: Submit tool outputs to run, more complex demo cases (currently only file info extraction is showcased).