-
Notifications
You must be signed in to change notification settings - Fork 38
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
✨ Make Kai RPC Server fail initialize
call when model provider cannot be queried
#614
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
visack w/ a few questions.
kai/rpc_server/server.py
Outdated
except Exception as e: | ||
app.log.error("unable to get model provider:", e) | ||
raise | ||
server.shutdown_flag = True |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does shutdown flag mean we are shutting down the server?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, though I'm open to having it stay on
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would think the end goal would be:
- User tries to start the server with bad arguments via the IDE
- Initialize fails, returns an error, but kai keeps running.
- IDE tells the user what they did wrong.
- User fixes.
- User tries to start the server again. IDE recognizes kai is already started. IDE just sends 'initialize' request with updated params.
- Success.
|
||
if self.cache: | ||
if self.cache and cache_path_resolver: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@JonahSussman you cannot call cache_path() twice, this is a shortcoming of the task based approach, because one task makes more than 1 requests to the LLM. we keep a count of the requests and it assumes when you call cache_path(), you are requesting a new cache so you get a new cache entry everytime.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I should have added a comment cautioning about that
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
All good, I'll revert the changes and add the comment
Signed-off-by: JonahSussman <[email protected]>
Signed-off-by: JonahSussman <[email protected]>
Signed-off-by: JonahSussman <[email protected]>
Signed-off-by: JonahSussman <[email protected]>
Signed-off-by: JonahSussman <[email protected]>
Signed-off-by: JonahSussman <[email protected]>
ad53692
to
ede4f9e
Compare
Implements #609
As a follow up, perhaps the IDE should show a reason why the
initialize
call failed? Right now you have to look directly at the logs.