You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
LLMs have limited context window. The file generated by miniogre to prompt LLMs needs to comply with that.
So far, there is no automation for that, and as a result, the README generation breaks if the context is larger than the LLM's context window.
Proposed solution
Currently, we have a system that is able to count the tokens (approximately) of the file sent to the LLM pipeline. We should use it to trunk the file and only send the tokens a given LLM can support.
The text was updated successfully, but these errors were encountered:
The Problem
LLMs have limited context window. The file generated by miniogre to prompt LLMs needs to comply with that.
So far, there is no automation for that, and as a result, the README generation breaks if the context is larger than the LLM's context window.
Proposed solution
Currently, we have a system that is able to count the tokens (approximately) of the file sent to the LLM pipeline. We should use it to trunk the file and only send the tokens a given LLM can support.
The text was updated successfully, but these errors were encountered: