-
Notifications
You must be signed in to change notification settings - Fork 31
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is it not supported by streaming output? #6
Comments
demo.py 中的 use_vllm = True # set True to use vllm for inference |
Thanks! |
不起作用 |
Since we need to post-process the LLM output to match the citation number such as [6-8] with the context sentences, it currently does not support stream output. |
@zRzRzRzRzRzRzR 说的不支持。 |
Is it not supported by streaming output?
The text was updated successfully, but these errors were encountered: