Flashinfer adoption by vLLM #883
prasad-nair-amd
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
The flashinfer github page says , flashinfer is adopted by vLLM. Does this mean if we are doing inference using vLLM flashinfer is used ??
Beta Was this translation helpful? Give feedback.
All reactions