Skip to content

hotfix - Revert vllm/attention/layer.py changes from 0f8cafe - fix torch.compile recompilations #1696

hotfix - Revert vllm/attention/layer.py changes from 0f8cafe - fix torch.compile recompilations

hotfix - Revert vllm/attention/layer.py changes from 0f8cafe - fix torch.compile recompilations #1696

Annotations

1 warning

mypy (3.9)

succeeded Jan 23, 2025 in 31s