Skip to content

hotfix - Revert vllm/attention/layer.py changes from 0f8cafe - fix torch.compile recompilations #1692

hotfix - Revert vllm/attention/layer.py changes from 0f8cafe - fix torch.compile recompilations

hotfix - Revert vllm/attention/layer.py changes from 0f8cafe - fix torch.compile recompilations #1692

Annotations

1 warning

ruff (3.12)

succeeded Jan 23, 2025 in 11s