Skip to content

hotfix - Revert vllm/attention/layer.py changes from 0f8cafe - fix torch.compile recompilations #518

hotfix - Revert vllm/attention/layer.py changes from 0f8cafe - fix torch.compile recompilations

hotfix - Revert vllm/attention/layer.py changes from 0f8cafe - fix torch.compile recompilations #518

Annotations

5 warnings

CodeQL Scan

succeeded Jan 24, 2025 in 3m 2s