Flash-Attention 3 Wheels
CUDA 12.8, PyTorch 2.11.0
Generated on: 2026-04-03 02:03:57 UTC
flash_attn_3-3.0.0+20260330.cu128torch2110cxx11abitrue.98024f-cp39-abi3-linux_aarch64.whl
flash_attn_3-3.0.0+20260330.cu128torch2110cxx11abitrue.98024f-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0+20260330.cu128torch2110cxx11abitrue.98024f-cp39-abi3-win_amd64.whl