Flash-Attention 3 Wheels
CUDA 12.9, PyTorch 2.9.0
Generated on: 2026-02-12 10:45:58 UTC
flash_attn_3-3.0.0+20260212.cu129torch290cxx11abitrue.3daa3f-cp39-abi3-win_amd64.whl
flash_attn_3-3.0.0b1+20260112.cu129torch290cxx11abitrue.c3da81-cp39-abi3-win_amd64.whl
flash_attn_3-3.0.0b1+20260119.cu129torch290cxx11abitrue.100fa1-cp39-abi3-win_amd64.whl
flash_attn_3-3.0.0b1+20260126.cu129torch290cxx11abitrue.40bb43-cp39-abi3-win_amd64.whl