Flash-Attention 3 Wheels
CUDA 12.9, PyTorch 2.8.0
Generated on: 2026-02-12 10:45:58 UTC
flash_attn_3-3.0.0+20260212.cu129torch280cxx11abitrue.3daa3f-cp39-abi3-win_amd64.whl
flash_attn_3-3.0.0+20260212.cu129torch280cxx11abitrue.c4d8b0-cp39-abi3-linux_aarch64.whl
flash_attn_3-3.0.0+20260212.cu129torch280cxx11abitrue.c4d8b0-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20250911.cu129torch280cxx11abitrue.dfb664-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20250914.cu129torch280cxx11abitrue.2cc6fd-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20250922.cu129torch280cxx11abitrue.5c1627-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20250928.cu129torch280cxx11abitrue.5059fd-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20251014.cu129torch280cxx11abitrue.54d8aa-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20251021.cu129torch280cxx11abitrue.f29df7-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20260112.cu129torch280cxx11abitrue.ea8f73-cp39-abi3-linux_aarch64.whl
flash_attn_3-3.0.0b1+20260112.cu129torch280cxx11abitrue.ea8f73-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20260119.cu129torch280cxx11abitrue.a0f9f4-cp39-abi3-linux_aarch64.whl
flash_attn_3-3.0.0b1+20260119.cu129torch280cxx11abitrue.a0f9f4-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20260126.cu129torch280cxx11abitrue.40bb43-cp39-abi3-win_amd64.whl
flash_attn_3-3.0.0b1+20260126.cu129torch280cxx11abitrue.438325-cp39-abi3-linux_aarch64.whl
flash_attn_3-3.0.0b1+20260126.cu129torch280cxx11abitrue.438325-cp39-abi3-linux_x86_64.whl