Flash-Attention 3 Wheels
CUDA 12.8, PyTorch 2.9.1
Generated on: 2026-02-12 10:45:58 UTC
flash_attn_3-3.0.0b1+20251204.cu128torch291cxx11abitrue.26ba55-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20251204.cu128torch291cxx11abitrue.e878b6-cp39-abi3-win_amd64.whl
flash_attn_3-3.0.0b1+20251208.cu128torch291cxx11abitrue.1f2d0c-cp39-abi3-win_amd64.whl
flash_attn_3-3.0.0b1+20251208.cu128torch291cxx11abitrue.632843-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20251215.cu128torch291cxx11abitrue.bc0e4a-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20251215.cu128torch291cxx11abitrue.e73940-cp39-abi3-win_amd64.whl
flash_attn_3-3.0.0b1+20251222.cu128torch291cxx11abitrue.bba578-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20251222.cu128torch291cxx11abitrue.f474f8-cp39-abi3-win_amd64.whl
flash_attn_3-3.0.0b1+20251229.cu128torch291cxx11abitrue.58fe37-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20251229.cu128torch291cxx11abitrue.df278f-cp39-abi3-win_amd64.whl
flash_attn_3-3.0.0b1+20260105.cu128torch291cxx11abitrue.394c4d-cp39-abi3-win_amd64.whl
flash_attn_3-3.0.0b1+20260105.cu128torch291cxx11abitrue.9b6dba-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20260112.cu128torch291cxx11abitrue.ea8f73-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20260119.cu128torch291cxx11abitrue.a0f9f4-cp39-abi3-linux_x86_64.whl