Flash-Attention 3 Wheels
CUDA 12.8, PyTorch 2.10.0
Generated on: 2026-04-03 02:03:57 UTC
flash_attn_3-3.0.0+20260216.cu128torch2100cxx11abitrue.fec3a6-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0+20260302.cu128torch2100cxx11abitrue.9083bc-cp39-abi3-win_amd64.whl
flash_attn_3-3.0.0+20260302.cu128torch2100cxx11abitrue.ceb109-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0+20260316.cu128torch2100cxx11abitrue.71bf77-cp39-abi3-linux_aarch64.whl
flash_attn_3-3.0.0+20260316.cu128torch2100cxx11abitrue.71bf77-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0+20260318.cu128torch2100cxx11abitrue.8afc61-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0+20260318.cu128torch2100cxx11abitrue.8afc61-cp39-abi3-win_amd64.whl
flash_attn_3-3.0.0+20260319.cu128torch2100cxx11abitrue.939781-cp39-abi3-linux_aarch64.whl