Flash-Attention 3 Wheels
CUDA 13.0, PyTorch 2.10.0
Generated on: 2026-04-03 02:03:57 UTC
flash_attn_3-3.0.0+20260212.cu130torch2100cxx11abitrue.c4d8b0-cp39-abi3-linux_aarch64.whl
flash_attn_3-3.0.0+20260212.cu130torch2100cxx11abitrue.c4d8b0-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0+20260216.cu130torch2100cxx11abitrue.fec3a6-cp39-abi3-linux_aarch64.whl
flash_attn_3-3.0.0+20260217.cu130torch2100cxx11abitrue.fec3a6-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0+20260302.cu130torch2100cxx11abitrue.ceb109-cp39-abi3-linux_aarch64.whl
flash_attn_3-3.0.0+20260302.cu130torch2100cxx11abitrue.ceb109-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0+20260316.cu130torch2100cxx11abitrue.71bf77-cp39-abi3-linux_aarch64.whl
flash_attn_3-3.0.0+20260316.cu130torch2100cxx11abitrue.71bf77-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0+20260318.cu130torch2100cxx11abitrue.8afc61-cp39-abi3-linux_aarch64.whl
flash_attn_3-3.0.0+20260318.cu130torch2100cxx11abitrue.8afc61-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0+20260318.cu130torch2100cxx11abitrue.8afc61-cp39-abi3-win_amd64.whl
flash_attn_3-3.0.0b1+20260126.cu130torch2100cxx11abitrue.438325-cp39-abi3-linux_aarch64.whl
flash_attn_3-3.0.0b1+20260126.cu130torch2100cxx11abitrue.438325-cp39-abi3-linux_x86_64.whl