Flash-Attention 3 Wheels
CUDA 12.9, PyTorch 2.10.0
Generated on: 2026-02-12 10:45:58 UTC
flash_attn_3-3.0.0+20260212.cu129torch2100cxx11abitrue.c4d8b0-cp39-abi3-linux_aarch64.whl
flash_attn_3-3.0.0+20260212.cu129torch2100cxx11abitrue.c4d8b0-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20260126.cu129torch2100cxx11abitrue.438325-cp39-abi3-linux_aarch64.whl
flash_attn_3-3.0.0b1+20260126.cu129torch2100cxx11abitrue.438325-cp39-abi3-linux_x86_64.whl