Flash-Attention 3 Wheels
CUDA cu129, PyTorch torch280
Generated on: 2025-11-03 04:17:27 UTC
flash_attn_3-3.0.0b1+20250911.cu129torch280cxx11abitrue.dfb664-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20250914.cu129torch280cxx11abitrue.2cc6fd-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20250922.cu129torch280cxx11abitrue.5c1627-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20250928.cu129torch280cxx11abitrue.5059fd-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20251014.cu129torch280cxx11abitrue.54d8aa-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20251021.cu129torch280cxx11abitrue.f29df7-cp39-abi3-linux_x86_64.whl