Flash-Attention 3 Wheels
CUDA cu128, PyTorch torch280
Generated on: 2025-11-03 04:17:27 UTC
flash_attn_3-3.0.0b1+20250911.cu128torch280cxx11abitrue.dfb664-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20250914.cu128torch280cxx11abitrue.2cc6fd-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20250922.cu128torch280cxx11abitrue.5c1627-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20250928.cu128torch280cxx11abitrue.5059fd-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20251014.cu128torch280cxx11abitrue.54d8aa-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20251021.cu128torch280cxx11abitrue.22f7da-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20251028.cu128torch280cxx11abitrue.b3f1b6-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20251103.cu128torch280cxx11abitrue.25611-cp39-abi3-linux_x86_64.whl