Flash-Attention 3 Wheels
CUDA cu130, PyTorch torch291
Generated on: 2025-12-29 05:20:10 UTC
flash_attn_3-3.0.0b1+20251205.cu130torch291cxx11abitrue.59df2f-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20251208.cu130torch291cxx11abitrue.632843-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20251215.cu130torch291cxx11abitrue.bc0e4a-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20251222.cu130torch291cxx11abitrue.bba578-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20251229.cu130torch291cxx11abitrue.58fe37-cp39-abi3-linux_x86_64.whl