Flash-Attention 3 Wheels
CUDA cu126, PyTorch torch290
Generated on: 2025-11-03 04:17:27 UTC
flash_attn_3-3.0.0b1+20251028.cu126torch290cxx11abitrue.b3f1b6-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20251103.cu126torch290cxx11abitrue.25611-cp39-abi3-linux_x86_64.whl