Flash-Attention 3 Wheels
CUDA 12.6, PyTorch 2.10.0
Generated on: 2026-04-03 02:03:57 UTC
flash_attn_3-3.0.0+20260216.cu126torch2100cxx11abitrue.b62d93-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0+20260302.cu126torch2100cxx11abitrue.ceb109-cp39-abi3-linux_x86_64.whl