Flash-Attention 3 Wheels
CUDA cu128, PyTorch torch290
Generated on: 2025-12-29 05:20:10 UTC
flash_attn_3-3.0.0b1+20251028.cu128torch290cxx11abitrue.b3f1b6-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20251103.cu128torch290cxx11abitrue.25611-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20251110.cu128torch290cxx11abitrue.c8abdd-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20251117.cu128torch290cxx11abitrue.5d2cd3-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20251124.cu128torch290cxx11abitrue.d063b3-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20251201.cu128torch290cxx11abitrue.672381-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20251204.cu128torch290cxx11abitrue.e878b6-cp39-abi3-win_amd64.whl
flash_attn_3-3.0.0b1+20251205.cu128torch290cxx11abitrue.59df2f-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20251208.cu128torch290cxx11abitrue.1f2d0c-cp39-abi3-win_amd64.whl
flash_attn_3-3.0.0b1+20251208.cu128torch290cxx11abitrue.632843-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20251215.cu128torch290cxx11abitrue.bc0e4a-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20251215.cu128torch290cxx11abitrue.e73940-cp39-abi3-win_amd64.whl
flash_attn_3-3.0.0b1+20251222.cu128torch290cxx11abitrue.bba578-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20251222.cu128torch290cxx11abitrue.f474f8-cp39-abi3-win_amd64.whl
flash_attn_3-3.0.0b1+20251229.cu128torch290cxx11abitrue.58fe37-cp39-abi3-linux_x86_64.whl
flash_attn_3-3.0.0b1+20251229.cu128torch290cxx11abitrue.df278f-cp39-abi3-win_amd64.whl