🔥 Flash-Attention 3 Wheels Repository

Pre-built wheels for Flash-Attention 3, updated biweekly

Generated on: 2026-04-03 02:03:57 UTC

📊 Download Statistics

1,062,699
Total Downloads
+13,132
Today's New
43
Active Wheels

📢 Recent Updates

Installation Instructions

Add the appropriate index URL to your pip command:

pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/PATH/TO/INDEX

Available Wheel Indexes

CUDA 13.0, PyTorch 2.11.0

Windows Support Arm64 Support

3 wheels available • Last updated: 2026-03-30

View Wheels
Direct pip command
pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu130_torch2110

CUDA 13.0, PyTorch 2.10.0

Windows Support Arm64 Support

13 wheels available • Last updated: 2026-03-19

View Wheels
Direct pip command
pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu130_torch2100

CUDA 13.0, PyTorch 2.9.1

Arm64 Support

16 wheels available • Last updated: 2026-03-03

View Wheels
Direct pip command
pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu130_torch291

CUDA 13.0, PyTorch 2.9.0

13 wheels available • Last updated: 2026-02-17

View Wheels
Direct pip command
pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu130_torch290

CUDA 12.9, PyTorch 2.10.0

Arm64 Support

9 wheels available • Last updated: 2026-03-16

View Wheels
Direct pip command
pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu129_torch2100

CUDA 12.9, PyTorch 2.9.1

Arm64 Support

9 wheels available • Last updated: 2026-03-03

View Wheels
Direct pip command
pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu129_torch291

CUDA 12.9, PyTorch 2.9.0

Windows Support

5 wheels available • Last updated: 2026-02-17

View Wheels
Direct pip command
pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu129_torch290

CUDA 12.9, PyTorch 2.8.0

Windows Support Arm64 Support

22 wheels available • Last updated: 2026-03-16

View Wheels
Direct pip command
pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu129_torch280

CUDA 12.8, PyTorch 2.11.0

Windows Support Arm64 Support

3 wheels available • Last updated: 2026-03-30

View Wheels
Direct pip command
pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu128_torch2110

CUDA 12.8, PyTorch 2.10.0

Windows Support Arm64 Support

8 wheels available • Last updated: 2026-03-19

View Wheels
Direct pip command
pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu128_torch2100

CUDA 12.8, PyTorch 2.9.1

Windows Support

18 wheels available • Last updated: 2026-03-16

View Wheels
Direct pip command
pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu128_torch291

CUDA 12.8, PyTorch 2.9.0

Windows Support

22 wheels available • Last updated: 2026-02-17

View Wheels
Direct pip command
pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu128_torch290

CUDA 12.8, PyTorch 2.8.0

Windows Support

36 wheels available • Last updated: 2026-03-30

View Wheels
Direct pip command
pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu128_torch280

CUDA 12.6, PyTorch 2.10.0

2 wheels available • Last updated: 2026-03-03

View Wheels
Direct pip command
pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu126_torch2100

CUDA 12.6, PyTorch 2.9.1

8 wheels available • Last updated: 2026-03-03

View Wheels
Direct pip command
pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu126_torch291

CUDA 12.6, PyTorch 2.9.0

13 wheels available • Last updated: 2026-02-17

View Wheels
Direct pip command
pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu126_torch290

CUDA 12.6, PyTorch 2.8.0

20 wheels available • Last updated: 2026-03-03

View Wheels
Direct pip command
pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu126_torch280

Quick Reference

Usage Examples

# Install for CUDA 12.3, PyTorch 2.4.0
pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu128_torch280

# Install specific version
pip install flash_attn_3==3.0.0 --find-links https://windreamer.github.io/flash-attention3-wheels/cu128_torch280

# Upgrade existing installation
pip install --upgrade flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu128_torch280

🔥 Top Daily New Downloads

Wheel Daily New Total
flash_attn_3-3.0.0b1+20251110.cu128torch280cxx11abitrue.c8abdd-cp39-abi3-linux_x86_64.whl +7,956 895,109
flash_attn_3-3.0.0+20260302.cu128torch280cxx11abitrue.ceb109-cp39-abi3-linux_x86_64.whl +1,356 30,831
flash_attn_3-3.0.0b1+20260112.cu129torch291cxx11abitrue.ea8f73-cp39-abi3-linux_x86_64.whl +1,141 26,715
flash_attn_3-3.0.0+20260302.cu130torch2100cxx11abitrue.ceb109-cp39-abi3-linux_x86_64.whl +884 4,833
flash_attn_3-3.0.0b1+20251229.cu128torch291cxx11abitrue.58fe37-cp39-abi3-linux_x86_64.whl +584 2,702