🔥 Flash-Attention 3 Wheels Repository

Pre-built wheels for Flash-Attention 3, updated biweekly

Generated on: 2026-02-12 10:45:58 UTC

🚀 Windows Wheels Now Available!

We've successfully built Flash Attention 3 wheels for Windows (CUDA 12 only for now) and Arm CUDA SBSA platforms like GH200.

Installation Instructions

Add the appropriate index URL to your pip command:

pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/PATH/TO/INDEX

Available Wheel Indexes

CUDA 13.0, PyTorch 2.10.0

Arm64 Support

4 wheels available • Last updated: 2026-02-12

View Wheels
Direct pip command
pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu130_torch2100 --extra-index-url https://download.pytorch.org/whl/cu130

CUDA 13.0, PyTorch 2.9.1

Arm64 Support

12 wheels available • Last updated: 2026-02-12

View Wheels
Direct pip command
pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu130_torch291 --extra-index-url https://download.pytorch.org/whl/cu130

CUDA 13.0, PyTorch 2.9.0

12 wheels available • Last updated: 2026-01-05

View Wheels
Direct pip command
pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu130_torch290 --extra-index-url https://download.pytorch.org/whl/cu130

CUDA 12.9, PyTorch 2.10.0

Arm64 Support

4 wheels available • Last updated: 2026-02-12

View Wheels
Direct pip command
pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu129_torch2100 --extra-index-url https://download.pytorch.org/whl/cu129

CUDA 12.9, PyTorch 2.9.1

Arm64 Support

6 wheels available • Last updated: 2026-02-12

View Wheels
Direct pip command
pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu129_torch291 --extra-index-url https://download.pytorch.org/whl/cu129

CUDA 12.9, PyTorch 2.9.0

Windows Support

4 wheels available • Last updated: 2026-02-12

View Wheels
Direct pip command
pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu129_torch290 --extra-index-url https://download.pytorch.org/whl/cu129

CUDA 12.9, PyTorch 2.8.0

Windows SupportArm64 Support

16 wheels available • Last updated: 2026-02-12

View Wheels
Direct pip command
pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu129_torch280 --extra-index-url https://download.pytorch.org/whl/cu129

CUDA 12.8, PyTorch 2.9.1

Windows Support

14 wheels available • Last updated: 2026-01-19

View Wheels
Direct pip command
pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu128_torch291 --extra-index-url https://download.pytorch.org/whl/cu128

CUDA 12.8, PyTorch 2.9.0

Windows Support

20 wheels available • Last updated: 2026-01-19

View Wheels
Direct pip command
pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu128_torch290 --extra-index-url https://download.pytorch.org/whl/cu128

CUDA 12.8, PyTorch 2.8.0

Windows Support

26 wheels available • Last updated: 2026-01-19

View Wheels
Direct pip command
pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu128_torch280 --extra-index-url https://download.pytorch.org/whl/cu128

CUDA 12.6, PyTorch 2.9.1

6 wheels available • Last updated: 2026-01-05

View Wheels
Direct pip command
pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu126_torch291 --extra-index-url https://download.pytorch.org/whl/cu126

CUDA 12.6, PyTorch 2.9.0

12 wheels available • Last updated: 2026-01-05

View Wheels
Direct pip command
pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu126_torch290 --extra-index-url https://download.pytorch.org/whl/cu126

CUDA 12.6, PyTorch 2.8.0

18 wheels available • Last updated: 2026-01-05

View Wheels
Direct pip command
pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu126_torch280 --extra-index-url https://download.pytorch.org/whl/cu126

Quick Reference

Usage Examples

# Install for CUDA 12.3, PyTorch 2.4.0
pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu128_torch280

# Install specific version
pip install flash_attn_3==3.0.0 --find-links https://windreamer.github.io/flash-attention3-wheels/cu128_torch280

# Upgrade existing installation
pip install --upgrade flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu128_torch280