🔥 Flash-Attention 3 Wheels Repository

Pre-built wheels for Flash-Attention 3, updated weekly

Generated on: 2025-12-29 05:20:10 UTC

Update

🚀 Windows Wheels Now Available!

We've successfully built Flash Attention 3 wheels for Windows (CUDA 12.8 only for now).

Installation Instructions

Add the appropriate index URL to your pip command:

pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/PATH/TO/INDEX

Available Wheel Indexes

CUDA 13.0, PyTorch 2.9.1

5 wheels available • Last updated: 2025-12-29

View Wheels
Direct pip command pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu130_torch291 --extra-index-url https://download.pytorch.org/whl/cu130

CUDA 13.0, PyTorch 2.9.0

11 wheels available • Last updated: 2025-12-29

View Wheels
Direct pip command pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu130_torch290 --extra-index-url https://download.pytorch.org/whl/cu130

CUDA 12.9, PyTorch 2.8.0

6 wheels available • Last updated: 2025-10-21

View Wheels
Direct pip command pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu129_torch280 --extra-index-url https://download.pytorch.org/whl/cu129

CUDA 12.8, PyTorch 2.9.1

Windows Support

10 wheels available • Last updated: 2025-12-29

View Wheels
Direct pip command pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu128_torch291 --extra-index-url https://download.pytorch.org/whl/cu128

CUDA 12.8, PyTorch 2.9.0

Windows Support

16 wheels available • Last updated: 2025-12-29

View Wheels
Direct pip command pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu128_torch290 --extra-index-url https://download.pytorch.org/whl/cu128

CUDA 12.8, PyTorch 2.8.0

Windows Support

22 wheels available • Last updated: 2025-12-29

View Wheels
Direct pip command pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu128_torch280 --extra-index-url https://download.pytorch.org/whl/cu128

CUDA 12.6, PyTorch 2.9.1

5 wheels available • Last updated: 2025-12-29

View Wheels
Direct pip command pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu126_torch291 --extra-index-url https://download.pytorch.org/whl/cu126

CUDA 12.6, PyTorch 2.9.0

11 wheels available • Last updated: 2025-12-29

View Wheels
Direct pip command pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu126_torch290 --extra-index-url https://download.pytorch.org/whl/cu126

CUDA 12.6, PyTorch 2.8.0

17 wheels available • Last updated: 2025-12-29

View Wheels
Direct pip command pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu126_torch280 --extra-index-url https://download.pytorch.org/whl/cu126

Quick Reference

Usage Examples

# Install for CUDA 12.3, PyTorch 2.4.0
pip install flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu128_torch280

# Install specific version
pip install flash_attn_3==3.0.0 --find-links https://windreamer.github.io/flash-attention3-wheels/cu128_torch280

# Upgrade existing installation
pip install --upgrade flash_attn_3 --find-links https://windreamer.github.io/flash-attention3-wheels/cu128_torch280