Skip to content

Releases: mjun0812/flash-attention-prebuild-wheels

v0.0.2

16 Nov 17:41
Compare
Choose a tag to compare
Flash-Attention Python PyTorch CUDA
2.4.3, 2.5.6, 2.6.3, 2.7.0.post2 3.10, 3.11, 3.12 2.0.1, 2.1.2, 2.2.2, 2.3.1, 2.4.1, 2.5.1 11.8.0, 12.1.1, 12.4.1

v0.0.1

01 Nov 22:26
Compare
Choose a tag to compare
Flash-Attention Python PyTorch CUDA
1.0.9, 2.4.3, 2.5.6, 2.5.9, 2.6.3 3.10, 3.11, 3.12 2.0.1, 2.1.2, 2.2.2, 2.3.1, 2.4.1, 2.5.0 11.8.0, 12.1.1, 12.4.1

v0.0.0

27 Oct 10:54
Compare
Choose a tag to compare
  • Flash-Attention
    • 2.4.3
    • 2.5.6
    • 2.5.9
    • 2.6.3
  • Python
    • 3.11
    • 3.12
  • PyTorch
    • 2.0.1
    • 2.1.2
    • 2.2.2
    • 2.3.1
    • 2.4.1
    • 2.5.0
  • CUDA
    • 11.8.0
    • 12.1.1
    • 12.4.1