-
Notifications
You must be signed in to change notification settings - Fork 2.2k
Pull requests: Dao-AILab/flash-attention
Author
Label
Projects
Milestones
Reviews
Assignee
Sort
Pull requests list
Fix isinstance checks when ColumnParallelLinear is None
#2050
opened Dec 5, 2025 by
ailuntz
Loading…
Fix Windows Linking for FlashAttention 3 using Ninja response files
#2047
opened Dec 4, 2025 by
windreamer
Loading…
[Cute,Fwd] Extend score_mod to variable sequence length
#2043
opened Dec 3, 2025 by
reubenconducts
Loading…
[Cute,Fwd] Add scheduler metadata kernel for varlen and dynamic splits into sm100 fwd
#2027
opened Nov 22, 2025 by
jayhshah
Loading…
[Cute,Fwd/Bwd,Sm12x] [WIP/DRAFT/HELP] cute FA for sm12x
#2017
opened Nov 16, 2025 by
johnnynunez
•
Draft
Add flash_attn_varlen_qkvpacked_func to hopper (flash_attn_3)
#1902
opened Sep 22, 2025 by
foreverYoungGitHub
Loading…
fix race condition bug in cute _flash_attn_fwd in multiple gpu env
#1793
opened Aug 1, 2025 by
beiw-nv
Loading…
Previous Next
ProTip!
Follow long discussions with comments:>50.