Skip to content

Commit c87575d

Browse files
committed
update
1 parent 3cb66e8 commit c87575d

File tree

1 file changed

+0
-1
lines changed

1 file changed

+0
-1
lines changed

src/diffusers/models/transformers/transformer_flux.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -360,7 +360,6 @@ def __init__(
360360
self.norm1 = AdaLayerNormZero(dim)
361361
self.norm1_context = AdaLayerNormZero(dim)
362362

363-
# Use specialized FluxAttention instead of generic Attention
364363
self.attn = FluxAttention(
365364
query_dim=dim,
366365
cross_attention_dim=None,

0 commit comments

Comments
 (0)