We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
1 parent 3e2d292 commit 803d0d1Copy full SHA for 803d0d1
1 file changed
src/diffusers/models/transformers/transformer_photon.py
@@ -427,7 +427,7 @@ def forward(
427
temb: torch.Tensor,
428
image_rotary_emb: torch.Tensor,
429
attention_mask: Optional[torch.Tensor] = None,
430
- **kwargs: dict[str, Any],
+ **kwargs: Dict[str, Any],
431
) -> torch.Tensor:
432
r"""
433
Runs modulation-gated cross-attention and MLP, with residual connections.
0 commit comments