Skip to content

Commit da8e1fe

Browse files
committed
support sage attention
1 parent 852c3d8 commit da8e1fe

File tree

1 file changed

+7
-0
lines changed

1 file changed

+7
-0
lines changed

examples/wanvideo/README.md

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,13 @@ cd DiffSynth-Studio
1010
pip install -e .
1111
```
1212

13+
Wan-Video supports multiple Attention implementations. If you have installed any of the following Attention implementations, they will be enabled based on priority.
14+
15+
* [Flash Attention 3](https://github.com/Dao-AILab/flash-attention)
16+
* [Flash Attention 2](https://github.com/Dao-AILab/flash-attention)
17+
* [Sage Attention](https://github.com/thu-ml/SageAttention)
18+
* [torch SDPA](https://pytorch.org/docs/stable/generated/torch.nn.functional.scaled_dot_product_attention.html) (default. `torch>=2.5.0` is recommended.)
19+
1320
## Inference
1421

1522
### Wan-Video-1.3B-T2V

0 commit comments

Comments
 (0)