Update README.md
Browse files
README.md
CHANGED
|
@@ -22,8 +22,8 @@ Demo example (Only the first 32 frames are original; the rest are generated)
|
|
| 22 |
This is simple lora for Wan2.2TI transformer.
|
| 23 |
First test - rank = 64, alpha = 128.
|
| 24 |
It was trained using around 10k video. Input video frames 16-64 and output video frames 41-81.
|
| 25 |
-
Mostly attention processor has been changed for this approach.
|
| 26 |
-
|
| 27 |
|
| 28 |
### Models
|
| 29 |
| Model | Best input frames count | Best output frames count | Resolution | Huggingface Link |
|
|
|
|
| 22 |
This is simple lora for Wan2.2TI transformer.
|
| 23 |
First test - rank = 64, alpha = 128.
|
| 24 |
It was trained using around 10k video. Input video frames 16-64 and output video frames 41-81.
|
| 25 |
+
Mostly attention processor has been changed for this approach.
|
| 26 |
+
See <a href="https://github.com/TheDenk/wan2.2-video-continuation">Github code</a>.
|
| 27 |
|
| 28 |
### Models
|
| 29 |
| Model | Best input frames count | Best output frames count | Resolution | Huggingface Link |
|