FP8 Pruned Model (E5M2)

Converted from: https://huggingface.co/spacepxl/Wan2.1-VAE-upscale2x
File: Wan2.1_VAE_upscale2x_imageonly_real_v1.safetensors โ†’ Wan2.1_VAE_upscale2x_imageonly_real_v1-fp8-e5m2.safetensors

Quantization: FP8 (E5M2)
Converted on: 2025-12-01 06:49:45

Downloads last month
214
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support