An fp8 version (e4m3fn) of Wan2.1 Vace 14B converted from the repackaged version from Comfy-Org

UPDATE: Added fp8 e5m2 version converted from original 58.9GB model

Credits:

Original Vace from Wan-AI:

https://huggingface.co/Wan-AI/Wan2.1-VACE-14B

Repackaged fp16 from Comfy-Org:

https://huggingface.co/Comfy-Org/Wan_2.1_ComfyUI_repackaged

Comfyui node used to convert to fp8:

https://github.com/Shiba-2-shiba/ComfyUI_DiffusionModel_fp8_converter

e5m2 conversion scripts from:

https://huggingface.co/phazei

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support