Safetensors version?

#2
by Kamikaze-88 - opened

Hi, how was this achieved? did you merge the Vace module into the base skyreels somehow?

I'd like to use an fp8 safetensors version that I can use with block swapping, I prefer fp8 safetensors over gguf's for the small speed boost

QuantStack org

In your case it is probably easier to just use the skyreels fp8 model with kijais fp8 vace and wrapper, since its also allowing block swap. Though in theory i can create it for you yes

In your case it is probably easier to just use the skyreels fp8 model with kijais fp8 vace and wrapper, since its also allowing block swap. Though in theory i can create it for you yes

That would be great, because I'd like to use my SSD storage when working with AI stuff and like to switch between native and wrapper workflows, having so many models eats up a lot of my storage. So I prefer using 1 combined file that works in both native and wrapper

QuantStack org

though does the original vace without addon even work in wrapper? If it does ill do it for you (:

QuantStack org

I already patched it, but i can only upload it later on, it will be available here: https://huggingface.co/wsbagnsv1/test
The correct model is the Wan2_1-SkyReels-V2-T2V-14B-720P-VACE_fp8_e4m3fn.safetensors

though does the original vace without addon even work in wrapper? If it does ill do it for you (:

I already patched it, but i can only upload it later on, it will be available here: https://huggingface.co/wsbagnsv1/test

Yes it does, well at least for me :) and also when you hover over the vace model slot it gives a hint

What a legend! thank you :)

image.png

QuantStack org

Good news its online (;

Good news its online (;

Thank you so much ! it's working :)

Kamikaze-88 changed discussion status to closed

Sign up or log in to comment