workflow error?
Requested to load WAN21
loaded completely 12696.800530926514 10424.687622070312 True
0%| | 0/2 [00:00<?, ?it/s]
!!! Exception during processing !!! Given groups=1, weight of size [5120, 36, 1, 2, 2], expected input[1, 32, 21, 96, 96] to have 36 channels, but got 32 channels instead
Traceback (most recent call last):
File "A:\COMFYUI_3\ComfyUI_windows_portable\ComfyUI\execution.py", line 427, in execute
output_data, output_ui, has_subgraph, has_pending_tasks = await get_output_data(prompt_id, unique_id, obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\ComfyUI\execution.py", line 270, in get_output_data
return_values = await _async_map_node_over_list(prompt_id, unique_id, obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\ComfyUI\execution.py", line 244, in _async_map_node_over_list
await process_inputs(input_dict, i)
File "A:\COMFYUI_3\ComfyUI_windows_portable\ComfyUI\execution.py", line 232, in process_inputs
result = f(**inputs)
^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\ComfyUI\nodes.py", line 1550, in sample
return common_ksampler(model, noise_seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise, disable_noise=disable_noise, start_step=start_at_step, last_step=end_at_step, force_full_denoise=force_full_denoise)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\ComfyUI\nodes.py", line 1483, in common_ksampler
samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-impact-pack\modules\impact\sample_error_enhancer.py", line 22, in informative_sample
raise e
File "A:\COMFYUI_3\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-impact-pack\modules\impact\sample_error_enhancer.py", line 9, in informative_sample
return original_sample(*args, **kwargs) # This code helps interpret error messages that occur within exceptions but does not have any impact on other operations.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\ComfyUI\comfy\sample.py", line 45, in sample
samples = sampler.sample(noise, positive, negative, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 1143, in sample
return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 1033, in sample
return cfg_guider.sample(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 1018, in sample
output = executor.execute(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\ComfyUI\comfy\patcher_extension.py", line 111, in execute
return self.original(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 986, in outer_sample
output = self.inner_sample(noise, latent_image, device, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 969, in inner_sample
samples = executor.execute(self, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\ComfyUI\comfy\patcher_extension.py", line 111, in execute
return self.original(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 748, in sample
samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **self.extra_options)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\ComfyUI\comfy\k_diffusion\sampling.py", line 985, in sample_lcm
denoised = model(x, sigmas[i] * s_in, **extra_args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 400, in call
out = self.inner_model(x, sigma, model_options=model_options, seed=seed)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 949, in call
return self.predict_noise(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 952, in predict_noise
return sampling_function(self.inner_model, x, timestep, self.conds.get("negative", None), self.conds.get("positive", None), self.cfg, model_options=model_options, seed=seed)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 380, in sampling_function
out = calc_cond_batch(model, conds, x, timestep, model_options)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 206, in calc_cond_batch
return executor.execute(model, conds, x_in, timestep, model_options)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\ComfyUI\comfy\patcher_extension.py", line 111, in execute
return self.original(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 325, in calc_cond_batch
output = model.apply_model(input_x, timestep, **c).chunk(batch_chunks)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\ComfyUI\comfy\model_base.py", line 152, in apply_model
return comfy.patcher_extension.WrapperExecutor.new_class_executor(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\ComfyUI\comfy\patcher_extension.py", line 111, in execute
return self.original(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\ComfyUI\comfy\model_base.py", line 190, in _apply_model
model_output = self.diffusion_model(xc, t, context=context, control=control, transformer_options=transformer_options, **extra_conds).float()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1751, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1762, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\ComfyUI\comfy\ldm\wan\model.py", line 563, in forward
return self.forward_orig(x, timestep, context, clip_fea=clip_fea, freqs=freqs, transformer_options=transformer_options, **kwargs)[:, :, :t, :h, :w]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\ComfyUI\comfy\ldm\wan\model.py", line 503, in forward_orig
x = self.patch_embedding(x.float()).to(x.dtype)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1751, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1762, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\ComfyUI\comfy\ops.py", line 126, in forward
return self.forward_comfy_cast_weights(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\ComfyUI\comfy\ops.py", line 122, in forward_comfy_cast_weights
return self._conv_forward(input, weight, bias)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "A:\COMFYUI_3\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\conv.py", line 720, in _conv_forward
return F.conv3d(
^^^^^^^^^
RuntimeError: Given groups=1, weight of size [5120, 36, 1, 2, 2], expected input[1, 32, 21, 96, 96] to have 36 channels, but got 32 channels instead
Prompt executed in 29.24 seconds
Did you update your comfyui?
Please update all to latest.
Update done...Same error
did you install comfy-gguf?
sure...
This is probably because the ComfyUI Desktop is still at ComfyUI v0.3.45 that is behind https://github.com/comfyanonymous/ComfyUI/releases/tag/v0.3.46 with Wan 2.2 support.
ComfyUI Version: v0.3.46 | Released on '2025-07-28'
Having the same issue. All updated and getting the same errors.
Its the model, its missing the i2v layers.
which Q* is giving errors? it's working fine with the provided workflow and both a 3090 and a 5090
So far I've tried Q4_k_m and Q4_k_s of the I2V models. I got the same error with both.
Are you using the 2.1 vae? this uses the 2.1, not the 2.2.
Impossible to get it to work... Some say it's due to sage attention, but I don't really believe that.
I switched over to the QuantStack version and it worked like a charm.
I switched over to the QuantStack version and it worked like a charm.
Yes, working fine!
Quantstack version didnt actually work for me but removing the custom node flow2-wan-video seemed to do the trick.