vram

#2
by cromolog - opened

how much memory vram needs to work. I have installed it in a rtx 4080 32gb ram and 12 vram and always say me out of memory. I have run locally wan 2.1 in my pc without problem.

All experiments except for training the LoRA were performed on a 4090 with 24 GiB of vram. Maybe further quantization is needed from your end as modelscope t2v has fewer params then wan.

Sign up or log in to comment