Model size and multiple GPUs?
#10
by
just-james
- opened
Setup using the conda instructions in the model card and running with the below code, the model card says that this should allocate 45 GB of memory but it is not recognizing one of my video cards. Is 45 GB the actual VRAM needed to run the example at the smaller resolution?
python3 sample_video.py \
--video-size 544 960 \
--video-length 129 \
--infer-steps 30 \
--prompt "a tiger is running, realistic." \
--flow-reverse \
--seed 0 \
--use-cpu-offload \
--save-path ./results
it supports one gpu at the moment
Can you enable support for multiple GPUs? There are enthusiasts with multiple 24GB GPUs that could run the smaller version locally, but not many people own a single GPU with 48 GB of VRAM like an RTX 6000.
Can you enable support for multiple GPUs? There are enthusiasts with multiple 24GB GPUs that could run the smaller version locally, but not many people own a single GPU with 48 GB of VRAM like an RTX 6000.
not that i am aware of