Too large gpu memory use
#4
by
songispm
- opened
I use the code of https://huggingface.co/spaces/THUDM-HF-SPACE/CogView4/tree/main
The 4090 can not load.
Error:
CUDA out of memory. Tried to allocate 32.00 MiB. GPU 0 has a total capacity of 23.63 GiB of which 41.12 MiB is free. Including non-PyTorch memory, this process has 23.26 GiB memory in use. Of the allocated memory 22.87 GiB is allocated by PyTorch, and 15.43 MiB is reserved by PyTorch but unallocated.
how fix it?
Well, the model file is over 30GB. waiting for the quantified version.