Out of Mem for 4090-24G
#1
by
GUESSGUO
- opened
For current infer code, how much gpu mem is required? 24G seems unavailable.
For current infer code, how much gpu mem is required? 24G seems unavailable.
Thanks for your comment : ) The code is currently only optimized for 80GB GPUs. We’ll be releasing the full inference code later today — please stay tuned!
wen full inference code?