BAAI
/

Out of Mem for 4090-24G

#1
by GUESSGUO - opened

For current infer code, how much gpu mem is required? 24G seems unavailable.

Beijing Academy of Artificial Intelligence org

For current infer code, how much gpu mem is required? 24G seems unavailable.

Thanks for your comment : ) The code is currently only optimized for 80GB GPUs. We’ll be releasing the full inference code later today — please stay tuned!

wen full inference code?

Sign up or log in to comment