Spaces:
Running
on
Zero
Running
on
Zero
What specifications do this model need?
#2
by
hieuxinhe
- opened
Hi, Great work!
But Please provide some basic specifications you need to run inference of this model.
( I have some problems when running this model on T4 with 18GB. )
Hello, you need around VRAM >18GB for single image inference. Do you get an OOM error?
๐๐๐
@yisol please can u mention the memory ,gpu specifications , can we clone this model to run locally ?? , I'm using 16GB ram and 24GB GPU still unable to launch the app