Not able to load in oobabooga webui
Hello,
I have downloaded this model but when I tries to load it in the webui, my pc gets hang, no response, no error message. I just have to hard reboot the pc. Kindly tell me how do I load it.
How many GPUs and VRAM for each? It is very likely that loading the model into VRAM has kicked out your PC display program from VRAM.
You would need to limit the VRAM for each GPU in the Model section of textgen by draging the VRAM slider.
How many GPUs and VRAM for each? It is very likely that loading the model into VRAM has kicked out your PC display program from VRAM.
You would need to limit the VRAM for each GPU in the Model section of textgen by draging the VRAM slider.
Will you please mention what are the hardware requirements for successful and smooth working?
I am working with 2x3090 (24G vram each) ubuntu PC with 256GB ram.
You would need at least 24G vram (a single 3090 or 4090) to run 30B models.
OMG! Two Nvidia 3090 Cost around 4,22,000 INR and 256GB RAM!
It seems like PC's costing around 2000000 INR. For such a RAM, VRAM, I need that much compatible Motherboard, SSD too. Its just far beyond my paying capacity.
Thanks for your kind replies.
You only need one 3090 to run it. I got my 3090 used on ebay for $700, thats about 58,000 inr. I have 32gb ram.
I am working with 2x3090 (24G vram each) ubuntu PC with 256GB ram.
You would need at least 24G vram (a single 3090 or 4090) to run 30B models.
@GirishSharma My setup is for inferencing 70B and beyond models or running fine-tuning pipeline for smaller models locally.