How much VRAM do you need (you = Anthracite)

#3
by AIGUYCONTENT - opened

I noticed in your description that it said you did not continue training for an additional epoch due to the cost.

Out of sheer curiosity and nothing else--what if you only had 168GB of VRAM at your disposal (All 3090s and different mfg)? Would training a model like Magnum 123b be impractical due to the length of time it would take? Would that be enough for you to train a 70B model? And if so--do you guys take requests?

Anthracite org

Hi there, thanks for voicing your interest in providing compute! however it didn't properly fit even with liger kernels into 8xh100 (640GB of vram), we had to use mi300x (1.5TB of vram) the last few times for anything 70b/123b, so it's not even the time that would prevent us from using this offer for these models but the OOM.

lucyknada changed discussion status to closed

Well thanks and keep up the good work! Magnum 123B is my new favorite model. In fact, it caused me to cancel my Claude3 subscription. Really appreciate you guys giving these great models away to the community. Thank you!!!

Anthracite org

Thanks for such positive feedback! I'm glad we were able to give you claude at home!

Sign up or log in to comment