如欲新增small data以Finetune Llama3-TAIDE,RAM 32GB是否足夠?
#14
by
JessyNTHUELEBC
- opened
目前我們是以「RAM 32GB」試著Finetuned,然而可能是因為系統default就吃走一些RAM,所以好奇究竟RAM要到甚麼程度才可以跑fine-tuning?
您好,
請參考 LoRa 的 finetune 方法 和 Table 3:
Table 3. The GPU memory utilization is captured by varying the max. batch size parameter.
https://infohub.delltechnologies.com/en-us/p/llama-2-efficient-fine-tuning-using-low-rank-adaptation-lora-on-single-gpu/
Best Regards.