Continue Pretraining
#7 opened 4 months ago
by
HuggySSO
Embedding from transformers
#6 opened 5 months ago
by
tillwenke
![](https://cdn-avatars.huggingface.co/v1/production/uploads/noauth/POkBXz3YzW-98Ou-B-q7M.jpeg)
"[...] mixture of full fine-tuning and LoRA was used to provide better generalization."
#5 opened 5 months ago
by
bobox