YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

warmup_steps = 5,

num_train_epochs = 3,

learning_rate = 5e-5,

optim = 'adamw_torch', #"adamw_8bit",

gradient_accumulation_steps = 4,

weight_decay = 0.03, #L2 reg

lr_scheduler_type = "linear",

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Collection including moneco/Llama8B-1k-3-epoch_2