baichuan-7b-lora-alpaca-cn / finetuning_args.json
Laurie's picture
Upload 11 files
0f5f3fe
raw
history blame contribute delete
194 Bytes
{
"finetuning_type": "lora",
"lora_alpha": 32.0,
"lora_dropout": 0.1,
"lora_rank": 8,
"lora_target": [
"W_pack"
],
"name_module_trainable": "mlp",
"num_layer_trainable": 3
}