metadata
license: apache-2.0
library_name: transformers
base_model:
- nbeerbower/Mahou-1.5-mistral-nemo-12B-lorablated
datasets:
- nbeerbower/Schule-DPO
- nbeerbower/Arkhaios-DPO
- nbeerbower/Purpura-DPO
mistral-nemo-kartoffel-12B
Mahou-1.5-mistral-nemo-12B-lorablated finetuned on various datasets.
Method
ORPO tuned with 8x A100 for 2 epochs.