|
--- |
|
license: apache-2.0 |
|
language: |
|
- en |
|
tags: |
|
- mistral |
|
- dpo |
|
- biology |
|
- education |
|
--- |
|
This model is fine-tuned on Mistral-7B-Instruct-v0.2 with SFT. The purpose is to develop a more capable educational chatbot that helps students study biology. |
|
|
|
If you use this work, please cite: |
|
``` |
|
@misc{sonkar2024pedagogical, |
|
title={Pedagogical Alignment of Large Language Models}, |
|
author={Shashank Sonkar and Kangqi Ni and Sapana Chaudhary and Richard G. Baraniuk}, |
|
year={2024}, |
|
eprint={2402.05000}, |
|
archivePrefix={arXiv}, |
|
primaryClass={cs.CL}, |
|
url={https://arxiv.org/abs/2402.05000} |
|
} |
|
``` |
|
|