SD2K's picture
Update README.md
a6e88ee verified
---
license: mit
language:
- en
base_model:
- microsoft/phi-4
pipeline_tag: question-answering
tags:
- unsloth
datasets:
- mlabonne/FineTome-100k
metrics:
- accuracy
new_version: microsoft/phi-4-gguf
library_name: diffusers
---
My First Huggingface Model - Default UnSloth phi4 template with LoRA fine tuner
Locally trained for around 2 hours, utilized around 16 GB RAM to store the data.
I also used 8 GB RAM to train the model with my GPU