metadata
license: apache-2.0
base_model: apple/OpenELM-450M
library_name: peft
tags:
- lora
- openelm
- gsm8k
- math
- adapter
- transformers
- peft
🧸 OpenELM-450M LoRA Adapter — Fine-Tuned on GSM8K
This is a LoRA adapter trained on the GSM8K dataset using Apple's OpenELM-450M base model.
Model Details
- Base model:
apple/OpenELM-450M
- Adapter type: LoRA via PEFT (float32)
- Trained on: GSM8K (math word problems)
- Languages: English
- License: Apache 2.0
How to Use
from transformers import AutoModelForCausalLM, AutoTokenizer
base_model = AutoModelForCausalLM.from_pretrained("apple/OpenELM-450M")
tokenizer = AutoTokenizer.from_pretrained("EleutherAI/gpt-neo-125M")