🧸 OpenELM-450M LoRA Adapter — Fine-Tuned on GSM8K

This is a LoRA adapter trained on the GSM8K dataset using Apple's OpenELM-450M base model.

Model Details

  • Base model: apple/OpenELM-450M
  • Adapter type: LoRA via PEFT (float32)
  • Trained on: GSM8K (math word problems)
  • Languages: English
  • License: Apache 2.0

How to Use

from transformers import AutoModelForCausalLM, AutoTokenizer

base_model = AutoModelForCausalLM.from_pretrained("apple/OpenELM-450M")
tokenizer = AutoTokenizer.from_pretrained("EleutherAI/gpt-neo-125M")

Downloads last month
9
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 1 Ask for provider support

Model tree for faxnoprinter/OpenELM-450M-gsm8k-LoRA

Base model

apple/OpenELM-450M
Adapter
(4)
this model

Collection including faxnoprinter/OpenELM-450M-gsm8k-LoRA