🧸 OpenELM-450M LoRA Adapter — Fine-Tuned on dolly

This is a LoRA adapter trained on the dolly dataset using Apple's OpenELM-450M base model.

Model Details

  • Base model: apple/OpenELM-450M
  • Adapter type: LoRA via PEFT (float32)
  • Trained on: databricks-dolly-15k (question-answering)
  • Languages: English
  • License: mit

How to Use

from transformers import AutoModelForCausalLM, AutoTokenizer

base_model = AutoModelForCausalLM.from_pretrained("apple/OpenELM-450M")
tokenizer = AutoTokenizer.from_pretrained("EleutherAI/gpt-neo-125M")
Downloads last month
16
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for faxnoprinter/OpenELM-450M-dolly-LoRA

Base model

apple/OpenELM-450M
Adapter
(4)
this model

Dataset used to train faxnoprinter/OpenELM-450M-dolly-LoRA

Collection including faxnoprinter/OpenELM-450M-dolly-LoRA