MCQ Generator with Mistral-7B + LoRA (Fine-tuned)
This model generates Multiple Choice Questions (MCQs) from academic-style paragraphs. It is fine-tuned using LoRA on top of mistralai/Mistral-7B-Instruct-v0.1
using a custom dataset of educational instructions and responses.
Model Details
- Base Model:
mistralai/Mistral-7B-Instruct-v0.1
- Adapter Technique: LoRA via
peft
- Quantization: 8-bit (
bitsandbytes
) - Fine-tuned by: Lingesh S
- Use case: EdTech, Auto quiz generation, School AI tutors
- Training Platform: Google Colab (T4 GPU with CPU + disk offload)
Example Usage
from transformers import AutoTokenizer, AutoModelForCausalLM, pipeline
from peft import PeftModel
base = "mistralai/Mistral-7B-Instruct-v0.1"
adapter = "Lingesh-S/mcq-mistral-lora"
tokenizer = AutoTokenizer.from_pretrained(base)
tokenizer.pad_token = tokenizer.eos_token
model = AutoModelForCausalLM.from_pretrained(
base,
device_map="auto",
load_in_8bit=True,
quantization_config={
"load_in_8bit": True,
"llm_int8_enable_fp32_cpu_offload": True
},
offload_folder="./offload"
)
model = PeftModel.from_pretrained(model, adapter)
pipe = pipeline("text-generation", model=model, tokenizer=tokenizer)
prompt = """
# Instruction:
Generate a multiple choice question with 4 options and one correct answer based on the paragraph below.
Paragraph: The Lok Sabha is the House of the People in India. It is one of the two houses of Parliament.
# Response:
"""
output = pipe(prompt, max_new_tokens=150, do_sample=True, temperature=0.5)
print(output[0]["generated_text"])
Sample Output
What is the name of the House of the People in India?
a) The Rajya Sabha
b) The Lok Sabha
c) The Supreme Court
d) The President's House
Correct answer: b) The Lok Sabha
Training Details
Dataset: Custom JSONL of 500 examples (Paragraph โ MCQ)
Epochs: 3
Batch size: 1
Loss: ~0.23
Adapter size: 13.6MB
LoRA Config:
r=8
lora_alpha=32
dropout=0.05
target_modules=['q_proj', 'v_proj']
Model Sources
- Base:
mistralai/Mistral-7B-Instruct-v0.1
- Adapter: This repo (
Lingesh-S/mcq-mistral-lora
) - Dataset: Not published (private custom dataset)
Intended Use
Use Case | Status |
---|---|
MCQ generation for education | โ Intended |
Chat-style assistants | โ Possible |
Factual question generation | โ ๏ธ Needs review |
Medical/legal MCQs | โ Not recommended |
Limitations & Biases
This model:
- Was trained on a small dataset (~500 samples)
- May hallucinate or repeat options
- Should not be used for high-stakes testing without human review
Contact
Model developed and shared by Lingesh S Contact via Hugging Face or LinkedIn
Citation
@misc{lingesh2024mcq,
title={MCQ Generator Fine-Tuned on Mistral-7B via LoRA},
author={Lingesh S},
year={2024},
url={https://huggingface.co/Lingesh-S/mcq-mistral-lora}
}
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for Lingesh-S/mcq-mistral-lora
Base model
mistralai/Mistral-7B-v0.1
Finetuned
mistralai/Mistral-7B-Instruct-v0.1