π§ historical-gpt2-finetuned
β¨ Model Summary
A custom fine-tuned version of gpt2
trained on 500 conversational question-answer pairs focused on the Two-Nation Theory, the Pakistan Movement, and key historical figures like Jinnah and Iqbal. This model generates historically-aligned answers in a chatbot-style format, simulating a Q&A assistant.
π Model Details
Property | Value |
---|---|
Base Model | GPT-2 (gpt2 ) |
Fine-tuned By | Muhammad Yasir (DevSecure) |
Language | English (conversational, formal tone) |
Dataset Size | 500 examples (JSONL format) |
Use Case | Conversational history tutor, education, research |
Training Loss | 0.1233 (after 3 epochs) |
Trained on | Custom .jsonl file with Two-Nation Theory Q&A |
License | MIT |
π Dataset Description
The dataset simulates a dialogue between:
- Yasir (asking questions)
- AI (providing accurate, historical answers)
Example:
{"text": "Yasir: What was the Two-Nation Theory?\nAI: The Two-Nation Theory stated that Muslims and Hindus were two distinct nations with their own customs, religion, and traditions, and therefore Muslims should have a separate homeland.\n"}
Source Materials Used for Data Curation:
- Wikipedia: Two-Nation Theory
- Books: Jinnah of Pakistan, India Wins Freedom, Iqbal's Speeches
- Academic journals and historical PDFs
π― Intended Use
β Direct Use
- Chatbot-style app for history Q&A
- Teaching tool for Pakistan studies
- Historical assistant for students and researchers
- Offline history tutor for smart learning platforms
π« Out-of-Scope
- Use in political argumentation or manipulation
- Interpretation of modern geopolitical conflicts
- Multilingual outputs (no Urdu in current version)
β οΈ Risks, Biases, and Limitations
- Model reflects curated historical narratives from available sources; potential for bias in interpretation.
- May hallucinate or extrapolate when asked questions far beyond the dataset scope.
- Not intended for factual legal or political conclusions.
π How to Use (Code Snippet)
from transformers import pipeline
generator = pipeline("text-generation", model="jamyasir/historical-gpt2-finetuned")
prompt = "Yasir: What was Gandhi's opinion on the Two-Nation Theory?\nAI:"
response = generator(prompt, max_length=100, do_sample=True, temperature=0.7)[0]['generated_text']
print(response)
π Training & Hyperparameters
Config | Value |
---|---|
Epochs | 3 |
Optimizer | AdamW (default) |
Max Length | 128 tokens |
Tokenizer | GPT-2 (pad_token = eos_token ) |
Learning Rate | 5e-5 |
Framework | Hugging Face Trainer |
β Evaluation Examples
Seen Prompt
Prompt:
Yasir: Who introduced the Two-Nation Theory?\nAI:
Response:
The concept was popularized by Allama Iqbal in 1930 and later advocated by Muhammad Ali Jinnah.
Unseen Prompt
Prompt:
Yasir: Why did Jinnah believe Muslims needed a separate homeland?\nAI:
Response:
Jinnah believed that due to religious and cultural differences, Muslims needed an independent state to preserve their identity and rights.
π± Environmental Impact
- Hardware: Local GPU (RTX 3060) / CPU fallback
- Training Time: ~5 minutes
- Carbon Emissions: Negligible for 500-sample fine-tuning
π§ Architecture & Objective
- Base: GPT-2, decoder-only transformer
- Objective: Causal Language Modeling (CLM)
- Modified? No architectural changes; fine-tuning only
βοΈ Citation
@misc{yasir2025twonationgpt2,
author = {Muhammad Yasir},
title = {GPT-2 Fine-Tuned on Two-Nation Theory Debates},
howpublished = {\url{https://huggingface.co/jamyasir/historical-gpt2-finetuned}},
year = {2025}
}
π Author
Muhammad Yasir AI Engineer | Full Stack Developer π Lodhran, Pakistan π Portfolio βοΈ [email protected] π€ Hugging Face Profile
- Downloads last month
- 0
Model tree for devxyasir/historical-gpt2-finetuned
Base model
openai-community/gpt2