license: apache-2.0 language: - ar metrics: - accuracy base_model: - QCRI/Fanar-1-9B pipeline_tag: question-answering

Fanar-1-9B-Islamic-Inheritance-Reasoning

This repository contains the model Fanar-1-9B-Islamic-Inheritance-Reasoning, developed by QU-NLP for the QIAS 2025 Shared Task (SubTask 1: Islamic Inheritance Reasoning).
The model is fine-tuned with LoRA and integrated into a Retrieval-Augmented Generation (RAG) pipeline, enabling mid-scale Arabic LLMs to perform domain-specific reasoning in Islamic inheritance law (ʿIlm al-Farāʾiḍ).


📖 Description

Islamic inheritance law involves complex reasoning over:

  • Identifying eligible heirs
  • Applying Qurʾanic fixed-share rules
  • Handling multiple inheritance scenarios
  • Performing fractional and numerical calculations

Our approach fine-tuned Fanar-1-9B with LoRA adapters and integrated it into a RAG pipeline to ground responses in authoritative Islamic sources.

Results at QIAS 2025:

  • Overall Accuracy: 85.8%
  • Advanced Reasoning: 97.6% (outperforming Gemini 2.5 and OpenAI o3)
  • Surpassed zero-shot prompting of GPT-4.5, LLaMA, Mistral, and ALLaM.

🧩 Example

السؤال: مات وترك: ابن ابن عم شقيق و بنت (5) و أم الأم و ابن عم الأب، كم عدد الأسهم التي تحصل عليها بنت (5) قبل تصحيح المسألة؟

الخيارات:
A) سهمان
B) 0 سهم
C) 6 أسهم
D) 5 أسهم
E) 4 أسهم
F) 3 أسهم

Model Output: E

بنت (5) تأخذ أربعة أسهم قبل التصحيح.

Example Output


📊 Usage

from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("QU-NLP/Fanar-1-9B-Islamic-Inheritance-Reasoning")
model = AutoModelForCausalLM.from_pretrained("QU-NLP/Fanar-1-9B-Islamic-Inheritance-Reasoning")

question = "مات وترك: ابن ابن عم شقيق و بنت (5) و أم الأم و ابن عم الأب..."
options = ["سهمان", "0 سهم", "6 أسهم", "5 أسهم", "4 أسهم", "3 أسهم"]

# Prepare a prompt using RAG-retrieved context
prompt = f"السؤال: {question}\n\nالخيارات:\n" + "\n".join([f"{chr(65+i)}) {opt}" for i,opt in enumerate(options)]) + "\n\nاختر الحرف الصحيح:"

inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=50)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))

---

Citation

If you use this model in your research, please cite the following paper:


@inproceedings{QU-NLP-QIAS2025,
  author    = {Mohammad AL-Smadi},
  title     = {QU-NLP at QIAS 2025 Shared Task: A Two-Phase LLM Fine-Tuning and Retrieval-Augmented Generation Approach for Islamic Inheritance Reasoning},
  booktitle = {Proceedings of The Third Arabic Natural Language Processing Conference (ArabicNLP 2025)},
  year      = {2025},
  publisher = {Association for Computational Linguistics},
  note      = {Suzhou, China, Nov 5--9},
  url       = {https://arabicnlp2025.sigarab.org/}
}
Downloads last month
62
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for msmadi/Fanar-1-9B-Islamic-Inheritance-Reasoning

Base model

QCRI/Fanar-1-9B
Adapter
(1)
this model