|
--- |
|
language: en |
|
tags: |
|
- medical |
|
- llama |
|
- unsloth |
|
- qlora |
|
- finetuned |
|
- chatbot |
|
license: apache-2.0 |
|
datasets: |
|
- custom-medical-qa |
|
base_model: ContactDoctor/Bio-Medical-Llama-3-8B |
|
model_creator: khalednabawi11 |
|
library_name: transformers |
|
pipeline_tag: text-generation |
|
--- |
|
|
|
|
|
# Bio-Medical LLaMA 3 8B - Fine-Tuned |
|
|
|
π **Fine-tuned version of [ContactDoctor/Bio-Medical-Llama-3-8B](https://huggingface.co/ContactDoctor/Bio-Medical-Llama-3-8B) using Unsloth for enhanced medical Q&A capabilities.** |
|
|
|
## π Model Details |
|
|
|
- **Model Name:** Bio-Medical LLaMA 3 8B - Fine-Tuned |
|
- **Base Model:** ContactDoctor/Bio-Medical-Llama-3-8B |
|
- **Fine-Tuning Method:** QLoRA with Unsloth |
|
- **Domain:** Medical Question Answering |
|
- **Dataset:** Medical Q&A dataset (MQA.json) |
|
|
|
## π οΈ Training Configuration |
|
|
|
- **Epochs:** 4 |
|
- **Batch Size:** 2 |
|
- **Gradient Accumulation:** 4 |
|
- **Learning Rate:** 2e-4 |
|
- **Optimizer:** AdamW (8-bit) |
|
- **Weight Decay:** 0.01 |
|
- **Warmup Steps:** 50 |
|
|
|
## π§ LoRA Parameters |
|
|
|
- **LoRA Rank (r):** 16 |
|
- **LoRA Alpha:** 16 |
|
- **LoRA Dropout:** 0 |
|
- **Bias:** None |
|
- **Target Layers:** |
|
- q_proj |
|
- k_proj |
|
- v_proj |
|
- o_proj |
|
- gate_proj |
|
- up_proj |
|
- down_proj |
|
- **Gradient Checkpointing:** Enabled (Unsloth) |
|
- **Random Seed:** 3407 |
|
|
|
## π Model Capabilities |
|
|
|
- Optimized for **low-memory inference** |
|
- Supports **long medical queries** |
|
- Efficient **parameter-efficient tuning (LoRA)** |
|
|
|
## π Usage |
|
|
|
This model is suitable for **medical question answering**, **clinical chatbot applications**, and **biomedical research assistance**. |
|
|
|
## π References |
|
|
|
- [Unsloth Documentation](https://github.com/unslothai/unsloth) |
|
- [Hugging Face Transformers](https://huggingface.co/docs/transformers/index) |
|
|
|
--- |
|
π‘ **Contributions & Feedback**: Open to collaboration! Feel free to reach out. |
|
|