File size: 1,304 Bytes
75295dc 7f12889 75295dc ef3304e 75295dc b96e2c1 ef3304e 75295dc ef3304e 75295dc ef3304e 75295dc ef3304e 75295dc ef3304e 75295dc ef3304e 75295dc ef3304e |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 |
---
library_name: transformers
tags:
- medical
license: apache-2.0
language:
- fr
- en
base_model:
- mistralai/Mistral-7B-Instruct-v0.1
---
### Model Description
MedMistralInstruct-CPT-7B is adapted from Mistral-7B-Instruct-v0.1 through Continual Pre-Training, maintaining instruction-following capabilities while gaining medical domain knowledge.
### Model Details
- **Model Type**: Causal Language Model
- **Base Model**: Mistral-7B-Instruct-v0.1
- **Language**: French
- **Domain**: Medical/Healthcare
- **Parameters**: 7 billion
- **License**: Apache 2.0
### Training Details
**Continual Pre-Training (CPT)**
- **Dataset**: NACHOS corpus (7.4 GB French medical texts)
- **Training Duration**: 2.8 epochs
- **Hardware**: 32 NVIDIA A100 80GB GPUs
- **Training Time**: ~40 hours
### Computational Requirements
- **Carbon Emissions**: 32.89 kgCO2e
- **Training Time**: 40 hours
### Ethical Considerations
- **Medical Accuracy**: For research and educational purposes only
- **Professional Oversight**: Requires verification by qualified medical professionals
- **Bias Awareness**: May contain biases from training data
- **Privacy**: Do not input private health information
### Citation
```bibtex
```
### Contact
For questions about these models, please contact: [email protected] |