--- license: apache-2.0 datasets: - TypicaAI/MedicalNER_Fr language: - fr metrics: - f1 base_model: - almanach/camembert-bio-base library_name: transformers tags: - educational - healthcare-ner - french-ner - nlp-book - medical extra_gated_prompt: "You agree to not use the model for medical decisions nor for production use." extra_gated_fields: I want to use this model for: type: select options: - Education - Research - label: Other value: other I agree to use this model for non-commercial use ONLY: checkbox --- # French Healthcare NER Model (Educational Version) This French Healthcare NER model is part of the healthcare NLP case study featured in the book *[Natural Language Processing on Oracle Cloud Infrastructure: Building Transformer-Based NLP Solutions Using Oracle AI and Hugging Face](https://a.co/d/h0xL4lo).* Dive into Chapter 6 for a comprehensive, step-by-step guide on building this model. ## 📚 Purpose and Scope This model is designed to complement Chapter 6 of the book, allowing readers to: - **Explore the Model**: Experiment with the healthcare NLP model built in the book without needing to train one from scratch. - **Recreate the Case Study**: Follow along with the step-by-step implementation detailed in Chapter 6. - **Understand Key Concepts**: Learn how to fine-tune and apply a healthcare NER model to French-language data. This pre-built model simplifies the learning process and enables hands-on practice directly aligned with the book's content. ## ⚠️ Usage Restrictions This is a demo model provided for educational purposes. It was trained on a limited dataset and is not intended for production use, clinical decision-making, or real-world medical applications. - Educational and research purposes only - Not licensed for commercial deployment - Not for production use - Not for medical decisions ## 🎓 Book Reference This model is built as described in Chapter 6 of the book *Natural Language Processing on Oracle Cloud Infrastructure*. The book covers the entire NLP solution lifecycle—including data preparation, model fine-tuning, deployment, and monitoring. Chapter 6 specifically focuses on: - Fine-tuning a pretrained model from Hugging Face Hub for healthcare Named Entity Recognition (NER) - Training the model using OCI’s Data Science service and Hugging Face Transformers libraries - Performance evaluation and best practices for robust and cost-effective NLP models For more details, you can explore the book and Chapter 6 on the following platforms: - **Full Book on Springer**: [View Here](https://link.springer.com/book/10.1007/979-8-8688-1073-2) - **Chapter 6 on Springer**: [Read Chapter 6](https://link.springer.com/chapter/10.1007/979-8-8688-1073-2_6) - **Amazon**: [Learn More](https://a.co/d/3jDIQki) ## Citation If you use this model, please cite the following: ```bibtex @Inbook{Assoudi2024, author="Assoudi, Hicham", title="Model Fine-Tuning", bookTitle="Natural Language Processing on Oracle Cloud Infrastructure: Building Transformer-Based NLP Solutions Using Oracle AI and Hugging Face", year="2024", publisher="Apress", address="Berkeley, CA", pages="249--319", abstract="This chapter focuses on the process of fine-tuning a pretrained model for healthcare Named Entity Recognition (NER). This chapter provides an in-depth exploration of training the healthcare NER model using OCI's Data Science platform and Hugging Face tools. It covers the fine-tuning process, performance evaluation, and best practices that contribute to creating robust and cost-effective NLP models.", isbn="979-8-8688-1073-2", doi="10.1007/979-8-8688-1073-2_6", url="https://doi.org/10.1007/979-8-8688-1073-2_6" } ``` ## 📞 Connect and Contact Stay updated on my latest models and projects: 👉 **[Follow me on Hugging Face](https://huggingface.co/hassoudi)** For inquiries or professional communication, feel free to reach out: 📧 **Email**: [assoudi@typica.ai](mailto:assoudi@typica.ai)