Model Card for Model ID
Fine-tuned BERT for Named Entity Recognition (NER)
Model Details
Model Description
This model is a fine-tuned version of google-bert/bert-base-uncased for Named Entity Recognition (NER) using the CoNLL-2003 dataset. The model classifies tokens into named entity categories such as persons, locations, organizations, and miscellaneous entities.
- Developed by: Gowtham Arulmozhi
- Language(s) (NLP): English (en)
- License: MIT
- Finetuned from model : google-bert/bert-base-uncased
How to Get Started with the Model
Run this snippet to use the model with ๐ค Transformers:
from transformers import pipeline
ner_pipeline = pipeline("ner", model="Wothmag07/NER-fine-tuned-model")
text = "Barack Obama was born in Hawaii."
results = ner_pipeline(text)
print(results)
Training Details
Training Data
The model was trained using the CoNLL-2003 dataset, which contains news articles annotated for named entities.
Training Hyperparameters
Batch size: 16
Learning rate: 2e-5
Epochs: 20
Optimizer: Adam
Evaluation
Tested on CoNLL-2003 test set with:
Accuracy: 0.987084
F1-score: 0.947727
Precision: 0.945248
Recall: 0.950218
- Downloads last month
- 2
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for Wothmag07/NER-fine-tuned-model
Base model
google-bert/bert-base-uncased