DistilBERT (uncased) for FaceNews Classification

This model is a classification model built by fine-tuning DistilBERT base model. This model was trained using fake-and-real-news-dataset for five epochs.

NOTE: This model is just a POC (proof-of-concept) for a fellowship I was applying for.

Intended uses & limitations

Note that this model is primarily aimed at classifying an article to either "Fake" or "Real".

How to use

Check this notebook on Kaggle.

Downloads last month
169
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Datasets used to train anwarvic/distilbert-base-uncased-for-fakenews