Model description

This is a multinomial naive Bayes model trained on 20 new groups dataset. Count vectorizer and TFIDF vectorizer are used on top of the model.

Intended uses & limitations

This model is not ready to be used in production.

Training Procedure

Hyperparameters

The model is trained with below hyperparameters.

Click to expand
Hyperparameter Value
memory
steps [('vect', CountVectorizer()), ('tfidf', TfidfTransformer()), ('clf', MultinomialNB())]
verbose False
vect CountVectorizer()
tfidf TfidfTransformer()
clf MultinomialNB()
vect__analyzer word
vect__binary False
vect__decode_error strict
vect__dtype <class 'numpy.int64'>
vect__encoding utf-8
vect__input content
vect__lowercase True
vect__max_df 1.0
vect__max_features
vect__min_df 1
vect__ngram_range (1, 1)
vect__preprocessor
vect__stop_words
vect__strip_accents
vect__token_pattern (?u)\b\w\w+\b
vect__tokenizer
vect__vocabulary
tfidf__norm l2
tfidf__smooth_idf True
tfidf__sublinear_tf False
tfidf__use_idf True
clf__alpha 1.0
clf__class_prior
clf__fit_prior True

Model Plot

The model plot is below.

Pipeline(steps=[('vect', CountVectorizer()), ('tfidf', TfidfTransformer()),('clf', MultinomialNB())])
Please rerun this cell to show the HTML repr or trust the notebook.

Evaluation Results

You can find the details about evaluation process and the evaluation results.

Metric Value

How to Get Started with the Model

Use the code below to get started with the model.

Click to expand
import pickle
with open(pkl_filename, 'rb') as file:
    clf = pickle.load(file)

Model Card Authors

This model card is written by following authors:

merve

Model Card Contact

You can contact the model card authors through following channels: [More Information Needed]

Citation

Below you can find information related to citation.

BibTeX:

bibtex
@inproceedings{...,year={2020}}
Downloads last month
0
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.