ONNX convert typeform/distilbert-base-uncased-mnli

Conversion of typeform/distilbert-base-uncased-mnli

This is the uncased DistilBERT model fine-tuned on Multi-Genre Natural Language Inference (MNLI) dataset for the zero-shot classification task. The model is not case-sensitive, i.e., it does not make a difference between "english" and "English".

Training

Training is done on a p3.2xlarge AWS EC2 instance (1 NVIDIA Tesla V100 GPUs), with the following hyperparameters:

$ run_glue.py \
    --model_name_or_path distilbert-base-uncased \
    --task_name mnli \
    --do_train \
    --do_eval \
    --max_seq_length 128 \
    --per_device_train_batch_size 16 \
    --learning_rate 2e-5 \
    --num_train_epochs 5 \
    --output_dir /tmp/distilbert-base-uncased_mnli/

Evaluation results

Task MNLI MNLI-mm
82.0 82.0
Downloads last month
76
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train optimum/distilbert-base-uncased-mnli

Spaces using optimum/distilbert-base-uncased-mnli 2