Edit model card

This model was distilled from BERTimbau

Usage

from transformers import AutoTokenizer  # Or BertTokenizer
from transformers import AutoModelForPreTraining  # Or BertForPreTraining for loading pretraining heads
from transformers import AutoModel  # or BertModel, for BERT without pretraining heads
model = AutoModelForPreTraining.from_pretrained('adalbertojunior/distilbert-portuguese-cased')
tokenizer = AutoTokenizer.from_pretrained('adalbertojunior/distilbert-portuguese-cased', do_lower_case=False)

You should fine tune it on your own data.

It can achieve accuracy up to 99% relative to the original BERTimbau in some tasks.

@misc {adalberto_ferreira_barbosa_junior_2024,
    author       = { {Adalberto Ferreira Barbosa Junior} },
    title        = { distilbert-portuguese-cased (Revision df1fa7a) },
    year         = 2024,
    url          = { https://huggingface.co/adalbertojunior/distilbert-portuguese-cased },
    doi          = { 10.57967/hf/3041 },
    publisher    = { Hugging Face }
}
Downloads last month
682
Safetensors
Model size
66.4M params
Tensor type
I64
ยท
F32
ยท
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for adalbertojunior/distilbert-portuguese-cased

Finetunes
2 models

Space using adalbertojunior/distilbert-portuguese-cased 1