Danish BERT (version 2, uncased) by Certainly (previously known as BotXO).

All credit goes to Certainly (previously known as BotXO), who developed Danish BERT. For data and training details see their GitHub repository or this article. You can also visit their organization page on Hugging Face.

It is both available in TensorFlow and Pytorch format.

The original TensorFlow version can be downloaded using this link.

Here is an example on how to load Danish BERT in PyTorch using the 🤗Transformers library:

from transformers import AutoTokenizer, AutoModelForPreTraining

tokenizer = AutoTokenizer.from_pretrained("Maltehb/danish-bert-botxo")
model = AutoModelForPreTraining.from_pretrained("Maltehb/danish-bert-botxo")
Downloads last month
1,926
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Maltehb/danish-bert-botxo

Finetunes
8 models

Dataset used to train Maltehb/danish-bert-botxo

Space using Maltehb/danish-bert-botxo 1