Introducing BEREL 3.0 - New and Improved BEREL: BERT Embeddings for Rabbinic-Encoded Language

When using BEREL 3.0, please reference:

Avi Shmidman, Joshua Guedalia, Shaltiel Shmidman, Cheyn Shmuel Shmidman, Eli Handel, Moshe Koppel, "Introducing BEREL: BERT Embeddings for Rabbinic-Encoded Language", Aug 2022 [arXiv:2208.01875]

  1. Usage:
from transformers import AutoTokenizer, BertForMaskedLM

tokenizer = AutoTokenizer.from_pretrained('dicta-il/BEREL_3.0')
model = BertForMaskedLM.from_pretrained('dicta-il/BEREL_3.0')

# for evaluation, disable dropout
model.eval()

NOTE: This code will not work and provide bad results if you use BertTokenizer. Please use AutoTokenizer or BertTokenizerFast.

Downloads last month
1,704
Safetensors
Model size
184M params
Tensor type
I64
·
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for dicta-il/BEREL_3.0

Finetunes
1 model