Update 2025-5-12: This model is BEREL version 1.0. We are now happy to provide a much improved BEREL_3.0.

Introducing BEREL: BERT Embeddings for Rabbinic-Encoded Language

When using BEREL, please reference:

Avi Shmidman, Joshua Guedalia, Shaltiel Shmidman, Cheyn Shmuel Shmidman, Eli Handel, Moshe Koppel, "Introducing BEREL: BERT Embeddings for Rabbinic-Encoded Language", Aug 2022 [arXiv:2208.01875]

  1. Usage:
from transformers import AutoTokenizer, BertForMaskedLM

tokenizer = AutoTokenizer.from_pretrained('dicta-il/BEREL')
model = BertForMaskedLM.from_pretrained('dicta-il/BEREL')

# for evaluation, disable dropout
model.eval()
Downloads last month
62
Safetensors
Model size
184M params
Tensor type
I64
·
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for dicta-il/BEREL

Finetunes
17 models