This repository contains the model described in the paper A Systematic Investigation of Distilling Large Language Models into Cross-Encoders for Passage Re-ranking.

The code for training and evaluation can be found at https://github.com/webis-de/msmarco-llm-distillation.

Downloads last month
34
Safetensors
Model size
334M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for webis/monoelectra-large

Finetuned
(4)
this model