This model uses two BERT models fine-tuned on a contrastive learning objective. One is responsable for short queries, and the other for longer documents that contain the answer to the query. After encoding many documents, one may perform a nearest neighbor search with the query encoding, to fetch the most relevant document.

Take a look at inference.py to see how to perform inference.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train irow/dual-bert-IR