MASTER: Multi-task Pre-trained Bottlenecked Masked Autoencoders are Better Dense Retrievers
Paper: https://arxiv.org/abs/2212.07841.
Code: https://github.com/microsoft/SimXNS/tree/main/MASTER.
Overview
This is the checkpoint after pretraining on the NQ, TQ, WQ and Squad's Wikipedia corpus. You may use this checkpoint as the initialization for finetuning.
Useage
To load this checkpoint for initialization, you may follow:
from transformers import AutoModel
model = AutoModel.from_pretrained('lx865712528/master-base-pretrained-wiki')
- Downloads last month
- 6
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.