SPLADE sparse retrieval models based on BERT-Tiny (4M) and BERT-Mini (11M) distilled from a Cross-Encoder on the MSMARCO dataset
Yosef Worku Alemneh
rasyosef
AI & ML interests
Pretraining, Supervised Fine Tuning, Direct Preference Optimization, Retrieval Augmented Generation (RAG), Function Calling
Recent Activity
commented on
an
article
about 9 hours ago
Training and Finetuning Sparse Embedding Models with Sentence Transformers v5
new activity
about 10 hours ago
naver/splade-v3:SPLADE-Index python package: An ultra-fast search index for SPLADE sparse retrieval models
updated
a model
6 days ago
rasyosef/splade-tiny