File size: 50,661 Bytes
8bed20e
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
---

language:
- en
license: apache-2.0
tags:
- sentence-transformers
- sparse-encoder
- sparse
- splade
- generated_from_trainer
- dataset_size:99000
- loss:SpladeLoss
- loss:SparseDistillKLDivLoss
- loss:FlopsLoss
base_model: Luyu/co-condenser-marco
widget:
- text: 'The ejection fraction may decrease if: 1  You have weakness of your heart

    muscle, such as dilated cardiomyopathy, which can be caused by a heart muscle

    problem, familial (genetic) cardiomyopathy, or systemic illnesses. 2  A heart

    attack has damaged your heart.  You have problems with your heart''s valves.'
- text: "One thing we avoided: Lots of alternative slime recipes swap Borax for liquid\

    \ starch, shampoo, body wash, hand soap, contact lens solution, or laundry detergent.\

    \ Those may seem benign â\x80\x94 and they might be â\x80\x94 but many of them\

    \ contain derivatives or relatives of sodium borate too."
- text: how do i get my mvr in pa
- text: English is a language whose vocabulary is the composite of a surprising range
    of influences. We have pillaged words from Latin, Greek, Dutch, Arabic, Old Norse,
    Spanish, Italian, Hindi, and more besides to make English what it is today.
- text: Weed Eater was a string trimmer company founded in 1971 in Houston, Texas
    by George C. Ballas, Sr. , the inventor of the device. The idea for the Weed Eater
    trimmer came to him from the spinning nylon bristles of an automatic car wash.He
    thought that he could come up with a similar technique to protect the bark on
    trees that he was trimming around. His company was eventually bought by Emerson
    Electric and merged with Poulan.Poulan/Weed Eater was later purchased by Electrolux,
    which spun off the outdoors division as Husqvarna AB in 2006.Inventor Ballas was
    the father of champion ballroom dancer Corky Ballas and the grandfather of Dancing
    with the Stars dancer Mark Ballas.George Ballas died on June 25, 2011.he idea
    for the Weed Eater trimmer came to him from the spinning nylon bristles of an
    automatic car wash. He thought that he could come up with a similar technique
    to protect the bark on trees that he was trimming around. His company was eventually
    bought by Emerson Electric and merged with Poulan.
pipeline_tag: feature-extraction
library_name: sentence-transformers
metrics:
- dot_accuracy@1
- dot_accuracy@3
- dot_accuracy@5
- dot_accuracy@10
- dot_precision@1
- dot_precision@3
- dot_precision@5
- dot_precision@10
- dot_recall@1
- dot_recall@3
- dot_recall@5
- dot_recall@10
- dot_ndcg@10
- dot_mrr@10
- dot_map@100
- query_active_dims
- query_sparsity_ratio
- corpus_active_dims
- corpus_sparsity_ratio
co2_eq_emissions:
  emissions: 78.12595691469743
  energy_consumed: 0.2009919087493695
  source: codecarbon
  training_type: fine-tuning
  on_cloud: false
  cpu_model: 13th Gen Intel(R) Core(TM) i7-13700K
  ram_total_size: 31.777088165283203
  hours_used: 0.571
  hardware_used: 1 x NVIDIA GeForce RTX 3090
model-index:
- name: CoCondenser finetuned on MS MARCO
  results:
  - task:
      type: sparse-information-retrieval
      name: Sparse Information Retrieval
    dataset:
      name: NanoMSMARCO
      type: NanoMSMARCO
    metrics:
    - type: dot_accuracy@1
      value: 0.42
      name: Dot Accuracy@1
    - type: dot_accuracy@3
      value: 0.56
      name: Dot Accuracy@3
    - type: dot_accuracy@5
      value: 0.72
      name: Dot Accuracy@5
    - type: dot_accuracy@10
      value: 0.9
      name: Dot Accuracy@10
    - type: dot_precision@1
      value: 0.42
      name: Dot Precision@1
    - type: dot_precision@3
      value: 0.18666666666666668
      name: Dot Precision@3
    - type: dot_precision@5
      value: 0.14400000000000002
      name: Dot Precision@5
    - type: dot_precision@10
      value: 0.08999999999999998
      name: Dot Precision@10
    - type: dot_recall@1
      value: 0.42
      name: Dot Recall@1
    - type: dot_recall@3
      value: 0.56
      name: Dot Recall@3
    - type: dot_recall@5
      value: 0.72
      name: Dot Recall@5
    - type: dot_recall@10
      value: 0.9
      name: Dot Recall@10
    - type: dot_ndcg@10
      value: 0.6291399713464962
      name: Dot Ndcg@10
    - type: dot_mrr@10
      value: 0.5467460317460318
      name: Dot Mrr@10
    - type: dot_map@100
      value: 0.5503396478777393
      name: Dot Map@100
    - type: query_active_dims
      value: 23.31999969482422
      name: Query Active Dims
    - type: query_sparsity_ratio
      value: 0.9992359609562013
      name: Query Sparsity Ratio
    - type: corpus_active_dims
      value: 257.3004150390625
      name: Corpus Active Dims
    - type: corpus_sparsity_ratio
      value: 0.9915700014730665
      name: Corpus Sparsity Ratio
  - task:
      type: sparse-information-retrieval
      name: Sparse Information Retrieval
    dataset:
      name: NanoNFCorpus
      type: NanoNFCorpus
    metrics:
    - type: dot_accuracy@1
      value: 0.44
      name: Dot Accuracy@1
    - type: dot_accuracy@3
      value: 0.56
      name: Dot Accuracy@3
    - type: dot_accuracy@5
      value: 0.6
      name: Dot Accuracy@5
    - type: dot_accuracy@10
      value: 0.66
      name: Dot Accuracy@10
    - type: dot_precision@1
      value: 0.44
      name: Dot Precision@1
    - type: dot_precision@3
      value: 0.38666666666666666
      name: Dot Precision@3
    - type: dot_precision@5
      value: 0.32800000000000007
      name: Dot Precision@5
    - type: dot_precision@10
      value: 0.272
      name: Dot Precision@10
    - type: dot_recall@1
      value: 0.041590314149379026
      name: Dot Recall@1
    - type: dot_recall@3
      value: 0.07672442108786207
      name: Dot Recall@3
    - type: dot_recall@5
      value: 0.09154300468916865
      name: Dot Recall@5
    - type: dot_recall@10
      value: 0.1433130618338512
      name: Dot Recall@10
    - type: dot_ndcg@10
      value: 0.33898990155781883
      name: Dot Ndcg@10
    - type: dot_mrr@10
      value: 0.5123809523809524
      name: Dot Mrr@10
    - type: dot_map@100
      value: 0.1505453583653259
      name: Dot Map@100
    - type: query_active_dims
      value: 21.260000228881836
      name: Query Active Dims
    - type: query_sparsity_ratio
      value: 0.9993034532393394
      name: Query Sparsity Ratio
    - type: corpus_active_dims
      value: 494.8533630371094
      name: Corpus Active Dims
    - type: corpus_sparsity_ratio
      value: 0.9837869941996883
      name: Corpus Sparsity Ratio
  - task:
      type: sparse-information-retrieval
      name: Sparse Information Retrieval
    dataset:
      name: NanoNQ
      type: NanoNQ
    metrics:
    - type: dot_accuracy@1
      value: 0.44
      name: Dot Accuracy@1
    - type: dot_accuracy@3
      value: 0.74
      name: Dot Accuracy@3
    - type: dot_accuracy@5
      value: 0.78
      name: Dot Accuracy@5
    - type: dot_accuracy@10
      value: 0.84
      name: Dot Accuracy@10
    - type: dot_precision@1
      value: 0.44
      name: Dot Precision@1
    - type: dot_precision@3
      value: 0.2533333333333333
      name: Dot Precision@3
    - type: dot_precision@5
      value: 0.16399999999999998
      name: Dot Precision@5
    - type: dot_precision@10
      value: 0.08999999999999998
      name: Dot Precision@10
    - type: dot_recall@1
      value: 0.42
      name: Dot Recall@1
    - type: dot_recall@3
      value: 0.7
      name: Dot Recall@3
    - type: dot_recall@5
      value: 0.74
      name: Dot Recall@5
    - type: dot_recall@10
      value: 0.81
      name: Dot Recall@10
    - type: dot_ndcg@10
      value: 0.6304630848492498
      name: Dot Ndcg@10
    - type: dot_mrr@10
      value: 0.5837460317460317
      name: Dot Mrr@10
    - type: dot_map@100
      value: 0.5712846533262134
      name: Dot Map@100
    - type: query_active_dims
      value: 28.0
      name: Query Active Dims
    - type: query_sparsity_ratio
      value: 0.9990826289233995
      name: Query Sparsity Ratio
    - type: corpus_active_dims
      value: 290.61212158203125
      name: Corpus Active Dims
    - type: corpus_sparsity_ratio
      value: 0.9904786016125408
      name: Corpus Sparsity Ratio
  - task:
      type: sparse-nano-beir
      name: Sparse Nano BEIR
    dataset:
      name: NanoBEIR mean
      type: NanoBEIR_mean
    metrics:
    - type: dot_accuracy@1
      value: 0.43333333333333335
      name: Dot Accuracy@1
    - type: dot_accuracy@3
      value: 0.62
      name: Dot Accuracy@3
    - type: dot_accuracy@5
      value: 0.6999999999999998
      name: Dot Accuracy@5
    - type: dot_accuracy@10
      value: 0.7999999999999999
      name: Dot Accuracy@10
    - type: dot_precision@1
      value: 0.43333333333333335
      name: Dot Precision@1
    - type: dot_precision@3
      value: 0.27555555555555555
      name: Dot Precision@3
    - type: dot_precision@5
      value: 0.21200000000000005
      name: Dot Precision@5
    - type: dot_precision@10
      value: 0.15066666666666664
      name: Dot Precision@10
    - type: dot_recall@1
      value: 0.29386343804979304
      name: Dot Recall@1
    - type: dot_recall@3
      value: 0.44557480702928737
      name: Dot Recall@3
    - type: dot_recall@5
      value: 0.5171810015630562
      name: Dot Recall@5
    - type: dot_recall@10
      value: 0.6177710206112837
      name: Dot Recall@10
    - type: dot_ndcg@10
      value: 0.5328643192511883
      name: Dot Ndcg@10
    - type: dot_mrr@10
      value: 0.5476243386243386
      name: Dot Mrr@10
    - type: dot_map@100
      value: 0.42405655318975954
      name: Dot Map@100
    - type: query_active_dims
      value: 24.19333330790202
      name: Query Active Dims
    - type: query_sparsity_ratio
      value: 0.9992073477063135
      name: Query Sparsity Ratio
    - type: corpus_active_dims
      value: 324.00429792464917
      name: Corpus Active Dims
    - type: corpus_sparsity_ratio
      value: 0.9893845652996315
      name: Corpus Sparsity Ratio
---


# CoCondenser finetuned on MS MARCO

This is a [SPLADE Sparse Encoder](https://www.sbert.net/docs/sparse_encoder/usage/usage.html) model finetuned from [Luyu/co-condenser-marco](https://huggingface.co/Luyu/co-condenser-marco) using the [sentence-transformers](https://www.SBERT.net) library. It maps sentences & paragraphs to a 30522-dimensional sparse vector space   and can be used for semantic search and sparse retrieval.
## Model Details

### Model Description
- **Model Type:** SPLADE Sparse Encoder
- **Base model:** [Luyu/co-condenser-marco](https://huggingface.co/Luyu/co-condenser-marco) <!-- at revision e0cef0ab2410aae0f0994366ddefb5649a266709 -->
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 30522 dimensions
- **Similarity Function:** Dot Product
<!-- - **Training Dataset:** Unknown -->
- **Language:** en
- **License:** apache-2.0

### Model Sources

- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Documentation:** [Sparse Encoder Documentation](https://www.sbert.net/docs/sparse_encoder/usage/usage.html)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sparse Encoders on Hugging Face](https://huggingface.co/models?library=sentence-transformers&other=sparse-encoder)

### Full Model Architecture

```

SparseEncoder(

  (0): MLMTransformer({'max_seq_length': 512, 'do_lower_case': False, 'architecture': 'BertForMaskedLM'})

  (1): SpladePooling({'pooling_strategy': 'max', 'activation_function': 'relu', 'word_embedding_dimension': 30522})

)

```

## Usage

### Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

```bash

pip install -U sentence-transformers

```

Then you can load this model and run inference.
```python

from sentence_transformers import SparseEncoder



# Download from the 🤗 Hub

model = SparseEncoder("tomaarsen/splade-cocondenser-kldiv-marginmse-minilm-temp-4")

# Run inference

queries = [

    "who started gladiator lacrosse",

]

documents = [

    'Weed Eater was a string trimmer company founded in 1971 in Houston, Texas by George C. Ballas, Sr. , the inventor of the device. The idea for the Weed Eater trimmer came to him from the spinning nylon bristles of an automatic car wash.He thought that he could come up with a similar technique to protect the bark on trees that he was trimming around. His company was eventually bought by Emerson Electric and merged with Poulan.Poulan/Weed Eater was later purchased by Electrolux, which spun off the outdoors division as Husqvarna AB in 2006.Inventor Ballas was the father of champion ballroom dancer Corky Ballas and the grandfather of Dancing with the Stars dancer Mark Ballas.George Ballas died on June 25, 2011.he idea for the Weed Eater trimmer came to him from the spinning nylon bristles of an automatic car wash. He thought that he could come up with a similar technique to protect the bark on trees that he was trimming around. His company was eventually bought by Emerson Electric and merged with Poulan.',

    "The earliest types of gladiator were named after Rome's enemies of that time: the Samnite, Thracian and Gaul. The Samnite, heavily armed, elegantly helmed and probably the most popular type, was renamed Secutor and the Gaul renamed Murmillo, once these former enemies had been conquered then absorbed into Rome's Empire.",

    'Summit Hill, PA. Sponsored Topics. Summit Hill is a borough in Carbon County, Pennsylvania, United States. The population was 2,974 at the 2000 census. Summit Hill is located at 40°49â\x80²39â\x80³N 75°51â\x80²57â\x80³W / 40.8275°N 75.86583°W / 40.8275; -75.86583 (40.827420, -75.865892).',

]

query_embeddings = model.encode_query(queries)

document_embeddings = model.encode_document(documents)

print(query_embeddings.shape, document_embeddings.shape)

# [1, 30522] [3, 30522]



# Get the similarity scores for the embeddings

similarities = model.similarity(query_embeddings, document_embeddings)

print(similarities)

# tensor([[19.3181, 29.9645, 13.8348]])

```

<!--
### Direct Usage (Transformers)

<details><summary>Click to see the direct usage in Transformers</summary>

</details>
-->

<!--
### Downstream Usage (Sentence Transformers)

You can finetune this model on your own dataset.

<details><summary>Click to expand</summary>

</details>
-->

<!--
### Out-of-Scope Use

*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->

## Evaluation

### Metrics

#### Sparse Information Retrieval

* Datasets: `NanoMSMARCO`, `NanoNFCorpus` and `NanoNQ`
* Evaluated with [<code>SparseInformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sparse_encoder/evaluation.html#sentence_transformers.sparse_encoder.evaluation.SparseInformationRetrievalEvaluator)

| Metric                | NanoMSMARCO | NanoNFCorpus | NanoNQ     |
|:----------------------|:------------|:-------------|:-----------|
| dot_accuracy@1        | 0.42        | 0.44         | 0.44       |

| dot_accuracy@3        | 0.56        | 0.56         | 0.74       |
| dot_accuracy@5        | 0.72        | 0.6          | 0.78       |

| dot_accuracy@10       | 0.9         | 0.66         | 0.84       |
| dot_precision@1       | 0.42        | 0.44         | 0.44       |

| dot_precision@3       | 0.1867      | 0.3867       | 0.2533     |
| dot_precision@5       | 0.144       | 0.328        | 0.164      |

| dot_precision@10      | 0.09        | 0.272        | 0.09       |
| dot_recall@1          | 0.42        | 0.0416       | 0.42       |

| dot_recall@3          | 0.56        | 0.0767       | 0.7        |
| dot_recall@5          | 0.72        | 0.0915       | 0.74       |

| dot_recall@10         | 0.9         | 0.1433       | 0.81       |
| **dot_ndcg@10**       | **0.6291**  | **0.339**    | **0.6305** |

| dot_mrr@10            | 0.5467      | 0.5124       | 0.5837     |

| dot_map@100           | 0.5503      | 0.1505       | 0.5713     |

| query_active_dims     | 23.32       | 21.26        | 28.0       |

| query_sparsity_ratio  | 0.9992      | 0.9993       | 0.9991     |

| corpus_active_dims    | 257.3004    | 494.8534     | 290.6121   |

| corpus_sparsity_ratio | 0.9916      | 0.9838       | 0.9905     |



#### Sparse Nano BEIR



* Dataset: `NanoBEIR_mean`

* Evaluated with [<code>SparseNanoBEIREvaluator</code>](https://sbert.net/docs/package_reference/sparse_encoder/evaluation.html#sentence_transformers.sparse_encoder.evaluation.SparseNanoBEIREvaluator) with these parameters:

  ```json

  {

      "dataset_names": [

          "msmarco",

          "nfcorpus",

          "nq"

      ]

  }

  ```



| Metric                | Value      |

|:----------------------|:-----------|

| dot_accuracy@1        | 0.4333     |

| dot_accuracy@3        | 0.62       |

| dot_accuracy@5        | 0.7        |

| dot_accuracy@10       | 0.8        |

| dot_precision@1       | 0.4333     |

| dot_precision@3       | 0.2756     |

| dot_precision@5       | 0.212      |

| dot_precision@10      | 0.1507     |

| dot_recall@1          | 0.2939     |

| dot_recall@3          | 0.4456     |

| dot_recall@5          | 0.5172     |

| dot_recall@10         | 0.6178     |

| **dot_ndcg@10**       | **0.5329** |
| dot_mrr@10            | 0.5476     |

| dot_map@100           | 0.4241     |
| query_active_dims     | 24.1933    |
| query_sparsity_ratio  | 0.9992     |
| corpus_active_dims    | 324.0043   |
| corpus_sparsity_ratio | 0.9894     |

<!--
## Bias, Risks and Limitations

*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->

<!--
### Recommendations

*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->

## Training Details

### Training Dataset

#### Unnamed Dataset

* Size: 99,000 training samples
* Columns: <code>query</code>, <code>positive</code>, <code>negative</code>, and <code>label</code>
* Approximate statistics based on the first 1000 samples:
  |         | query                                                                           | positive                                                                            | negative                                                                            | label                              |
  |:--------|:--------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:-----------------------------------|
  | type    | string                                                                          | string                                                                              | string                                                                              | list                               |
  | details | <ul><li>min: 4 tokens</li><li>mean: 9.2 tokens</li><li>max: 34 tokens</li></ul> | <ul><li>min: 18 tokens</li><li>mean: 79.86 tokens</li><li>max: 219 tokens</li></ul> | <ul><li>min: 18 tokens</li><li>mean: 79.96 tokens</li><li>max: 270 tokens</li></ul> | <ul><li>size: 2 elements</li></ul> |
* Samples:
  | query                                                                      | positive                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                          | negative                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                         | label                                                  |
  |:---------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------|
  | <code>rtn tv network</code>                                                | <code>Home Shopping Network. Home Shopping Network (HSN) is an American broadcast, basic cable and satellite television network that is owned by HSN, Inc. (NASDAQ: HSNI), which also owns catalog company Cornerstone Brands. Based in St. Petersburg, Florida, United States, the home shopping channel has former and current sister channels in several other countries.</code>                                                                                                                                               | <code>The Public Switched Telephone Network - The public switched telephone network (PSTN) is the international network of circuit-switched telephones. Learn more about PSTN at HowStuffWorks. x</code>                                                                                                                                                                                                                                                                                                         | <code>[-1.0804121494293213, -5.908488750457764]</code> |
  | <code>how did president nixon react to the watergate investigation?</code> | <code>The Watergate scandal was a major political scandal that occurred in the United States during the early 1970s, following a break-in by five men at the Democratic National Committee headquarters at the Watergate office complex in Washington, D.C. on June 17, 1972, and President Richard Nixon's administration's subsequent attempt to cover up its involvement. After the five burglars were caught and the conspiracy was discovered, Watergate was investigated by the United States Congress. Meanwhile, N</code> | <code>The release of the tape was ordered by the Supreme Court on July 24, 1974, in a case known as United States v. Nixon. The court’s decision was unanimous. President Nixon released the tape on August 5. It was one of three conversations he had with Haldeman six days after the Watergate break-in. The tapes prove that he ordered a cover-up of the Watergate burglary. The Smoking Gun tape reveals that Nixon ordered the FBI to abandon its investigation of the break-in. [Read more…]</code> | <code>[4.117279052734375, 3.191757917404175]</code>    |
  | <code>what is a summary offense in pennsylvania</code>                     | <code>We provide cost effective house arrest and electronic monitoring services to magisterial district court systems throughout Pennsylvania including York, Harrisburg, Philadelphia and Allentown.In addition, we also serve the York County, Lancaster County and Chester County.e provide cost effective house arrest and electronic monitoring services to magisterial district court systems throughout Pennsylvania including York, Harrisburg, Philadelphia and Allentown.</code>                                        | <code>In order to be convicted of Simple Assault, one must cause bodily injury. To be convicted of Aggravated Assault, one must cause serious bodily injury. From my research, Pennsylvania law defines bodily injury as the impairment of physical condition or substantial pain.</code>                                                                                                                                                                                                                        | <code>[-8.954689025878906, -1.3361705541610718]</code> |
* Loss: [<code>SpladeLoss</code>](https://sbert.net/docs/package_reference/sparse_encoder/losses.html#spladeloss) with these parameters:
  ```json

  {

      "loss": "SparseDistillKLDivLoss",

      "lambda_corpus": 0.0005,

      "lambda_query": 0.0005

  }

  ```

### Evaluation Dataset

#### Unnamed Dataset

* Size: 1,000 evaluation samples
* Columns: <code>query</code>, <code>positive</code>, <code>negative</code>, and <code>label</code>
* Approximate statistics based on the first 1000 samples:
  |         | query                                                                            | positive                                                                            | negative                                                                            | label                              |
  |:--------|:---------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:-----------------------------------|
  | type    | string                                                                           | string                                                                              | string                                                                              | list                               |
  | details | <ul><li>min: 4 tokens</li><li>mean: 9.12 tokens</li><li>max: 37 tokens</li></ul> | <ul><li>min: 17 tokens</li><li>mean: 78.91 tokens</li><li>max: 239 tokens</li></ul> | <ul><li>min: 25 tokens</li><li>mean: 81.25 tokens</li><li>max: 239 tokens</li></ul> | <ul><li>size: 2 elements</li></ul> |
* Samples:
  | query                                                | positive                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                              | negative                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                    | label                                                    |
  |:-----------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------|
  | <code>how long to cook roast beef for</code>         | <code>Roasting times for beef. Preheat your oven to 160°C (325°F) and use these cooking times to prepare a roast that's moist, tender and delicious. Your roast should be covered with foil for the first half of the roasting time to prevent drying the outer layer.3 to 5lb Joint 1½ to 2 hours.reheat your oven to 160°C (325°F) and use these cooking times to prepare a roast that's moist, tender and delicious. Your roast should be covered with foil for the first half of the roasting time to prevent drying the outer layer.</code> | <code>Estimating Cooking Time for Large Beef Roasts. If you roast at a steady 325F (160C), subtract 2 minutes or so per pound. If the roast is refrigerated just before going into the oven, add 2 or 3 minutes per pound. WARNING NOTES: Remember, the rib roast will continue to cook as it sets.</code>                                                                                                                                                                                                                                                                                                                  | <code>[6.501978874206543, 8.214995384216309]</code>      |
  | <code>definition of fire inspection</code>           | <code>Learn how to do a monthly fire extinguisher inspection in your workplace. Departments must assign an individual to inspect monthly the extinguishers in or adjacent to the department's facilities.1  Read Fire Extinguisher Types and Maintenance for more information.earn how to do a monthly fire extinguisher inspection in your workplace. Departments must assign an individual to inspect monthly the extinguishers in or adjacent to the department's facilities.</code>                                                               | <code>reconnaissance by fire-a method of reconnaissance in which fire is placed on a suspected enemy position in order to cause the enemy to disclose his presence by moving or returning fire. reconnaissance in force-an offensive operation designed to discover or test the enemy's strength (or to obtain other information). mission undertaken to obtain, by visual observation or other detection methods, information about the activities and resources of an enemy or potential enemy, or to secure data concerning the meteorological, hydrographic, or geographic characteristics of a particular area.</code> | <code>[-0.38299351930618286, -0.9372650384902954]</code> |
  | <code>how many stores does family dollar have</code> | <code>Property Spotlight: New Retail Center at Hamilton & Warner - Outlots Available!! Family Dollar is closing stores following a disappointing second quarter. Family Dollar Stores Inc. won’t just be cutting prices in an attempt to boost its business – it’ll be closing stores as well. The Matthews, N.C.-based discount retailer plans to shutter 370 under-performing shops, according to the Charlotte Business Journal.</code>                                                                                                      | <code>Glassdoor has 1,976 Family Dollar Stores reviews submitted anonymously by Family Dollar Stores employees. Read employee reviews and ratings on Glassdoor to decide if Family Dollar Stores is right for you.</code>                                                                                                                                                                                                                                                                                                                                                                                                   | <code>[4.726407527923584, 8.284608840942383]</code>      |
* Loss: [<code>SpladeLoss</code>](https://sbert.net/docs/package_reference/sparse_encoder/losses.html#spladeloss) with these parameters:
  ```json

  {

      "loss": "SparseDistillKLDivLoss",

      "lambda_corpus": 0.0005,

      "lambda_query": 0.0005

  }

  ```

### Training Hyperparameters
#### Non-Default Hyperparameters

- `eval_strategy`: steps
- `per_device_train_batch_size`: 16
- `per_device_eval_batch_size`: 16
- `learning_rate`: 2e-05
- `num_train_epochs`: 1
- `warmup_ratio`: 0.1
- `fp16`: True
- `batch_sampler`: no_duplicates



#### All Hyperparameters

<details><summary>Click to expand</summary>



- `overwrite_output_dir`: False

- `do_predict`: False
- `eval_strategy`: steps
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 16
- `per_device_eval_batch_size`: 16
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 2e-05
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 1
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: True
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}

- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch

- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save

- `hub_private_repo`: None

- `hub_always_push`: False

- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `include_for_metrics`: []
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`: 
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `average_tokens_across_devices`: False
- `prompts`: None
- `batch_sampler`: no_duplicates

- `multi_dataset_batch_sampler`: proportional
- `router_mapping`: {}
- `learning_rate_mapping`: {}

</details>

### Training Logs
| Epoch  | Step | Training Loss | Validation Loss | NanoMSMARCO_dot_ndcg@10 | NanoNFCorpus_dot_ndcg@10 | NanoNQ_dot_ndcg@10 | NanoBEIR_mean_dot_ndcg@10 |

|:------:|:----:|:-------------:|:---------------:|:-----------------------:|:------------------------:|:------------------:|:-------------------------:|

| -1     | -1   | -             | -               | 0.0823                  | 0.0412                   | 0.0621             | 0.0619                    |

| 0.0162 | 100  | 740.8226      | -               | -                       | -                        | -                  | -                         |

| 0.0323 | 200  | 82.2666       | -               | -                       | -                        | -                  | -                         |

| 0.0485 | 300  | 3.3514        | -               | -                       | -                        | -                  | -                         |

| 0.0646 | 400  | 1.9689        | -               | -                       | -                        | -                  | -                         |

| 0.0808 | 500  | 1.8268        | 1.8327          | 0.1979                  | 0.1096                   | 0.2507             | 0.1861                    |

| 0.0970 | 600  | 1.8           | -               | -                       | -                        | -                  | -                         |

| 0.1131 | 700  | 1.613         | -               | -                       | -                        | -                  | -                         |

| 0.1293 | 800  | 1.5977        | -               | -                       | -                        | -                  | -                         |

| 0.1454 | 900  | 1.5886        | -               | -                       | -                        | -                  | -                         |

| 0.1616 | 1000 | 1.3922        | 1.2983          | 0.5044                  | 0.2715                   | 0.5851             | 0.4537                    |

| 0.1778 | 1100 | 1.3708        | -               | -                       | -                        | -                  | -                         |

| 0.1939 | 1200 | 1.383         | -               | -                       | -                        | -                  | -                         |

| 0.2101 | 1300 | 1.2148        | -               | -                       | -                        | -                  | -                         |

| 0.2262 | 1400 | 1.246         | -               | -                       | -                        | -                  | -                         |

| 0.2424 | 1500 | 1.2206        | 1.0998          | 0.5329                  | 0.2969                   | 0.5945             | 0.4748                    |

| 0.2586 | 1600 | 1.1962        | -               | -                       | -                        | -                  | -                         |

| 0.2747 | 1700 | 1.1546        | -               | -                       | -                        | -                  | -                         |

| 0.2909 | 1800 | 1.1319        | -               | -                       | -                        | -                  | -                         |

| 0.3070 | 1900 | 1.1656        | -               | -                       | -                        | -                  | -                         |

| 0.3232 | 2000 | 1.1196        | 0.9878          | 0.5667                  | 0.3283                   | 0.6106             | 0.5019                    |

| 0.3394 | 2100 | 1.0789        | -               | -                       | -                        | -                  | -                         |

| 0.3555 | 2200 | 1.0148        | -               | -                       | -                        | -                  | -                         |

| 0.3717 | 2300 | 1.042         | -               | -                       | -                        | -                  | -                         |

| 0.3878 | 2400 | 1.0274        | -               | -                       | -                        | -                  | -                         |

| 0.4040 | 2500 | 1.0041        | 0.8749          | 0.6059                  | 0.3346                   | 0.5942             | 0.5116                    |

| 0.4202 | 2600 | 1.0557        | -               | -                       | -                        | -                  | -                         |

| 0.4363 | 2700 | 1.0077        | -               | -                       | -                        | -                  | -                         |

| 0.4525 | 2800 | 1.0115        | -               | -                       | -                        | -                  | -                         |

| 0.4686 | 2900 | 0.8708        | -               | -                       | -                        | -                  | -                         |

| 0.4848 | 3000 | 0.8838        | 0.9321          | 0.5826                  | 0.3264                   | 0.6354             | 0.5148                    |

| 0.5010 | 3100 | 0.9103        | -               | -                       | -                        | -                  | -                         |

| 0.5171 | 3200 | 0.8586        | -               | -                       | -                        | -                  | -                         |

| 0.5333 | 3300 | 0.9286        | -               | -                       | -                        | -                  | -                         |

| 0.5495 | 3400 | 0.8645        | -               | -                       | -                        | -                  | -                         |

| 0.5656 | 3500 | 0.9522        | 0.8105          | 0.6164                  | 0.3378                   | 0.6131             | 0.5224                    |

| 0.5818 | 3600 | 0.8636        | -               | -                       | -                        | -                  | -                         |

| 0.5979 | 3700 | 0.8634        | -               | -                       | -                        | -                  | -                         |

| 0.6141 | 3800 | 0.8555        | -               | -                       | -                        | -                  | -                         |

| 0.6303 | 3900 | 0.8447        | -               | -                       | -                        | -                  | -                         |

| 0.6464 | 4000 | 0.8331        | 0.7699          | 0.6033                  | 0.3442                   | 0.6016             | 0.5164                    |

| 0.6626 | 4100 | 0.8292        | -               | -                       | -                        | -                  | -                         |

| 0.6787 | 4200 | 0.8273        | -               | -                       | -                        | -                  | -                         |

| 0.6949 | 4300 | 0.8381        | -               | -                       | -                        | -                  | -                         |

| 0.7111 | 4400 | 0.8035        | -               | -                       | -                        | -                  | -                         |

| 0.7272 | 4500 | 0.8166        | 0.7743          | 0.6018                  | 0.3394                   | 0.6060             | 0.5157                    |

| 0.7434 | 4600 | 0.8245        | -               | -                       | -                        | -                  | -                         |

| 0.7595 | 4700 | 0.7831        | -               | -                       | -                        | -                  | -                         |

| 0.7757 | 4800 | 0.8314        | -               | -                       | -                        | -                  | -                         |

| 0.7919 | 4900 | 0.7994        | -               | -                       | -                        | -                  | -                         |

| 0.8080 | 5000 | 0.8018        | 0.7058          | 0.6236                  | 0.3413                   | 0.6378             | 0.5342                    |

| 0.8242 | 5100 | 0.7652        | -               | -                       | -                        | -                  | -                         |

| 0.8403 | 5200 | 0.7458        | -               | -                       | -                        | -                  | -                         |

| 0.8565 | 5300 | 0.8158        | -               | -                       | -                        | -                  | -                         |

| 0.8727 | 5400 | 0.7887        | -               | -                       | -                        | -                  | -                         |

| 0.8888 | 5500 | 0.7372        | 0.7389          | 0.6251                  | 0.3476                   | 0.6327             | 0.5351                    |

| 0.9050 | 5600 | 0.8           | -               | -                       | -                        | -                  | -                         |

| 0.9211 | 5700 | 0.7724        | -               | -                       | -                        | -                  | -                         |

| 0.9373 | 5800 | 0.7578        | -               | -                       | -                        | -                  | -                         |

| 0.9535 | 5900 | 0.7536        | -               | -                       | -                        | -                  | -                         |

| 0.9696 | 6000 | 0.7982        | 0.7011          | 0.6289                  | 0.3396                   | 0.6308             | 0.5331                    |

| 0.9858 | 6100 | 0.8084        | -               | -                       | -                        | -                  | -                         |

| -1     | -1   | -             | -               | 0.6291                  | 0.3390                   | 0.6305             | 0.5329                    |





### Environmental Impact

Carbon emissions were measured using [CodeCarbon](https://github.com/mlco2/codecarbon).

- **Energy Consumed**: 0.201 kWh

- **Carbon Emitted**: 0.078 kg of CO2

- **Hours Used**: 0.571 hours



### Training Hardware

- **On Cloud**: No

- **GPU Model**: 1 x NVIDIA GeForce RTX 3090

- **CPU Model**: 13th Gen Intel(R) Core(TM) i7-13700K

- **RAM Size**: 31.78 GB



### Framework Versions

- Python: 3.11.6

- Sentence Transformers: 4.2.0.dev0

- Transformers: 4.52.4

- PyTorch: 2.7.1+cu126

- Accelerate: 1.5.1

- Datasets: 2.21.0

- Tokenizers: 0.21.1



## Citation



### BibTeX



#### Sentence Transformers

```bibtex

@inproceedings{reimers-2019-sentence-bert,

    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",

    author = "Reimers, Nils and Gurevych, Iryna",

    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",

    month = "11",

    year = "2019",

    publisher = "Association for Computational Linguistics",

    url = "https://arxiv.org/abs/1908.10084",

}

```



#### SpladeLoss

```bibtex

@misc{formal2022distillationhardnegativesampling,

      title={From Distillation to Hard Negative Sampling: Making Sparse Neural IR Models More Effective},

      author={Thibault Formal and Carlos Lassance and Benjamin Piwowarski and Stéphane Clinchant},

      year={2022},

      eprint={2205.04733},

      archivePrefix={arXiv},

      primaryClass={cs.IR},

      url={https://arxiv.org/abs/2205.04733},

}

```



#### SparseDistillKLDivLoss

```bibtex

@misc{lin2020distillingdenserepresentationsranking,

      title={Distilling Dense Representations for Ranking using Tightly-Coupled Teachers},

      author={Sheng-Chieh Lin and Jheng-Hong Yang and Jimmy Lin},

      year={2020},

      eprint={2010.11386},

      archivePrefix={arXiv},

      primaryClass={cs.IR},

      url={https://arxiv.org/abs/2010.11386},

}

```



#### FlopsLoss

```bibtex

@article{paria2020minimizing,

    title={Minimizing flops to learn efficient sparse representations},

    author={Paria, Biswajit and Yeh, Chih-Kuan and Yen, Ian EH and Xu, Ning and Ravikumar, Pradeep and P{'o}czos, Barnab{'a}s},

    journal={arXiv preprint arXiv:2004.05665},

    year={2020}

}

```



<!--

## Glossary



*Clearly define terms in order to be accessible across audiences.*

-->



<!--

## Model Card Authors



*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*

-->



<!--

## Model Card Contact



*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*

-->