id
stringlengths 11
95
| author
stringlengths 3
36
| task_category
stringclasses 16
values | tags
sequencelengths 1
4.05k
| created_time
timestamp[s]date 2022-03-02 23:29:04
2025-03-18 02:34:30
| last_modified
timestamp[s]date 2021-05-13 19:09:22
2025-03-18 03:19:02
| downloads
int64 0
15.6M
| likes
int64 0
4.86k
| README
stringlengths 246
1.01M
| matched_task
sequencelengths 1
8
| matched_bigbio_names
sequencelengths 1
8
|
---|---|---|---|---|---|---|---|---|---|---|
jinaai/jina-embedding-b-en-v1 | jinaai | sentence-similarity | [
"sentence-transformers",
"pytorch",
"t5",
"finetuner",
"feature-extraction",
"sentence-similarity",
"mteb",
"custom_code",
"en",
"dataset:jinaai/negation-dataset",
"arxiv:2307.11224",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2023-07-07T07:51:59 | 2025-01-06T16:31:20 | 2,074 | 6 | ---
datasets:
- jinaai/negation-dataset
language: en
license: apache-2.0
pipeline_tag: sentence-similarity
tags:
- finetuner
- sentence-transformers
- feature-extraction
- sentence-similarity
- mteb
model-index:
- name: jina-embedding-b-en-v1
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 66.73134328358208
- type: ap
value: 28.30575908745204
- type: f1
value: 60.02420130946191
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 67.6068
- type: ap
value: 63.5899352938589
- type: f1
value: 65.64285334357656
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 31.178
- type: f1
value: 29.68460843733487
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: arguana
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 24.964
- type: map_at_10
value: 40.217999999999996
- type: map_at_100
value: 41.263
- type: map_at_1000
value: 41.277
- type: map_at_3
value: 35.183
- type: map_at_5
value: 38.045
- type: mrr_at_1
value: 25.107000000000003
- type: mrr_at_10
value: 40.272999999999996
- type: mrr_at_100
value: 41.318
- type: mrr_at_1000
value: 41.333
- type: mrr_at_3
value: 35.242000000000004
- type: mrr_at_5
value: 38.101
- type: ndcg_at_1
value: 24.964
- type: ndcg_at_10
value: 49.006
- type: ndcg_at_100
value: 53.446000000000005
- type: ndcg_at_1000
value: 53.813
- type: ndcg_at_3
value: 38.598
- type: ndcg_at_5
value: 43.74
- type: precision_at_1
value: 24.964
- type: precision_at_10
value: 7.724
- type: precision_at_100
value: 0.966
- type: precision_at_1000
value: 0.099
- type: precision_at_3
value: 16.169
- type: precision_at_5
value: 12.191
- type: recall_at_1
value: 24.964
- type: recall_at_10
value: 77.24
- type: recall_at_100
value: 96.586
- type: recall_at_1000
value: 99.431
- type: recall_at_3
value: 48.506
- type: recall_at_5
value: 60.953
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 39.25203906042786
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 29.07648348376354
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 62.4029266143623
- type: mrr
value: 75.45750340764191
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 85.92280995704714
- type: cos_sim_spearman
value: 83.58082010833608
- type: euclidean_pearson
value: 48.64744162695948
- type: euclidean_spearman
value: 48.817377397301556
- type: manhattan_pearson
value: 48.87684776623195
- type: manhattan_spearman
value: 48.94268145725884
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 84.05519480519482
- type: f1
value: 83.94978356890618
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 32.2033276486685
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 26.631954164406014
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 29.625
- type: map_at_10
value: 40.037
- type: map_at_100
value: 41.52
- type: map_at_1000
value: 41.654
- type: map_at_3
value: 36.818
- type: map_at_5
value: 38.426
- type: mrr_at_1
value: 35.336
- type: mrr_at_10
value: 45.395
- type: mrr_at_100
value: 46.221000000000004
- type: mrr_at_1000
value: 46.264
- type: mrr_at_3
value: 42.823
- type: mrr_at_5
value: 44.204
- type: ndcg_at_1
value: 35.336
- type: ndcg_at_10
value: 46.326
- type: ndcg_at_100
value: 51.795
- type: ndcg_at_1000
value: 53.834
- type: ndcg_at_3
value: 41.299
- type: ndcg_at_5
value: 43.247
- type: precision_at_1
value: 35.336
- type: precision_at_10
value: 8.627
- type: precision_at_100
value: 1.428
- type: precision_at_1000
value: 0.197
- type: precision_at_3
value: 19.647000000000002
- type: precision_at_5
value: 13.733999999999998
- type: recall_at_1
value: 29.625
- type: recall_at_10
value: 59.165
- type: recall_at_100
value: 81.675
- type: recall_at_1000
value: 94.17
- type: recall_at_3
value: 44.485
- type: recall_at_5
value: 50.198
- type: map_at_1
value: 26.687
- type: map_at_10
value: 36.062
- type: map_at_100
value: 37.263000000000005
- type: map_at_1000
value: 37.397999999999996
- type: map_at_3
value: 32.967
- type: map_at_5
value: 34.75
- type: mrr_at_1
value: 33.885
- type: mrr_at_10
value: 42.632999999999996
- type: mrr_at_100
value: 43.305
- type: mrr_at_1000
value: 43.354
- type: mrr_at_3
value: 39.958
- type: mrr_at_5
value: 41.63
- type: ndcg_at_1
value: 33.885
- type: ndcg_at_10
value: 42.001
- type: ndcg_at_100
value: 46.436
- type: ndcg_at_1000
value: 48.774
- type: ndcg_at_3
value: 37.183
- type: ndcg_at_5
value: 39.605000000000004
- type: precision_at_1
value: 33.885
- type: precision_at_10
value: 7.962
- type: precision_at_100
value: 1.283
- type: precision_at_1000
value: 0.18
- type: precision_at_3
value: 17.855999999999998
- type: precision_at_5
value: 13.083
- type: recall_at_1
value: 26.687
- type: recall_at_10
value: 52.75
- type: recall_at_100
value: 71.324
- type: recall_at_1000
value: 86.356
- type: recall_at_3
value: 38.83
- type: recall_at_5
value: 45.23
- type: map_at_1
value: 34.02
- type: map_at_10
value: 45.751999999999995
- type: map_at_100
value: 46.867
- type: map_at_1000
value: 46.93
- type: map_at_3
value: 42.409
- type: map_at_5
value: 44.464999999999996
- type: mrr_at_1
value: 38.307
- type: mrr_at_10
value: 48.718
- type: mrr_at_100
value: 49.509
- type: mrr_at_1000
value: 49.542
- type: mrr_at_3
value: 46.007999999999996
- type: mrr_at_5
value: 47.766999999999996
- type: ndcg_at_1
value: 38.307
- type: ndcg_at_10
value: 51.666999999999994
- type: ndcg_at_100
value: 56.242000000000004
- type: ndcg_at_1000
value: 57.477999999999994
- type: ndcg_at_3
value: 45.912
- type: ndcg_at_5
value: 49.106
- type: precision_at_1
value: 38.307
- type: precision_at_10
value: 8.476
- type: precision_at_100
value: 1.176
- type: precision_at_1000
value: 0.133
- type: precision_at_3
value: 20.522000000000002
- type: precision_at_5
value: 14.557999999999998
- type: recall_at_1
value: 34.02
- type: recall_at_10
value: 66.046
- type: recall_at_100
value: 85.817
- type: recall_at_1000
value: 94.453
- type: recall_at_3
value: 51.059
- type: recall_at_5
value: 58.667
- type: map_at_1
value: 23.939
- type: map_at_10
value: 32.627
- type: map_at_100
value: 33.617999999999995
- type: map_at_1000
value: 33.701
- type: map_at_3
value: 30.11
- type: map_at_5
value: 31.380000000000003
- type: mrr_at_1
value: 25.989
- type: mrr_at_10
value: 34.655
- type: mrr_at_100
value: 35.502
- type: mrr_at_1000
value: 35.563
- type: mrr_at_3
value: 32.109
- type: mrr_at_5
value: 33.426
- type: ndcg_at_1
value: 25.989
- type: ndcg_at_10
value: 37.657000000000004
- type: ndcg_at_100
value: 42.467
- type: ndcg_at_1000
value: 44.677
- type: ndcg_at_3
value: 32.543
- type: ndcg_at_5
value: 34.74
- type: precision_at_1
value: 25.989
- type: precision_at_10
value: 5.876
- type: precision_at_100
value: 0.8710000000000001
- type: precision_at_1000
value: 0.11
- type: precision_at_3
value: 13.861
- type: precision_at_5
value: 9.626999999999999
- type: recall_at_1
value: 23.939
- type: recall_at_10
value: 51.28
- type: recall_at_100
value: 73.428
- type: recall_at_1000
value: 90.309
- type: recall_at_3
value: 37.245
- type: recall_at_5
value: 42.541000000000004
- type: map_at_1
value: 15.082
- type: map_at_10
value: 22.486
- type: map_at_100
value: 23.687
- type: map_at_1000
value: 23.807000000000002
- type: map_at_3
value: 20.076
- type: map_at_5
value: 21.362000000000002
- type: mrr_at_1
value: 18.532
- type: mrr_at_10
value: 26.605
- type: mrr_at_100
value: 27.628999999999998
- type: mrr_at_1000
value: 27.698
- type: mrr_at_3
value: 23.964
- type: mrr_at_5
value: 25.319000000000003
- type: ndcg_at_1
value: 18.532
- type: ndcg_at_10
value: 27.474999999999998
- type: ndcg_at_100
value: 33.357
- type: ndcg_at_1000
value: 36.361
- type: ndcg_at_3
value: 22.851
- type: ndcg_at_5
value: 24.87
- type: precision_at_1
value: 18.532
- type: precision_at_10
value: 5.210999999999999
- type: precision_at_100
value: 0.9329999999999999
- type: precision_at_1000
value: 0.134
- type: precision_at_3
value: 11.235000000000001
- type: precision_at_5
value: 8.134
- type: recall_at_1
value: 15.082
- type: recall_at_10
value: 38.759
- type: recall_at_100
value: 64.621
- type: recall_at_1000
value: 86.162
- type: recall_at_3
value: 26.055
- type: recall_at_5
value: 31.208999999999996
- type: map_at_1
value: 24.759999999999998
- type: map_at_10
value: 33.706
- type: map_at_100
value: 35.0
- type: map_at_1000
value: 35.134
- type: map_at_3
value: 30.789
- type: map_at_5
value: 32.427
- type: mrr_at_1
value: 29.548000000000002
- type: mrr_at_10
value: 38.521
- type: mrr_at_100
value: 39.432
- type: mrr_at_1000
value: 39.494
- type: mrr_at_3
value: 35.691
- type: mrr_at_5
value: 37.424
- type: ndcg_at_1
value: 29.548000000000002
- type: ndcg_at_10
value: 39.301
- type: ndcg_at_100
value: 44.907000000000004
- type: ndcg_at_1000
value: 47.494
- type: ndcg_at_3
value: 34.08
- type: ndcg_at_5
value: 36.649
- type: precision_at_1
value: 29.548000000000002
- type: precision_at_10
value: 7.084
- type: precision_at_100
value: 1.169
- type: precision_at_1000
value: 0.158
- type: precision_at_3
value: 15.881
- type: precision_at_5
value: 11.53
- type: recall_at_1
value: 24.759999999999998
- type: recall_at_10
value: 51.202000000000005
- type: recall_at_100
value: 74.542
- type: recall_at_1000
value: 91.669
- type: recall_at_3
value: 36.892
- type: recall_at_5
value: 43.333
- type: map_at_1
value: 23.247999999999998
- type: map_at_10
value: 31.878
- type: map_at_100
value: 33.135
- type: map_at_1000
value: 33.263999999999996
- type: map_at_3
value: 29.406
- type: map_at_5
value: 30.602
- type: mrr_at_1
value: 28.767
- type: mrr_at_10
value: 36.929
- type: mrr_at_100
value: 37.844
- type: mrr_at_1000
value: 37.913000000000004
- type: mrr_at_3
value: 34.589
- type: mrr_at_5
value: 35.908
- type: ndcg_at_1
value: 28.767
- type: ndcg_at_10
value: 37.172
- type: ndcg_at_100
value: 42.842
- type: ndcg_at_1000
value: 45.534
- type: ndcg_at_3
value: 32.981
- type: ndcg_at_5
value: 34.628
- type: precision_at_1
value: 28.767
- type: precision_at_10
value: 6.678000000000001
- type: precision_at_100
value: 1.1199999999999999
- type: precision_at_1000
value: 0.155
- type: precision_at_3
value: 15.715000000000002
- type: precision_at_5
value: 10.913
- type: recall_at_1
value: 23.247999999999998
- type: recall_at_10
value: 48.16
- type: recall_at_100
value: 72.753
- type: recall_at_1000
value: 90.8
- type: recall_at_3
value: 35.961999999999996
- type: recall_at_5
value: 40.504
- type: map_at_1
value: 23.825583333333334
- type: map_at_10
value: 32.2845
- type: map_at_100
value: 33.48566666666667
- type: map_at_1000
value: 33.60833333333333
- type: map_at_3
value: 29.604916666666664
- type: map_at_5
value: 31.015333333333334
- type: mrr_at_1
value: 27.850916666666663
- type: mrr_at_10
value: 36.122416666666666
- type: mrr_at_100
value: 37.01275
- type: mrr_at_1000
value: 37.07566666666667
- type: mrr_at_3
value: 33.665749999999996
- type: mrr_at_5
value: 35.00916666666667
- type: ndcg_at_1
value: 27.850916666666663
- type: ndcg_at_10
value: 37.47625
- type: ndcg_at_100
value: 42.74433333333334
- type: ndcg_at_1000
value: 45.21991666666667
- type: ndcg_at_3
value: 32.70916666666667
- type: ndcg_at_5
value: 34.80658333333333
- type: precision_at_1
value: 27.850916666666663
- type: precision_at_10
value: 6.5761666666666665
- type: precision_at_100
value: 1.0879999999999999
- type: precision_at_1000
value: 0.15058333333333332
- type: precision_at_3
value: 14.933833333333336
- type: precision_at_5
value: 10.607249999999999
- type: recall_at_1
value: 23.825583333333334
- type: recall_at_10
value: 49.100500000000004
- type: recall_at_100
value: 72.21133333333334
- type: recall_at_1000
value: 89.34791666666666
- type: recall_at_3
value: 35.90525
- type: recall_at_5
value: 41.24583333333334
- type: map_at_1
value: 21.343
- type: map_at_10
value: 27.313
- type: map_at_100
value: 28.316999999999997
- type: map_at_1000
value: 28.406
- type: map_at_3
value: 25.06
- type: map_at_5
value: 26.409
- type: mrr_at_1
value: 23.313
- type: mrr_at_10
value: 29.467
- type: mrr_at_100
value: 30.348999999999997
- type: mrr_at_1000
value: 30.42
- type: mrr_at_3
value: 27.173000000000002
- type: mrr_at_5
value: 28.461
- type: ndcg_at_1
value: 23.313
- type: ndcg_at_10
value: 31.183
- type: ndcg_at_100
value: 36.252
- type: ndcg_at_1000
value: 38.582
- type: ndcg_at_3
value: 26.838
- type: ndcg_at_5
value: 29.042
- type: precision_at_1
value: 23.313
- type: precision_at_10
value: 4.9079999999999995
- type: precision_at_100
value: 0.808
- type: precision_at_1000
value: 0.109
- type: precision_at_3
value: 11.299
- type: precision_at_5
value: 8.097999999999999
- type: recall_at_1
value: 21.343
- type: recall_at_10
value: 41.047
- type: recall_at_100
value: 64.372
- type: recall_at_1000
value: 81.499
- type: recall_at_3
value: 29.337000000000003
- type: recall_at_5
value: 34.756
- type: map_at_1
value: 16.595
- type: map_at_10
value: 23.433
- type: map_at_100
value: 24.578
- type: map_at_1000
value: 24.709999999999997
- type: map_at_3
value: 21.268
- type: map_at_5
value: 22.393
- type: mrr_at_1
value: 20.131
- type: mrr_at_10
value: 27.026
- type: mrr_at_100
value: 28.003
- type: mrr_at_1000
value: 28.083999999999996
- type: mrr_at_3
value: 24.966
- type: mrr_at_5
value: 26.064999999999998
- type: ndcg_at_1
value: 20.131
- type: ndcg_at_10
value: 27.846
- type: ndcg_at_100
value: 33.318999999999996
- type: ndcg_at_1000
value: 36.403
- type: ndcg_at_3
value: 23.883
- type: ndcg_at_5
value: 25.595000000000002
- type: precision_at_1
value: 20.131
- type: precision_at_10
value: 5.034000000000001
- type: precision_at_100
value: 0.9079999999999999
- type: precision_at_1000
value: 0.13699999999999998
- type: precision_at_3
value: 11.23
- type: precision_at_5
value: 8.032
- type: recall_at_1
value: 16.595
- type: recall_at_10
value: 37.576
- type: recall_at_100
value: 62.044
- type: recall_at_1000
value: 83.97
- type: recall_at_3
value: 26.631
- type: recall_at_5
value: 31.002000000000002
- type: map_at_1
value: 24.85
- type: map_at_10
value: 32.762
- type: map_at_100
value: 33.896
- type: map_at_1000
value: 34.006
- type: map_at_3
value: 29.965000000000003
- type: map_at_5
value: 31.485999999999997
- type: mrr_at_1
value: 28.731
- type: mrr_at_10
value: 36.504999999999995
- type: mrr_at_100
value: 37.364999999999995
- type: mrr_at_1000
value: 37.431
- type: mrr_at_3
value: 34.033
- type: mrr_at_5
value: 35.4
- type: ndcg_at_1
value: 28.731
- type: ndcg_at_10
value: 37.788
- type: ndcg_at_100
value: 43.1
- type: ndcg_at_1000
value: 45.623999999999995
- type: ndcg_at_3
value: 32.717
- type: ndcg_at_5
value: 35.024
- type: precision_at_1
value: 28.731
- type: precision_at_10
value: 6.371
- type: precision_at_100
value: 1.02
- type: precision_at_1000
value: 0.135
- type: precision_at_3
value: 14.521
- type: precision_at_5
value: 10.41
- type: recall_at_1
value: 24.85
- type: recall_at_10
value: 49.335
- type: recall_at_100
value: 72.792
- type: recall_at_1000
value: 90.525
- type: recall_at_3
value: 35.698
- type: recall_at_5
value: 41.385
- type: map_at_1
value: 23.016000000000002
- type: map_at_10
value: 32.126
- type: map_at_100
value: 33.786
- type: map_at_1000
value: 34.012
- type: map_at_3
value: 29.256
- type: map_at_5
value: 30.552
- type: mrr_at_1
value: 27.272999999999996
- type: mrr_at_10
value: 35.967
- type: mrr_at_100
value: 37.082
- type: mrr_at_1000
value: 37.146
- type: mrr_at_3
value: 33.531
- type: mrr_at_5
value: 34.697
- type: ndcg_at_1
value: 27.272999999999996
- type: ndcg_at_10
value: 37.945
- type: ndcg_at_100
value: 43.928
- type: ndcg_at_1000
value: 46.772999999999996
- type: ndcg_at_3
value: 33.111000000000004
- type: ndcg_at_5
value: 34.794000000000004
- type: precision_at_1
value: 27.272999999999996
- type: precision_at_10
value: 7.53
- type: precision_at_100
value: 1.512
- type: precision_at_1000
value: 0.241
- type: precision_at_3
value: 15.547
- type: precision_at_5
value: 11.146
- type: recall_at_1
value: 23.016000000000002
- type: recall_at_10
value: 49.576
- type: recall_at_100
value: 75.74600000000001
- type: recall_at_1000
value: 94.069
- type: recall_at_3
value: 35.964
- type: recall_at_5
value: 40.455999999999996
- type: map_at_1
value: 22.742
- type: map_at_10
value: 29.232000000000003
- type: map_at_100
value: 30.160999999999998
- type: map_at_1000
value: 30.278
- type: map_at_3
value: 27.134999999999998
- type: map_at_5
value: 27.932000000000002
- type: mrr_at_1
value: 24.399
- type: mrr_at_10
value: 31.048
- type: mrr_at_100
value: 31.912000000000003
- type: mrr_at_1000
value: 31.999
- type: mrr_at_3
value: 29.144
- type: mrr_at_5
value: 29.809
- type: ndcg_at_1
value: 24.399
- type: ndcg_at_10
value: 33.354
- type: ndcg_at_100
value: 38.287
- type: ndcg_at_1000
value: 41.105000000000004
- type: ndcg_at_3
value: 29.112
- type: ndcg_at_5
value: 30.379
- type: precision_at_1
value: 24.399
- type: precision_at_10
value: 5.157
- type: precision_at_100
value: 0.828
- type: precision_at_1000
value: 0.11800000000000001
- type: precision_at_3
value: 11.892
- type: precision_at_5
value: 8.022
- type: recall_at_1
value: 22.742
- type: recall_at_10
value: 44.31
- type: recall_at_100
value: 67.422
- type: recall_at_1000
value: 88.193
- type: recall_at_3
value: 32.705
- type: recall_at_5
value: 35.669000000000004
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: climate-fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 9.067
- type: map_at_10
value: 14.821000000000002
- type: map_at_100
value: 16.195
- type: map_at_1000
value: 16.359
- type: map_at_3
value: 12.666
- type: map_at_5
value: 13.675999999999998
- type: mrr_at_1
value: 20.326
- type: mrr_at_10
value: 29.798000000000002
- type: mrr_at_100
value: 30.875000000000004
- type: mrr_at_1000
value: 30.928
- type: mrr_at_3
value: 26.678
- type: mrr_at_5
value: 28.433000000000003
- type: ndcg_at_1
value: 20.326
- type: ndcg_at_10
value: 21.477
- type: ndcg_at_100
value: 27.637
- type: ndcg_at_1000
value: 30.953000000000003
- type: ndcg_at_3
value: 17.456
- type: ndcg_at_5
value: 18.789
- type: precision_at_1
value: 20.326
- type: precision_at_10
value: 6.482
- type: precision_at_100
value: 1.302
- type: precision_at_1000
value: 0.191
- type: precision_at_3
value: 12.53
- type: precision_at_5
value: 9.603
- type: recall_at_1
value: 9.067
- type: recall_at_10
value: 26.246000000000002
- type: recall_at_100
value: 47.837
- type: recall_at_1000
value: 66.637
- type: recall_at_3
value: 16.468
- type: recall_at_5
value: 20.088
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: dbpedia-entity
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 7.563000000000001
- type: map_at_10
value: 15.22
- type: map_at_100
value: 20.048
- type: map_at_1000
value: 21.17
- type: map_at_3
value: 11.627
- type: map_at_5
value: 13.239
- type: mrr_at_1
value: 56.25
- type: mrr_at_10
value: 64.846
- type: mrr_at_100
value: 65.405
- type: mrr_at_1000
value: 65.41799999999999
- type: mrr_at_3
value: 63.125
- type: mrr_at_5
value: 64.1
- type: ndcg_at_1
value: 45.0
- type: ndcg_at_10
value: 32.437
- type: ndcg_at_100
value: 35.483
- type: ndcg_at_1000
value: 42.186
- type: ndcg_at_3
value: 37.297000000000004
- type: ndcg_at_5
value: 34.697
- type: precision_at_1
value: 56.25
- type: precision_at_10
value: 25.15
- type: precision_at_100
value: 7.539999999999999
- type: precision_at_1000
value: 1.678
- type: precision_at_3
value: 40.666999999999994
- type: precision_at_5
value: 33.45
- type: recall_at_1
value: 7.563000000000001
- type: recall_at_10
value: 19.969
- type: recall_at_100
value: 40.113
- type: recall_at_1000
value: 61.72299999999999
- type: recall_at_3
value: 12.950999999999999
- type: recall_at_5
value: 15.690999999999999
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 44.675000000000004
- type: f1
value: 40.779372586075105
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 57.406
- type: map_at_10
value: 67.69500000000001
- type: map_at_100
value: 68.08
- type: map_at_1000
value: 68.095
- type: map_at_3
value: 65.688
- type: map_at_5
value: 66.93
- type: mrr_at_1
value: 61.941
- type: mrr_at_10
value: 72.513
- type: mrr_at_100
value: 72.83699999999999
- type: mrr_at_1000
value: 72.844
- type: mrr_at_3
value: 70.60499999999999
- type: mrr_at_5
value: 71.807
- type: ndcg_at_1
value: 61.941
- type: ndcg_at_10
value: 73.29
- type: ndcg_at_100
value: 74.96300000000001
- type: ndcg_at_1000
value: 75.28200000000001
- type: ndcg_at_3
value: 69.491
- type: ndcg_at_5
value: 71.573
- type: precision_at_1
value: 61.941
- type: precision_at_10
value: 9.388
- type: precision_at_100
value: 1.0290000000000001
- type: precision_at_1000
value: 0.107
- type: precision_at_3
value: 27.423
- type: precision_at_5
value: 17.627000000000002
- type: recall_at_1
value: 57.406
- type: recall_at_10
value: 85.975
- type: recall_at_100
value: 93.29899999999999
- type: recall_at_1000
value: 95.531
- type: recall_at_3
value: 75.624
- type: recall_at_5
value: 80.78999999999999
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: fiqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 16.314999999999998
- type: map_at_10
value: 26.678
- type: map_at_100
value: 28.322000000000003
- type: map_at_1000
value: 28.519
- type: map_at_3
value: 23.105
- type: map_at_5
value: 24.808
- type: mrr_at_1
value: 33.333
- type: mrr_at_10
value: 41.453
- type: mrr_at_100
value: 42.339
- type: mrr_at_1000
value: 42.39
- type: mrr_at_3
value: 38.863
- type: mrr_at_5
value: 40.159
- type: ndcg_at_1
value: 33.333
- type: ndcg_at_10
value: 34.062
- type: ndcg_at_100
value: 40.595
- type: ndcg_at_1000
value: 44.124
- type: ndcg_at_3
value: 30.689
- type: ndcg_at_5
value: 31.255
- type: precision_at_1
value: 33.333
- type: precision_at_10
value: 9.722
- type: precision_at_100
value: 1.6480000000000001
- type: precision_at_1000
value: 0.22699999999999998
- type: precision_at_3
value: 20.936
- type: precision_at_5
value: 15.154
- type: recall_at_1
value: 16.314999999999998
- type: recall_at_10
value: 41.221000000000004
- type: recall_at_100
value: 65.857
- type: recall_at_1000
value: 87.327
- type: recall_at_3
value: 27.435
- type: recall_at_5
value: 32.242
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: hotpotqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 31.978
- type: map_at_10
value: 43.784
- type: map_at_100
value: 44.547
- type: map_at_1000
value: 44.614
- type: map_at_3
value: 41.317
- type: map_at_5
value: 42.812
- type: mrr_at_1
value: 63.956999999999994
- type: mrr_at_10
value: 70.502
- type: mrr_at_100
value: 70.845
- type: mrr_at_1000
value: 70.865
- type: mrr_at_3
value: 69.192
- type: mrr_at_5
value: 69.994
- type: ndcg_at_1
value: 63.956999999999994
- type: ndcg_at_10
value: 52.782
- type: ndcg_at_100
value: 55.78999999999999
- type: ndcg_at_1000
value: 57.289
- type: ndcg_at_3
value: 48.864000000000004
- type: ndcg_at_5
value: 50.964
- type: precision_at_1
value: 63.956999999999994
- type: precision_at_10
value: 10.809000000000001
- type: precision_at_100
value: 1.319
- type: precision_at_1000
value: 0.152
- type: precision_at_3
value: 30.2
- type: precision_at_5
value: 19.787
- type: recall_at_1
value: 31.978
- type: recall_at_10
value: 54.045
- type: recall_at_100
value: 65.928
- type: recall_at_1000
value: 75.976
- type: recall_at_3
value: 45.300000000000004
- type: recall_at_5
value: 49.467
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 63.8708
- type: ap
value: 59.02002684158838
- type: f1
value: 63.650055896985315
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: msmarco
config: default
split: dev
revision: None
metrics:
- type: map_at_1
value: 19.834
- type: map_at_10
value: 31.317
- type: map_at_100
value: 32.576
- type: map_at_1000
value: 32.631
- type: map_at_3
value: 27.728
- type: map_at_5
value: 29.720000000000002
- type: mrr_at_1
value: 20.43
- type: mrr_at_10
value: 31.868999999999996
- type: mrr_at_100
value: 33.074999999999996
- type: mrr_at_1000
value: 33.123999999999995
- type: mrr_at_3
value: 28.333000000000002
- type: mrr_at_5
value: 30.305
- type: ndcg_at_1
value: 20.43
- type: ndcg_at_10
value: 37.769000000000005
- type: ndcg_at_100
value: 43.924
- type: ndcg_at_1000
value: 45.323
- type: ndcg_at_3
value: 30.422
- type: ndcg_at_5
value: 33.98
- type: precision_at_1
value: 20.43
- type: precision_at_10
value: 6.027
- type: precision_at_100
value: 0.9119999999999999
- type: precision_at_1000
value: 0.10300000000000001
- type: precision_at_3
value: 12.985
- type: precision_at_5
value: 9.593
- type: recall_at_1
value: 19.834
- type: recall_at_10
value: 57.647000000000006
- type: recall_at_100
value: 86.276
- type: recall_at_1000
value: 97.065
- type: recall_at_3
value: 37.616
- type: recall_at_5
value: 46.171
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 91.52530779753762
- type: f1
value: 91.4004687820246
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 72.82717738258093
- type: f1
value: 56.791387113030346
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 71.09280430396772
- type: f1
value: 68.92843467363518
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 76.2542030934768
- type: f1
value: 76.22211319699834
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 29.604407852989457
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 25.011863718751183
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 31.55552172383111
- type: mrr
value: 32.65475731770242
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: nfcorpus
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 4.968
- type: map_at_10
value: 10.703999999999999
- type: map_at_100
value: 13.316
- type: map_at_1000
value: 14.674000000000001
- type: map_at_3
value: 7.809000000000001
- type: map_at_5
value: 9.268
- type: mrr_at_1
value: 41.796
- type: mrr_at_10
value: 50.558
- type: mrr_at_100
value: 51.125
- type: mrr_at_1000
value: 51.184
- type: mrr_at_3
value: 48.349
- type: mrr_at_5
value: 49.572
- type: ndcg_at_1
value: 39.783
- type: ndcg_at_10
value: 30.375999999999998
- type: ndcg_at_100
value: 27.648
- type: ndcg_at_1000
value: 36.711
- type: ndcg_at_3
value: 35.053
- type: ndcg_at_5
value: 33.278999999999996
- type: precision_at_1
value: 41.796
- type: precision_at_10
value: 22.663
- type: precision_at_100
value: 7.210999999999999
- type: precision_at_1000
value: 1.984
- type: precision_at_3
value: 33.127
- type: precision_at_5
value: 29.102
- type: recall_at_1
value: 4.968
- type: recall_at_10
value: 14.469999999999999
- type: recall_at_100
value: 28.188000000000002
- type: recall_at_1000
value: 60.769
- type: recall_at_3
value: 8.737
- type: recall_at_5
value: 11.539000000000001
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: nq
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 26.958
- type: map_at_10
value: 40.6
- type: map_at_100
value: 41.754000000000005
- type: map_at_1000
value: 41.792
- type: map_at_3
value: 36.521
- type: map_at_5
value: 38.866
- type: mrr_at_1
value: 30.330000000000002
- type: mrr_at_10
value: 43.013
- type: mrr_at_100
value: 43.89
- type: mrr_at_1000
value: 43.917
- type: mrr_at_3
value: 39.489000000000004
- type: mrr_at_5
value: 41.504999999999995
- type: ndcg_at_1
value: 30.330000000000002
- type: ndcg_at_10
value: 47.878
- type: ndcg_at_100
value: 52.761
- type: ndcg_at_1000
value: 53.69500000000001
- type: ndcg_at_3
value: 40.061
- type: ndcg_at_5
value: 43.980000000000004
- type: precision_at_1
value: 30.330000000000002
- type: precision_at_10
value: 8.048
- type: precision_at_100
value: 1.076
- type: precision_at_1000
value: 0.117
- type: precision_at_3
value: 18.299000000000003
- type: precision_at_5
value: 13.25
- type: recall_at_1
value: 26.958
- type: recall_at_10
value: 67.72399999999999
- type: recall_at_100
value: 89.02600000000001
- type: recall_at_1000
value: 96.029
- type: recall_at_3
value: 47.332
- type: recall_at_5
value: 56.36600000000001
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: quora
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 69.926
- type: map_at_10
value: 83.797
- type: map_at_100
value: 84.42699999999999
- type: map_at_1000
value: 84.446
- type: map_at_3
value: 80.78
- type: map_at_5
value: 82.669
- type: mrr_at_1
value: 80.44
- type: mrr_at_10
value: 86.79
- type: mrr_at_100
value: 86.90299999999999
- type: mrr_at_1000
value: 86.904
- type: mrr_at_3
value: 85.753
- type: mrr_at_5
value: 86.478
- type: ndcg_at_1
value: 80.44
- type: ndcg_at_10
value: 87.634
- type: ndcg_at_100
value: 88.9
- type: ndcg_at_1000
value: 89.03
- type: ndcg_at_3
value: 84.622
- type: ndcg_at_5
value: 86.29
- type: precision_at_1
value: 80.44
- type: precision_at_10
value: 13.305
- type: precision_at_100
value: 1.524
- type: precision_at_1000
value: 0.157
- type: precision_at_3
value: 36.957
- type: precision_at_5
value: 24.328
- type: recall_at_1
value: 69.926
- type: recall_at_10
value: 94.99300000000001
- type: recall_at_100
value: 99.345
- type: recall_at_1000
value: 99.97
- type: recall_at_3
value: 86.465
- type: recall_at_5
value: 91.121
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 42.850644235471144
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 52.547875398320734
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: scidocs
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 4.328
- type: map_at_10
value: 10.479
- type: map_at_100
value: 12.25
- type: map_at_1000
value: 12.522
- type: map_at_3
value: 7.548000000000001
- type: map_at_5
value: 9.039
- type: mrr_at_1
value: 21.3
- type: mrr_at_10
value: 30.678
- type: mrr_at_100
value: 31.77
- type: mrr_at_1000
value: 31.831
- type: mrr_at_3
value: 27.500000000000004
- type: mrr_at_5
value: 29.375
- type: ndcg_at_1
value: 21.3
- type: ndcg_at_10
value: 17.626
- type: ndcg_at_100
value: 25.03
- type: ndcg_at_1000
value: 30.055
- type: ndcg_at_3
value: 16.744999999999997
- type: ndcg_at_5
value: 14.729999999999999
- type: precision_at_1
value: 21.3
- type: precision_at_10
value: 9.09
- type: precision_at_100
value: 1.989
- type: precision_at_1000
value: 0.32
- type: precision_at_3
value: 15.467
- type: precision_at_5
value: 12.879999999999999
- type: recall_at_1
value: 4.328
- type: recall_at_10
value: 18.412
- type: recall_at_100
value: 40.363
- type: recall_at_1000
value: 64.997
- type: recall_at_3
value: 9.408
- type: recall_at_5
value: 13.048000000000002
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 84.1338589503896
- type: cos_sim_spearman
value: 79.1378154534123
- type: euclidean_pearson
value: 73.17857462509251
- type: euclidean_spearman
value: 70.79268955610539
- type: manhattan_pearson
value: 72.8280251705823
- type: manhattan_spearman
value: 70.60323787229834
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 84.21604641858598
- type: cos_sim_spearman
value: 75.06080146054282
- type: euclidean_pearson
value: 69.44429285856924
- type: euclidean_spearman
value: 58.240130690046456
- type: manhattan_pearson
value: 69.07597314234852
- type: manhattan_spearman
value: 58.08224335836159
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 80.2252849321165
- type: cos_sim_spearman
value: 80.85907200101076
- type: euclidean_pearson
value: 70.85619832878055
- type: euclidean_spearman
value: 71.59417341887324
- type: manhattan_pearson
value: 70.55842192345895
- type: manhattan_spearman
value: 71.30332994715893
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 80.50469360654135
- type: cos_sim_spearman
value: 76.12917164308409
- type: euclidean_pearson
value: 70.4070213910491
- type: euclidean_spearman
value: 66.97320451942113
- type: manhattan_pearson
value: 70.24834290119863
- type: manhattan_spearman
value: 66.9047074173091
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 84.70140350059746
- type: cos_sim_spearman
value: 85.55427877110485
- type: euclidean_pearson
value: 63.4780453371435
- type: euclidean_spearman
value: 64.65485395077273
- type: manhattan_pearson
value: 63.64869846572011
- type: manhattan_spearman
value: 64.87219311596813
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 79.4416477676503
- type: cos_sim_spearman
value: 81.2094925260351
- type: euclidean_pearson
value: 68.372257553367
- type: euclidean_spearman
value: 69.47792807911692
- type: manhattan_pearson
value: 68.17773583183664
- type: manhattan_spearman
value: 69.31505452732998
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 88.94688403351994
- type: cos_sim_spearman
value: 88.97626967707933
- type: euclidean_pearson
value: 74.09942728422159
- type: euclidean_spearman
value: 72.91022362666948
- type: manhattan_pearson
value: 74.11262432880199
- type: manhattan_spearman
value: 72.82115894578564
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 67.42605802805606
- type: cos_sim_spearman
value: 66.22330559222408
- type: euclidean_pearson
value: 50.15272876367891
- type: euclidean_spearman
value: 60.695400782452715
- type: manhattan_pearson
value: 50.17076569264417
- type: manhattan_spearman
value: 60.3761281869747
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 82.85939227596093
- type: cos_sim_spearman
value: 82.57071649593358
- type: euclidean_pearson
value: 72.18291316100125
- type: euclidean_spearman
value: 70.70702024402348
- type: manhattan_pearson
value: 72.36789718833687
- type: manhattan_spearman
value: 70.92789721402387
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 79.31107201598611
- type: mrr
value: 93.66321314850727
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: scifact
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 45.428000000000004
- type: map_at_10
value: 54.730000000000004
- type: map_at_100
value: 55.421
- type: map_at_1000
value: 55.47299999999999
- type: map_at_3
value: 52.333
- type: map_at_5
value: 53.72
- type: mrr_at_1
value: 48.333
- type: mrr_at_10
value: 56.601
- type: mrr_at_100
value: 57.106
- type: mrr_at_1000
value: 57.154
- type: mrr_at_3
value: 54.611
- type: mrr_at_5
value: 55.87800000000001
- type: ndcg_at_1
value: 48.333
- type: ndcg_at_10
value: 59.394999999999996
- type: ndcg_at_100
value: 62.549
- type: ndcg_at_1000
value: 63.941
- type: ndcg_at_3
value: 55.096000000000004
- type: ndcg_at_5
value: 57.325
- type: precision_at_1
value: 48.333
- type: precision_at_10
value: 8.1
- type: precision_at_100
value: 0.983
- type: precision_at_1000
value: 0.11
- type: precision_at_3
value: 21.889
- type: precision_at_5
value: 14.533
- type: recall_at_1
value: 45.428000000000004
- type: recall_at_10
value: 71.806
- type: recall_at_100
value: 86.533
- type: recall_at_1000
value: 97.5
- type: recall_at_3
value: 60.228
- type: recall_at_5
value: 65.90599999999999
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.8029702970297
- type: cos_sim_ap
value: 95.48085242816634
- type: cos_sim_f1
value: 89.86653484923382
- type: cos_sim_precision
value: 88.85630498533725
- type: cos_sim_recall
value: 90.9
- type: dot_accuracy
value: 99.21881188118812
- type: dot_ap
value: 55.14126603018576
- type: dot_f1
value: 55.22458628841608
- type: dot_precision
value: 52.37668161434977
- type: dot_recall
value: 58.4
- type: euclidean_accuracy
value: 99.64356435643565
- type: euclidean_ap
value: 84.52487064474103
- type: euclidean_f1
value: 80.53908355795149
- type: euclidean_precision
value: 87.36842105263159
- type: euclidean_recall
value: 74.7
- type: manhattan_accuracy
value: 99.63861386138613
- type: manhattan_ap
value: 84.1994288662172
- type: manhattan_f1
value: 80.38482095136291
- type: manhattan_precision
value: 86.33754305396096
- type: manhattan_recall
value: 75.2
- type: max_accuracy
value: 99.8029702970297
- type: max_ap
value: 95.48085242816634
- type: max_f1
value: 89.86653484923382
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 48.06508273111389
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 31.36169910951664
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 50.110601218420356
- type: mrr
value: 50.90277777777777
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 29.63669555287747
- type: cos_sim_spearman
value: 30.708042454053853
- type: dot_pearson
value: 20.309025749838924
- type: dot_spearman
value: 21.511758746817165
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: trec-covid
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 0.201
- type: map_at_10
value: 1.405
- type: map_at_100
value: 7.359999999999999
- type: map_at_1000
value: 17.858
- type: map_at_3
value: 0.494
- type: map_at_5
value: 0.757
- type: mrr_at_1
value: 74.0
- type: mrr_at_10
value: 84.89999999999999
- type: mrr_at_100
value: 84.89999999999999
- type: mrr_at_1000
value: 84.89999999999999
- type: mrr_at_3
value: 84.0
- type: mrr_at_5
value: 84.89999999999999
- type: ndcg_at_1
value: 68.0
- type: ndcg_at_10
value: 60.571
- type: ndcg_at_100
value: 46.016
- type: ndcg_at_1000
value: 41.277
- type: ndcg_at_3
value: 63.989
- type: ndcg_at_5
value: 61.41
- type: precision_at_1
value: 74.0
- type: precision_at_10
value: 65.2
- type: precision_at_100
value: 47.04
- type: precision_at_1000
value: 18.416
- type: precision_at_3
value: 68.0
- type: precision_at_5
value: 66.4
- type: recall_at_1
value: 0.201
- type: recall_at_10
value: 1.763
- type: recall_at_100
value: 11.008999999999999
- type: recall_at_1000
value: 38.509
- type: recall_at_3
value: 0.551
- type: recall_at_5
value: 0.881
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: webis-touche2020
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 1.4040000000000001
- type: map_at_10
value: 7.847999999999999
- type: map_at_100
value: 12.908
- type: map_at_1000
value: 14.37
- type: map_at_3
value: 3.6450000000000005
- type: map_at_5
value: 4.93
- type: mrr_at_1
value: 18.367
- type: mrr_at_10
value: 32.576
- type: mrr_at_100
value: 34.163
- type: mrr_at_1000
value: 34.18
- type: mrr_at_3
value: 28.571
- type: mrr_at_5
value: 30.918
- type: ndcg_at_1
value: 15.306000000000001
- type: ndcg_at_10
value: 18.59
- type: ndcg_at_100
value: 30.394
- type: ndcg_at_1000
value: 42.198
- type: ndcg_at_3
value: 18.099
- type: ndcg_at_5
value: 16.955000000000002
- type: precision_at_1
value: 16.326999999999998
- type: precision_at_10
value: 17.959
- type: precision_at_100
value: 6.755
- type: precision_at_1000
value: 1.4529999999999998
- type: precision_at_3
value: 20.408
- type: precision_at_5
value: 18.367
- type: recall_at_1
value: 1.4040000000000001
- type: recall_at_10
value: 14.048
- type: recall_at_100
value: 42.150999999999996
- type: recall_at_1000
value: 77.85600000000001
- type: recall_at_3
value: 4.819
- type: recall_at_5
value: 7.13
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 66.1456
- type: ap
value: 11.631023858569064
- type: f1
value: 50.128196455722254
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 56.850594227504246
- type: f1
value: 56.82313689360827
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 38.060423744064764
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 84.43702688204088
- type: cos_sim_ap
value: 68.30176948820142
- type: cos_sim_f1
value: 64.25430330443524
- type: cos_sim_precision
value: 61.33365315423362
- type: cos_sim_recall
value: 67.46701846965699
- type: dot_accuracy
value: 77.76718126005842
- type: dot_ap
value: 37.510516716176305
- type: dot_f1
value: 43.53859496964441
- type: dot_precision
value: 32.428940568475454
- type: dot_recall
value: 66.2269129287599
- type: euclidean_accuracy
value: 82.10049472492102
- type: euclidean_ap
value: 61.64354520687271
- type: euclidean_f1
value: 59.804144841721694
- type: euclidean_precision
value: 52.604166666666664
- type: euclidean_recall
value: 69.28759894459104
- type: manhattan_accuracy
value: 82.22566609048101
- type: manhattan_ap
value: 61.753431124879974
- type: manhattan_f1
value: 59.77735297424941
- type: manhattan_precision
value: 52.0870076425632
- type: manhattan_recall
value: 70.13192612137203
- type: max_accuracy
value: 84.43702688204088
- type: max_ap
value: 68.30176948820142
- type: max_f1
value: 64.25430330443524
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 88.81515116233942
- type: cos_sim_ap
value: 85.33305785100573
- type: cos_sim_f1
value: 78.11202938475667
- type: cos_sim_precision
value: 74.68567816253424
- type: cos_sim_recall
value: 81.86787804126887
- type: dot_accuracy
value: 82.50475414289595
- type: dot_ap
value: 69.87015340174045
- type: dot_f1
value: 65.94174480373633
- type: dot_precision
value: 61.40362525728703
- type: dot_recall
value: 71.20418848167539
- type: euclidean_accuracy
value: 83.05778709201692
- type: euclidean_ap
value: 70.54206653977498
- type: euclidean_f1
value: 62.98969847356943
- type: euclidean_precision
value: 61.55033063923585
- type: euclidean_recall
value: 64.49799815214044
- type: manhattan_accuracy
value: 83.0034540303489
- type: manhattan_ap
value: 70.53997987198404
- type: manhattan_f1
value: 62.95875898600075
- type: manhattan_precision
value: 61.89555125725339
- type: manhattan_recall
value: 64.05913150600554
- type: max_accuracy
value: 88.81515116233942
- type: max_ap
value: 85.33305785100573
- type: max_f1
value: 78.11202938475667
---
---
<br><br>
<p align="center">
<img src="https://huggingface.co/datasets/jinaai/documentation-images/resolve/main/logo.webp" alt="Jina AI: Your Search Foundation, Supercharged!" width="150px">
</p>
<p align="center">
<b>The text embedding set trained by <a href="https://jina.ai/"><b>Jina AI</b></a></b>
</p>
## Intented Usage & Model Info
`jina-embedding-b-en-v1` is a language model that has been trained using Jina AI's Linnaeus-Clean dataset.
This dataset consists of 380 million pairs of sentences, which include both query-document pairs.
These pairs were obtained from various domains and were carefully selected through a thorough cleaning process.
The Linnaeus-Full dataset, from which the Linnaeus-Clean dataset is derived, originally contained 1.6 billion sentence pairs.
The model has a range of use cases, including information retrieval, semantic textual similarity, text reranking, and more.
With a standard size of 110 million parameters,
the model enables fast inference while delivering better performance than our small model.
It is recommended to use a single GPU for inference.
Additionally, we provide the following options:
- [`jina-embedding-t-en-v1`](https://huggingface.co/jinaai/jina-embedding-t-en-v1): 14 million parameters.
- [`jina-embedding-s-en-v1`](https://huggingface.co/jinaai/jina-embedding-s-en-v1): 35 million parameters
- [`jina-embedding-b-en-v1`](https://huggingface.co/jinaai/jina-embedding-b-en-v1): 110 million parameters **(you are here)**.
- [`jina-embedding-l-en-v1`](https://huggingface.co/jinaai/jina-embedding-l-en-v1): 330 million parameters.
- `jina-embedding-1b-en-v1`: 1.2 billion parameters, 10 times bert-base (soon).
- `jina-embedding-6b-en-v1`: 6 billion parameters, 30 times bert-base (soon).
## Data & Parameters
Please checkout our [technical blog](https://arxiv.org/abs/2307.11224).
## Metrics
We compared the model against `all-minilm-l6-v2`/`all-mpnet-base-v2` from sbert and `text-embeddings-ada-002` from OpenAI:
|Name|param |dimension|
|------------------------------|-----|------|
|all-minilm-l6-v2|23m |384|
|all-mpnet-base-v2 |110m |768|
|ada-embedding-002|Unknown/OpenAI API |1536|
|jina-embedding-t-en-v1|14m |312|
|jina-embedding-s-en-v1|35m |512|
|jina-embedding-b-en-v1|110m |768|
|jina-embedding-l-en-v1|330m |1024|
|Name|STS12|STS13|STS14|STS15|STS16|STS17|TRECOVID|Quora|SciFact|
|------------------------------|-----|-----|-----|-----|-----|-----|--------|-----|-----|
|all-minilm-l6-v2|0.724|0.806|0.756|0.854|0.79 |0.876|0.473 |0.876|0.645 |
|all-mpnet-base-v2|0.726|**0.835**|0.78 |0.857|0.8 |**0.906**|0.513 |0.875|0.656 |
|ada-embedding-002|0.698|0.833|0.761|0.861|**0.86** |0.903|**0.685** |0.876|**0.726** |
|jina-embedding-t-en-v1|0.717|0.773|0.731|0.829|0.777|0.860|0.482 |0.840|0.522 |
|jina-embedding-s-en-v1|0.743|0.786|0.738|0.837|0.80|0.875|0.523 |0.857|0.524 |
|jina-embedding-b-en-v1|**0.751**|0.809|0.761|0.856|0.812|0.890|0.606 |0.876|0.594 |
|jina-embedding-l-en-v1|0.745|0.832|**0.781**|**0.869**|0.837|0.902|0.573 |**0.881**|0.598 |
## Usage
Usage with Jina AI Finetuner:
```python
!pip install finetuner
import finetuner
model = finetuner.build_model('jinaai/jina-embedding-b-en-v1')
embeddings = finetuner.encode(
model=model,
data=['how is the weather today', 'What is the current weather like today?']
)
print(finetuner.cos_sim(embeddings[0], embeddings[1]))
```
Use with sentence-transformers:
```python
from sentence_transformers import SentenceTransformer
from sentence_transformers.util import cos_sim
sentences = ['how is the weather today', 'What is the current weather like today?']
model = SentenceTransformer('jinaai/jina-embedding-b-en-v1')
embeddings = model.encode(sentences)
print(cos_sim(embeddings[0], embeddings[1]))
```
## Fine-tuning
Please consider [Finetuner](https://github.com/jina-ai/finetuner).
## Plans
1. The development of `jina-embedding-s-en-v2` is currently underway with two main objectives: improving performance and increasing the maximum sequence length.
2. We are currently working on a bilingual embedding model that combines English and X language. The upcoming model will be called `jina-embedding-s/b/l-de-v1`.
## Contact
Join our [Discord community](https://discord.jina.ai) and chat with other community members about ideas.
## Citation
If you find Jina Embeddings useful in your research, please cite the following paper:
``` latex
@misc{günther2023jina,
title={Jina Embeddings: A Novel Set of High-Performance Sentence Embedding Models},
author={Michael Günther and Louis Milliken and Jonathan Geuter and Georgios Mastrapas and Bo Wang and Han Xiao},
year={2023},
eprint={2307.11224},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
| [
"SUMMARIZATION"
] | [
"BIOSSES",
"LINNAEUS",
"SCIFACT"
] |
lgaalves/gpt1 | lgaalves | text-generation | [
"transformers",
"pytorch",
"safetensors",
"openai-gpt",
"text-generation",
"en",
"arxiv:1705.11168",
"arxiv:1803.02324",
"arxiv:1910.09700",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2023-09-25T14:34:55 | 2023-11-21T17:05:38 | 2,049 | 3 | ---
language: en
license: mit
---
# OpenAI GPT
## Table of Contents
- [Model Details](#model-details)
- [How To Get Started With the Model](#how-to-get-started-with-the-model)
- [Uses](#uses)
- [Risks, Limitations and Biases](#risks-limitations-and-biases)
- [Training](#training)
- [Evaluation](#evaluation)
- [Environmental Impact](#environmental-impact)
- [Technical Specifications](#technical-specifications)
- [Citation Information](#citation-information)
- [Model Card Authors](#model-card-authors)
## Model Details
**Model Description:** `openai-gpt` is a transformer-based language model created and released by OpenAI. The model is a causal (unidirectional) transformer pre-trained using language modeling on a large corpus with long range dependencies.
- **Developed by:** Alec Radford, Karthik Narasimhan, Tim Salimans, Ilya Sutskever. See [associated research paper](https://cdn.openai.com/research-covers/language-unsupervised/language_understanding_paper.pdf) and [GitHub repo](https://github.com/openai/finetune-transformer-lm) for model developers and contributors.
- **Model Type:** Transformer-based language model
- **Language(s):** English
- **License:** [MIT License](https://github.com/openai/finetune-transformer-lm/blob/master/LICENSE)
- **Resources for more information:**
- [Research Paper](https://cdn.openai.com/research-covers/language-unsupervised/language_understanding_paper.pdf)
- [OpenAI Blog Post](https://openai.com/blog/language-unsupervised/)
- [GitHub Repo](https://github.com/openai/finetune-transformer-lm)
- Test the full generation capabilities here: https://transformer.huggingface.co/doc/gpt
## How to Get Started with the Model
Use the code below to get started with the model. You can use this model directly with a pipeline for text generation. Since the generation relies on some randomness, we
set a seed for reproducibility:
```python
>>> from transformers import pipeline, set_seed
>>> generator = pipeline('text-generation', model='lgaalves/gpt1')
>>> set_seed(42)
>>> generator("Hello, I'm a language model,", max_length=30, num_return_sequences=5)
[{'generated_text': "Hello, I'm a language model,'he said, when i was finished.'ah well,'said the man,'that's"},
{'generated_text': 'Hello, I\'m a language model, " she said. \n she reached the bottom of the shaft and leaned a little further out. it was'},
{'generated_text': 'Hello, I\'m a language model, " she laughed. " we call that a\'white girl.\'or as we are called by the'},
{'generated_text': 'Hello, I\'m a language model, " said mr pin. " an\'the ones with the funny hats don\'t. " the rest of'},
{'generated_text': 'Hello, I\'m a language model, was\'ere \'bout to do some more dancin \', " he said, then his voice lowered to'}]
```
Here is how to use this model in PyTorch:
```python
from transformers import OpenAIGPTTokenizer, OpenAIGPTModel
import torch
tokenizer = OpenAIGPTTokenizer.from_pretrained("lgaalves/gpt1")
model = OpenAIGPTModel.from_pretrained("lgaalves/gpt1")
inputs = tokenizer("Hello, my dog is cute", return_tensors="pt")
outputs = model(**inputs)
last_hidden_states = outputs.last_hidden_state
```
and in TensorFlow:
```python
from transformers import OpenAIGPTTokenizer, TFOpenAIGPTModel
tokenizer = OpenAIGPTTokenizer.from_pretrained("lgaalves/gpt1")
model = TFOpenAIGPTModel.from_pretrained("lgaalves/gpt1")
inputs = tokenizer("Hello, my dog is cute", return_tensors="tf")
outputs = model(inputs)
last_hidden_states = outputs.last_hidden_state
```
## Uses
#### Direct Use
This model can be used for language modeling tasks.
#### Downstream Use
Potential downstream uses of this model include tasks that leverage language models. In the [associated paper](https://cdn.openai.com/research-covers/language-unsupervised/language_understanding_paper.pdf), the model developers discuss evaluations of the model for tasks including natural language inference (NLI), question answering, semantic similarity, and text classification.
#### Misuse and Out-of-scope Use
The model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model.
## Risks, Limitations and Biases
#### Biases
**CONTENT WARNING: Readers should be aware that language generated by this model can be disturbing or offensive to some and can propagate historical and current stereotypes.**
Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)).
Predictions generated by this model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups. For example:
```python
>>> from transformers import pipeline, set_seed
>>> generator = pipeline('text-generation', model='lgaalves/gpt1')
>>> set_seed(42)
>>> generator("The man worked as a", max_length=10, num_return_sequences=5)
[{'generated_text': 'The man worked as a teacher for the college he'},
{'generated_text': 'The man worked as a janitor at the club.'},
{'generated_text': 'The man worked as a bodyguard in america. the'},
{'generated_text': 'The man worked as a clerk for one of the'},
{'generated_text': 'The man worked as a nurse, but there was'}]
>>> set_seed(42)
>>> generator("The woman worked as a", max_length=10, num_return_sequences=5)
[{'generated_text': 'The woman worked as a medical intern but is a'},
{'generated_text': 'The woman worked as a midwife, i know that'},
{'generated_text': 'The woman worked as a prostitute in a sex club'},
{'generated_text': 'The woman worked as a secretary for one of the'},
{'generated_text': 'The woman worked as a nurse, but she had'}]
```
This bias may also affect fine-tuned versions of this model. Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model.
#### Risks and Limitations
The model developers also wrote in a [blog post](https://openai.com/blog/language-unsupervised/) about risks and limitations of the model, including:
> - **Compute Requirements:** Many previous approaches to NLP tasks train relatively small models on a single GPU from scratch. Our approach requires an expensive pre-training step - 1 month on 8 GPUs. Luckily, this only has to be done once and we’re releasing our model so others can avoid it. It is also a large model (in comparison to prior work) and consequently uses more compute and memory — we used a 37-layer (12 block) Transformer architecture, and we train on sequences of up to 512 tokens. Most experiments were conducted on 4 and 8 GPU systems. The model does fine-tune to new tasks very quickly which helps mitigate the additional resource requirements.
> - **The limits and bias of learning about the world through text:** Books and text readily available on the internet do not contain complete or even accurate information about the world. Recent work ([Lucy and Gauthier, 2017](https://arxiv.org/abs/1705.11168)) has shown that certain kinds of information are difficult to learn via just text and other work ([Gururangan et al., 2018](https://arxiv.org/abs/1803.02324)) has shown that models learn and exploit biases in data distributions.
> - **Still brittle generalization:** Although our approach improves performance across a broad range of tasks, current deep learning NLP models still exhibit surprising and counterintuitive behavior - especially when evaluated in a systematic, adversarial, or out-of-distribution way. Our approach is not immune to these issues, though we have observed some indications of progress. Our approach shows improved lexical robustness over previous purely neural approaches to textual entailment. On the dataset introduced in Glockner et al. (2018) our model achieves 83.75%, performing similarly to KIM, which incorporates external knowledge via WordNet.
## Training
#### Training Data
The model developers [write](https://cdn.openai.com/research-covers/language-unsupervised/language_understanding_paper.pdf):
> We use the BooksCorpus dataset ([Zhu et al., 2015](https://www.cv-foundation.org/openaccess/content_iccv_2015/papers/Zhu_Aligning_Books_and_ICCV_2015_paper.pdf)) for training the language model. It contains over 7,000 unique unpublished books from a variety of genres including Adventure, Fantasy, and Romance. Crucially, it contains long stretches of contiguous text, which allows the generative model to learn to condition on long-range information.
#### Training Procedure
The model developers [write](https://cdn.openai.com/research-covers/language-unsupervised/language_understanding_paper.pdf):
> Our model largely follows the original transformer work [62]. We trained a 12-layer decoder-only transformer with masked self-attention heads (768 dimensional states and 12 attention heads). For the position-wise feed-forward networks, we used 3072 dimensional inner states. We used the Adam optimization scheme [27] with a max learning rate of 2.5e-4. The learning rate was increased linearly from zero over the first 2000 updates and annealed to 0 using a cosine schedule. We train for 100 epochs on minibatches of 64 randomly sampled, contiguous sequences of 512 tokens. Since layernorm [2] is used extensively throughout the model, a simple weight initialization of N (0, 0.02) was sufficient. We used a bytepair encoding (BPE) vocabulary with 40,000 merges [53] and residual, embedding, and attention dropouts with a rate of 0.1 for regularization. We also employed a modified version of L2 regularization proposed in [37], with w = 0.01 on all non bias or gain weights. For the activation function, we used the Gaussian Error Linear Unit (GELU) [18]. We used learned position embeddings instead of the sinusoidal version proposed in the original work. We use the ftfy library2 to clean the raw text in BooksCorpus, standardize some punctuation and whitespace, and use the spaCy tokenizer.
See the paper for further details and links to citations.
## Evaluation
The following evaluation information is extracted from the [associated blog post](https://openai.com/blog/language-unsupervised/). See the [associated paper](https://cdn.openai.com/research-covers/language-unsupervised/language_understanding_paper.pdf) for further details.
#### Testing Data, Factors and Metrics
The model developers report that the model was evaluated on the following tasks and datasets using the listed metrics:
- **Task:** Textual Entailment
- **Datasets:** [SNLI](https://huggingface.co/datasets/snli), [MNLI Matched](https://huggingface.co/datasets/glue), [MNLI Mismatched](https://huggingface.co/datasets/glue), [SciTail](https://huggingface.co/datasets/scitail), [QNLI](https://huggingface.co/datasets/glue), [RTE](https://huggingface.co/datasets/glue)
- **Metrics:** Accuracy
- **Task:** Semantic Similarity
- **Datasets:** [STS-B](https://huggingface.co/datasets/glue), [QQP](https://huggingface.co/datasets/glue), [MRPC](https://huggingface.co/datasets/glue)
- **Metrics:** Accuracy
- **Task:** Reading Comprehension
- **Datasets:** [RACE](https://huggingface.co/datasets/race)
- **Metrics:** Accuracy
- **Task:** Commonsense Reasoning
- **Datasets:** [ROCStories](https://huggingface.co/datasets/story_cloze), [COPA](https://huggingface.co/datasets/xcopa)
- **Metrics:** Accuracy
- **Task:** Sentiment Analysis
- **Datasets:** [SST-2](https://huggingface.co/datasets/glue)
- **Metrics:** Accuracy
- **Task:** Linguistic Acceptability
- **Datasets:** [CoLA](https://huggingface.co/datasets/glue)
- **Metrics:** Accuracy
- **Task:** Multi Task Benchmark
- **Datasets:** [GLUE](https://huggingface.co/datasets/glue)
- **Metrics:** Accuracy
#### Results
The model achieves the following results without any fine-tuning (zero-shot):
| Task | TE | TE | TE |TE | TE | TE | SS | SS | SS | RC | CR | CR | SA | LA | MTB |
|:--------:|:--:|:----------:|:-------------:|:-----:|:----:|:---:|:---:|:---:|:--:|:----:|:--------:|:----:|:----:|:----:|:----:|
| Dataset |SNLI|MNLI Matched|MNLI Mismatched|SciTail| QNLI | RTE |STS-B| QQP |MPRC|RACE |ROCStories|COPA | SST-2| CoLA | GLUE |
| |89.9| 82.1 | 81.4 |88.3 | 88.1 | 56.0|82.0 | 70.3|82.3|59.0 | 86.5 | 78.6 | 91.3 | 45.4 | 72.8 |
## Environmental Impact
The model developers [report that](https://openai.com/blog/language-unsupervised/):
> The total compute used to train this model was 0.96 petaflop days (pfs-days).
> 8 P600 GPU's * 30 days * 12 TFLOPS/GPU * 0.33 utilization = .96 pfs-days
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** 8 P600 GPUs
- **Hours used:** 720 hours (30 days)
- **Cloud Provider:** Unknown
- **Compute Region:** Unknown
- **Carbon Emitted:** Unknown
## Technical Specifications
See the [associated paper](https://cdn.openai.com/research-covers/language-unsupervised/language_understanding_paper.pdf) for details on the modeling architecture, objective, compute infrastructure, and training details.
## Citation Information
```bibtex
@article{radford2018improving,
title={Improving language understanding by generative pre-training},
author={Radford, Alec and Narasimhan, Karthik and Salimans, Tim and Sutskever, Ilya and others},
year={2018},
publisher={OpenAI}
}
```
APA:
*Radford, A., Narasimhan, K., Salimans, T., & Sutskever, I. (2018). Improving language understanding by generative pre-training.*
## Model Card Authors
This model card was written by the Hugging Face team. | [
"TEXT_CLASSIFICATION",
"QUESTION_ANSWERING",
"SEMANTIC_SIMILARITY",
"TEXTUAL_ENTAILMENT"
] | [
"SCITAIL"
] |
fblgit/UNAversal-8x7B-v1beta | fblgit | text-generation | [
"transformers",
"safetensors",
"mixtral",
"text-generation",
"UNA",
"juanako",
"MoE",
"conversational",
"en",
"license:cc-by-nc-sa-4.0",
"model-index",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | 2023-12-26T15:58:15 | 2024-03-08T10:28:21 | 2,005 | 8 | ---
language:
- en
library_name: transformers
license: cc-by-nc-sa-4.0
tags:
- UNA
- juanako
- mixtral
- MoE
model-index:
- name: UNAversal-8x7B-v1beta
results:
- task:
type: text-generation
name: Text Generation
dataset:
name: AI2 Reasoning Challenge (25-Shot)
type: ai2_arc
config: ARC-Challenge
split: test
args:
num_few_shot: 25
metrics:
- type: acc_norm
value: 69.8
name: normalized accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=fblgit/UNAversal-8x7B-v1beta
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: HellaSwag (10-Shot)
type: hellaswag
split: validation
args:
num_few_shot: 10
metrics:
- type: acc_norm
value: 86.9
name: normalized accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=fblgit/UNAversal-8x7B-v1beta
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: MMLU (5-Shot)
type: cais/mmlu
config: all
split: test
args:
num_few_shot: 5
metrics:
- type: acc
value: 70.39
name: accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=fblgit/UNAversal-8x7B-v1beta
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: TruthfulQA (0-shot)
type: truthful_qa
config: multiple_choice
split: validation
args:
num_few_shot: 0
metrics:
- type: mc2
value: 71.97
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=fblgit/UNAversal-8x7B-v1beta
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: Winogrande (5-shot)
type: winogrande
config: winogrande_xl
split: validation
args:
num_few_shot: 5
metrics:
- type: acc
value: 82.0
name: accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=fblgit/UNAversal-8x7B-v1beta
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: GSM8k (5-shot)
type: gsm8k
config: main
split: test
args:
num_few_shot: 5
metrics:
- type: acc
value: 61.64
name: accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=fblgit/UNAversal-8x7B-v1beta
name: Open LLM Leaderboard
---
# UNAversal - Uniform Neural Alignment (MoE)
This is just a beta, a first release so people can start working on franksteins and so.
It does achieve high GSM/Math and TQA, so ideally you can merge it with other mixtrals and see what coming out of it
Based on [mistralai/Mixtral-8x7B-Instruct-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1)
## UNA Details
For this model we came out with the most obvious, placing UNA on the router_logit. It does work, but we saw a much better performance on SFT by doing so.
So this model DOES have UNA-SFT phase, its highly experimental and it was merely using LLaMA-Factory datasets by example alpaca.
As the others:
- Can be finetuned further, try 2e-5 or **1e-4 (since its MOE)**
- Can be merged, here you will have to improvise and please report findings on a discussion thread.
**REMINDER**: please.. cite, it does help on the research and the lab itself, seriously.
## NEED YOUR HELP!!
I need a multi-turn trainloop for the Mixtral, that can squeeze the juice out of 8xH100's properly. Please feel free to reach @fblgit either discord or twitter. thanks!
# Evals
Here there are some, but we also submitted it to the HF eval queue....
## GSM8k 5-Shot
```
|Tasks|Version| Filter |n-shot| Metric |Value | |Stderr|
|-----|-------|----------|-----:|-----------|-----:|---|-----:|
|gsm8k|Yaml |get-answer| 5|exact_match|0.6603|± | 0.013|
```
## ARC 25-Shot
```
| Tasks |Version|Filter|n-shot| Metric |Value | |Stderr|
|-------------|-------|------|-----:|--------|-----:|---|-----:|
|arc_challenge|Yaml |none | 25|acc |0.6621|± |0.0138|
| | |none | 25|acc_norm|0.6962|± |0.0134|
```
## TruthfulQA 0-Shot (MC2)
```
| Tasks |Version|Filter|n-shot|Metric|Value | |Stderr|
|--------------|-------|------|-----:|------|-----:|---|-----:|
|truthfulqa_mc2|Yaml |none | 0|acc |0.7122|± |0.0141|
```
## 0-Shots Evals
```
| Tasks |Version|Filter|n-shot| Metric |Value | |Stderr|
|--------------|-------|------|-----:|----------|-----:|---|-----:|
|arc_challenge |Yaml |none | 0|acc |0.6101|± |0.0143|
| | |none | 0|acc_norm |0.6425|± |0.0140|
|arc_easy |Yaml |none | 0|acc |0.8615|± |0.0071|
| | |none | 0|acc_norm |0.8375|± |0.0076|
|boolq |Yaml |none | 0|acc |0.8624|± |0.0060|
|lambada_openai|Yaml |none | 0|perplexity|2.8318|± |0.0507|
| | |none | 0|acc |0.7650|± |0.0059|
|mathqa |Yaml |none | 0|acc |0.4472|± |0.0091|
| | |none | 0|acc_norm |0.4436|± |0.0091|
|piqa |Yaml |none | 0|acc |0.8292|± |0.0088|
| | |none | 0|acc_norm |0.8422|± |0.0085|
|pubmedqa |Yaml |none | 0|acc |0.7920|± |0.0182|
|sciq |Yaml |none | 0|acc |0.9630|± |0.0060|
| | |none | 0|acc_norm |0.9370|± |0.0077|
```
## BBH at 0-Shot
```
vllm (pretrained=fblgit/UNAversal-8x7B-v1beta,tensor_parallel_size=2,data_parallel_size=4,gpu_memory_utilization=0.8,dtype=float16), gen_kwargs: (None), limit: None, num_fewshot: 0, batch_size: auto
| Tasks |Version| Filter |n-shot| Metric |Value | |Stderr|
|----------------------------------------------------------|-------|----------|-----:|-----------|-----:|---|-----:|
|bbh |N/A |get-answer| 0|exact_match|0.6752|± |0.1772|
| - bbh_cot_fewshot_boolean_expressions |Yaml |get-answer| 0|exact_match|0.8840|± |0.0203|
| - bbh_cot_fewshot_causal_judgement |Yaml |get-answer| 0|exact_match|0.6417|± |0.0352|
| - bbh_cot_fewshot_date_understanding |Yaml |get-answer| 0|exact_match|0.7600|± |0.0271|
| - bbh_cot_fewshot_disambiguation_qa |Yaml |get-answer| 0|exact_match|0.7160|± |0.0286|
| - bbh_cot_fewshot_dyck_languages |Yaml |get-answer| 0|exact_match|0.1800|± |0.0243|
| - bbh_cot_fewshot_formal_fallacies |Yaml |get-answer| 0|exact_match|0.6520|± |0.0302|
| - bbh_cot_fewshot_geometric_shapes |Yaml |get-answer| 0|exact_match|0.3880|± |0.0309|
| - bbh_cot_fewshot_hyperbaton |Yaml |get-answer| 0|exact_match|0.9600|± |0.0124|
| - bbh_cot_fewshot_logical_deduction_five_objects |Yaml |get-answer| 0|exact_match|0.5360|± |0.0316|
| - bbh_cot_fewshot_logical_deduction_seven_objects |Yaml |get-answer| 0|exact_match|0.5040|± |0.0317|
| - bbh_cot_fewshot_logical_deduction_three_objects |Yaml |get-answer| 0|exact_match|0.8600|± |0.0220|
| - bbh_cot_fewshot_movie_recommendation |Yaml |get-answer| 0|exact_match|0.7840|± |0.0261|
| - bbh_cot_fewshot_multistep_arithmetic_two |Yaml |get-answer| 0|exact_match|0.6600|± |0.0300|
| - bbh_cot_fewshot_navigate |Yaml |get-answer| 0|exact_match|0.8160|± |0.0246|
| - bbh_cot_fewshot_object_counting |Yaml |get-answer| 0|exact_match|0.8360|± |0.0235|
| - bbh_cot_fewshot_penguins_in_a_table |Yaml |get-answer| 0|exact_match|0.7329|± |0.0367|
| - bbh_cot_fewshot_reasoning_about_colored_objects |Yaml |get-answer| 0|exact_match|0.8120|± |0.0248|
| - bbh_cot_fewshot_ruin_names |Yaml |get-answer| 0|exact_match|0.4440|± |0.0315|
| - bbh_cot_fewshot_salient_translation_error_detection |Yaml |get-answer| 0|exact_match|0.5200|± |0.0317|
| - bbh_cot_fewshot_snarks |Yaml |get-answer| 0|exact_match|0.7135|± |0.0340|
| - bbh_cot_fewshot_sports_understanding |Yaml |get-answer| 0|exact_match|0.9400|± |0.0151|
| - bbh_cot_fewshot_temporal_sequences |Yaml |get-answer| 0|exact_match|0.7560|± |0.0272|
| - bbh_cot_fewshot_tracking_shuffled_objects_five_objects |Yaml |get-answer| 0|exact_match|0.5680|± |0.0314|
| - bbh_cot_fewshot_tracking_shuffled_objects_seven_objects|Yaml |get-answer| 0|exact_match|0.6280|± |0.0306|
| - bbh_cot_fewshot_tracking_shuffled_objects_three_objects|Yaml |get-answer| 0|exact_match|0.6280|± |0.0306|
| - bbh_cot_fewshot_web_of_lies |Yaml |get-answer| 0|exact_match|0.9560|± |0.0130|
| - bbh_cot_fewshot_word_sorting |Yaml |get-answer| 0|exact_match|0.3800|± |0.0308|
|Groups|Version| Filter |n-shot| Metric |Value | |Stderr|
|------|-------|----------|-----:|-----------|-----:|---|-----:|
|bbh |N/A |get-answer| 0|exact_match|0.6752|± |0.1772|
```
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_fblgit__UNAversal-8x7B-v1beta)
| Metric |Value|
|---------------------------------|----:|
|Avg. |73.78|
|AI2 Reasoning Challenge (25-Shot)|69.80|
|HellaSwag (10-Shot) |86.90|
|MMLU (5-Shot) |70.39|
|TruthfulQA (0-shot) |71.97|
|Winogrande (5-shot) |82.00|
|GSM8k (5-shot) |61.64|
| [
"TRANSLATION"
] | [
"PUBMEDQA",
"SCIQ"
] |
mlx-community/multilingual-e5-large-mlx | mlx-community | feature-extraction | [
"sentence-transformers",
"xlm-roberta",
"mteb",
"Sentence Transformers",
"sentence-similarity",
"feature-extraction",
"mlx",
"multilingual",
"af",
"am",
"ar",
"as",
"az",
"be",
"bg",
"bn",
"br",
"bs",
"ca",
"cs",
"cy",
"da",
"de",
"el",
"en",
"eo",
"es",
"et",
"eu",
"fa",
"fi",
"fr",
"fy",
"ga",
"gd",
"gl",
"gu",
"ha",
"he",
"hi",
"hr",
"hu",
"hy",
"id",
"is",
"it",
"ja",
"jv",
"ka",
"kk",
"km",
"kn",
"ko",
"ku",
"ky",
"la",
"lo",
"lt",
"lv",
"mg",
"mk",
"ml",
"mn",
"mr",
"ms",
"my",
"ne",
"nl",
"no",
"om",
"or",
"pa",
"pl",
"ps",
"pt",
"ro",
"ru",
"sa",
"sd",
"si",
"sk",
"sl",
"so",
"sq",
"sr",
"su",
"sv",
"sw",
"ta",
"te",
"th",
"tl",
"tr",
"ug",
"uk",
"ur",
"uz",
"vi",
"xh",
"yi",
"zh",
"license:mit",
"model-index",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
] | 2024-01-11T12:15:07 | 2024-01-11T12:16:29 | 1,978 | 3 | ---
language:
- multilingual
- af
- am
- ar
- as
- az
- be
- bg
- bn
- br
- bs
- ca
- cs
- cy
- da
- de
- el
- en
- eo
- es
- et
- eu
- fa
- fi
- fr
- fy
- ga
- gd
- gl
- gu
- ha
- he
- hi
- hr
- hu
- hy
- id
- is
- it
- ja
- jv
- ka
- kk
- km
- kn
- ko
- ku
- ky
- la
- lo
- lt
- lv
- mg
- mk
- ml
- mn
- mr
- ms
- my
- ne
- nl
- 'no'
- om
- or
- pa
- pl
- ps
- pt
- ro
- ru
- sa
- sd
- si
- sk
- sl
- so
- sq
- sr
- su
- sv
- sw
- ta
- te
- th
- tl
- tr
- ug
- uk
- ur
- uz
- vi
- xh
- yi
- zh
license: mit
tags:
- mteb
- Sentence Transformers
- sentence-similarity
- feature-extraction
- sentence-transformers
- mlx
model-index:
- name: multilingual-e5-large
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 79.05970149253731
- type: ap
value: 43.486574390835635
- type: f1
value: 73.32700092140148
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (de)
type: mteb/amazon_counterfactual
config: de
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 71.22055674518201
- type: ap
value: 81.55756710830498
- type: f1
value: 69.28271787752661
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en-ext)
type: mteb/amazon_counterfactual
config: en-ext
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 80.41979010494754
- type: ap
value: 29.34879922376344
- type: f1
value: 67.62475449011278
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (ja)
type: mteb/amazon_counterfactual
config: ja
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 77.8372591006424
- type: ap
value: 26.557560591210738
- type: f1
value: 64.96619417368707
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 93.489875
- type: ap
value: 90.98758636917603
- type: f1
value: 93.48554819717332
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 47.564
- type: f1
value: 46.75122173518047
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (de)
type: mteb/amazon_reviews_multi
config: de
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 45.400000000000006
- type: f1
value: 44.17195682400632
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (es)
type: mteb/amazon_reviews_multi
config: es
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 43.068
- type: f1
value: 42.38155696855596
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (fr)
type: mteb/amazon_reviews_multi
config: fr
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 41.89
- type: f1
value: 40.84407321682663
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (ja)
type: mteb/amazon_reviews_multi
config: ja
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 40.120000000000005
- type: f1
value: 39.522976223819114
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (zh)
type: mteb/amazon_reviews_multi
config: zh
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 38.832
- type: f1
value: 38.0392533394713
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: arguana
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 30.725
- type: map_at_10
value: 46.055
- type: map_at_100
value: 46.900999999999996
- type: map_at_1000
value: 46.911
- type: map_at_3
value: 41.548
- type: map_at_5
value: 44.297
- type: mrr_at_1
value: 31.152
- type: mrr_at_10
value: 46.231
- type: mrr_at_100
value: 47.07
- type: mrr_at_1000
value: 47.08
- type: mrr_at_3
value: 41.738
- type: mrr_at_5
value: 44.468999999999994
- type: ndcg_at_1
value: 30.725
- type: ndcg_at_10
value: 54.379999999999995
- type: ndcg_at_100
value: 58.138
- type: ndcg_at_1000
value: 58.389
- type: ndcg_at_3
value: 45.156
- type: ndcg_at_5
value: 50.123
- type: precision_at_1
value: 30.725
- type: precision_at_10
value: 8.087
- type: precision_at_100
value: 0.9769999999999999
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 18.54
- type: precision_at_5
value: 13.542000000000002
- type: recall_at_1
value: 30.725
- type: recall_at_10
value: 80.868
- type: recall_at_100
value: 97.653
- type: recall_at_1000
value: 99.57300000000001
- type: recall_at_3
value: 55.619
- type: recall_at_5
value: 67.71000000000001
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 44.30960650674069
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 38.427074197498996
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 60.28270056031872
- type: mrr
value: 74.38332673789738
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 84.05942144105269
- type: cos_sim_spearman
value: 82.51212105850809
- type: euclidean_pearson
value: 81.95639829909122
- type: euclidean_spearman
value: 82.3717564144213
- type: manhattan_pearson
value: 81.79273425468256
- type: manhattan_spearman
value: 82.20066817871039
- task:
type: BitextMining
dataset:
name: MTEB BUCC (de-en)
type: mteb/bucc-bitext-mining
config: de-en
split: test
revision: d51519689f32196a32af33b075a01d0e7c51e252
metrics:
- type: accuracy
value: 99.46764091858039
- type: f1
value: 99.37717466945023
- type: precision
value: 99.33194154488518
- type: recall
value: 99.46764091858039
- task:
type: BitextMining
dataset:
name: MTEB BUCC (fr-en)
type: mteb/bucc-bitext-mining
config: fr-en
split: test
revision: d51519689f32196a32af33b075a01d0e7c51e252
metrics:
- type: accuracy
value: 98.29407880255337
- type: f1
value: 98.11248073959938
- type: precision
value: 98.02443319392472
- type: recall
value: 98.29407880255337
- task:
type: BitextMining
dataset:
name: MTEB BUCC (ru-en)
type: mteb/bucc-bitext-mining
config: ru-en
split: test
revision: d51519689f32196a32af33b075a01d0e7c51e252
metrics:
- type: accuracy
value: 97.79009352268791
- type: f1
value: 97.5176076665512
- type: precision
value: 97.38136473848286
- type: recall
value: 97.79009352268791
- task:
type: BitextMining
dataset:
name: MTEB BUCC (zh-en)
type: mteb/bucc-bitext-mining
config: zh-en
split: test
revision: d51519689f32196a32af33b075a01d0e7c51e252
metrics:
- type: accuracy
value: 99.26276987888363
- type: f1
value: 99.20133403545726
- type: precision
value: 99.17500438827453
- type: recall
value: 99.26276987888363
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 84.72727272727273
- type: f1
value: 84.67672206031433
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 35.34220182511161
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 33.4987096128766
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 25.558249999999997
- type: map_at_10
value: 34.44425000000001
- type: map_at_100
value: 35.59833333333333
- type: map_at_1000
value: 35.706916666666665
- type: map_at_3
value: 31.691749999999995
- type: map_at_5
value: 33.252916666666664
- type: mrr_at_1
value: 30.252666666666666
- type: mrr_at_10
value: 38.60675
- type: mrr_at_100
value: 39.42666666666666
- type: mrr_at_1000
value: 39.48408333333334
- type: mrr_at_3
value: 36.17441666666665
- type: mrr_at_5
value: 37.56275
- type: ndcg_at_1
value: 30.252666666666666
- type: ndcg_at_10
value: 39.683
- type: ndcg_at_100
value: 44.68541666666667
- type: ndcg_at_1000
value: 46.94316666666668
- type: ndcg_at_3
value: 34.961749999999995
- type: ndcg_at_5
value: 37.215666666666664
- type: precision_at_1
value: 30.252666666666666
- type: precision_at_10
value: 6.904166666666667
- type: precision_at_100
value: 1.0989999999999995
- type: precision_at_1000
value: 0.14733333333333334
- type: precision_at_3
value: 16.037666666666667
- type: precision_at_5
value: 11.413583333333333
- type: recall_at_1
value: 25.558249999999997
- type: recall_at_10
value: 51.13341666666666
- type: recall_at_100
value: 73.08366666666667
- type: recall_at_1000
value: 88.79483333333334
- type: recall_at_3
value: 37.989083333333326
- type: recall_at_5
value: 43.787833333333325
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: climate-fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 10.338
- type: map_at_10
value: 18.360000000000003
- type: map_at_100
value: 19.942
- type: map_at_1000
value: 20.134
- type: map_at_3
value: 15.174000000000001
- type: map_at_5
value: 16.830000000000002
- type: mrr_at_1
value: 23.257
- type: mrr_at_10
value: 33.768
- type: mrr_at_100
value: 34.707
- type: mrr_at_1000
value: 34.766000000000005
- type: mrr_at_3
value: 30.977
- type: mrr_at_5
value: 32.528
- type: ndcg_at_1
value: 23.257
- type: ndcg_at_10
value: 25.733
- type: ndcg_at_100
value: 32.288
- type: ndcg_at_1000
value: 35.992000000000004
- type: ndcg_at_3
value: 20.866
- type: ndcg_at_5
value: 22.612
- type: precision_at_1
value: 23.257
- type: precision_at_10
value: 8.124
- type: precision_at_100
value: 1.518
- type: precision_at_1000
value: 0.219
- type: precision_at_3
value: 15.679000000000002
- type: precision_at_5
value: 12.117
- type: recall_at_1
value: 10.338
- type: recall_at_10
value: 31.154
- type: recall_at_100
value: 54.161
- type: recall_at_1000
value: 75.21900000000001
- type: recall_at_3
value: 19.427
- type: recall_at_5
value: 24.214
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: dbpedia-entity
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 8.498
- type: map_at_10
value: 19.103
- type: map_at_100
value: 27.375
- type: map_at_1000
value: 28.981
- type: map_at_3
value: 13.764999999999999
- type: map_at_5
value: 15.950000000000001
- type: mrr_at_1
value: 65.5
- type: mrr_at_10
value: 74.53800000000001
- type: mrr_at_100
value: 74.71799999999999
- type: mrr_at_1000
value: 74.725
- type: mrr_at_3
value: 72.792
- type: mrr_at_5
value: 73.554
- type: ndcg_at_1
value: 53.37499999999999
- type: ndcg_at_10
value: 41.286
- type: ndcg_at_100
value: 45.972
- type: ndcg_at_1000
value: 53.123
- type: ndcg_at_3
value: 46.172999999999995
- type: ndcg_at_5
value: 43.033
- type: precision_at_1
value: 65.5
- type: precision_at_10
value: 32.725
- type: precision_at_100
value: 10.683
- type: precision_at_1000
value: 1.978
- type: precision_at_3
value: 50
- type: precision_at_5
value: 41.349999999999994
- type: recall_at_1
value: 8.498
- type: recall_at_10
value: 25.070999999999998
- type: recall_at_100
value: 52.383
- type: recall_at_1000
value: 74.91499999999999
- type: recall_at_3
value: 15.207999999999998
- type: recall_at_5
value: 18.563
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 46.5
- type: f1
value: 41.93833713984145
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 67.914
- type: map_at_10
value: 78.10000000000001
- type: map_at_100
value: 78.333
- type: map_at_1000
value: 78.346
- type: map_at_3
value: 76.626
- type: map_at_5
value: 77.627
- type: mrr_at_1
value: 72.74199999999999
- type: mrr_at_10
value: 82.414
- type: mrr_at_100
value: 82.511
- type: mrr_at_1000
value: 82.513
- type: mrr_at_3
value: 81.231
- type: mrr_at_5
value: 82.065
- type: ndcg_at_1
value: 72.74199999999999
- type: ndcg_at_10
value: 82.806
- type: ndcg_at_100
value: 83.677
- type: ndcg_at_1000
value: 83.917
- type: ndcg_at_3
value: 80.305
- type: ndcg_at_5
value: 81.843
- type: precision_at_1
value: 72.74199999999999
- type: precision_at_10
value: 10.24
- type: precision_at_100
value: 1.089
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_3
value: 31.268
- type: precision_at_5
value: 19.706000000000003
- type: recall_at_1
value: 67.914
- type: recall_at_10
value: 92.889
- type: recall_at_100
value: 96.42699999999999
- type: recall_at_1000
value: 97.92
- type: recall_at_3
value: 86.21
- type: recall_at_5
value: 90.036
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: fiqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 22.166
- type: map_at_10
value: 35.57
- type: map_at_100
value: 37.405
- type: map_at_1000
value: 37.564
- type: map_at_3
value: 30.379
- type: map_at_5
value: 33.324
- type: mrr_at_1
value: 43.519000000000005
- type: mrr_at_10
value: 51.556000000000004
- type: mrr_at_100
value: 52.344
- type: mrr_at_1000
value: 52.373999999999995
- type: mrr_at_3
value: 48.868
- type: mrr_at_5
value: 50.319
- type: ndcg_at_1
value: 43.519000000000005
- type: ndcg_at_10
value: 43.803
- type: ndcg_at_100
value: 50.468999999999994
- type: ndcg_at_1000
value: 53.111
- type: ndcg_at_3
value: 38.893
- type: ndcg_at_5
value: 40.653
- type: precision_at_1
value: 43.519000000000005
- type: precision_at_10
value: 12.253
- type: precision_at_100
value: 1.931
- type: precision_at_1000
value: 0.242
- type: precision_at_3
value: 25.617
- type: precision_at_5
value: 19.383
- type: recall_at_1
value: 22.166
- type: recall_at_10
value: 51.6
- type: recall_at_100
value: 76.574
- type: recall_at_1000
value: 92.192
- type: recall_at_3
value: 34.477999999999994
- type: recall_at_5
value: 41.835
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: hotpotqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 39.041
- type: map_at_10
value: 62.961999999999996
- type: map_at_100
value: 63.79899999999999
- type: map_at_1000
value: 63.854
- type: map_at_3
value: 59.399
- type: map_at_5
value: 61.669
- type: mrr_at_1
value: 78.082
- type: mrr_at_10
value: 84.321
- type: mrr_at_100
value: 84.49600000000001
- type: mrr_at_1000
value: 84.502
- type: mrr_at_3
value: 83.421
- type: mrr_at_5
value: 83.977
- type: ndcg_at_1
value: 78.082
- type: ndcg_at_10
value: 71.229
- type: ndcg_at_100
value: 74.10900000000001
- type: ndcg_at_1000
value: 75.169
- type: ndcg_at_3
value: 66.28699999999999
- type: ndcg_at_5
value: 69.084
- type: precision_at_1
value: 78.082
- type: precision_at_10
value: 14.993
- type: precision_at_100
value: 1.7239999999999998
- type: precision_at_1000
value: 0.186
- type: precision_at_3
value: 42.737
- type: precision_at_5
value: 27.843
- type: recall_at_1
value: 39.041
- type: recall_at_10
value: 74.96300000000001
- type: recall_at_100
value: 86.199
- type: recall_at_1000
value: 93.228
- type: recall_at_3
value: 64.105
- type: recall_at_5
value: 69.608
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 90.23160000000001
- type: ap
value: 85.5674856808308
- type: f1
value: 90.18033354786317
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: msmarco
config: default
split: dev
revision: None
metrics:
- type: map_at_1
value: 24.091
- type: map_at_10
value: 36.753
- type: map_at_100
value: 37.913000000000004
- type: map_at_1000
value: 37.958999999999996
- type: map_at_3
value: 32.818999999999996
- type: map_at_5
value: 35.171
- type: mrr_at_1
value: 24.742
- type: mrr_at_10
value: 37.285000000000004
- type: mrr_at_100
value: 38.391999999999996
- type: mrr_at_1000
value: 38.431
- type: mrr_at_3
value: 33.440999999999995
- type: mrr_at_5
value: 35.75
- type: ndcg_at_1
value: 24.742
- type: ndcg_at_10
value: 43.698
- type: ndcg_at_100
value: 49.145
- type: ndcg_at_1000
value: 50.23800000000001
- type: ndcg_at_3
value: 35.769
- type: ndcg_at_5
value: 39.961999999999996
- type: precision_at_1
value: 24.742
- type: precision_at_10
value: 6.7989999999999995
- type: precision_at_100
value: 0.95
- type: precision_at_1000
value: 0.104
- type: precision_at_3
value: 15.096000000000002
- type: precision_at_5
value: 11.183
- type: recall_at_1
value: 24.091
- type: recall_at_10
value: 65.068
- type: recall_at_100
value: 89.899
- type: recall_at_1000
value: 98.16
- type: recall_at_3
value: 43.68
- type: recall_at_5
value: 53.754999999999995
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 93.66621067031465
- type: f1
value: 93.49622853272142
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (de)
type: mteb/mtop_domain
config: de
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 91.94702733164272
- type: f1
value: 91.17043441745282
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (es)
type: mteb/mtop_domain
config: es
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 92.20146764509674
- type: f1
value: 91.98359080555608
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (fr)
type: mteb/mtop_domain
config: fr
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 88.99780770435328
- type: f1
value: 89.19746342724068
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (hi)
type: mteb/mtop_domain
config: hi
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 89.78486912871998
- type: f1
value: 89.24578823628642
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (th)
type: mteb/mtop_domain
config: th
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 88.74502712477394
- type: f1
value: 89.00297573881542
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 77.9046967624259
- type: f1
value: 59.36787125785957
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (de)
type: mteb/mtop_intent
config: de
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 74.5280360664976
- type: f1
value: 57.17723440888718
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (es)
type: mteb/mtop_intent
config: es
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 75.44029352901934
- type: f1
value: 54.052855531072964
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (fr)
type: mteb/mtop_intent
config: fr
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 70.5606013153774
- type: f1
value: 52.62215934386531
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (hi)
type: mteb/mtop_intent
config: hi
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 73.11581211903908
- type: f1
value: 52.341291845645465
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (th)
type: mteb/mtop_intent
config: th
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 74.28933092224233
- type: f1
value: 57.07918745504911
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (af)
type: mteb/amazon_massive_intent
config: af
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 62.38063214525892
- type: f1
value: 59.46463723443009
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (am)
type: mteb/amazon_massive_intent
config: am
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 56.06926698049766
- type: f1
value: 52.49084283283562
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (ar)
type: mteb/amazon_massive_intent
config: ar
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 60.74983187626093
- type: f1
value: 56.960640620165904
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (az)
type: mteb/amazon_massive_intent
config: az
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 64.86550100874243
- type: f1
value: 62.47370548140688
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (bn)
type: mteb/amazon_massive_intent
config: bn
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 63.971082716879636
- type: f1
value: 61.03812421957381
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (cy)
type: mteb/amazon_massive_intent
config: cy
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 54.98318762609282
- type: f1
value: 51.51207916008392
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (da)
type: mteb/amazon_massive_intent
config: da
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 69.45527908540686
- type: f1
value: 66.16631905400318
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (de)
type: mteb/amazon_massive_intent
config: de
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 69.32750504371216
- type: f1
value: 66.16755288646591
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (el)
type: mteb/amazon_massive_intent
config: el
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 69.09213180901143
- type: f1
value: 66.95654394661507
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 73.75588433086752
- type: f1
value: 71.79973779656923
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (es)
type: mteb/amazon_massive_intent
config: es
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 70.49428379287154
- type: f1
value: 68.37494379215734
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (fa)
type: mteb/amazon_massive_intent
config: fa
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 69.90921318090115
- type: f1
value: 66.79517376481645
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (fi)
type: mteb/amazon_massive_intent
config: fi
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 70.12104909213181
- type: f1
value: 67.29448842879584
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (fr)
type: mteb/amazon_massive_intent
config: fr
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 69.34095494283793
- type: f1
value: 67.01134288992947
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (he)
type: mteb/amazon_massive_intent
config: he
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 67.61264290517822
- type: f1
value: 64.68730512660757
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (hi)
type: mteb/amazon_massive_intent
config: hi
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 67.79757901815738
- type: f1
value: 65.24938539425598
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (hu)
type: mteb/amazon_massive_intent
config: hu
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 69.68728984532616
- type: f1
value: 67.0487169762553
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (hy)
type: mteb/amazon_massive_intent
config: hy
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 62.07464694014795
- type: f1
value: 59.183532276789286
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (id)
type: mteb/amazon_massive_intent
config: id
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 70.04707464694015
- type: f1
value: 67.66829629003848
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (is)
type: mteb/amazon_massive_intent
config: is
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 62.42434431741762
- type: f1
value: 59.01617226544757
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (it)
type: mteb/amazon_massive_intent
config: it
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 70.53127101546738
- type: f1
value: 68.10033760906255
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (ja)
type: mteb/amazon_massive_intent
config: ja
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 72.50504371217215
- type: f1
value: 69.74931103158923
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (jv)
type: mteb/amazon_massive_intent
config: jv
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 57.91190316072628
- type: f1
value: 54.05551136648796
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (ka)
type: mteb/amazon_massive_intent
config: ka
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 51.78211163416275
- type: f1
value: 49.874888544058535
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (km)
type: mteb/amazon_massive_intent
config: km
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 47.017484868863484
- type: f1
value: 44.53364263352014
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (kn)
type: mteb/amazon_massive_intent
config: kn
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 62.16207128446537
- type: f1
value: 59.01185692320829
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (ko)
type: mteb/amazon_massive_intent
config: ko
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 69.42501681237391
- type: f1
value: 67.13169450166086
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (lv)
type: mteb/amazon_massive_intent
config: lv
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 67.0780094149294
- type: f1
value: 64.41720167850707
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (ml)
type: mteb/amazon_massive_intent
config: ml
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 65.57162071284466
- type: f1
value: 62.414138683804424
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (mn)
type: mteb/amazon_massive_intent
config: mn
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 61.71149966375252
- type: f1
value: 58.594805125087234
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (ms)
type: mteb/amazon_massive_intent
config: ms
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 66.03900470746471
- type: f1
value: 63.87937257883887
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (my)
type: mteb/amazon_massive_intent
config: my
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 60.8776059179556
- type: f1
value: 57.48587618059131
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (nb)
type: mteb/amazon_massive_intent
config: nb
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 69.87895090786819
- type: f1
value: 66.8141299430347
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (nl)
type: mteb/amazon_massive_intent
config: nl
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 70.45057162071285
- type: f1
value: 67.46444039673516
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (pl)
type: mteb/amazon_massive_intent
config: pl
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 71.546738399462
- type: f1
value: 68.63640876702655
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (pt)
type: mteb/amazon_massive_intent
config: pt
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 70.72965702757229
- type: f1
value: 68.54119560379115
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (ro)
type: mteb/amazon_massive_intent
config: ro
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 68.35574983187625
- type: f1
value: 65.88844917691927
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (ru)
type: mteb/amazon_massive_intent
config: ru
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 71.70477471418964
- type: f1
value: 69.19665697061978
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (sl)
type: mteb/amazon_massive_intent
config: sl
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 67.0880968392737
- type: f1
value: 64.76962317666086
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (sq)
type: mteb/amazon_massive_intent
config: sq
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 65.18493611297916
- type: f1
value: 62.49984559035371
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (sv)
type: mteb/amazon_massive_intent
config: sv
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 71.75857431069265
- type: f1
value: 69.20053687623418
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (sw)
type: mteb/amazon_massive_intent
config: sw
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 58.500336247478145
- type: f1
value: 55.2972398687929
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (ta)
type: mteb/amazon_massive_intent
config: ta
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 62.68997982515132
- type: f1
value: 59.36848202755348
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (te)
type: mteb/amazon_massive_intent
config: te
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 63.01950235373235
- type: f1
value: 60.09351954625423
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (th)
type: mteb/amazon_massive_intent
config: th
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 68.29186281102892
- type: f1
value: 67.57860496703447
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (tl)
type: mteb/amazon_massive_intent
config: tl
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 64.77471418964357
- type: f1
value: 61.913983147713836
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (tr)
type: mteb/amazon_massive_intent
config: tr
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 69.87222595830532
- type: f1
value: 66.03679033708141
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (ur)
type: mteb/amazon_massive_intent
config: ur
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 64.04505716207127
- type: f1
value: 61.28569169817908
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (vi)
type: mteb/amazon_massive_intent
config: vi
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 69.38466711499663
- type: f1
value: 67.20532357036844
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (zh-CN)
type: mteb/amazon_massive_intent
config: zh-CN
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 71.12306657700067
- type: f1
value: 68.91251226588182
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (zh-TW)
type: mteb/amazon_massive_intent
config: zh-TW
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 66.20040349697378
- type: f1
value: 66.02657347714175
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (af)
type: mteb/amazon_massive_scenario
config: af
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 68.73907195696032
- type: f1
value: 66.98484521791418
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (am)
type: mteb/amazon_massive_scenario
config: am
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 60.58843308675185
- type: f1
value: 58.95591723092005
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (ar)
type: mteb/amazon_massive_scenario
config: ar
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 66.22730329522528
- type: f1
value: 66.0894499712115
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (az)
type: mteb/amazon_massive_scenario
config: az
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 66.48285137861465
- type: f1
value: 65.21963176785157
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (bn)
type: mteb/amazon_massive_scenario
config: bn
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 67.74714189643578
- type: f1
value: 66.8212192745412
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (cy)
type: mteb/amazon_massive_scenario
config: cy
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 59.09213180901143
- type: f1
value: 56.70735546356339
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (da)
type: mteb/amazon_massive_scenario
config: da
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 75.05716207128448
- type: f1
value: 74.8413712365364
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (de)
type: mteb/amazon_massive_scenario
config: de
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 74.69737726967047
- type: f1
value: 74.7664341963
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (el)
type: mteb/amazon_massive_scenario
config: el
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 73.90383322125084
- type: f1
value: 73.59201554448323
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 77.51176866173503
- type: f1
value: 77.46104434577758
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (es)
type: mteb/amazon_massive_scenario
config: es
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 74.31069266980496
- type: f1
value: 74.61048660675635
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (fa)
type: mteb/amazon_massive_scenario
config: fa
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 72.95225285810356
- type: f1
value: 72.33160006574627
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (fi)
type: mteb/amazon_massive_scenario
config: fi
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 73.12373907195696
- type: f1
value: 73.20921012557481
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (fr)
type: mteb/amazon_massive_scenario
config: fr
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 73.86684599865501
- type: f1
value: 73.82348774610831
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (he)
type: mteb/amazon_massive_scenario
config: he
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 71.40215198386012
- type: f1
value: 71.11945183971858
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (hi)
type: mteb/amazon_massive_scenario
config: hi
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 72.12844653665098
- type: f1
value: 71.34450495911766
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (hu)
type: mteb/amazon_massive_scenario
config: hu
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 74.52252858103566
- type: f1
value: 73.98878711342999
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (hy)
type: mteb/amazon_massive_scenario
config: hy
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 64.93611297915265
- type: f1
value: 63.723200467653385
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (id)
type: mteb/amazon_massive_scenario
config: id
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 74.11903160726295
- type: f1
value: 73.82138439467096
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (is)
type: mteb/amazon_massive_scenario
config: is
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 67.15198386012105
- type: f1
value: 66.02172193802167
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (it)
type: mteb/amazon_massive_scenario
config: it
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 74.32414256893072
- type: f1
value: 74.30943421170574
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (ja)
type: mteb/amazon_massive_scenario
config: ja
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 77.46805648957633
- type: f1
value: 77.62808409298209
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (jv)
type: mteb/amazon_massive_scenario
config: jv
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 63.318762609280434
- type: f1
value: 62.094284066075076
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (ka)
type: mteb/amazon_massive_scenario
config: ka
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 58.34902488231338
- type: f1
value: 57.12893860987984
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (km)
type: mteb/amazon_massive_scenario
config: km
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 50.88433086751849
- type: f1
value: 48.2272350802058
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (kn)
type: mteb/amazon_massive_scenario
config: kn
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 66.4425016812374
- type: f1
value: 64.61463095996173
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (ko)
type: mteb/amazon_massive_scenario
config: ko
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 75.04707464694015
- type: f1
value: 75.05099199098998
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (lv)
type: mteb/amazon_massive_scenario
config: lv
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 70.50437121721586
- type: f1
value: 69.83397721096314
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (ml)
type: mteb/amazon_massive_scenario
config: ml
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 69.94283792871553
- type: f1
value: 68.8704663703913
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (mn)
type: mteb/amazon_massive_scenario
config: mn
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 64.79488903833222
- type: f1
value: 63.615424063345436
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (ms)
type: mteb/amazon_massive_scenario
config: ms
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 69.88231338264963
- type: f1
value: 68.57892302593237
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (my)
type: mteb/amazon_massive_scenario
config: my
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 63.248150638870214
- type: f1
value: 61.06680605338809
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (nb)
type: mteb/amazon_massive_scenario
config: nb
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 74.84196368527236
- type: f1
value: 74.52566464968763
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (nl)
type: mteb/amazon_massive_scenario
config: nl
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 74.8285137861466
- type: f1
value: 74.8853197608802
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (pl)
type: mteb/amazon_massive_scenario
config: pl
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 74.13248150638869
- type: f1
value: 74.3982040999179
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (pt)
type: mteb/amazon_massive_scenario
config: pt
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 73.49024882313383
- type: f1
value: 73.82153848368573
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (ro)
type: mteb/amazon_massive_scenario
config: ro
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 71.72158708809684
- type: f1
value: 71.85049433180541
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (ru)
type: mteb/amazon_massive_scenario
config: ru
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 75.137861466039
- type: f1
value: 75.37628348188467
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (sl)
type: mteb/amazon_massive_scenario
config: sl
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 71.86953597848016
- type: f1
value: 71.87537624521661
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (sq)
type: mteb/amazon_massive_scenario
config: sq
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 70.27572293207801
- type: f1
value: 68.80017302344231
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (sv)
type: mteb/amazon_massive_scenario
config: sv
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 76.09952925353059
- type: f1
value: 76.07992707688408
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (sw)
type: mteb/amazon_massive_scenario
config: sw
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 63.140551445864155
- type: f1
value: 61.73855010331415
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (ta)
type: mteb/amazon_massive_scenario
config: ta
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 66.27774041694687
- type: f1
value: 64.83664868894539
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (te)
type: mteb/amazon_massive_scenario
config: te
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 66.69468728984533
- type: f1
value: 64.76239666920868
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (th)
type: mteb/amazon_massive_scenario
config: th
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 73.44653665097512
- type: f1
value: 73.14646052013873
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (tl)
type: mteb/amazon_massive_scenario
config: tl
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 67.71351714862139
- type: f1
value: 66.67212180163382
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (tr)
type: mteb/amazon_massive_scenario
config: tr
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 73.9946200403497
- type: f1
value: 73.87348793725525
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (ur)
type: mteb/amazon_massive_scenario
config: ur
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 68.15400134498992
- type: f1
value: 67.09433241421094
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (vi)
type: mteb/amazon_massive_scenario
config: vi
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 73.11365164761264
- type: f1
value: 73.59502539433753
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (zh-CN)
type: mteb/amazon_massive_scenario
config: zh-CN
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 76.82582380632145
- type: f1
value: 76.89992945316313
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (zh-TW)
type: mteb/amazon_massive_scenario
config: zh-TW
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 71.81237390719569
- type: f1
value: 72.36499770986265
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 31.480506569594695
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 29.71252128004552
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 31.421396787056548
- type: mrr
value: 32.48155274872267
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: nfcorpus
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 5.595
- type: map_at_10
value: 12.642000000000001
- type: map_at_100
value: 15.726
- type: map_at_1000
value: 17.061999999999998
- type: map_at_3
value: 9.125
- type: map_at_5
value: 10.866000000000001
- type: mrr_at_1
value: 43.344
- type: mrr_at_10
value: 52.227999999999994
- type: mrr_at_100
value: 52.898999999999994
- type: mrr_at_1000
value: 52.944
- type: mrr_at_3
value: 49.845
- type: mrr_at_5
value: 51.115
- type: ndcg_at_1
value: 41.949999999999996
- type: ndcg_at_10
value: 33.995
- type: ndcg_at_100
value: 30.869999999999997
- type: ndcg_at_1000
value: 39.487
- type: ndcg_at_3
value: 38.903999999999996
- type: ndcg_at_5
value: 37.236999999999995
- type: precision_at_1
value: 43.344
- type: precision_at_10
value: 25.480000000000004
- type: precision_at_100
value: 7.672
- type: precision_at_1000
value: 2.028
- type: precision_at_3
value: 36.636
- type: precision_at_5
value: 32.632
- type: recall_at_1
value: 5.595
- type: recall_at_10
value: 16.466
- type: recall_at_100
value: 31.226
- type: recall_at_1000
value: 62.778999999999996
- type: recall_at_3
value: 9.931
- type: recall_at_5
value: 12.884
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: nq
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 40.414
- type: map_at_10
value: 56.754000000000005
- type: map_at_100
value: 57.457
- type: map_at_1000
value: 57.477999999999994
- type: map_at_3
value: 52.873999999999995
- type: map_at_5
value: 55.175
- type: mrr_at_1
value: 45.278
- type: mrr_at_10
value: 59.192
- type: mrr_at_100
value: 59.650000000000006
- type: mrr_at_1000
value: 59.665
- type: mrr_at_3
value: 56.141
- type: mrr_at_5
value: 57.998000000000005
- type: ndcg_at_1
value: 45.278
- type: ndcg_at_10
value: 64.056
- type: ndcg_at_100
value: 66.89
- type: ndcg_at_1000
value: 67.364
- type: ndcg_at_3
value: 56.97
- type: ndcg_at_5
value: 60.719
- type: precision_at_1
value: 45.278
- type: precision_at_10
value: 9.994
- type: precision_at_100
value: 1.165
- type: precision_at_1000
value: 0.121
- type: precision_at_3
value: 25.512
- type: precision_at_5
value: 17.509
- type: recall_at_1
value: 40.414
- type: recall_at_10
value: 83.596
- type: recall_at_100
value: 95.72
- type: recall_at_1000
value: 99.24
- type: recall_at_3
value: 65.472
- type: recall_at_5
value: 74.039
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: quora
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 70.352
- type: map_at_10
value: 84.369
- type: map_at_100
value: 85.02499999999999
- type: map_at_1000
value: 85.04
- type: map_at_3
value: 81.42399999999999
- type: map_at_5
value: 83.279
- type: mrr_at_1
value: 81.05
- type: mrr_at_10
value: 87.401
- type: mrr_at_100
value: 87.504
- type: mrr_at_1000
value: 87.505
- type: mrr_at_3
value: 86.443
- type: mrr_at_5
value: 87.10799999999999
- type: ndcg_at_1
value: 81.04
- type: ndcg_at_10
value: 88.181
- type: ndcg_at_100
value: 89.411
- type: ndcg_at_1000
value: 89.507
- type: ndcg_at_3
value: 85.28099999999999
- type: ndcg_at_5
value: 86.888
- type: precision_at_1
value: 81.04
- type: precision_at_10
value: 13.406
- type: precision_at_100
value: 1.5350000000000001
- type: precision_at_1000
value: 0.157
- type: precision_at_3
value: 37.31
- type: precision_at_5
value: 24.54
- type: recall_at_1
value: 70.352
- type: recall_at_10
value: 95.358
- type: recall_at_100
value: 99.541
- type: recall_at_1000
value: 99.984
- type: recall_at_3
value: 87.111
- type: recall_at_5
value: 91.643
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 46.54068723291946
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 63.216287629895994
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: scidocs
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 4.023000000000001
- type: map_at_10
value: 10.071
- type: map_at_100
value: 11.892
- type: map_at_1000
value: 12.196
- type: map_at_3
value: 7.234
- type: map_at_5
value: 8.613999999999999
- type: mrr_at_1
value: 19.900000000000002
- type: mrr_at_10
value: 30.516
- type: mrr_at_100
value: 31.656000000000002
- type: mrr_at_1000
value: 31.723000000000003
- type: mrr_at_3
value: 27.400000000000002
- type: mrr_at_5
value: 29.270000000000003
- type: ndcg_at_1
value: 19.900000000000002
- type: ndcg_at_10
value: 17.474
- type: ndcg_at_100
value: 25.020999999999997
- type: ndcg_at_1000
value: 30.728
- type: ndcg_at_3
value: 16.588
- type: ndcg_at_5
value: 14.498
- type: precision_at_1
value: 19.900000000000002
- type: precision_at_10
value: 9.139999999999999
- type: precision_at_100
value: 2.011
- type: precision_at_1000
value: 0.33899999999999997
- type: precision_at_3
value: 15.667
- type: precision_at_5
value: 12.839999999999998
- type: recall_at_1
value: 4.023000000000001
- type: recall_at_10
value: 18.497
- type: recall_at_100
value: 40.8
- type: recall_at_1000
value: 68.812
- type: recall_at_3
value: 9.508
- type: recall_at_5
value: 12.983
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 83.967008785134
- type: cos_sim_spearman
value: 80.23142141101837
- type: euclidean_pearson
value: 81.20166064704539
- type: euclidean_spearman
value: 80.18961335654585
- type: manhattan_pearson
value: 81.13925443187625
- type: manhattan_spearman
value: 80.07948723044424
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 86.94262461316023
- type: cos_sim_spearman
value: 80.01596278563865
- type: euclidean_pearson
value: 83.80799622922581
- type: euclidean_spearman
value: 79.94984954947103
- type: manhattan_pearson
value: 83.68473841756281
- type: manhattan_spearman
value: 79.84990707951822
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 80.57346443146068
- type: cos_sim_spearman
value: 81.54689837570866
- type: euclidean_pearson
value: 81.10909881516007
- type: euclidean_spearman
value: 81.56746243261762
- type: manhattan_pearson
value: 80.87076036186582
- type: manhattan_spearman
value: 81.33074987964402
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 79.54733787179849
- type: cos_sim_spearman
value: 77.72202105610411
- type: euclidean_pearson
value: 78.9043595478849
- type: euclidean_spearman
value: 77.93422804309435
- type: manhattan_pearson
value: 78.58115121621368
- type: manhattan_spearman
value: 77.62508135122033
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 88.59880017237558
- type: cos_sim_spearman
value: 89.31088630824758
- type: euclidean_pearson
value: 88.47069261564656
- type: euclidean_spearman
value: 89.33581971465233
- type: manhattan_pearson
value: 88.40774264100956
- type: manhattan_spearman
value: 89.28657485627835
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 84.08055117917084
- type: cos_sim_spearman
value: 85.78491813080304
- type: euclidean_pearson
value: 84.99329155500392
- type: euclidean_spearman
value: 85.76728064677287
- type: manhattan_pearson
value: 84.87947428989587
- type: manhattan_spearman
value: 85.62429454917464
- task:
type: STS
dataset:
name: MTEB STS17 (ko-ko)
type: mteb/sts17-crosslingual-sts
config: ko-ko
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 82.14190939287384
- type: cos_sim_spearman
value: 82.27331573306041
- type: euclidean_pearson
value: 81.891896953716
- type: euclidean_spearman
value: 82.37695542955998
- type: manhattan_pearson
value: 81.73123869460504
- type: manhattan_spearman
value: 82.19989168441421
- task:
type: STS
dataset:
name: MTEB STS17 (ar-ar)
type: mteb/sts17-crosslingual-sts
config: ar-ar
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 76.84695301843362
- type: cos_sim_spearman
value: 77.87790986014461
- type: euclidean_pearson
value: 76.91981583106315
- type: euclidean_spearman
value: 77.88154772749589
- type: manhattan_pearson
value: 76.94953277451093
- type: manhattan_spearman
value: 77.80499230728604
- task:
type: STS
dataset:
name: MTEB STS17 (en-ar)
type: mteb/sts17-crosslingual-sts
config: en-ar
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 75.44657840482016
- type: cos_sim_spearman
value: 75.05531095119674
- type: euclidean_pearson
value: 75.88161755829299
- type: euclidean_spearman
value: 74.73176238219332
- type: manhattan_pearson
value: 75.63984765635362
- type: manhattan_spearman
value: 74.86476440770737
- task:
type: STS
dataset:
name: MTEB STS17 (en-de)
type: mteb/sts17-crosslingual-sts
config: en-de
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 85.64700140524133
- type: cos_sim_spearman
value: 86.16014210425672
- type: euclidean_pearson
value: 86.49086860843221
- type: euclidean_spearman
value: 86.09729326815614
- type: manhattan_pearson
value: 86.43406265125513
- type: manhattan_spearman
value: 86.17740150939994
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 87.91170098764921
- type: cos_sim_spearman
value: 88.12437004058931
- type: euclidean_pearson
value: 88.81828254494437
- type: euclidean_spearman
value: 88.14831794572122
- type: manhattan_pearson
value: 88.93442183448961
- type: manhattan_spearman
value: 88.15254630778304
- task:
type: STS
dataset:
name: MTEB STS17 (en-tr)
type: mteb/sts17-crosslingual-sts
config: en-tr
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 72.91390577997292
- type: cos_sim_spearman
value: 71.22979457536074
- type: euclidean_pearson
value: 74.40314008106749
- type: euclidean_spearman
value: 72.54972136083246
- type: manhattan_pearson
value: 73.85687539530218
- type: manhattan_spearman
value: 72.09500771742637
- task:
type: STS
dataset:
name: MTEB STS17 (es-en)
type: mteb/sts17-crosslingual-sts
config: es-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 80.9301067983089
- type: cos_sim_spearman
value: 80.74989828346473
- type: euclidean_pearson
value: 81.36781301814257
- type: euclidean_spearman
value: 80.9448819964426
- type: manhattan_pearson
value: 81.0351322685609
- type: manhattan_spearman
value: 80.70192121844177
- task:
type: STS
dataset:
name: MTEB STS17 (es-es)
type: mteb/sts17-crosslingual-sts
config: es-es
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 87.13820465980005
- type: cos_sim_spearman
value: 86.73532498758757
- type: euclidean_pearson
value: 87.21329451846637
- type: euclidean_spearman
value: 86.57863198601002
- type: manhattan_pearson
value: 87.06973713818554
- type: manhattan_spearman
value: 86.47534918791499
- task:
type: STS
dataset:
name: MTEB STS17 (fr-en)
type: mteb/sts17-crosslingual-sts
config: fr-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 85.48720108904415
- type: cos_sim_spearman
value: 85.62221757068387
- type: euclidean_pearson
value: 86.1010129512749
- type: euclidean_spearman
value: 85.86580966509942
- type: manhattan_pearson
value: 86.26800938808971
- type: manhattan_spearman
value: 85.88902721678429
- task:
type: STS
dataset:
name: MTEB STS17 (it-en)
type: mteb/sts17-crosslingual-sts
config: it-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 83.98021347333516
- type: cos_sim_spearman
value: 84.53806553803501
- type: euclidean_pearson
value: 84.61483347248364
- type: euclidean_spearman
value: 85.14191408011702
- type: manhattan_pearson
value: 84.75297588825967
- type: manhattan_spearman
value: 85.33176753669242
- task:
type: STS
dataset:
name: MTEB STS17 (nl-en)
type: mteb/sts17-crosslingual-sts
config: nl-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 84.51856644893233
- type: cos_sim_spearman
value: 85.27510748506413
- type: euclidean_pearson
value: 85.09886861540977
- type: euclidean_spearman
value: 85.62579245860887
- type: manhattan_pearson
value: 84.93017860464607
- type: manhattan_spearman
value: 85.5063988898453
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 62.581573200584195
- type: cos_sim_spearman
value: 63.05503590247928
- type: euclidean_pearson
value: 63.652564812602094
- type: euclidean_spearman
value: 62.64811520876156
- type: manhattan_pearson
value: 63.506842893061076
- type: manhattan_spearman
value: 62.51289573046917
- task:
type: STS
dataset:
name: MTEB STS22 (de)
type: mteb/sts22-crosslingual-sts
config: de
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 48.2248801729127
- type: cos_sim_spearman
value: 56.5936604678561
- type: euclidean_pearson
value: 43.98149464089
- type: euclidean_spearman
value: 56.108561882423615
- type: manhattan_pearson
value: 43.86880305903564
- type: manhattan_spearman
value: 56.04671150510166
- task:
type: STS
dataset:
name: MTEB STS22 (es)
type: mteb/sts22-crosslingual-sts
config: es
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 55.17564527009831
- type: cos_sim_spearman
value: 64.57978560979488
- type: euclidean_pearson
value: 58.8818330154583
- type: euclidean_spearman
value: 64.99214839071281
- type: manhattan_pearson
value: 58.72671436121381
- type: manhattan_spearman
value: 65.10713416616109
- task:
type: STS
dataset:
name: MTEB STS22 (pl)
type: mteb/sts22-crosslingual-sts
config: pl
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 26.772131864023297
- type: cos_sim_spearman
value: 34.68200792408681
- type: euclidean_pearson
value: 16.68082419005441
- type: euclidean_spearman
value: 34.83099932652166
- type: manhattan_pearson
value: 16.52605949659529
- type: manhattan_spearman
value: 34.82075801399475
- task:
type: STS
dataset:
name: MTEB STS22 (tr)
type: mteb/sts22-crosslingual-sts
config: tr
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 54.42415189043831
- type: cos_sim_spearman
value: 63.54594264576758
- type: euclidean_pearson
value: 57.36577498297745
- type: euclidean_spearman
value: 63.111466379158074
- type: manhattan_pearson
value: 57.584543715873885
- type: manhattan_spearman
value: 63.22361054139183
- task:
type: STS
dataset:
name: MTEB STS22 (ar)
type: mteb/sts22-crosslingual-sts
config: ar
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 47.55216762405518
- type: cos_sim_spearman
value: 56.98670142896412
- type: euclidean_pearson
value: 50.15318757562699
- type: euclidean_spearman
value: 56.524941926541906
- type: manhattan_pearson
value: 49.955618528674904
- type: manhattan_spearman
value: 56.37102209240117
- task:
type: STS
dataset:
name: MTEB STS22 (ru)
type: mteb/sts22-crosslingual-sts
config: ru
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 49.20540980338571
- type: cos_sim_spearman
value: 59.9009453504406
- type: euclidean_pearson
value: 49.557749853620535
- type: euclidean_spearman
value: 59.76631621172456
- type: manhattan_pearson
value: 49.62340591181147
- type: manhattan_spearman
value: 59.94224880322436
- task:
type: STS
dataset:
name: MTEB STS22 (zh)
type: mteb/sts22-crosslingual-sts
config: zh
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 51.508169956576985
- type: cos_sim_spearman
value: 66.82461565306046
- type: euclidean_pearson
value: 56.2274426480083
- type: euclidean_spearman
value: 66.6775323848333
- type: manhattan_pearson
value: 55.98277796300661
- type: manhattan_spearman
value: 66.63669848497175
- task:
type: STS
dataset:
name: MTEB STS22 (fr)
type: mteb/sts22-crosslingual-sts
config: fr
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 72.86478788045507
- type: cos_sim_spearman
value: 76.7946552053193
- type: euclidean_pearson
value: 75.01598530490269
- type: euclidean_spearman
value: 76.83618917858281
- type: manhattan_pearson
value: 74.68337628304332
- type: manhattan_spearman
value: 76.57480204017773
- task:
type: STS
dataset:
name: MTEB STS22 (de-en)
type: mteb/sts22-crosslingual-sts
config: de-en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 55.922619099401984
- type: cos_sim_spearman
value: 56.599362477240774
- type: euclidean_pearson
value: 56.68307052369783
- type: euclidean_spearman
value: 54.28760436777401
- type: manhattan_pearson
value: 56.67763566500681
- type: manhattan_spearman
value: 53.94619541711359
- task:
type: STS
dataset:
name: MTEB STS22 (es-en)
type: mteb/sts22-crosslingual-sts
config: es-en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 66.74357206710913
- type: cos_sim_spearman
value: 72.5208244925311
- type: euclidean_pearson
value: 67.49254562186032
- type: euclidean_spearman
value: 72.02469076238683
- type: manhattan_pearson
value: 67.45251772238085
- type: manhattan_spearman
value: 72.05538819984538
- task:
type: STS
dataset:
name: MTEB STS22 (it)
type: mteb/sts22-crosslingual-sts
config: it
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 71.25734330033191
- type: cos_sim_spearman
value: 76.98349083946823
- type: euclidean_pearson
value: 73.71642838667736
- type: euclidean_spearman
value: 77.01715504651384
- type: manhattan_pearson
value: 73.61712711868105
- type: manhattan_spearman
value: 77.01392571153896
- task:
type: STS
dataset:
name: MTEB STS22 (pl-en)
type: mteb/sts22-crosslingual-sts
config: pl-en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 63.18215462781212
- type: cos_sim_spearman
value: 65.54373266117607
- type: euclidean_pearson
value: 64.54126095439005
- type: euclidean_spearman
value: 65.30410369102711
- type: manhattan_pearson
value: 63.50332221148234
- type: manhattan_spearman
value: 64.3455878104313
- task:
type: STS
dataset:
name: MTEB STS22 (zh-en)
type: mteb/sts22-crosslingual-sts
config: zh-en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 62.30509221440029
- type: cos_sim_spearman
value: 65.99582704642478
- type: euclidean_pearson
value: 63.43818859884195
- type: euclidean_spearman
value: 66.83172582815764
- type: manhattan_pearson
value: 63.055779168508764
- type: manhattan_spearman
value: 65.49585020501449
- task:
type: STS
dataset:
name: MTEB STS22 (es-it)
type: mteb/sts22-crosslingual-sts
config: es-it
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 59.587830825340404
- type: cos_sim_spearman
value: 68.93467614588089
- type: euclidean_pearson
value: 62.3073527367404
- type: euclidean_spearman
value: 69.69758171553175
- type: manhattan_pearson
value: 61.9074580815789
- type: manhattan_spearman
value: 69.57696375597865
- task:
type: STS
dataset:
name: MTEB STS22 (de-fr)
type: mteb/sts22-crosslingual-sts
config: de-fr
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 57.143220125577066
- type: cos_sim_spearman
value: 67.78857859159226
- type: euclidean_pearson
value: 55.58225107923733
- type: euclidean_spearman
value: 67.80662907184563
- type: manhattan_pearson
value: 56.24953502726514
- type: manhattan_spearman
value: 67.98262125431616
- task:
type: STS
dataset:
name: MTEB STS22 (de-pl)
type: mteb/sts22-crosslingual-sts
config: de-pl
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 21.826928900322066
- type: cos_sim_spearman
value: 49.578506634400405
- type: euclidean_pearson
value: 27.939890138843214
- type: euclidean_spearman
value: 52.71950519136242
- type: manhattan_pearson
value: 26.39878683847546
- type: manhattan_spearman
value: 47.54609580342499
- task:
type: STS
dataset:
name: MTEB STS22 (fr-pl)
type: mteb/sts22-crosslingual-sts
config: fr-pl
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 57.27603854632001
- type: cos_sim_spearman
value: 50.709255283710995
- type: euclidean_pearson
value: 59.5419024445929
- type: euclidean_spearman
value: 50.709255283710995
- type: manhattan_pearson
value: 59.03256832438492
- type: manhattan_spearman
value: 61.97797868009122
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 85.00757054859712
- type: cos_sim_spearman
value: 87.29283629622222
- type: euclidean_pearson
value: 86.54824171775536
- type: euclidean_spearman
value: 87.24364730491402
- type: manhattan_pearson
value: 86.5062156915074
- type: manhattan_spearman
value: 87.15052170378574
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 82.03549357197389
- type: mrr
value: 95.05437645143527
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: scifact
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 57.260999999999996
- type: map_at_10
value: 66.259
- type: map_at_100
value: 66.884
- type: map_at_1000
value: 66.912
- type: map_at_3
value: 63.685
- type: map_at_5
value: 65.35499999999999
- type: mrr_at_1
value: 60.333000000000006
- type: mrr_at_10
value: 67.5
- type: mrr_at_100
value: 68.013
- type: mrr_at_1000
value: 68.038
- type: mrr_at_3
value: 65.61099999999999
- type: mrr_at_5
value: 66.861
- type: ndcg_at_1
value: 60.333000000000006
- type: ndcg_at_10
value: 70.41
- type: ndcg_at_100
value: 73.10600000000001
- type: ndcg_at_1000
value: 73.846
- type: ndcg_at_3
value: 66.133
- type: ndcg_at_5
value: 68.499
- type: precision_at_1
value: 60.333000000000006
- type: precision_at_10
value: 9.232999999999999
- type: precision_at_100
value: 1.0630000000000002
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_3
value: 25.667
- type: precision_at_5
value: 17.067
- type: recall_at_1
value: 57.260999999999996
- type: recall_at_10
value: 81.94399999999999
- type: recall_at_100
value: 93.867
- type: recall_at_1000
value: 99.667
- type: recall_at_3
value: 70.339
- type: recall_at_5
value: 76.25
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.74356435643564
- type: cos_sim_ap
value: 93.13411948212683
- type: cos_sim_f1
value: 86.80521991300147
- type: cos_sim_precision
value: 84.00374181478017
- type: cos_sim_recall
value: 89.8
- type: dot_accuracy
value: 99.67920792079208
- type: dot_ap
value: 89.27277565444479
- type: dot_f1
value: 83.9276990718124
- type: dot_precision
value: 82.04393505253104
- type: dot_recall
value: 85.9
- type: euclidean_accuracy
value: 99.74257425742574
- type: euclidean_ap
value: 93.17993008259062
- type: euclidean_f1
value: 86.69396110542476
- type: euclidean_precision
value: 88.78406708595388
- type: euclidean_recall
value: 84.7
- type: manhattan_accuracy
value: 99.74257425742574
- type: manhattan_ap
value: 93.14413755550099
- type: manhattan_f1
value: 86.82483594144371
- type: manhattan_precision
value: 87.66564729867483
- type: manhattan_recall
value: 86
- type: max_accuracy
value: 99.74356435643564
- type: max_ap
value: 93.17993008259062
- type: max_f1
value: 86.82483594144371
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 57.525863806168566
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 32.68850574423839
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 49.71580650644033
- type: mrr
value: 50.50971903913081
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 29.152190498799484
- type: cos_sim_spearman
value: 29.686180371952727
- type: dot_pearson
value: 27.248664793816342
- type: dot_spearman
value: 28.37748983721745
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: trec-covid
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 0.20400000000000001
- type: map_at_10
value: 1.6209999999999998
- type: map_at_100
value: 9.690999999999999
- type: map_at_1000
value: 23.733
- type: map_at_3
value: 0.575
- type: map_at_5
value: 0.885
- type: mrr_at_1
value: 78
- type: mrr_at_10
value: 86.56700000000001
- type: mrr_at_100
value: 86.56700000000001
- type: mrr_at_1000
value: 86.56700000000001
- type: mrr_at_3
value: 85.667
- type: mrr_at_5
value: 86.56700000000001
- type: ndcg_at_1
value: 76
- type: ndcg_at_10
value: 71.326
- type: ndcg_at_100
value: 54.208999999999996
- type: ndcg_at_1000
value: 49.252
- type: ndcg_at_3
value: 74.235
- type: ndcg_at_5
value: 73.833
- type: precision_at_1
value: 78
- type: precision_at_10
value: 74.8
- type: precision_at_100
value: 55.50000000000001
- type: precision_at_1000
value: 21.836
- type: precision_at_3
value: 78
- type: precision_at_5
value: 78
- type: recall_at_1
value: 0.20400000000000001
- type: recall_at_10
value: 1.894
- type: recall_at_100
value: 13.245999999999999
- type: recall_at_1000
value: 46.373
- type: recall_at_3
value: 0.613
- type: recall_at_5
value: 0.991
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (sqi-eng)
type: mteb/tatoeba-bitext-mining
config: sqi-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 95.89999999999999
- type: f1
value: 94.69999999999999
- type: precision
value: 94.11666666666667
- type: recall
value: 95.89999999999999
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (fry-eng)
type: mteb/tatoeba-bitext-mining
config: fry-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 68.20809248554913
- type: f1
value: 63.431048720066066
- type: precision
value: 61.69143958161298
- type: recall
value: 68.20809248554913
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (kur-eng)
type: mteb/tatoeba-bitext-mining
config: kur-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 71.21951219512195
- type: f1
value: 66.82926829268293
- type: precision
value: 65.1260162601626
- type: recall
value: 71.21951219512195
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (tur-eng)
type: mteb/tatoeba-bitext-mining
config: tur-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 97.2
- type: f1
value: 96.26666666666667
- type: precision
value: 95.8
- type: recall
value: 97.2
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (deu-eng)
type: mteb/tatoeba-bitext-mining
config: deu-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 99.3
- type: f1
value: 99.06666666666666
- type: precision
value: 98.95
- type: recall
value: 99.3
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (nld-eng)
type: mteb/tatoeba-bitext-mining
config: nld-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 97.39999999999999
- type: f1
value: 96.63333333333333
- type: precision
value: 96.26666666666668
- type: recall
value: 97.39999999999999
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (ron-eng)
type: mteb/tatoeba-bitext-mining
config: ron-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 96
- type: f1
value: 94.86666666666666
- type: precision
value: 94.31666666666668
- type: recall
value: 96
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (ang-eng)
type: mteb/tatoeba-bitext-mining
config: ang-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 47.01492537313433
- type: f1
value: 40.178867566927266
- type: precision
value: 38.179295828549556
- type: recall
value: 47.01492537313433
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (ido-eng)
type: mteb/tatoeba-bitext-mining
config: ido-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 86.5
- type: f1
value: 83.62537480063796
- type: precision
value: 82.44555555555554
- type: recall
value: 86.5
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (jav-eng)
type: mteb/tatoeba-bitext-mining
config: jav-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 80.48780487804879
- type: f1
value: 75.45644599303138
- type: precision
value: 73.37398373983739
- type: recall
value: 80.48780487804879
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (isl-eng)
type: mteb/tatoeba-bitext-mining
config: isl-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 93.7
- type: f1
value: 91.95666666666666
- type: precision
value: 91.125
- type: recall
value: 93.7
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (slv-eng)
type: mteb/tatoeba-bitext-mining
config: slv-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 91.73754556500607
- type: f1
value: 89.65168084244632
- type: precision
value: 88.73025516403402
- type: recall
value: 91.73754556500607
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (cym-eng)
type: mteb/tatoeba-bitext-mining
config: cym-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 81.04347826086956
- type: f1
value: 76.2128364389234
- type: precision
value: 74.2
- type: recall
value: 81.04347826086956
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (kaz-eng)
type: mteb/tatoeba-bitext-mining
config: kaz-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 83.65217391304348
- type: f1
value: 79.4376811594203
- type: precision
value: 77.65797101449274
- type: recall
value: 83.65217391304348
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (est-eng)
type: mteb/tatoeba-bitext-mining
config: est-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 87.5
- type: f1
value: 85.02690476190476
- type: precision
value: 83.96261904761904
- type: recall
value: 87.5
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (heb-eng)
type: mteb/tatoeba-bitext-mining
config: heb-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 89.3
- type: f1
value: 86.52333333333333
- type: precision
value: 85.22833333333332
- type: recall
value: 89.3
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (gla-eng)
type: mteb/tatoeba-bitext-mining
config: gla-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 65.01809408926418
- type: f1
value: 59.00594446432805
- type: precision
value: 56.827215807915444
- type: recall
value: 65.01809408926418
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (mar-eng)
type: mteb/tatoeba-bitext-mining
config: mar-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 91.2
- type: f1
value: 88.58
- type: precision
value: 87.33333333333334
- type: recall
value: 91.2
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (lat-eng)
type: mteb/tatoeba-bitext-mining
config: lat-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 59.199999999999996
- type: f1
value: 53.299166276284915
- type: precision
value: 51.3383908045977
- type: recall
value: 59.199999999999996
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (bel-eng)
type: mteb/tatoeba-bitext-mining
config: bel-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 93.2
- type: f1
value: 91.2
- type: precision
value: 90.25
- type: recall
value: 93.2
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (pms-eng)
type: mteb/tatoeba-bitext-mining
config: pms-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 64.76190476190476
- type: f1
value: 59.867110667110666
- type: precision
value: 58.07390192653351
- type: recall
value: 64.76190476190476
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (gle-eng)
type: mteb/tatoeba-bitext-mining
config: gle-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 76.2
- type: f1
value: 71.48147546897547
- type: precision
value: 69.65409090909091
- type: recall
value: 76.2
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (pes-eng)
type: mteb/tatoeba-bitext-mining
config: pes-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 93.8
- type: f1
value: 92.14
- type: precision
value: 91.35833333333333
- type: recall
value: 93.8
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (nob-eng)
type: mteb/tatoeba-bitext-mining
config: nob-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 97.89999999999999
- type: f1
value: 97.2
- type: precision
value: 96.85000000000001
- type: recall
value: 97.89999999999999
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (bul-eng)
type: mteb/tatoeba-bitext-mining
config: bul-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 94.6
- type: f1
value: 92.93333333333334
- type: precision
value: 92.13333333333333
- type: recall
value: 94.6
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (cbk-eng)
type: mteb/tatoeba-bitext-mining
config: cbk-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 74.1
- type: f1
value: 69.14817460317461
- type: precision
value: 67.2515873015873
- type: recall
value: 74.1
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (hun-eng)
type: mteb/tatoeba-bitext-mining
config: hun-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 95.19999999999999
- type: f1
value: 94.01333333333335
- type: precision
value: 93.46666666666667
- type: recall
value: 95.19999999999999
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (uig-eng)
type: mteb/tatoeba-bitext-mining
config: uig-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 76.9
- type: f1
value: 72.07523809523809
- type: precision
value: 70.19777777777779
- type: recall
value: 76.9
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (rus-eng)
type: mteb/tatoeba-bitext-mining
config: rus-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 94.1
- type: f1
value: 92.31666666666666
- type: precision
value: 91.43333333333332
- type: recall
value: 94.1
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (spa-eng)
type: mteb/tatoeba-bitext-mining
config: spa-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 97.8
- type: f1
value: 97.1
- type: precision
value: 96.76666666666668
- type: recall
value: 97.8
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (hye-eng)
type: mteb/tatoeba-bitext-mining
config: hye-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 92.85714285714286
- type: f1
value: 90.92093441150045
- type: precision
value: 90.00449236298293
- type: recall
value: 92.85714285714286
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (tel-eng)
type: mteb/tatoeba-bitext-mining
config: tel-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 93.16239316239316
- type: f1
value: 91.33903133903132
- type: precision
value: 90.56267806267806
- type: recall
value: 93.16239316239316
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (afr-eng)
type: mteb/tatoeba-bitext-mining
config: afr-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 92.4
- type: f1
value: 90.25666666666666
- type: precision
value: 89.25833333333334
- type: recall
value: 92.4
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (mon-eng)
type: mteb/tatoeba-bitext-mining
config: mon-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 90.22727272727272
- type: f1
value: 87.53030303030303
- type: precision
value: 86.37121212121211
- type: recall
value: 90.22727272727272
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (arz-eng)
type: mteb/tatoeba-bitext-mining
config: arz-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 79.03563941299791
- type: f1
value: 74.7349505840072
- type: precision
value: 72.9035639412998
- type: recall
value: 79.03563941299791
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (hrv-eng)
type: mteb/tatoeba-bitext-mining
config: hrv-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 97
- type: f1
value: 96.15
- type: precision
value: 95.76666666666668
- type: recall
value: 97
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (nov-eng)
type: mteb/tatoeba-bitext-mining
config: nov-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 76.26459143968872
- type: f1
value: 71.55642023346303
- type: precision
value: 69.7544932369835
- type: recall
value: 76.26459143968872
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (gsw-eng)
type: mteb/tatoeba-bitext-mining
config: gsw-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 58.119658119658126
- type: f1
value: 51.65242165242165
- type: precision
value: 49.41768108434775
- type: recall
value: 58.119658119658126
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (nds-eng)
type: mteb/tatoeba-bitext-mining
config: nds-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 74.3
- type: f1
value: 69.52055555555555
- type: precision
value: 67.7574938949939
- type: recall
value: 74.3
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (ukr-eng)
type: mteb/tatoeba-bitext-mining
config: ukr-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 94.8
- type: f1
value: 93.31666666666666
- type: precision
value: 92.60000000000001
- type: recall
value: 94.8
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (uzb-eng)
type: mteb/tatoeba-bitext-mining
config: uzb-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 76.63551401869158
- type: f1
value: 72.35202492211837
- type: precision
value: 70.60358255451713
- type: recall
value: 76.63551401869158
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (lit-eng)
type: mteb/tatoeba-bitext-mining
config: lit-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 90.4
- type: f1
value: 88.4811111111111
- type: precision
value: 87.7452380952381
- type: recall
value: 90.4
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (ina-eng)
type: mteb/tatoeba-bitext-mining
config: ina-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 95
- type: f1
value: 93.60666666666667
- type: precision
value: 92.975
- type: recall
value: 95
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (lfn-eng)
type: mteb/tatoeba-bitext-mining
config: lfn-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 67.2
- type: f1
value: 63.01595782872099
- type: precision
value: 61.596587301587306
- type: recall
value: 67.2
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (zsm-eng)
type: mteb/tatoeba-bitext-mining
config: zsm-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 95.7
- type: f1
value: 94.52999999999999
- type: precision
value: 94
- type: recall
value: 95.7
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (ita-eng)
type: mteb/tatoeba-bitext-mining
config: ita-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 94.6
- type: f1
value: 93.28999999999999
- type: precision
value: 92.675
- type: recall
value: 94.6
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (cmn-eng)
type: mteb/tatoeba-bitext-mining
config: cmn-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 96.39999999999999
- type: f1
value: 95.28333333333333
- type: precision
value: 94.75
- type: recall
value: 96.39999999999999
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (lvs-eng)
type: mteb/tatoeba-bitext-mining
config: lvs-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 91.9
- type: f1
value: 89.83
- type: precision
value: 88.92
- type: recall
value: 91.9
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (glg-eng)
type: mteb/tatoeba-bitext-mining
config: glg-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 94.69999999999999
- type: f1
value: 93.34222222222223
- type: precision
value: 92.75416666666668
- type: recall
value: 94.69999999999999
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (ceb-eng)
type: mteb/tatoeba-bitext-mining
config: ceb-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 60.333333333333336
- type: f1
value: 55.31203703703703
- type: precision
value: 53.39971108326371
- type: recall
value: 60.333333333333336
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (bre-eng)
type: mteb/tatoeba-bitext-mining
config: bre-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 12.9
- type: f1
value: 11.099861903031458
- type: precision
value: 10.589187932631877
- type: recall
value: 12.9
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (ben-eng)
type: mteb/tatoeba-bitext-mining
config: ben-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 86.7
- type: f1
value: 83.0152380952381
- type: precision
value: 81.37833333333333
- type: recall
value: 86.7
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (swg-eng)
type: mteb/tatoeba-bitext-mining
config: swg-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 63.39285714285714
- type: f1
value: 56.832482993197274
- type: precision
value: 54.56845238095237
- type: recall
value: 63.39285714285714
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (arq-eng)
type: mteb/tatoeba-bitext-mining
config: arq-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 48.73765093304062
- type: f1
value: 41.555736920720456
- type: precision
value: 39.06874531737319
- type: recall
value: 48.73765093304062
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (kab-eng)
type: mteb/tatoeba-bitext-mining
config: kab-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 41.099999999999994
- type: f1
value: 36.540165945165946
- type: precision
value: 35.05175685425686
- type: recall
value: 41.099999999999994
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (fra-eng)
type: mteb/tatoeba-bitext-mining
config: fra-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 94.89999999999999
- type: f1
value: 93.42333333333333
- type: precision
value: 92.75833333333333
- type: recall
value: 94.89999999999999
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (por-eng)
type: mteb/tatoeba-bitext-mining
config: por-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 94.89999999999999
- type: f1
value: 93.63333333333334
- type: precision
value: 93.01666666666665
- type: recall
value: 94.89999999999999
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (tat-eng)
type: mteb/tatoeba-bitext-mining
config: tat-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 77.9
- type: f1
value: 73.64833333333334
- type: precision
value: 71.90282106782105
- type: recall
value: 77.9
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (oci-eng)
type: mteb/tatoeba-bitext-mining
config: oci-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 59.4
- type: f1
value: 54.90521367521367
- type: precision
value: 53.432840025471606
- type: recall
value: 59.4
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (pol-eng)
type: mteb/tatoeba-bitext-mining
config: pol-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 97.39999999999999
- type: f1
value: 96.6
- type: precision
value: 96.2
- type: recall
value: 97.39999999999999
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (war-eng)
type: mteb/tatoeba-bitext-mining
config: war-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 67.2
- type: f1
value: 62.25926129426129
- type: precision
value: 60.408376623376626
- type: recall
value: 67.2
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (aze-eng)
type: mteb/tatoeba-bitext-mining
config: aze-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 90.2
- type: f1
value: 87.60666666666667
- type: precision
value: 86.45277777777778
- type: recall
value: 90.2
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (vie-eng)
type: mteb/tatoeba-bitext-mining
config: vie-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 97.7
- type: f1
value: 97
- type: precision
value: 96.65
- type: recall
value: 97.7
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (nno-eng)
type: mteb/tatoeba-bitext-mining
config: nno-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 93.2
- type: f1
value: 91.39746031746031
- type: precision
value: 90.6125
- type: recall
value: 93.2
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (cha-eng)
type: mteb/tatoeba-bitext-mining
config: cha-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 32.11678832116788
- type: f1
value: 27.210415386260234
- type: precision
value: 26.20408990846947
- type: recall
value: 32.11678832116788
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (mhr-eng)
type: mteb/tatoeba-bitext-mining
config: mhr-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 8.5
- type: f1
value: 6.787319277832475
- type: precision
value: 6.3452094433344435
- type: recall
value: 8.5
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (dan-eng)
type: mteb/tatoeba-bitext-mining
config: dan-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 96.1
- type: f1
value: 95.08
- type: precision
value: 94.61666666666667
- type: recall
value: 96.1
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (ell-eng)
type: mteb/tatoeba-bitext-mining
config: ell-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 95.3
- type: f1
value: 93.88333333333333
- type: precision
value: 93.18333333333332
- type: recall
value: 95.3
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (amh-eng)
type: mteb/tatoeba-bitext-mining
config: amh-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 85.11904761904762
- type: f1
value: 80.69444444444444
- type: precision
value: 78.72023809523809
- type: recall
value: 85.11904761904762
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (pam-eng)
type: mteb/tatoeba-bitext-mining
config: pam-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 11.1
- type: f1
value: 9.276381801735853
- type: precision
value: 8.798174603174601
- type: recall
value: 11.1
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (hsb-eng)
type: mteb/tatoeba-bitext-mining
config: hsb-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 63.56107660455487
- type: f1
value: 58.70433569191332
- type: precision
value: 56.896926581464015
- type: recall
value: 63.56107660455487
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (srp-eng)
type: mteb/tatoeba-bitext-mining
config: srp-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 94.69999999999999
- type: f1
value: 93.10000000000001
- type: precision
value: 92.35
- type: recall
value: 94.69999999999999
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (epo-eng)
type: mteb/tatoeba-bitext-mining
config: epo-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 96.8
- type: f1
value: 96.01222222222222
- type: precision
value: 95.67083333333332
- type: recall
value: 96.8
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (kzj-eng)
type: mteb/tatoeba-bitext-mining
config: kzj-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 9.2
- type: f1
value: 7.911555250305249
- type: precision
value: 7.631246556216846
- type: recall
value: 9.2
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (awa-eng)
type: mteb/tatoeba-bitext-mining
config: awa-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 77.48917748917748
- type: f1
value: 72.27375798804371
- type: precision
value: 70.14430014430013
- type: recall
value: 77.48917748917748
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (fao-eng)
type: mteb/tatoeba-bitext-mining
config: fao-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 77.09923664122137
- type: f1
value: 72.61541257724463
- type: precision
value: 70.8998380754106
- type: recall
value: 77.09923664122137
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (mal-eng)
type: mteb/tatoeba-bitext-mining
config: mal-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 98.2532751091703
- type: f1
value: 97.69529354682193
- type: precision
value: 97.42843279961184
- type: recall
value: 98.2532751091703
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (ile-eng)
type: mteb/tatoeba-bitext-mining
config: ile-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 82.8
- type: f1
value: 79.14672619047619
- type: precision
value: 77.59489247311828
- type: recall
value: 82.8
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (bos-eng)
type: mteb/tatoeba-bitext-mining
config: bos-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 94.35028248587571
- type: f1
value: 92.86252354048965
- type: precision
value: 92.2080979284369
- type: recall
value: 94.35028248587571
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (cor-eng)
type: mteb/tatoeba-bitext-mining
config: cor-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 8.5
- type: f1
value: 6.282429263935621
- type: precision
value: 5.783274240739785
- type: recall
value: 8.5
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (cat-eng)
type: mteb/tatoeba-bitext-mining
config: cat-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 92.7
- type: f1
value: 91.025
- type: precision
value: 90.30428571428571
- type: recall
value: 92.7
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (eus-eng)
type: mteb/tatoeba-bitext-mining
config: eus-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 81
- type: f1
value: 77.8232380952381
- type: precision
value: 76.60194444444444
- type: recall
value: 81
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (yue-eng)
type: mteb/tatoeba-bitext-mining
config: yue-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 91
- type: f1
value: 88.70857142857142
- type: precision
value: 87.7
- type: recall
value: 91
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (swe-eng)
type: mteb/tatoeba-bitext-mining
config: swe-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 96.39999999999999
- type: f1
value: 95.3
- type: precision
value: 94.76666666666667
- type: recall
value: 96.39999999999999
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (dtp-eng)
type: mteb/tatoeba-bitext-mining
config: dtp-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 8.1
- type: f1
value: 7.001008218834307
- type: precision
value: 6.708329562594269
- type: recall
value: 8.1
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (kat-eng)
type: mteb/tatoeba-bitext-mining
config: kat-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 87.1313672922252
- type: f1
value: 84.09070598748882
- type: precision
value: 82.79171454104429
- type: recall
value: 87.1313672922252
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (jpn-eng)
type: mteb/tatoeba-bitext-mining
config: jpn-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 96.39999999999999
- type: f1
value: 95.28333333333333
- type: precision
value: 94.73333333333332
- type: recall
value: 96.39999999999999
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (csb-eng)
type: mteb/tatoeba-bitext-mining
config: csb-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 42.29249011857708
- type: f1
value: 36.981018542283365
- type: precision
value: 35.415877813576024
- type: recall
value: 42.29249011857708
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (xho-eng)
type: mteb/tatoeba-bitext-mining
config: xho-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 83.80281690140845
- type: f1
value: 80.86854460093896
- type: precision
value: 79.60093896713614
- type: recall
value: 83.80281690140845
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (orv-eng)
type: mteb/tatoeba-bitext-mining
config: orv-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 45.26946107784431
- type: f1
value: 39.80235464678088
- type: precision
value: 38.14342660001342
- type: recall
value: 45.26946107784431
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (ind-eng)
type: mteb/tatoeba-bitext-mining
config: ind-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 94.3
- type: f1
value: 92.9
- type: precision
value: 92.26666666666668
- type: recall
value: 94.3
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (tuk-eng)
type: mteb/tatoeba-bitext-mining
config: tuk-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 37.93103448275862
- type: f1
value: 33.15192743764172
- type: precision
value: 31.57456528146183
- type: recall
value: 37.93103448275862
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (max-eng)
type: mteb/tatoeba-bitext-mining
config: max-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 69.01408450704226
- type: f1
value: 63.41549295774648
- type: precision
value: 61.342778895595806
- type: recall
value: 69.01408450704226
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (swh-eng)
type: mteb/tatoeba-bitext-mining
config: swh-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 76.66666666666667
- type: f1
value: 71.60705960705961
- type: precision
value: 69.60683760683762
- type: recall
value: 76.66666666666667
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (hin-eng)
type: mteb/tatoeba-bitext-mining
config: hin-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 95.8
- type: f1
value: 94.48333333333333
- type: precision
value: 93.83333333333333
- type: recall
value: 95.8
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (dsb-eng)
type: mteb/tatoeba-bitext-mining
config: dsb-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 52.81837160751566
- type: f1
value: 48.435977731384824
- type: precision
value: 47.11291973845539
- type: recall
value: 52.81837160751566
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (ber-eng)
type: mteb/tatoeba-bitext-mining
config: ber-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 44.9
- type: f1
value: 38.88962621607783
- type: precision
value: 36.95936507936508
- type: recall
value: 44.9
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (tam-eng)
type: mteb/tatoeba-bitext-mining
config: tam-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 90.55374592833876
- type: f1
value: 88.22553125484721
- type: precision
value: 87.26927252985884
- type: recall
value: 90.55374592833876
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (slk-eng)
type: mteb/tatoeba-bitext-mining
config: slk-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 94.6
- type: f1
value: 93.13333333333333
- type: precision
value: 92.45333333333333
- type: recall
value: 94.6
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (tgl-eng)
type: mteb/tatoeba-bitext-mining
config: tgl-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 93.7
- type: f1
value: 91.99666666666667
- type: precision
value: 91.26666666666668
- type: recall
value: 93.7
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (ast-eng)
type: mteb/tatoeba-bitext-mining
config: ast-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 85.03937007874016
- type: f1
value: 81.75853018372703
- type: precision
value: 80.34120734908137
- type: recall
value: 85.03937007874016
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (mkd-eng)
type: mteb/tatoeba-bitext-mining
config: mkd-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 88.3
- type: f1
value: 85.5
- type: precision
value: 84.25833333333334
- type: recall
value: 88.3
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (khm-eng)
type: mteb/tatoeba-bitext-mining
config: khm-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 65.51246537396122
- type: f1
value: 60.02297410192148
- type: precision
value: 58.133467727289236
- type: recall
value: 65.51246537396122
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (ces-eng)
type: mteb/tatoeba-bitext-mining
config: ces-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 96
- type: f1
value: 94.89
- type: precision
value: 94.39166666666667
- type: recall
value: 96
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (tzl-eng)
type: mteb/tatoeba-bitext-mining
config: tzl-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 57.692307692307686
- type: f1
value: 53.162393162393165
- type: precision
value: 51.70673076923077
- type: recall
value: 57.692307692307686
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (urd-eng)
type: mteb/tatoeba-bitext-mining
config: urd-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 91.60000000000001
- type: f1
value: 89.21190476190475
- type: precision
value: 88.08666666666667
- type: recall
value: 91.60000000000001
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (ara-eng)
type: mteb/tatoeba-bitext-mining
config: ara-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 88
- type: f1
value: 85.47
- type: precision
value: 84.43266233766234
- type: recall
value: 88
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (kor-eng)
type: mteb/tatoeba-bitext-mining
config: kor-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 92.7
- type: f1
value: 90.64999999999999
- type: precision
value: 89.68333333333332
- type: recall
value: 92.7
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (yid-eng)
type: mteb/tatoeba-bitext-mining
config: yid-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 80.30660377358491
- type: f1
value: 76.33044137466307
- type: precision
value: 74.78970125786164
- type: recall
value: 80.30660377358491
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (fin-eng)
type: mteb/tatoeba-bitext-mining
config: fin-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 96.39999999999999
- type: f1
value: 95.44
- type: precision
value: 94.99166666666666
- type: recall
value: 96.39999999999999
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (tha-eng)
type: mteb/tatoeba-bitext-mining
config: tha-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 96.53284671532847
- type: f1
value: 95.37712895377129
- type: precision
value: 94.7992700729927
- type: recall
value: 96.53284671532847
- task:
type: BitextMining
dataset:
name: MTEB Tatoeba (wuu-eng)
type: mteb/tatoeba-bitext-mining
config: wuu-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 89
- type: f1
value: 86.23190476190476
- type: precision
value: 85.035
- type: recall
value: 89
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: webis-touche2020
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 2.585
- type: map_at_10
value: 9.012
- type: map_at_100
value: 14.027000000000001
- type: map_at_1000
value: 15.565000000000001
- type: map_at_3
value: 5.032
- type: map_at_5
value: 6.657
- type: mrr_at_1
value: 28.571
- type: mrr_at_10
value: 45.377
- type: mrr_at_100
value: 46.119
- type: mrr_at_1000
value: 46.127
- type: mrr_at_3
value: 41.156
- type: mrr_at_5
value: 42.585
- type: ndcg_at_1
value: 27.551
- type: ndcg_at_10
value: 23.395
- type: ndcg_at_100
value: 33.342
- type: ndcg_at_1000
value: 45.523
- type: ndcg_at_3
value: 25.158
- type: ndcg_at_5
value: 23.427
- type: precision_at_1
value: 28.571
- type: precision_at_10
value: 21.429000000000002
- type: precision_at_100
value: 6.714
- type: precision_at_1000
value: 1.473
- type: precision_at_3
value: 27.211000000000002
- type: precision_at_5
value: 24.490000000000002
- type: recall_at_1
value: 2.585
- type: recall_at_10
value: 15.418999999999999
- type: recall_at_100
value: 42.485
- type: recall_at_1000
value: 79.536
- type: recall_at_3
value: 6.239999999999999
- type: recall_at_5
value: 8.996
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 71.3234
- type: ap
value: 14.361688653847423
- type: f1
value: 54.819068624319044
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 61.97792869269949
- type: f1
value: 62.28965628513728
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 38.90540145385218
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 86.53513739047506
- type: cos_sim_ap
value: 75.27741586677557
- type: cos_sim_f1
value: 69.18792902473774
- type: cos_sim_precision
value: 67.94708725515136
- type: cos_sim_recall
value: 70.47493403693932
- type: dot_accuracy
value: 84.7052512368123
- type: dot_ap
value: 69.36075482849378
- type: dot_f1
value: 64.44688376631296
- type: dot_precision
value: 59.92288500793831
- type: dot_recall
value: 69.70976253298153
- type: euclidean_accuracy
value: 86.60666388508076
- type: euclidean_ap
value: 75.47512772621097
- type: euclidean_f1
value: 69.413872536473
- type: euclidean_precision
value: 67.39562624254472
- type: euclidean_recall
value: 71.55672823218997
- type: manhattan_accuracy
value: 86.52917684925792
- type: manhattan_ap
value: 75.34000110496703
- type: manhattan_f1
value: 69.28489190226429
- type: manhattan_precision
value: 67.24608889992551
- type: manhattan_recall
value: 71.45118733509234
- type: max_accuracy
value: 86.60666388508076
- type: max_ap
value: 75.47512772621097
- type: max_f1
value: 69.413872536473
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 89.01695967710637
- type: cos_sim_ap
value: 85.8298270742901
- type: cos_sim_f1
value: 78.46988128389272
- type: cos_sim_precision
value: 74.86017897091722
- type: cos_sim_recall
value: 82.44533415460425
- type: dot_accuracy
value: 88.19420188613343
- type: dot_ap
value: 83.82679165901324
- type: dot_f1
value: 76.55833777304208
- type: dot_precision
value: 75.6884875846501
- type: dot_recall
value: 77.44841392054204
- type: euclidean_accuracy
value: 89.03054294252338
- type: euclidean_ap
value: 85.89089555185325
- type: euclidean_f1
value: 78.62997658079624
- type: euclidean_precision
value: 74.92329149232914
- type: euclidean_recall
value: 82.72251308900523
- type: manhattan_accuracy
value: 89.0266620095471
- type: manhattan_ap
value: 85.86458997929147
- type: manhattan_f1
value: 78.50685331000291
- type: manhattan_precision
value: 74.5499861534201
- type: manhattan_recall
value: 82.90729904527257
- type: max_accuracy
value: 89.03054294252338
- type: max_ap
value: 85.89089555185325
- type: max_f1
value: 78.62997658079624
---
# multilingual-e5-large-mlx
This model was converted to MLX format from [`intfloat/multilingual-e5-large`]().
Refer to the [original model card](https://huggingface.co/intfloat/multilingual-e5-large) for more details on the model.
## Use with mlx
```bash
pip install mlx
git clone https://github.com/ml-explore/mlx-examples.git
cd mlx-examples/llms/hf_llm
python generate.py --model mlx-community/multilingual-e5-large-mlx --prompt "My name is"
```
| [
"SUMMARIZATION"
] | [
"BIOSSES",
"SCIFACT"
] |
m42-health/Llama3-Med42-8B | m42-health | text-generation | [
"transformers",
"safetensors",
"llama",
"text-generation",
"m42",
"health",
"healthcare",
"clinical-llm",
"conversational",
"en",
"arxiv:2408.06142",
"license:llama3",
"autotrain_compatible",
"text-generation-inference",
"region:us"
] | 2024-07-02T10:14:40 | 2024-08-20T05:12:05 | 1,966 | 62 | ---
language:
- en
license: llama3
license_name: llama3
pipeline_tag: text-generation
tags:
- m42
- health
- healthcare
- clinical-llm
inference: false
---
# **Med42-v2 - A Suite of Clinically-aligned Large Language Models**
Med42-v2 is a suite of open-access clinical large language models (LLM) instruct and preference-tuned by M42 to expand access to medical knowledge. Built off LLaMA-3 and comprising either 8 or 70 billion parameters, these generative AI systems provide high-quality answers to medical questions.
## Key performance metrics:
- Med42-v2-70B outperforms GPT-4.0 in most of the MCQA tasks.
- Med42-v2-70B achieves a MedQA zero-shot performance of 79.10, surpassing the prior state-of-the-art among all openly available medical LLMs.
- Med42-v2-70B sits at the top of the Clinical Elo Rating Leaderboard.
|Models|Elo Score|
|:---:|:---:|
|**Med42-v2-70B**| 1764 |
|Llama3-70B-Instruct| 1643 |
|GPT4-o| 1426 |
|Llama3-8B-Instruct| 1352 |
|Mixtral-8x7b-Instruct| 970 |
|**Med42-v2-8B**| 924 |
|OpenBioLLM-70B| 657 |
|JSL-MedLlama-3-8B-v2.0| 447 |
## Limitations & Safe Use
- The Med42-v2 suite of models is not ready for real clinical use. Extensive human evaluation is undergoing as it is required to ensure safety.
- Potential for generating incorrect or harmful information.
- Risk of perpetuating biases in training data.
Use this suite of models responsibly! Do not rely on them for medical usage without rigorous safety testing.
## Model Details
*Disclaimer: This large language model is not yet ready for clinical use without further testing and validation. It should not be relied upon for making medical decisions or providing patient care.*
Beginning with Llama3 models, Med42-v2 were instruction-tuned using a dataset of ~1B tokens compiled from different open-access and high-quality sources, including medical flashcards, exam questions, and open-domain dialogues.
**Model Developers:** M42 Health AI Team
**Finetuned from model:** Llama3 - 8B & 70B Instruct
**Context length:** 8k tokens
**Input:** Text only data
**Output:** Model generates text only
**Status:** This is a static model trained on an offline dataset. Future versions of the tuned models will be released as we enhance the model's performance.
**License:** Llama 3 Community License Agreement
**Research Paper:** [Med42-v2: A Suite of Clinical LLMs](https://huggingface.co/papers/2408.06142)
## Intended Use
The Med42-v2 suite of models is being made available for further testing and assessment as AI assistants to enhance clinical decision-making and access to LLMs for healthcare use. Potential use cases include:
- Medical question answering
- Patient record summarization
- Aiding medical diagnosis
- General health Q&A
**Run the model**
You can use the 🤗 Transformers library `text-generation` pipeline to do inference.
```python
import transformers
import torch
model_name_or_path = "m42-health/Llama3-Med42-8B"
pipeline = transformers.pipeline(
"text-generation",
model=model_name_or_path,
torch_dtype=torch.bfloat16,
device_map="auto",
)
messages = [
{
"role": "system",
"content": (
"You are a helpful, respectful and honest medical assistant. You are a second version of Med42 developed by the AI team at M42, UAE. "
"Always answer as helpfully as possible, while being safe. "
"Your answers should not include any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content. "
"Please ensure that your responses are socially unbiased and positive in nature. If a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. "
"If you don’t know the answer to a question, please don’t share false information."
),
},
{"role": "user", "content": "What are the symptoms of diabetes?"},
]
prompt = pipeline.tokenizer.apply_chat_template(
messages, tokenize=False, add_generation_prompt=False
)
stop_tokens = [
pipeline.tokenizer.eos_token_id,
pipeline.tokenizer.convert_tokens_to_ids("<|eot_id|>"),
]
outputs = pipeline(
prompt,
max_new_tokens=512,
eos_token_id=stop_tokens,
do_sample=True,
temperature=0.4,
top_k=150,
top_p=0.75,
)
print(outputs[0]["generated_text"][len(prompt) :])
```
## Hardware and Software
The training was conducted on the NVIDIA DGX cluster with H100 GPUs, utilizing PyTorch's Fully Sharded Data Parallel (FSDP) framework.
## Evaluation Results
### Open-ended question generation
To ensure a robust evaluation of our model's output quality, we employ the LLM-as-a-Judge approach using Prometheus-8x7b-v2.0. Our assessment uses 4,000 carefully curated publicly accessible healthcare-related questions, generating responses from various models. We then use Prometheus to conduct pairwise comparisons of the answers. Drawing inspiration from the LMSYS Chatbot-Arena methodology, we present the results as Elo ratings for each model.
To maintain fairness and eliminate potential bias from prompt engineering, we used the same simple system prompt for every model throughout the evaluation process.
Below is the scoring rubric we used to prompt Prometheus to select the best answer:
```
### Score Rubric:
Which response is of higher overall quality in a medical context? Consider:
* Relevance: Does it directly address the question?
* Completeness: Does it cover all important aspects, details and subpoints?
* Safety: Does it avoid unsafe practices and address potential risks?
* Ethics: Does it maintain confidentiality and avoid biases?
* Clarity: Is it professional, clear and easy to understand?
```
#### Elo Ratings
|Models|Elo Score|
|:---:|:---:|
|**Med42-v2-70B**| 1764 |
|Llama3-70B-Instruct| 1643 |
|GPT4-o| 1426 |
|Llama3-8B-Instruct| 1352 |
|Mixtral-8x7b-Instruct| 970 |
|**Med42-v2-8B**| 924 |
|OpenBioLLM-70B| 657 |
|JSL-MedLlama-3-8B-v2.0| 447 |
#### Win-rate

### MCQA Evaluation
Med42-v2 improves performance on every clinical benchmark compared to our previous version, including MedQA, MedMCQA, USMLE, MMLU clinical topics and MMLU Pro clinical subset. For all evaluations reported so far, we use [EleutherAI's evaluation harness library](https://github.com/EleutherAI/lm-evaluation-harness) and report zero-shot accuracies (except otherwise stated). We integrated chat templates into harness and computed the likelihood for the full answer instead of only the tokens "a.", "b.", "c." or "d.".
|Model|MMLU Pro|MMLU|MedMCQA|MedQA|USMLE|
|---:|:---:|:---:|:---:|:---:|:---:|
|**Med42v2-70B**|64.36|87.12|73.20|79.10|83.80|
|**Med42v2-8B**|54.30|75.76|61.34|62.84|67.04|
|OpenBioLLM-70B|64.24|90.40|73.18|76.90|79.01|
|GPT-4.0<sup>†</sup>|-|87.00|69.50|78.90|84.05|
|MedGemini*|-|-|-|84.00|-|
|Med-PaLM-2 (5-shot)*|-|87.77|71.30|79.70|-|
|Med42|-|76.72|60.90|61.50|71.85|
|ClinicalCamel-70B|-|69.75|47.00|53.40|54.30|
|GPT-3.5<sup>†</sup>|-|66.63|50.10|50.80|53.00|
|Llama3-8B-Instruct|48.24|72.89|59.65|61.64|60.38|
|Llama3-70B-Instruct|64.24|85.99|72.03|78.88|83.57|
**For MedGemini, results are reported for MedQA without self-training and without search. We note that 0-shot performance is not reported for Med-PaLM 2. Further details can be found at [https://github.com/m42health/med42](https://github.com/m42health/med42)*.
<sup>†</sup> *Results as reported in the paper [Capabilities of GPT-4 on Medical Challenge Problems](https://www.microsoft.com/en-us/research/uploads/prod/2023/03/GPT-4_medical_benchmarks.pdf)*.
## Accessing Med42 and Reporting Issues
Please report any software "bug" or other problems through one of the following means:
- Reporting issues with the model: [https://github.com/m42health/med42](https://github.com/m42health/med42)
- Reporting risky content generated by the model, bugs and/or any security concerns: [https://forms.office.com/r/fPY4Ksecgf](https://forms.office.com/r/fPY4Ksecgf)
- M42’s privacy policy available at [https://m42.ae/privacy-policy/](https://m42.ae/privacy-policy/)
- Reporting violations of the Acceptable Use Policy or unlicensed uses of Med42: <[email protected]>
## Acknowledgements
We thank the Torch FSDP team for their robust distributed training framework, the EleutherAI harness team for their valuable evaluation tools, and the Hugging Face Alignment team for their contributions to responsible AI development.
## Citation
```
@misc{med42v2,
Author = {Cl{\'e}ment Christophe and Praveen K Kanithi and Tathagata Raha and Shadab Khan and Marco AF Pimentel},
Title = {Med42-v2: A Suite of Clinical LLMs},
Year = {2024},
Eprint = {arXiv:2408.06142},
url={https://arxiv.org/abs/2408.06142},
}
```
| [
"QUESTION_ANSWERING",
"SUMMARIZATION"
] | [
"MEDQA"
] |
GritLM/GritLM-8x7B | GritLM | text-generation | [
"transformers",
"pytorch",
"safetensors",
"mixtral",
"text-generation",
"mteb",
"conversational",
"custom_code",
"dataset:GritLM/tulu2",
"arxiv:2402.09906",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | 2024-02-11T16:02:26 | 2024-02-16T10:14:34 | 1,829 | 35 | ---
datasets:
- GritLM/tulu2
license: apache-2.0
pipeline_tag: text-generation
tags:
- mteb
inference: true
model-index:
- name: GritLM-8x7B
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 80.47761194029852
- type: ap
value: 44.38751347932197
- type: f1
value: 74.33580162208256
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 96.32155000000002
- type: ap
value: 94.8026654593679
- type: f1
value: 96.3209869463974
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 57.18400000000001
- type: f1
value: 55.945160479400954
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: arguana
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 34.353
- type: map_at_10
value: 50.773
- type: map_at_100
value: 51.515
- type: map_at_1000
value: 51.517
- type: map_at_3
value: 46.29
- type: map_at_5
value: 48.914
- type: mrr_at_1
value: 35.135
- type: mrr_at_10
value: 51.036
- type: mrr_at_100
value: 51.785000000000004
- type: mrr_at_1000
value: 51.787000000000006
- type: mrr_at_3
value: 46.562
- type: mrr_at_5
value: 49.183
- type: ndcg_at_1
value: 34.353
- type: ndcg_at_10
value: 59.492
- type: ndcg_at_100
value: 62.395999999999994
- type: ndcg_at_1000
value: 62.44499999999999
- type: ndcg_at_3
value: 50.217
- type: ndcg_at_5
value: 54.98499999999999
- type: precision_at_1
value: 34.353
- type: precision_at_10
value: 8.72
- type: precision_at_100
value: 0.993
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 20.531
- type: precision_at_5
value: 14.651
- type: recall_at_1
value: 34.353
- type: recall_at_10
value: 87.198
- type: recall_at_100
value: 99.289
- type: recall_at_1000
value: 99.644
- type: recall_at_3
value: 61.592999999999996
- type: recall_at_5
value: 73.257
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 50.720077577006286
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 48.01021098734129
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 65.59672236627206
- type: mrr
value: 78.01191575429802
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 89.52452252271826
- type: cos_sim_spearman
value: 87.34415887061094
- type: euclidean_pearson
value: 87.46187616533932
- type: euclidean_spearman
value: 85.44712769366146
- type: manhattan_pearson
value: 87.56696679505373
- type: manhattan_spearman
value: 86.01581535039067
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 87.4577922077922
- type: f1
value: 87.38432712848123
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 41.41290357360428
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 38.67213605633667
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 37.545
- type: map_at_10
value: 50.015
- type: map_at_100
value: 51.763999999999996
- type: map_at_1000
value: 51.870000000000005
- type: map_at_3
value: 46.129999999999995
- type: map_at_5
value: 48.473
- type: mrr_at_1
value: 47.638999999999996
- type: mrr_at_10
value: 56.913000000000004
- type: mrr_at_100
value: 57.619
- type: mrr_at_1000
value: 57.648999999999994
- type: mrr_at_3
value: 54.435
- type: mrr_at_5
value: 56.059000000000005
- type: ndcg_at_1
value: 47.638999999999996
- type: ndcg_at_10
value: 56.664
- type: ndcg_at_100
value: 62.089000000000006
- type: ndcg_at_1000
value: 63.415
- type: ndcg_at_3
value: 51.842999999999996
- type: ndcg_at_5
value: 54.30199999999999
- type: precision_at_1
value: 47.638999999999996
- type: precision_at_10
value: 10.886999999999999
- type: precision_at_100
value: 1.722
- type: precision_at_1000
value: 0.212
- type: precision_at_3
value: 25.179000000000002
- type: precision_at_5
value: 18.226
- type: recall_at_1
value: 37.545
- type: recall_at_10
value: 68.118
- type: recall_at_100
value: 90.381
- type: recall_at_1000
value: 98.556
- type: recall_at_3
value: 53.319
- type: recall_at_5
value: 60.574
- type: map_at_1
value: 37.066
- type: map_at_10
value: 49.464000000000006
- type: map_at_100
value: 50.79900000000001
- type: map_at_1000
value: 50.928
- type: map_at_3
value: 46.133
- type: map_at_5
value: 47.941
- type: mrr_at_1
value: 48.025
- type: mrr_at_10
value: 56.16100000000001
- type: mrr_at_100
value: 56.725
- type: mrr_at_1000
value: 56.757000000000005
- type: mrr_at_3
value: 54.31
- type: mrr_at_5
value: 55.285
- type: ndcg_at_1
value: 48.025
- type: ndcg_at_10
value: 55.467
- type: ndcg_at_100
value: 59.391000000000005
- type: ndcg_at_1000
value: 61.086
- type: ndcg_at_3
value: 51.733
- type: ndcg_at_5
value: 53.223
- type: precision_at_1
value: 48.025
- type: precision_at_10
value: 10.656
- type: precision_at_100
value: 1.6070000000000002
- type: precision_at_1000
value: 0.20600000000000002
- type: precision_at_3
value: 25.499
- type: precision_at_5
value: 17.771
- type: recall_at_1
value: 37.066
- type: recall_at_10
value: 65.062
- type: recall_at_100
value: 81.662
- type: recall_at_1000
value: 91.913
- type: recall_at_3
value: 52.734
- type: recall_at_5
value: 57.696999999999996
- type: map_at_1
value: 46.099000000000004
- type: map_at_10
value: 59.721999999999994
- type: map_at_100
value: 60.675000000000004
- type: map_at_1000
value: 60.708
- type: map_at_3
value: 55.852000000000004
- type: map_at_5
value: 58.426
- type: mrr_at_1
value: 53.417
- type: mrr_at_10
value: 63.597
- type: mrr_at_100
value: 64.12299999999999
- type: mrr_at_1000
value: 64.13799999999999
- type: mrr_at_3
value: 61.149
- type: mrr_at_5
value: 62.800999999999995
- type: ndcg_at_1
value: 53.417
- type: ndcg_at_10
value: 65.90899999999999
- type: ndcg_at_100
value: 69.312
- type: ndcg_at_1000
value: 69.89
- type: ndcg_at_3
value: 60.089999999999996
- type: ndcg_at_5
value: 63.575
- type: precision_at_1
value: 53.417
- type: precision_at_10
value: 10.533
- type: precision_at_100
value: 1.313
- type: precision_at_1000
value: 0.13899999999999998
- type: precision_at_3
value: 26.667
- type: precision_at_5
value: 18.671
- type: recall_at_1
value: 46.099000000000004
- type: recall_at_10
value: 80.134
- type: recall_at_100
value: 94.536
- type: recall_at_1000
value: 98.543
- type: recall_at_3
value: 65.026
- type: recall_at_5
value: 73.462
- type: map_at_1
value: 28.261999999999997
- type: map_at_10
value: 38.012
- type: map_at_100
value: 39.104
- type: map_at_1000
value: 39.177
- type: map_at_3
value: 35.068
- type: map_at_5
value: 36.620000000000005
- type: mrr_at_1
value: 30.847
- type: mrr_at_10
value: 40.251999999999995
- type: mrr_at_100
value: 41.174
- type: mrr_at_1000
value: 41.227999999999994
- type: mrr_at_3
value: 37.74
- type: mrr_at_5
value: 38.972
- type: ndcg_at_1
value: 30.847
- type: ndcg_at_10
value: 43.513000000000005
- type: ndcg_at_100
value: 48.771
- type: ndcg_at_1000
value: 50.501
- type: ndcg_at_3
value: 37.861
- type: ndcg_at_5
value: 40.366
- type: precision_at_1
value: 30.847
- type: precision_at_10
value: 6.7909999999999995
- type: precision_at_100
value: 0.992
- type: precision_at_1000
value: 0.117
- type: precision_at_3
value: 16.234
- type: precision_at_5
value: 11.254
- type: recall_at_1
value: 28.261999999999997
- type: recall_at_10
value: 58.292
- type: recall_at_100
value: 82.24000000000001
- type: recall_at_1000
value: 95.042
- type: recall_at_3
value: 42.955
- type: recall_at_5
value: 48.973
- type: map_at_1
value: 18.281
- type: map_at_10
value: 27.687
- type: map_at_100
value: 28.9
- type: map_at_1000
value: 29.019000000000002
- type: map_at_3
value: 24.773
- type: map_at_5
value: 26.180999999999997
- type: mrr_at_1
value: 23.01
- type: mrr_at_10
value: 32.225
- type: mrr_at_100
value: 33.054
- type: mrr_at_1000
value: 33.119
- type: mrr_at_3
value: 29.353
- type: mrr_at_5
value: 30.846
- type: ndcg_at_1
value: 23.01
- type: ndcg_at_10
value: 33.422000000000004
- type: ndcg_at_100
value: 39.108
- type: ndcg_at_1000
value: 41.699999999999996
- type: ndcg_at_3
value: 28.083999999999996
- type: ndcg_at_5
value: 30.164
- type: precision_at_1
value: 23.01
- type: precision_at_10
value: 6.493
- type: precision_at_100
value: 1.077
- type: precision_at_1000
value: 0.14100000000000001
- type: precision_at_3
value: 13.930000000000001
- type: precision_at_5
value: 10.075000000000001
- type: recall_at_1
value: 18.281
- type: recall_at_10
value: 46.318
- type: recall_at_100
value: 71.327
- type: recall_at_1000
value: 89.716
- type: recall_at_3
value: 31.517
- type: recall_at_5
value: 36.821
- type: map_at_1
value: 36.575
- type: map_at_10
value: 49.235
- type: map_at_100
value: 50.723
- type: map_at_1000
value: 50.809000000000005
- type: map_at_3
value: 45.696999999999996
- type: map_at_5
value: 47.588
- type: mrr_at_1
value: 45.525
- type: mrr_at_10
value: 55.334
- type: mrr_at_100
value: 56.092
- type: mrr_at_1000
value: 56.118
- type: mrr_at_3
value: 53.032000000000004
- type: mrr_at_5
value: 54.19199999999999
- type: ndcg_at_1
value: 45.525
- type: ndcg_at_10
value: 55.542
- type: ndcg_at_100
value: 60.879000000000005
- type: ndcg_at_1000
value: 62.224999999999994
- type: ndcg_at_3
value: 50.688
- type: ndcg_at_5
value: 52.76499999999999
- type: precision_at_1
value: 45.525
- type: precision_at_10
value: 10.067
- type: precision_at_100
value: 1.471
- type: precision_at_1000
value: 0.173
- type: precision_at_3
value: 24.382
- type: precision_at_5
value: 16.919999999999998
- type: recall_at_1
value: 36.575
- type: recall_at_10
value: 67.903
- type: recall_at_100
value: 89.464
- type: recall_at_1000
value: 97.799
- type: recall_at_3
value: 53.493
- type: recall_at_5
value: 59.372
- type: map_at_1
value: 29.099000000000004
- type: map_at_10
value: 42.147
- type: map_at_100
value: 43.522
- type: map_at_1000
value: 43.624
- type: map_at_3
value: 38.104
- type: map_at_5
value: 40.435
- type: mrr_at_1
value: 36.416
- type: mrr_at_10
value: 47.922
- type: mrr_at_100
value: 48.664
- type: mrr_at_1000
value: 48.709
- type: mrr_at_3
value: 44.977000000000004
- type: mrr_at_5
value: 46.838
- type: ndcg_at_1
value: 36.416
- type: ndcg_at_10
value: 49.307
- type: ndcg_at_100
value: 54.332
- type: ndcg_at_1000
value: 56.145
- type: ndcg_at_3
value: 42.994
- type: ndcg_at_5
value: 46.119
- type: precision_at_1
value: 36.416
- type: precision_at_10
value: 9.452
- type: precision_at_100
value: 1.4080000000000001
- type: precision_at_1000
value: 0.172
- type: precision_at_3
value: 21.081
- type: precision_at_5
value: 15.501999999999999
- type: recall_at_1
value: 29.099000000000004
- type: recall_at_10
value: 64.485
- type: recall_at_100
value: 84.753
- type: recall_at_1000
value: 96.875
- type: recall_at_3
value: 47.06
- type: recall_at_5
value: 55.077
- type: map_at_1
value: 30.69458333333333
- type: map_at_10
value: 41.65291666666666
- type: map_at_100
value: 42.95775
- type: map_at_1000
value: 43.06258333333333
- type: map_at_3
value: 38.335750000000004
- type: map_at_5
value: 40.20941666666666
- type: mrr_at_1
value: 37.013000000000005
- type: mrr_at_10
value: 46.30600000000001
- type: mrr_at_100
value: 47.094666666666676
- type: mrr_at_1000
value: 47.139583333333334
- type: mrr_at_3
value: 43.805749999999996
- type: mrr_at_5
value: 45.22366666666666
- type: ndcg_at_1
value: 37.013000000000005
- type: ndcg_at_10
value: 47.63491666666667
- type: ndcg_at_100
value: 52.71083333333334
- type: ndcg_at_1000
value: 54.493583333333326
- type: ndcg_at_3
value: 42.43616666666666
- type: ndcg_at_5
value: 44.87583333333334
- type: precision_at_1
value: 37.013000000000005
- type: precision_at_10
value: 8.481583333333333
- type: precision_at_100
value: 1.3073333333333337
- type: precision_at_1000
value: 0.16341666666666668
- type: precision_at_3
value: 19.811833333333333
- type: precision_at_5
value: 14.07691666666667
- type: recall_at_1
value: 30.69458333333333
- type: recall_at_10
value: 60.462083333333325
- type: recall_at_100
value: 82.42325000000001
- type: recall_at_1000
value: 94.53291666666667
- type: recall_at_3
value: 45.7405
- type: recall_at_5
value: 52.14025
- type: map_at_1
value: 27.833000000000002
- type: map_at_10
value: 36.55
- type: map_at_100
value: 37.524
- type: map_at_1000
value: 37.613
- type: map_at_3
value: 33.552
- type: map_at_5
value: 35.173
- type: mrr_at_1
value: 31.135
- type: mrr_at_10
value: 39.637
- type: mrr_at_100
value: 40.361000000000004
- type: mrr_at_1000
value: 40.422000000000004
- type: mrr_at_3
value: 36.887
- type: mrr_at_5
value: 38.428000000000004
- type: ndcg_at_1
value: 31.135
- type: ndcg_at_10
value: 42.007
- type: ndcg_at_100
value: 46.531
- type: ndcg_at_1000
value: 48.643
- type: ndcg_at_3
value: 36.437999999999995
- type: ndcg_at_5
value: 39.021
- type: precision_at_1
value: 31.135
- type: precision_at_10
value: 6.856
- type: precision_at_100
value: 0.988
- type: precision_at_1000
value: 0.125
- type: precision_at_3
value: 15.9
- type: precision_at_5
value: 11.227
- type: recall_at_1
value: 27.833000000000002
- type: recall_at_10
value: 55.711
- type: recall_at_100
value: 76.255
- type: recall_at_1000
value: 91.51899999999999
- type: recall_at_3
value: 40.22
- type: recall_at_5
value: 46.69
- type: map_at_1
value: 21.274
- type: map_at_10
value: 29.925
- type: map_at_100
value: 31.171
- type: map_at_1000
value: 31.296000000000003
- type: map_at_3
value: 27.209
- type: map_at_5
value: 28.707
- type: mrr_at_1
value: 26.462000000000003
- type: mrr_at_10
value: 34.604
- type: mrr_at_100
value: 35.554
- type: mrr_at_1000
value: 35.622
- type: mrr_at_3
value: 32.295
- type: mrr_at_5
value: 33.598
- type: ndcg_at_1
value: 26.462000000000003
- type: ndcg_at_10
value: 35.193000000000005
- type: ndcg_at_100
value: 40.876000000000005
- type: ndcg_at_1000
value: 43.442
- type: ndcg_at_3
value: 30.724
- type: ndcg_at_5
value: 32.735
- type: precision_at_1
value: 26.462000000000003
- type: precision_at_10
value: 6.438000000000001
- type: precision_at_100
value: 1.093
- type: precision_at_1000
value: 0.15
- type: precision_at_3
value: 14.636
- type: precision_at_5
value: 10.496
- type: recall_at_1
value: 21.274
- type: recall_at_10
value: 46.322
- type: recall_at_100
value: 71.702
- type: recall_at_1000
value: 89.405
- type: recall_at_3
value: 33.444
- type: recall_at_5
value: 38.83
- type: map_at_1
value: 31.174000000000003
- type: map_at_10
value: 42.798
- type: map_at_100
value: 43.996
- type: map_at_1000
value: 44.088
- type: map_at_3
value: 39.255
- type: map_at_5
value: 41.336
- type: mrr_at_1
value: 37.22
- type: mrr_at_10
value: 47.035
- type: mrr_at_100
value: 47.833999999999996
- type: mrr_at_1000
value: 47.88
- type: mrr_at_3
value: 44.248
- type: mrr_at_5
value: 45.815
- type: ndcg_at_1
value: 37.22
- type: ndcg_at_10
value: 48.931999999999995
- type: ndcg_at_100
value: 53.991
- type: ndcg_at_1000
value: 55.825
- type: ndcg_at_3
value: 43.144
- type: ndcg_at_5
value: 45.964
- type: precision_at_1
value: 37.22
- type: precision_at_10
value: 8.451
- type: precision_at_100
value: 1.2189999999999999
- type: precision_at_1000
value: 0.149
- type: precision_at_3
value: 20.087
- type: precision_at_5
value: 14.235000000000001
- type: recall_at_1
value: 31.174000000000003
- type: recall_at_10
value: 63.232
- type: recall_at_100
value: 84.747
- type: recall_at_1000
value: 97.006
- type: recall_at_3
value: 47.087
- type: recall_at_5
value: 54.493
- type: map_at_1
value: 29.628
- type: map_at_10
value: 39.995999999999995
- type: map_at_100
value: 41.899
- type: map_at_1000
value: 42.125
- type: map_at_3
value: 36.345
- type: map_at_5
value: 38.474000000000004
- type: mrr_at_1
value: 36.364000000000004
- type: mrr_at_10
value: 45.293
- type: mrr_at_100
value: 46.278999999999996
- type: mrr_at_1000
value: 46.318
- type: mrr_at_3
value: 42.522999999999996
- type: mrr_at_5
value: 44.104
- type: ndcg_at_1
value: 36.364000000000004
- type: ndcg_at_10
value: 46.622
- type: ndcg_at_100
value: 52.617000000000004
- type: ndcg_at_1000
value: 54.529
- type: ndcg_at_3
value: 40.971999999999994
- type: ndcg_at_5
value: 43.738
- type: precision_at_1
value: 36.364000000000004
- type: precision_at_10
value: 9.110999999999999
- type: precision_at_100
value: 1.846
- type: precision_at_1000
value: 0.256
- type: precision_at_3
value: 19.236
- type: precision_at_5
value: 14.269000000000002
- type: recall_at_1
value: 29.628
- type: recall_at_10
value: 58.706
- type: recall_at_100
value: 85.116
- type: recall_at_1000
value: 97.258
- type: recall_at_3
value: 42.655
- type: recall_at_5
value: 49.909
- type: map_at_1
value: 25.499
- type: map_at_10
value: 34.284
- type: map_at_100
value: 35.416
- type: map_at_1000
value: 35.494
- type: map_at_3
value: 31.911
- type: map_at_5
value: 33.159
- type: mrr_at_1
value: 28.096
- type: mrr_at_10
value: 36.699
- type: mrr_at_100
value: 37.657000000000004
- type: mrr_at_1000
value: 37.714999999999996
- type: mrr_at_3
value: 34.72
- type: mrr_at_5
value: 35.746
- type: ndcg_at_1
value: 28.096
- type: ndcg_at_10
value: 39.041
- type: ndcg_at_100
value: 44.633
- type: ndcg_at_1000
value: 46.522000000000006
- type: ndcg_at_3
value: 34.663
- type: ndcg_at_5
value: 36.538
- type: precision_at_1
value: 28.096
- type: precision_at_10
value: 6.0440000000000005
- type: precision_at_100
value: 0.9520000000000001
- type: precision_at_1000
value: 0.121
- type: precision_at_3
value: 14.911
- type: precision_at_5
value: 10.277
- type: recall_at_1
value: 25.499
- type: recall_at_10
value: 51.26199999999999
- type: recall_at_100
value: 76.896
- type: recall_at_1000
value: 90.763
- type: recall_at_3
value: 39.376
- type: recall_at_5
value: 43.785000000000004
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: climate-fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 10.532
- type: map_at_10
value: 19.911
- type: map_at_100
value: 21.926000000000002
- type: map_at_1000
value: 22.113
- type: map_at_3
value: 16.118
- type: map_at_5
value: 18.043
- type: mrr_at_1
value: 23.909
- type: mrr_at_10
value: 37.029
- type: mrr_at_100
value: 38.015
- type: mrr_at_1000
value: 38.054
- type: mrr_at_3
value: 33.29
- type: mrr_at_5
value: 35.446
- type: ndcg_at_1
value: 23.909
- type: ndcg_at_10
value: 28.691
- type: ndcg_at_100
value: 36.341
- type: ndcg_at_1000
value: 39.644
- type: ndcg_at_3
value: 22.561
- type: ndcg_at_5
value: 24.779999999999998
- type: precision_at_1
value: 23.909
- type: precision_at_10
value: 9.433
- type: precision_at_100
value: 1.763
- type: precision_at_1000
value: 0.23800000000000002
- type: precision_at_3
value: 17.438000000000002
- type: precision_at_5
value: 13.758999999999999
- type: recall_at_1
value: 10.532
- type: recall_at_10
value: 36.079
- type: recall_at_100
value: 62.156
- type: recall_at_1000
value: 80.53099999999999
- type: recall_at_3
value: 21.384
- type: recall_at_5
value: 27.29
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: dbpedia-entity
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 9.483
- type: map_at_10
value: 21.986
- type: map_at_100
value: 31.319000000000003
- type: map_at_1000
value: 33.231
- type: map_at_3
value: 15.193000000000001
- type: map_at_5
value: 18.116
- type: mrr_at_1
value: 74.0
- type: mrr_at_10
value: 80.047
- type: mrr_at_100
value: 80.406
- type: mrr_at_1000
value: 80.414
- type: mrr_at_3
value: 78.667
- type: mrr_at_5
value: 79.467
- type: ndcg_at_1
value: 61.875
- type: ndcg_at_10
value: 46.544999999999995
- type: ndcg_at_100
value: 51.097
- type: ndcg_at_1000
value: 58.331999999999994
- type: ndcg_at_3
value: 51.622
- type: ndcg_at_5
value: 49.016
- type: precision_at_1
value: 74.0
- type: precision_at_10
value: 37.325
- type: precision_at_100
value: 11.743
- type: precision_at_1000
value: 2.423
- type: precision_at_3
value: 54.75
- type: precision_at_5
value: 47.699999999999996
- type: recall_at_1
value: 9.483
- type: recall_at_10
value: 27.477
- type: recall_at_100
value: 57.099999999999994
- type: recall_at_1000
value: 80.56
- type: recall_at_3
value: 16.543
- type: recall_at_5
value: 20.830000000000002
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 50.06
- type: f1
value: 44.99375486940016
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 70.94
- type: map_at_10
value: 80.854
- type: map_at_100
value: 81.096
- type: map_at_1000
value: 81.109
- type: map_at_3
value: 79.589
- type: map_at_5
value: 80.431
- type: mrr_at_1
value: 76.44800000000001
- type: mrr_at_10
value: 85.07000000000001
- type: mrr_at_100
value: 85.168
- type: mrr_at_1000
value: 85.17
- type: mrr_at_3
value: 84.221
- type: mrr_at_5
value: 84.832
- type: ndcg_at_1
value: 76.44800000000001
- type: ndcg_at_10
value: 85.019
- type: ndcg_at_100
value: 85.886
- type: ndcg_at_1000
value: 86.09400000000001
- type: ndcg_at_3
value: 83.023
- type: ndcg_at_5
value: 84.223
- type: precision_at_1
value: 76.44800000000001
- type: precision_at_10
value: 10.405000000000001
- type: precision_at_100
value: 1.105
- type: precision_at_1000
value: 0.11399999999999999
- type: precision_at_3
value: 32.208
- type: precision_at_5
value: 20.122999999999998
- type: recall_at_1
value: 70.94
- type: recall_at_10
value: 93.508
- type: recall_at_100
value: 96.962
- type: recall_at_1000
value: 98.24300000000001
- type: recall_at_3
value: 88.17099999999999
- type: recall_at_5
value: 91.191
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: fiqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 23.844
- type: map_at_10
value: 41.629
- type: map_at_100
value: 43.766
- type: map_at_1000
value: 43.916
- type: map_at_3
value: 35.992000000000004
- type: map_at_5
value: 39.302
- type: mrr_at_1
value: 45.988
- type: mrr_at_10
value: 56.050999999999995
- type: mrr_at_100
value: 56.741
- type: mrr_at_1000
value: 56.767999999999994
- type: mrr_at_3
value: 53.498000000000005
- type: mrr_at_5
value: 55.071999999999996
- type: ndcg_at_1
value: 45.988
- type: ndcg_at_10
value: 49.891999999999996
- type: ndcg_at_100
value: 56.727000000000004
- type: ndcg_at_1000
value: 58.952000000000005
- type: ndcg_at_3
value: 45.09
- type: ndcg_at_5
value: 46.943
- type: precision_at_1
value: 45.988
- type: precision_at_10
value: 13.980999999999998
- type: precision_at_100
value: 2.136
- type: precision_at_1000
value: 0.252
- type: precision_at_3
value: 30.556
- type: precision_at_5
value: 22.778000000000002
- type: recall_at_1
value: 23.844
- type: recall_at_10
value: 58.46
- type: recall_at_100
value: 82.811
- type: recall_at_1000
value: 96.084
- type: recall_at_3
value: 41.636
- type: recall_at_5
value: 49.271
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: hotpotqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 40.108
- type: map_at_10
value: 65.846
- type: map_at_100
value: 66.691
- type: map_at_1000
value: 66.743
- type: map_at_3
value: 62.09
- type: map_at_5
value: 64.412
- type: mrr_at_1
value: 80.216
- type: mrr_at_10
value: 85.768
- type: mrr_at_100
value: 85.92699999999999
- type: mrr_at_1000
value: 85.932
- type: mrr_at_3
value: 85.012
- type: mrr_at_5
value: 85.495
- type: ndcg_at_1
value: 80.216
- type: ndcg_at_10
value: 73.833
- type: ndcg_at_100
value: 76.68
- type: ndcg_at_1000
value: 77.639
- type: ndcg_at_3
value: 68.7
- type: ndcg_at_5
value: 71.514
- type: precision_at_1
value: 80.216
- type: precision_at_10
value: 15.616
- type: precision_at_100
value: 1.783
- type: precision_at_1000
value: 0.191
- type: precision_at_3
value: 44.483
- type: precision_at_5
value: 28.904999999999998
- type: recall_at_1
value: 40.108
- type: recall_at_10
value: 78.082
- type: recall_at_100
value: 89.129
- type: recall_at_1000
value: 95.381
- type: recall_at_3
value: 66.725
- type: recall_at_5
value: 72.262
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 94.3208
- type: ap
value: 91.64852216825692
- type: f1
value: 94.31672442494217
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: msmarco
config: default
split: dev
revision: None
metrics:
- type: map_at_1
value: 16.954
- type: map_at_10
value: 28.605000000000004
- type: map_at_100
value: 29.875
- type: map_at_1000
value: 29.934
- type: map_at_3
value: 24.57
- type: map_at_5
value: 26.845000000000002
- type: mrr_at_1
value: 17.407
- type: mrr_at_10
value: 29.082
- type: mrr_at_100
value: 30.309
- type: mrr_at_1000
value: 30.361
- type: mrr_at_3
value: 25.112000000000002
- type: mrr_at_5
value: 27.37
- type: ndcg_at_1
value: 17.407
- type: ndcg_at_10
value: 35.555
- type: ndcg_at_100
value: 41.808
- type: ndcg_at_1000
value: 43.277
- type: ndcg_at_3
value: 27.291999999999998
- type: ndcg_at_5
value: 31.369999999999997
- type: precision_at_1
value: 17.407
- type: precision_at_10
value: 5.9670000000000005
- type: precision_at_100
value: 0.9119999999999999
- type: precision_at_1000
value: 0.104
- type: precision_at_3
value: 11.939
- type: precision_at_5
value: 9.223
- type: recall_at_1
value: 16.954
- type: recall_at_10
value: 57.216
- type: recall_at_100
value: 86.384
- type: recall_at_1000
value: 97.64
- type: recall_at_3
value: 34.660999999999994
- type: recall_at_5
value: 44.484
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 95.29183766529867
- type: f1
value: 95.01282555921513
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 87.07934336525307
- type: f1
value: 69.58693991783085
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 79.71755211835911
- type: f1
value: 77.08207736007755
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 81.08607935440484
- type: f1
value: 80.71191664406739
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 36.5355083590869
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 37.24173539348128
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 32.84293003435578
- type: mrr
value: 34.09721970493348
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: nfcorpus
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 6.369
- type: map_at_10
value: 14.892
- type: map_at_100
value: 18.884999999999998
- type: map_at_1000
value: 20.43
- type: map_at_3
value: 10.735999999999999
- type: map_at_5
value: 12.703000000000001
- type: mrr_at_1
value: 50.15500000000001
- type: mrr_at_10
value: 59.948
- type: mrr_at_100
value: 60.422
- type: mrr_at_1000
value: 60.455999999999996
- type: mrr_at_3
value: 58.204
- type: mrr_at_5
value: 59.35
- type: ndcg_at_1
value: 47.678
- type: ndcg_at_10
value: 39.050000000000004
- type: ndcg_at_100
value: 35.905
- type: ndcg_at_1000
value: 44.662
- type: ndcg_at_3
value: 44.781
- type: ndcg_at_5
value: 42.549
- type: precision_at_1
value: 49.226
- type: precision_at_10
value: 28.762
- type: precision_at_100
value: 8.767999999999999
- type: precision_at_1000
value: 2.169
- type: precision_at_3
value: 41.796
- type: precision_at_5
value: 37.09
- type: recall_at_1
value: 6.369
- type: recall_at_10
value: 19.842000000000002
- type: recall_at_100
value: 37.017
- type: recall_at_1000
value: 68.444
- type: recall_at_3
value: 12.446
- type: recall_at_5
value: 15.525
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: nq
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 39.663
- type: map_at_10
value: 56.252
- type: map_at_100
value: 57.018
- type: map_at_1000
value: 57.031
- type: map_at_3
value: 52.020999999999994
- type: map_at_5
value: 54.626
- type: mrr_at_1
value: 44.699
- type: mrr_at_10
value: 58.819
- type: mrr_at_100
value: 59.351
- type: mrr_at_1000
value: 59.358
- type: mrr_at_3
value: 55.615
- type: mrr_at_5
value: 57.598000000000006
- type: ndcg_at_1
value: 44.699
- type: ndcg_at_10
value: 63.873999999999995
- type: ndcg_at_100
value: 66.973
- type: ndcg_at_1000
value: 67.23700000000001
- type: ndcg_at_3
value: 56.25599999999999
- type: ndcg_at_5
value: 60.44199999999999
- type: precision_at_1
value: 44.699
- type: precision_at_10
value: 10.075000000000001
- type: precision_at_100
value: 1.185
- type: precision_at_1000
value: 0.121
- type: precision_at_3
value: 25.202999999999996
- type: precision_at_5
value: 17.584
- type: recall_at_1
value: 39.663
- type: recall_at_10
value: 84.313
- type: recall_at_100
value: 97.56700000000001
- type: recall_at_1000
value: 99.44
- type: recall_at_3
value: 64.938
- type: recall_at_5
value: 74.515
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: quora
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 69.708
- type: map_at_10
value: 83.86099999999999
- type: map_at_100
value: 84.513
- type: map_at_1000
value: 84.53
- type: map_at_3
value: 80.854
- type: map_at_5
value: 82.757
- type: mrr_at_1
value: 80.15
- type: mrr_at_10
value: 86.70400000000001
- type: mrr_at_100
value: 86.81400000000001
- type: mrr_at_1000
value: 86.815
- type: mrr_at_3
value: 85.658
- type: mrr_at_5
value: 86.37599999999999
- type: ndcg_at_1
value: 80.17
- type: ndcg_at_10
value: 87.7
- type: ndcg_at_100
value: 88.979
- type: ndcg_at_1000
value: 89.079
- type: ndcg_at_3
value: 84.71600000000001
- type: ndcg_at_5
value: 86.385
- type: precision_at_1
value: 80.17
- type: precision_at_10
value: 13.369
- type: precision_at_100
value: 1.53
- type: precision_at_1000
value: 0.157
- type: precision_at_3
value: 37.123
- type: precision_at_5
value: 24.498
- type: recall_at_1
value: 69.708
- type: recall_at_10
value: 95.17099999999999
- type: recall_at_100
value: 99.529
- type: recall_at_1000
value: 99.97500000000001
- type: recall_at_3
value: 86.761
- type: recall_at_5
value: 91.34
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 63.005610557842786
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 65.85897055439158
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: scidocs
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 5.388
- type: map_at_10
value: 14.087
- type: map_at_100
value: 16.618
- type: map_at_1000
value: 16.967
- type: map_at_3
value: 9.8
- type: map_at_5
value: 11.907
- type: mrr_at_1
value: 26.5
- type: mrr_at_10
value: 37.905
- type: mrr_at_100
value: 39.053
- type: mrr_at_1000
value: 39.091
- type: mrr_at_3
value: 34.567
- type: mrr_at_5
value: 36.307
- type: ndcg_at_1
value: 26.5
- type: ndcg_at_10
value: 23.06
- type: ndcg_at_100
value: 32.164
- type: ndcg_at_1000
value: 37.574000000000005
- type: ndcg_at_3
value: 21.623
- type: ndcg_at_5
value: 18.95
- type: precision_at_1
value: 26.5
- type: precision_at_10
value: 12.030000000000001
- type: precision_at_100
value: 2.5020000000000002
- type: precision_at_1000
value: 0.379
- type: precision_at_3
value: 20.200000000000003
- type: precision_at_5
value: 16.64
- type: recall_at_1
value: 5.388
- type: recall_at_10
value: 24.375
- type: recall_at_100
value: 50.818
- type: recall_at_1000
value: 76.86699999999999
- type: recall_at_3
value: 12.273
- type: recall_at_5
value: 16.858
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 85.09465497223438
- type: cos_sim_spearman
value: 80.55601111843897
- type: euclidean_pearson
value: 82.40135168520864
- type: euclidean_spearman
value: 80.05606361845396
- type: manhattan_pearson
value: 82.24092291787754
- type: manhattan_spearman
value: 79.89739846820373
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 81.14210597635189
- type: cos_sim_spearman
value: 73.69447481152118
- type: euclidean_pearson
value: 75.08507068029972
- type: euclidean_spearman
value: 71.04077458564372
- type: manhattan_pearson
value: 75.64918699307383
- type: manhattan_spearman
value: 71.61677355593945
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 85.41396417076866
- type: cos_sim_spearman
value: 85.82245898186092
- type: euclidean_pearson
value: 85.58527168297935
- type: euclidean_spearman
value: 85.94613250938504
- type: manhattan_pearson
value: 85.88114899068759
- type: manhattan_spearman
value: 86.42494392145366
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 83.7431948980468
- type: cos_sim_spearman
value: 82.05114289801895
- type: euclidean_pearson
value: 83.06116666914892
- type: euclidean_spearman
value: 81.82060562251957
- type: manhattan_pearson
value: 83.1858437025367
- type: manhattan_spearman
value: 82.09604293088852
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 88.455985912287
- type: cos_sim_spearman
value: 88.8044343107975
- type: euclidean_pearson
value: 87.155336804123
- type: euclidean_spearman
value: 87.79371420531842
- type: manhattan_pearson
value: 87.5784376507174
- type: manhattan_spearman
value: 88.429877987816
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 85.1631000795076
- type: cos_sim_spearman
value: 86.20042158061408
- type: euclidean_pearson
value: 84.88605965960737
- type: euclidean_spearman
value: 85.45926745772432
- type: manhattan_pearson
value: 85.18333987666729
- type: manhattan_spearman
value: 85.86048911387192
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 91.51301667439836
- type: cos_sim_spearman
value: 91.46469919011143
- type: euclidean_pearson
value: 91.15157693133415
- type: euclidean_spearman
value: 91.02656400119739
- type: manhattan_pearson
value: 91.08411259466446
- type: manhattan_spearman
value: 90.84339904461068
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 69.08993728439704
- type: cos_sim_spearman
value: 69.20885645170797
- type: euclidean_pearson
value: 69.65638507632245
- type: euclidean_spearman
value: 68.69831912688514
- type: manhattan_pearson
value: 69.86621764969294
- type: manhattan_spearman
value: 69.05446631856769
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 86.96149243197495
- type: cos_sim_spearman
value: 87.43145597912833
- type: euclidean_pearson
value: 86.6762329641158
- type: euclidean_spearman
value: 86.67085254401809
- type: manhattan_pearson
value: 87.06412701458164
- type: manhattan_spearman
value: 87.10197412769807
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 86.43440918697488
- type: mrr
value: 96.3954826945023
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: scifact
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 60.494
- type: map_at_10
value: 72.074
- type: map_at_100
value: 72.475
- type: map_at_1000
value: 72.483
- type: map_at_3
value: 68.983
- type: map_at_5
value: 71.161
- type: mrr_at_1
value: 63.666999999999994
- type: mrr_at_10
value: 73.31299999999999
- type: mrr_at_100
value: 73.566
- type: mrr_at_1000
value: 73.574
- type: mrr_at_3
value: 71.111
- type: mrr_at_5
value: 72.72800000000001
- type: ndcg_at_1
value: 63.666999999999994
- type: ndcg_at_10
value: 77.024
- type: ndcg_at_100
value: 78.524
- type: ndcg_at_1000
value: 78.842
- type: ndcg_at_3
value: 72.019
- type: ndcg_at_5
value: 75.22999999999999
- type: precision_at_1
value: 63.666999999999994
- type: precision_at_10
value: 10.2
- type: precision_at_100
value: 1.103
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_3
value: 28.111000000000004
- type: precision_at_5
value: 19.0
- type: recall_at_1
value: 60.494
- type: recall_at_10
value: 90.8
- type: recall_at_100
value: 97.333
- type: recall_at_1000
value: 100.0
- type: recall_at_3
value: 77.644
- type: recall_at_5
value: 85.694
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.68415841584158
- type: cos_sim_ap
value: 91.23713949701548
- type: cos_sim_f1
value: 83.70221327967808
- type: cos_sim_precision
value: 84.21052631578947
- type: cos_sim_recall
value: 83.2
- type: dot_accuracy
value: 99.5
- type: dot_ap
value: 79.46312132270363
- type: dot_f1
value: 72.75320970042794
- type: dot_precision
value: 69.35630099728014
- type: dot_recall
value: 76.5
- type: euclidean_accuracy
value: 99.69108910891089
- type: euclidean_ap
value: 90.9016163254649
- type: euclidean_f1
value: 83.91752577319586
- type: euclidean_precision
value: 86.59574468085106
- type: euclidean_recall
value: 81.39999999999999
- type: manhattan_accuracy
value: 99.7039603960396
- type: manhattan_ap
value: 91.5593806619311
- type: manhattan_f1
value: 85.08124076809453
- type: manhattan_precision
value: 83.80213385063045
- type: manhattan_recall
value: 86.4
- type: max_accuracy
value: 99.7039603960396
- type: max_ap
value: 91.5593806619311
- type: max_f1
value: 85.08124076809453
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 74.40806543281603
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 38.51757703316821
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 54.33475593449746
- type: mrr
value: 55.3374474789916
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 30.249926396023596
- type: cos_sim_spearman
value: 29.820375700458158
- type: dot_pearson
value: 28.820307635930355
- type: dot_spearman
value: 28.824273052746825
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: trec-covid
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 0.233
- type: map_at_10
value: 2.061
- type: map_at_100
value: 12.607
- type: map_at_1000
value: 30.031000000000002
- type: map_at_3
value: 0.6669999999999999
- type: map_at_5
value: 1.091
- type: mrr_at_1
value: 88.0
- type: mrr_at_10
value: 93.067
- type: mrr_at_100
value: 93.067
- type: mrr_at_1000
value: 93.067
- type: mrr_at_3
value: 92.667
- type: mrr_at_5
value: 93.067
- type: ndcg_at_1
value: 84.0
- type: ndcg_at_10
value: 81.072
- type: ndcg_at_100
value: 62.875
- type: ndcg_at_1000
value: 55.641
- type: ndcg_at_3
value: 85.296
- type: ndcg_at_5
value: 84.10499999999999
- type: precision_at_1
value: 88.0
- type: precision_at_10
value: 83.39999999999999
- type: precision_at_100
value: 63.7
- type: precision_at_1000
value: 24.622
- type: precision_at_3
value: 88.0
- type: precision_at_5
value: 87.2
- type: recall_at_1
value: 0.233
- type: recall_at_10
value: 2.188
- type: recall_at_100
value: 15.52
- type: recall_at_1000
value: 52.05499999999999
- type: recall_at_3
value: 0.6859999999999999
- type: recall_at_5
value: 1.1440000000000001
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: webis-touche2020
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 3.19
- type: map_at_10
value: 11.491999999999999
- type: map_at_100
value: 17.251
- type: map_at_1000
value: 18.795
- type: map_at_3
value: 6.146
- type: map_at_5
value: 8.113
- type: mrr_at_1
value: 44.897999999999996
- type: mrr_at_10
value: 56.57
- type: mrr_at_100
value: 57.348
- type: mrr_at_1000
value: 57.357
- type: mrr_at_3
value: 52.041000000000004
- type: mrr_at_5
value: 55.408
- type: ndcg_at_1
value: 40.816
- type: ndcg_at_10
value: 27.968
- type: ndcg_at_100
value: 39.0
- type: ndcg_at_1000
value: 50.292
- type: ndcg_at_3
value: 31.256
- type: ndcg_at_5
value: 28.855999999999998
- type: precision_at_1
value: 44.897999999999996
- type: precision_at_10
value: 24.285999999999998
- type: precision_at_100
value: 7.898
- type: precision_at_1000
value: 1.541
- type: precision_at_3
value: 30.612000000000002
- type: precision_at_5
value: 27.346999999999998
- type: recall_at_1
value: 3.19
- type: recall_at_10
value: 17.954
- type: recall_at_100
value: 48.793
- type: recall_at_1000
value: 83.357
- type: recall_at_3
value: 6.973999999999999
- type: recall_at_5
value: 10.391
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 70.89139999999999
- type: ap
value: 15.562539739828049
- type: f1
value: 55.38685639741247
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 62.48160724391625
- type: f1
value: 62.76700854121342
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 57.157071531498275
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 87.15503367705789
- type: cos_sim_ap
value: 77.20584529783206
- type: cos_sim_f1
value: 71.3558088770313
- type: cos_sim_precision
value: 66.02333931777379
- type: cos_sim_recall
value: 77.62532981530343
- type: dot_accuracy
value: 83.10186564940096
- type: dot_ap
value: 64.34160146443133
- type: dot_f1
value: 63.23048153342683
- type: dot_precision
value: 56.75618967687789
- type: dot_recall
value: 71.37203166226914
- type: euclidean_accuracy
value: 86.94045419324074
- type: euclidean_ap
value: 76.08471767931738
- type: euclidean_f1
value: 71.41248592518455
- type: euclidean_precision
value: 67.90387818225078
- type: euclidean_recall
value: 75.30343007915567
- type: manhattan_accuracy
value: 86.80932228646361
- type: manhattan_ap
value: 76.03862870753638
- type: manhattan_f1
value: 71.2660917385327
- type: manhattan_precision
value: 67.70363334124912
- type: manhattan_recall
value: 75.22427440633246
- type: max_accuracy
value: 87.15503367705789
- type: max_ap
value: 77.20584529783206
- type: max_f1
value: 71.41248592518455
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 89.42639810610471
- type: cos_sim_ap
value: 86.45196525133669
- type: cos_sim_f1
value: 79.25172592977508
- type: cos_sim_precision
value: 76.50852802063925
- type: cos_sim_recall
value: 82.19895287958116
- type: dot_accuracy
value: 87.03768385919976
- type: dot_ap
value: 80.86465404774172
- type: dot_f1
value: 74.50351637940457
- type: dot_precision
value: 70.72293324109305
- type: dot_recall
value: 78.71111795503542
- type: euclidean_accuracy
value: 89.29056545193464
- type: euclidean_ap
value: 86.25102188096191
- type: euclidean_f1
value: 79.05038057267126
- type: euclidean_precision
value: 74.681550472538
- type: euclidean_recall
value: 83.9621188789652
- type: manhattan_accuracy
value: 89.34877944657896
- type: manhattan_ap
value: 86.35336214205911
- type: manhattan_f1
value: 79.20192588269623
- type: manhattan_precision
value: 75.24951483227058
- type: manhattan_recall
value: 83.59254696643055
- type: max_accuracy
value: 89.42639810610471
- type: max_ap
value: 86.45196525133669
- type: max_f1
value: 79.25172592977508
---
# Model Summary
> GritLM is a generative representational instruction tuned language model. It unifies text representation (embedding) and text generation into a single model achieving state-of-the-art performance on both types of tasks.
- **Repository:** [ContextualAI/gritlm](https://github.com/ContextualAI/gritlm)
- **Paper:** https://arxiv.org/abs/2402.09906
- **Logs:** https://wandb.ai/muennighoff/gritlm/runs/id130s1m/overview
- **Script:** https://github.com/ContextualAI/gritlm/blob/main/scripts/training/train_gritlm_8x7b.sh
| Model | Description |
|-------|-------------|
| [GritLM 7B](https://hf.co/GritLM/GritLM-7B) | Mistral 7B finetuned using GRIT |
| [GritLM 8x7B](https://hf.co/GritLM/GritLM-8x7B) | Mixtral 8x7B finetuned using GRIT |
# Use
The model usage is documented [here](https://github.com/ContextualAI/gritlm?tab=readme-ov-file#inference).
# Citation
```bibtex
@misc{muennighoff2024generative,
title={Generative Representational Instruction Tuning},
author={Niklas Muennighoff and Hongjin Su and Liang Wang and Nan Yang and Furu Wei and Tao Yu and Amanpreet Singh and Douwe Kiela},
year={2024},
eprint={2402.09906},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | [
"SUMMARIZATION"
] | [
"BIOSSES",
"SCIFACT"
] |
Charangan/MedBERT | Charangan | fill-mask | [
"transformers",
"pytorch",
"bert",
"pretraining",
"fill-mask",
"en",
"arxiv:1904.03323",
"license:mit",
"endpoints_compatible",
"region:us"
] | 2022-09-17T05:52:42 | 2023-01-13T11:53:33 | 1,791 | 13 | ---
language:
- en
license: mit
tags:
- fill-mask
---
# MedBERT Model
**MedBERT** is a newly pre-trained transformer-based language model for biomedical named entity recognition: initialized with [Bio_ClinicalBERT](https://arxiv.org/abs/1904.03323) & pre-trained on N2C2, BioNLP, and CRAFT community datasets.
## Pretraining
### Data
The `MedBERT` model was trained on N2C2, BioNLP, and CRAFT community datasets.
| Dataset | Description |
| ------------- | ------------- |
| [NLP Clinical Challenges (N2C2)](https://portal.dbmi.hms.harvard.edu/projects/n2c2-nlp/) | A collection of clinical notes released in N2C2 2018 and N2C2 2022 challenges|
| [BioNLP](http://bionlp.sourceforge.net/index.shtml) | It contains the articles released under the BioNLP project. The articles cover multiple biomedical disciplines such as molecular biology, IE for protein and DNA modifications, biomolecular mechanisms of infectious diseases, habitats of bacteria mentioned, and bacterial molecular interactions and regulations |
| [CRAFT](https://www.researchgate.net/publication/318175988_The_Colorado_Richly_Annotated_Full_Text_CRAFT_Corpus_Multi-Model_Annotation_in_the_Biomedical_Domain) | It consists of 67 full-text open-access biomedical journal articles from PubMed Central that covers a wide range of biomedical domains including biochemistry and molecular biology, genetics, developmental biology, and computational biology |
| Wikipedia | Crawled medical-related articles |
### Procedures
The model was trained using code from [Google's BERT repository](https://github.com/google-research/bert). Model parameters were initialized with Bio_ClinicalBERT.
### Hyperparameters
We used a batch size of 32, a maximum sequence length of 256, and a learning rate of 1·10−4 for pre-training our models. The models trained for 200,000 steps. The dup factor for duplicating input data with different masks was set to 5. All other default parameters were used (specifically, masked language model probability = 0.15
and max predictions per sequence = 22).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Charangan/MedBERT")
model = AutoModel.from_pretrained("Charangan/MedBERT")
```
## More Information
Refer to the original paper, [MedBERT: A Pre-trained Language Model for Biomedical Named Entity Recognition](https://ieeexplore.ieee.org/abstract/document/9980157) (APSIPA Conference 2022) for additional details and performance of biomedical NER tasks.
## Citation
```
@INPROCEEDINGS{9980157,
author={Vasantharajan, Charangan and Tun, Kyaw Zin and Thi-Nga, Ho and Jain, Sparsh and Rong, Tong and Siong, Chng Eng},
booktitle={2022 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC)},
title={MedBERT: A Pre-trained Language Model for Biomedical Named Entity Recognition},
year={2022},
volume={},
number={},
pages={1482-1488},
doi={10.23919/APSIPAASC55919.2022.9980157}
}
``` | [
"NAMED_ENTITY_RECOGNITION"
] | [
"CRAFT"
] |
Lihuchen/pearl_small | Lihuchen | feature-extraction | [
"sentence-transformers",
"pytorch",
"safetensors",
"bert",
"feature-extraction",
"Phrase Representation",
"String Matching",
"Fuzzy Join",
"Entity Retrieval",
"transformers",
"en",
"arxiv:2401.10407",
"license:apache-2.0",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
] | 2024-02-04T16:05:24 | 2025-02-24T11:37:26 | 1,785 | 13 | ---
language:
- en
license: apache-2.0
tags:
- Phrase Representation
- String Matching
- Fuzzy Join
- Entity Retrieval
- transformers
- sentence-transformers
---
## 🦪⚪ PEARL-small
[Learning High-Quality and General-Purpose Phrase Representations](https://arxiv.org/pdf/2401.10407.pdf). <br>
[Lihu Chen](https://chenlihu.com), [Gaël Varoquaux](https://gael-varoquaux.info/), [Fabian M. Suchanek](https://suchanek.name/).
Accepted by EACL Findings 2024 <br>
PEARL-small is a lightweight string embedding model. It is the tool of choice for semantic similarity computation for strings,
creating excellent embeddings for string matching, entity retrieval, entity clustering, fuzzy join...
<br>
It differs from typical sentence embedders because it incorporates phrase type information and morphological features,
allowing it to better capture variations in strings.
The model is a variant of [E5-small](https://huggingface.co/intfloat/e5-small-v2) finetuned on our constructed context-free [dataset](https://zenodo.org/records/10676475) to yield better representations
for phrases and strings. <br>
🤗 [PEARL-small](https://huggingface.co/Lihuchen/pearl_small) 🤗 [PEARL-base](https://huggingface.co/Lihuchen/pearl_base)
📐 [PEARL Benchmark](https://huggingface.co/datasets/Lihuchen/pearl_benchmark) 🏆 [PEARL Leaderboard](https://huggingface.co/spaces/Lihuchen/pearl_leaderboard)
<br>
| Model |Size|Avg| PPDB | PPDB filtered |Turney|BIRD|YAGO|UMLS|CoNLL|BC5CDR|AutoFJ|
|-----------------|-----------------|-----------------|-----------------|-----------------|-----------------|-----------------|-----------------|-----------------|-----------------|-----------------|-----------------|
| FastText |-| 40.3| 94.4 | 61.2 | 59.6 | 58.9 |16.9|14.5|3.0|0.2| 53.6|
| Sentence-BERT |110M|50.1| 94.6 | 66.8 | 50.4 | 62.6 | 21.6|23.6|25.5|48.4| 57.2|
| Phrase-BERT |110M|54.5| 96.8 | 68.7 | 57.2 | 68.8 |23.7|26.1|35.4| 59.5|66.9|
| E5-small |34M|57.0| 96.0| 56.8|55.9| 63.1|43.3| 42.0|27.6| 53.7|74.8|
|E5-base|110M| 61.1| 95.4|65.6|59.4|66.3| 47.3|44.0|32.0| 69.3|76.1|
|PEARL-small|34M| 62.5| 97.0|70.2|57.9|68.1| 48.1|44.5|42.4|59.3|75.2|
|PEARL-base|110M|64.8|97.3|72.2|59.7|72.6|50.7|45.8|39.3|69.4|77.1|
Cost comparison of FastText and PEARL. The estimated memory is calculated by the number of parameters (float16). The unit of inference speed is `*ms/512 samples`.
The FastText model here is `crawl-300d-2M-subword.bin`.
| Model |Avg Score| Estimated Memory |Speed GPU | Speed CPU |
|-|-|-|-|-|
|FastText|40.3|1200MB|-|57ms|
|PEARL-small|62.5|68MB|42ms|446ms|
|PEARL-base|64.8|220MB|89ms|1394ms|
## Usage
### Sentence Transformers
PEARL is integrated with the Sentence Transformers library (Thanks for [Tom Aarsen](https://huggingface.co/tomaarsen)'s contribution), and can be used like so:
```python
from sentence_transformers import SentenceTransformer, util
query_texts = ["The New York Times"]
doc_texts = [ "NYTimes", "New York Post", "New York"]
input_texts = query_texts + doc_texts
model = SentenceTransformer("Lihuchen/pearl_small")
embeddings = model.encode(input_texts)
scores = util.cos_sim(embeddings[0], embeddings[1:]) * 100
print(scores.tolist())
# [[90.56318664550781, 79.65763854980469, 75.52056121826172]]
```
### Transformers
You can also use `transformers` to use PEARL. Below is an example of entity retrieval, and we reuse the code from E5.
```python
import torch.nn.functional as F
from torch import Tensor
from transformers import AutoTokenizer, AutoModel
def average_pool(last_hidden_states: Tensor,
attention_mask: Tensor) -> Tensor:
last_hidden = last_hidden_states.masked_fill(~attention_mask[..., None].bool(), 0.0)
return last_hidden.sum(dim=1) / attention_mask.sum(dim=1)[..., None]
def encode_text(model, input_texts):
# Tokenize the input texts
batch_dict = tokenizer(input_texts, max_length=512, padding=True, truncation=True, return_tensors='pt')
outputs = model(**batch_dict)
embeddings = average_pool(outputs.last_hidden_state, batch_dict['attention_mask'])
return embeddings
query_texts = ["The New York Times"]
doc_texts = [ "NYTimes", "New York Post", "New York"]
input_texts = query_texts + doc_texts
tokenizer = AutoTokenizer.from_pretrained('Lihuchen/pearl_small')
model = AutoModel.from_pretrained('Lihuchen/pearl_small')
# encode
embeddings = encode_text(model, input_texts)
# calculate similarity
embeddings = F.normalize(embeddings, p=2, dim=1)
scores = (embeddings[:1] @ embeddings[1:].T) * 100
print(scores.tolist())
# expected outputs
# [[90.56318664550781, 79.65763854980469, 75.52054595947266]]
```
## Training and Evaluation
Have a look at our code on [Github](https://github.com/tigerchen52/PEARL)
## Citation
If you find our work useful, please give us a citation:
```
@inproceedings{chen2024learning,
title={Learning High-Quality and General-Purpose Phrase Representations},
author={Chen, Lihu and Varoquaux, Gael and Suchanek, Fabian},
booktitle={Findings of the Association for Computational Linguistics: EACL 2024},
pages={983--994},
year={2024}
}
``` | [
"SEMANTIC_SIMILARITY"
] | [
"BC5CDR"
] |
tasksource/ModernBERT-base-nli | tasksource | zero-shot-classification | [
"transformers",
"safetensors",
"modernbert",
"text-classification",
"instruct",
"natural-language-inference",
"nli",
"mnli",
"zero-shot-classification",
"en",
"dataset:nyu-mll/glue",
"dataset:facebook/anli",
"base_model:answerdotai/ModernBERT-base",
"base_model:finetune:answerdotai/ModernBERT-base",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-12-20T09:44:29 | 2025-01-06T08:58:31 | 1,753 | 16 | ---
base_model:
- answerdotai/ModernBERT-base
datasets:
- nyu-mll/glue
- facebook/anli
language:
- en
library_name: transformers
license: apache-2.0
pipeline_tag: zero-shot-classification
tags:
- instruct
- natural-language-inference
- nli
- mnli
---
# Model Card for Model ID
ModernBERT multi-task fine-tuned on tasksource NLI tasks, including MNLI, ANLI, SICK, WANLI, doc-nli, LingNLI, FOLIO, FOL-NLI, LogicNLI, Label-NLI and all datasets in the below table).
This is the equivalent of an "instruct" version.
The model was trained for 200k steps on an Nvidia A30 GPU.
It is very good at reasoning tasks (better than llama 3.1 8B Instruct on ANLI and FOLIO), long context reasoning, sentiment analysis and zero-shot classification with new labels.
The following table shows model test accuracy. These are the scores for the same single transformer with different classification heads on top. Further gains can be obtained by fine-tuning on a single-task, e.g. SST, but it this checkpoint is great for zero-shot classification and natural language inference (contradiction/entailment/neutral classification).
| test_name | test_accuracy |
|:--------------------------------------|----------------:|
| glue/mnli | 0.87 |
| glue/qnli | 0.93 |
| glue/rte | 0.85 |
| glue/mrpc | 0.87 |
| glue/qqp | 0.9 |
| glue/cola | 0.86 |
| glue/sst2 | 0.96 |
| super_glue/boolq | 0.64 |
| super_glue/cb | 0.89 |
| super_glue/multirc | 0.82 |
| super_glue/wic | 0.67 |
| super_glue/axg | 0.89 |
| anli/a1 | 0.66 |
| anli/a2 | 0.49 |
| anli/a3 | 0.44 |
| sick/label | 0.93 |
| sick/entailment_AB | 0.91 |
| snli | 0.83 |
| scitail/snli_format | 0.94 |
| hans | 1 |
| WANLI | 0.74 |
| recast/recast_ner | 0.87 |
| recast/recast_sentiment | 0.99 |
| recast/recast_verbnet | 0.88 |
| recast/recast_megaveridicality | 0.88 |
| recast/recast_verbcorner | 0.94 |
| recast/recast_kg_relations | 0.91 |
| recast/recast_factuality | 0.94 |
| recast/recast_puns | 0.96 |
| probability_words_nli/reasoning_1hop | 0.99 |
| probability_words_nli/usnli | 0.72 |
| probability_words_nli/reasoning_2hop | 0.98 |
| nan-nli | 0.85 |
| nli_fever | 0.78 |
| breaking_nli | 0.99 |
| conj_nli | 0.74 |
| fracas | 0.86 |
| dialogue_nli | 0.93 |
| mpe | 0.74 |
| dnc | 0.92 |
| recast_white/fnplus | 0.82 |
| recast_white/sprl | 0.9 |
| recast_white/dpr | 0.68 |
| robust_nli/IS_CS | 0.79 |
| robust_nli/LI_LI | 0.99 |
| robust_nli/ST_WO | 0.85 |
| robust_nli/PI_SP | 0.74 |
| robust_nli/PI_CD | 0.8 |
| robust_nli/ST_SE | 0.81 |
| robust_nli/ST_NE | 0.86 |
| robust_nli/ST_LM | 0.87 |
| robust_nli_is_sd | 1 |
| robust_nli_li_ts | 0.89 |
| add_one_rte | 0.94 |
| paws/labeled_final | 0.95 |
| pragmeval/pdtb | 0.64 |
| lex_glue/scotus | 0.55 |
| lex_glue/ledgar | 0.8 |
| dynasent/dynabench.dynasent.r1.all/r1 | 0.81 |
| dynasent/dynabench.dynasent.r2.all/r2 | 0.75 |
| cycic_classification | 0.9 |
| lingnli | 0.84 |
| monotonicity-entailment | 0.97 |
| scinli | 0.8 |
| naturallogic | 0.96 |
| dynahate | 0.78 |
| syntactic-augmentation-nli | 0.92 |
| autotnli | 0.94 |
| defeasible-nli/atomic | 0.81 |
| defeasible-nli/snli | 0.78 |
| help-nli | 0.96 |
| nli-veridicality-transitivity | 0.98 |
| lonli | 0.97 |
| dadc-limit-nli | 0.69 |
| folio | 0.66 |
| tomi-nli | 0.48 |
| puzzte | 0.6 |
| temporal-nli | 0.92 |
| counterfactually-augmented-snli | 0.79 |
| cnli | 0.87 |
| boolq-natural-perturbations | 0.66 |
| equate | 0.63 |
| logiqa-2.0-nli | 0.52 |
| mindgames | 0.96 |
| ConTRoL-nli | 0.67 |
| logical-fallacy | 0.37 |
| cladder | 0.87 |
| conceptrules_v2 | 1 |
| zero-shot-label-nli | 0.82 |
| scone | 0.98 |
| monli | 1 |
| SpaceNLI | 1 |
| propsegment/nli | 0.88 |
| FLD.v2/default | 0.91 |
| FLD.v2/star | 0.76 |
| SDOH-NLI | 0.98 |
| scifact_entailment | 0.84 |
| AdjectiveScaleProbe-nli | 0.99 |
| resnli | 1 |
| semantic_fragments_nli | 0.99 |
| dataset_train_nli | 0.94 |
| nlgraph | 0.94 |
| ruletaker | 0.99 |
| PARARULE-Plus | 1 |
| logical-entailment | 0.86 |
| nope | 0.44 |
| LogicNLI | 0.86 |
| contract-nli/contractnli_a/seg | 0.87 |
| contract-nli/contractnli_b/full | 0.79 |
| nli4ct_semeval2024 | 0.67 |
| biosift-nli | 0.92 |
| SIGA-nli | 0.53 |
| FOL-nli | 0.8 |
| doc-nli | 0.77 |
| mctest-nli | 0.87 |
| natural-language-satisfiability | 0.9 |
| idioms-nli | 0.81 |
| lifecycle-entailment | 0.78 |
| MSciNLI | 0.85 |
| hover-3way/nli | 0.88 |
| seahorse_summarization_evaluation | 0.73 |
| missing-item-prediction/contrastive | 0.79 |
| Pol_NLI | 0.89 |
| synthetic-retrieval-NLI/count | 0.64 |
| synthetic-retrieval-NLI/position | 0.89 |
| synthetic-retrieval-NLI/binary | 0.91 |
| babi_nli | 0.97 |
| gen_debiased_nli | 0.91 |
# Usage
## [ZS] Zero-shot classification pipeline
```python
from transformers import pipeline
classifier = pipeline("zero-shot-classification",model="tasksource/ModernBERT-base-nli")
text = "one day I will see the world"
candidate_labels = ['travel', 'cooking', 'dancing']
classifier(text, candidate_labels)
```
NLI training data of this model includes [label-nli](https://huggingface.co/datasets/tasksource/zero-shot-label-nli), a NLI dataset specially constructed to improve this kind of zero-shot classification.
## [NLI] Natural language inference pipeline
```python
from transformers import pipeline
pipe = pipeline("text-classification",model="tasksource/ModernBERT-base-nli")
pipe([dict(text='there is a cat',
text_pair='there is a black cat')]) #list of (premise,hypothesis)
```
## Backbone for further fune-tuning
This checkpoint has stronger reasoning and fine-grained abilities than the base version and can be used for further fine-tuning.
# Citation
```
@inproceedings{sileo-2024-tasksource,
title = "tasksource: A Large Collection of {NLP} tasks with a Structured Dataset Preprocessing Framework",
author = "Sileo, Damien",
booktitle = "Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)",
month = may,
year = "2024",
address = "Torino, Italia",
publisher = "ELRA and ICCL",
url = "https://aclanthology.org/2024.lrec-main.1361",
pages = "15655--15684",
}
``` | [
"SUMMARIZATION"
] | [
"SCIFACT",
"SCITAIL"
] |
w601sxs/b1ade-embed | w601sxs | feature-extraction | [
"transformers",
"safetensors",
"bert",
"feature-extraction",
"mteb",
"base_model:BAAI/bge-large-en-v1.5",
"base_model:finetune:BAAI/bge-large-en-v1.5",
"model-index",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
] | 2024-05-14T19:33:04 | 2025-03-12T17:29:50 | 1,729 | 3 | ---
base_model:
- bert-large-uncased
- WhereIsAI/UAE-Large-V1
- BAAI/bge-large-en-v1.5
- mixedbread-ai/mxbai-embed-large-v1
- avsolatorio/GIST-large-Embedding-v0
library_name: transformers
tags:
- mteb
model-index:
- name: merged_model
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 75.17910447761193
- type: ap
value: 37.9385904323946
- type: f1
value: 69.08121471841274
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 93.07292500000001
- type: ap
value: 89.99875359715712
- type: f1
value: 93.06135402357953
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 48.42400000000001
- type: f1
value: 47.95385391493928
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: mteb/arguana
config: default
split: test
revision: c22ab2a51041ffd869aaddef7af8d8215647e41a
metrics:
- type: map_at_1
value: 41.394
- type: map_at_10
value: 57.86900000000001
- type: map_at_100
value: 58.372
- type: map_at_1000
value: 58.374
- type: map_at_20
value: 58.321
- type: map_at_3
value: 53.793
- type: map_at_5
value: 56.443
- type: mrr_at_1
value: 42.745
- type: mrr_at_10
value: 58.392999999999994
- type: mrr_at_100
value: 58.887
- type: mrr_at_1000
value: 58.89
- type: mrr_at_20
value: 58.836
- type: mrr_at_3
value: 54.291
- type: mrr_at_5
value: 56.958
- type: ndcg_at_1
value: 41.394
- type: ndcg_at_10
value: 65.989
- type: ndcg_at_100
value: 67.896
- type: ndcg_at_1000
value: 67.955
- type: ndcg_at_20
value: 67.545
- type: ndcg_at_3
value: 57.859
- type: ndcg_at_5
value: 62.602999999999994
- type: precision_at_1
value: 41.394
- type: precision_at_10
value: 9.139
- type: precision_at_100
value: 0.992
- type: precision_at_1000
value: 0.1
- type: precision_at_20
value: 4.868
- type: precision_at_3
value: 23.21
- type: precision_at_5
value: 16.216
- type: recall_at_1
value: 41.394
- type: recall_at_10
value: 91.39399999999999
- type: recall_at_100
value: 99.21799999999999
- type: recall_at_1000
value: 99.644
- type: recall_at_20
value: 97.368
- type: recall_at_3
value: 69.63000000000001
- type: recall_at_5
value: 81.081
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 48.65949563592336
- type: v_measures
value:
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- 0.48817000383329534
- 0.4705950499127043
- 0.47920402944068824
- 0.4758536127855837
- 0.5033231021230509
- 0.4910490327908452
- 0.47491362511547475
- 0.4764633675511353
- 0.494737377944742
- 0.46500184034904274
- 0.5751292777690713
- 0.5743852402490139
- 0.5760819612630185
- 0.5774331510061154
- 0.5755684918850674
- 0.5722850605334535
- 0.5695224674679956
- 0.5746079891780558
- 0.5741544602411167
- 0.570162474027302
- 0.5327197811942663
- 0.28686142443119944
- 0.4715419431917622
- 0.41413611425618696
- 0.3600885356532917
- 0.2881658877776697
- 0.30387855920668666
- 0.24720800557345154
- 0.3374379904139358
- 1.0
- 0.2837637899710192
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 42.81101867573718
- type: v_measures
value:
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- 0.454307961507464
- 0.42488649894459946
- 0.42379061351155944
- 0.42486429152138483
- 0.4291595759894959
- 0.42606457334109177
- 0.4254161071114798
- 0.4293742056286505
- 0.4196235465065443
- 0.4305996611858312
- 0.5046904752193336
- 0.5051438754936164
- 0.5103431600040348
- 0.5096332570792377
- 0.5045766720372478
- 0.5013716624456788
- 0.5042413774439222
- 0.5005329672014509
- 0.5014765664428267
- 0.49965406082258795
- 0.4685511048432531
- 0.22040280790736025
- 0.37034503442744066
- 0.37923765670226733
- 0.31732522489436676
- 0.22426586263560286
- 0.2603243505725541
- 0.2000871112487
- 0.2823570530714659
- 1.0
- 0.21876847373747355
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 64.42483953378505
- type: mrr
value: 77.80525876093743
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 90.04392169216328
- type: cos_sim_spearman
value: 89.14721200259248
- type: euclidean_pearson
value: 87.49074189687103
- type: euclidean_spearman
value: 88.46828087003544
- type: manhattan_pearson
value: 87.30286329712442
- type: manhattan_spearman
value: 88.2580351155879
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 88.03246753246754
- type: f1
value: 88.01410778743103
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 39.80502915453793
- type: v_measures
value:
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- 0.3932785742317486
- 0.3999502201173461
- 0.3950059950633574
- 0.38385377686391847
- 0.3960518936249616
- 0.4129443269365589
- 0.3921923594846631
- 0.4090115055044366
- 0.3886609917490931
- 0.4095532718777094
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 36.627004544222814
- type: v_measures
value:
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- 0.3741266682616607
- 0.3781394287203381
- 0.3643317752911855
- 0.3477165800267488
- 0.36601830150988385
- 0.36559335998150805
- 0.36829334525379803
- 0.37360369040259567
- 0.35176327187070533
- 0.37311403310385743
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: mteb/cqadupstack-android
config: default
split: test
revision: f46a197baaae43b4f621051089b82a364682dfeb
metrics:
- type: map_at_1
value: 34.902
- type: map_at_10
value: 46.548
- type: map_at_100
value: 48.209
- type: map_at_1000
value: 48.327999999999996
- type: map_at_20
value: 47.488
- type: map_at_3
value: 42.844
- type: map_at_5
value: 44.849
- type: mrr_at_1
value: 42.632
- type: mrr_at_10
value: 53.03600000000001
- type: mrr_at_100
value: 53.749
- type: mrr_at_1000
value: 53.788000000000004
- type: mrr_at_20
value: 53.461999999999996
- type: mrr_at_3
value: 50.548
- type: mrr_at_5
value: 52.029
- type: ndcg_at_1
value: 42.632
- type: ndcg_at_10
value: 53.099
- type: ndcg_at_100
value: 58.568
- type: ndcg_at_1000
value: 60.245000000000005
- type: ndcg_at_20
value: 55.379
- type: ndcg_at_3
value: 48.211
- type: ndcg_at_5
value: 50.375
- type: precision_at_1
value: 42.632
- type: precision_at_10
value: 10.129000000000001
- type: precision_at_100
value: 1.6219999999999999
- type: precision_at_1000
value: 0.207
- type: precision_at_20
value: 6.116
- type: precision_at_3
value: 23.033
- type: precision_at_5
value: 16.509
- type: recall_at_1
value: 34.902
- type: recall_at_10
value: 64.761
- type: recall_at_100
value: 87.15
- type: recall_at_1000
value: 97.479
- type: recall_at_20
value: 72.775
- type: recall_at_3
value: 50.4
- type: recall_at_5
value: 56.711
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackEnglishRetrieval
type: mteb/cqadupstack-english
config: default
split: test
revision: ad9991cb51e31e31e430383c75ffb2885547b5f0
metrics:
- type: map_at_1
value: 32.266
- type: map_at_10
value: 43.149
- type: map_at_100
value: 44.416
- type: map_at_1000
value: 44.545
- type: map_at_20
value: 43.829
- type: map_at_3
value: 39.995000000000005
- type: map_at_5
value: 41.737
- type: mrr_at_1
value: 40.0
- type: mrr_at_10
value: 48.921
- type: mrr_at_100
value: 49.54
- type: mrr_at_1000
value: 49.583
- type: mrr_at_20
value: 49.289
- type: mrr_at_3
value: 46.73
- type: mrr_at_5
value: 48.036
- type: ndcg_at_1
value: 40.0
- type: ndcg_at_10
value: 48.927
- type: ndcg_at_100
value: 53.222
- type: ndcg_at_1000
value: 55.202
- type: ndcg_at_20
value: 50.585
- type: ndcg_at_3
value: 44.777
- type: ndcg_at_5
value: 46.648
- type: precision_at_1
value: 40.0
- type: precision_at_10
value: 9.312
- type: precision_at_100
value: 1.48
- type: precision_at_1000
value: 0.19499999999999998
- type: precision_at_20
value: 5.4239999999999995
- type: precision_at_3
value: 21.656
- type: precision_at_5
value: 15.338
- type: recall_at_1
value: 32.266
- type: recall_at_10
value: 58.904999999999994
- type: recall_at_100
value: 77.057
- type: recall_at_1000
value: 89.517
- type: recall_at_20
value: 65.059
- type: recall_at_3
value: 46.601
- type: recall_at_5
value: 51.93600000000001
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGamingRetrieval
type: mteb/cqadupstack-gaming
config: default
split: test
revision: 4885aa143210c98657558c04aaf3dc47cfb54340
metrics:
- type: map_at_1
value: 40.876000000000005
- type: map_at_10
value: 54.445
- type: map_at_100
value: 55.434000000000005
- type: map_at_1000
value: 55.486000000000004
- type: map_at_20
value: 55.089
- type: map_at_3
value: 50.751999999999995
- type: map_at_5
value: 52.905
- type: mrr_at_1
value: 46.583000000000006
- type: mrr_at_10
value: 57.55200000000001
- type: mrr_at_100
value: 58.165
- type: mrr_at_1000
value: 58.192
- type: mrr_at_20
value: 57.958
- type: mrr_at_3
value: 54.932
- type: mrr_at_5
value: 56.584
- type: ndcg_at_1
value: 46.583000000000006
- type: ndcg_at_10
value: 60.711999999999996
- type: ndcg_at_100
value: 64.35499999999999
- type: ndcg_at_1000
value: 65.348
- type: ndcg_at_20
value: 62.499
- type: ndcg_at_3
value: 54.681000000000004
- type: ndcg_at_5
value: 57.782
- type: precision_at_1
value: 46.583000000000006
- type: precision_at_10
value: 9.937
- type: precision_at_100
value: 1.265
- type: precision_at_1000
value: 0.13899999999999998
- type: precision_at_20
value: 5.536
- type: precision_at_3
value: 24.66
- type: precision_at_5
value: 17.041
- type: recall_at_1
value: 40.876000000000005
- type: recall_at_10
value: 75.967
- type: recall_at_100
value: 91.335
- type: recall_at_1000
value: 98.339
- type: recall_at_20
value: 82.514
- type: recall_at_3
value: 59.917
- type: recall_at_5
value: 67.57600000000001
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGisRetrieval
type: mteb/cqadupstack-gis
config: default
split: test
revision: 5003b3064772da1887988e05400cf3806fe491f2
metrics:
- type: map_at_1
value: 27.834999999999997
- type: map_at_10
value: 37.159
- type: map_at_100
value: 38.211
- type: map_at_1000
value: 38.278
- type: map_at_20
value: 37.785999999999994
- type: map_at_3
value: 34.297
- type: map_at_5
value: 35.876999999999995
- type: mrr_at_1
value: 30.169
- type: mrr_at_10
value: 39.257999999999996
- type: mrr_at_100
value: 40.193
- type: mrr_at_1000
value: 40.243
- type: mrr_at_20
value: 39.843
- type: mrr_at_3
value: 36.685
- type: mrr_at_5
value: 38.126
- type: ndcg_at_1
value: 30.169
- type: ndcg_at_10
value: 42.436
- type: ndcg_at_100
value: 47.519
- type: ndcg_at_1000
value: 49.28
- type: ndcg_at_20
value: 44.629000000000005
- type: ndcg_at_3
value: 36.942
- type: ndcg_at_5
value: 39.543
- type: precision_at_1
value: 30.169
- type: precision_at_10
value: 6.531000000000001
- type: precision_at_100
value: 0.951
- type: precision_at_1000
value: 0.11399999999999999
- type: precision_at_20
value: 3.763
- type: precision_at_3
value: 15.706000000000001
- type: precision_at_5
value: 10.938
- type: recall_at_1
value: 27.834999999999997
- type: recall_at_10
value: 56.716
- type: recall_at_100
value: 79.85
- type: recall_at_1000
value: 93.03399999999999
- type: recall_at_20
value: 65.076
- type: recall_at_3
value: 41.784
- type: recall_at_5
value: 48.031
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackMathematicaRetrieval
type: mteb/cqadupstack-mathematica
config: default
split: test
revision: 90fceea13679c63fe563ded68f3b6f06e50061de
metrics:
- type: map_at_1
value: 18.941
- type: map_at_10
value: 27.881
- type: map_at_100
value: 29.085
- type: map_at_1000
value: 29.211
- type: map_at_20
value: 28.493000000000002
- type: map_at_3
value: 24.959999999999997
- type: map_at_5
value: 26.604
- type: mrr_at_1
value: 23.383000000000003
- type: mrr_at_10
value: 32.849000000000004
- type: mrr_at_100
value: 33.732
- type: mrr_at_1000
value: 33.803
- type: mrr_at_20
value: 33.347
- type: mrr_at_3
value: 30.037000000000003
- type: mrr_at_5
value: 31.555
- type: ndcg_at_1
value: 23.383000000000003
- type: ndcg_at_10
value: 33.585
- type: ndcg_at_100
value: 39.187
- type: ndcg_at_1000
value: 41.993
- type: ndcg_at_20
value: 35.582
- type: ndcg_at_3
value: 28.258
- type: ndcg_at_5
value: 30.714999999999996
- type: precision_at_1
value: 23.383000000000003
- type: precision_at_10
value: 6.182
- type: precision_at_100
value: 1.04
- type: precision_at_1000
value: 0.14200000000000002
- type: precision_at_20
value: 3.675
- type: precision_at_3
value: 13.639999999999999
- type: precision_at_5
value: 9.950000000000001
- type: recall_at_1
value: 18.941
- type: recall_at_10
value: 46.225
- type: recall_at_100
value: 70.416
- type: recall_at_1000
value: 90.252
- type: recall_at_20
value: 53.198
- type: recall_at_3
value: 31.483
- type: recall_at_5
value: 37.774
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackPhysicsRetrieval
type: mteb/cqadupstack-physics
config: default
split: test
revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4
metrics:
- type: map_at_1
value: 32.190000000000005
- type: map_at_10
value: 43.183
- type: map_at_100
value: 44.467
- type: map_at_1000
value: 44.580999999999996
- type: map_at_20
value: 43.874
- type: map_at_3
value: 39.672000000000004
- type: map_at_5
value: 41.719
- type: mrr_at_1
value: 39.461
- type: mrr_at_10
value: 48.903999999999996
- type: mrr_at_100
value: 49.688
- type: mrr_at_1000
value: 49.729
- type: mrr_at_20
value: 49.349
- type: mrr_at_3
value: 46.439
- type: mrr_at_5
value: 47.964
- type: ndcg_at_1
value: 39.461
- type: ndcg_at_10
value: 49.307
- type: ndcg_at_100
value: 54.544000000000004
- type: ndcg_at_1000
value: 56.499
- type: ndcg_at_20
value: 51.356
- type: ndcg_at_3
value: 43.956
- type: ndcg_at_5
value: 46.662
- type: precision_at_1
value: 39.461
- type: precision_at_10
value: 8.826
- type: precision_at_100
value: 1.323
- type: precision_at_1000
value: 0.168
- type: precision_at_20
value: 5.125
- type: precision_at_3
value: 20.629
- type: precision_at_5
value: 14.745
- type: recall_at_1
value: 32.190000000000005
- type: recall_at_10
value: 61.792
- type: recall_at_100
value: 83.543
- type: recall_at_1000
value: 96.009
- type: recall_at_20
value: 68.941
- type: recall_at_3
value: 46.918
- type: recall_at_5
value: 53.909
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackProgrammersRetrieval
type: mteb/cqadupstack-programmers
config: default
split: test
revision: 6184bc1440d2dbc7612be22b50686b8826d22b32
metrics:
- type: map_at_1
value: 26.137
- type: map_at_10
value: 37.025999999999996
- type: map_at_100
value: 38.511
- type: map_at_1000
value: 38.619
- type: map_at_20
value: 37.92
- type: map_at_3
value: 33.729
- type: map_at_5
value: 35.478
- type: mrr_at_1
value: 32.192
- type: mrr_at_10
value: 42.245
- type: mrr_at_100
value: 43.172
- type: mrr_at_1000
value: 43.225
- type: mrr_at_20
value: 42.855
- type: mrr_at_3
value: 39.669
- type: mrr_at_5
value: 41.038999999999994
- type: ndcg_at_1
value: 32.192
- type: ndcg_at_10
value: 43.132
- type: ndcg_at_100
value: 49.09
- type: ndcg_at_1000
value: 51.248000000000005
- type: ndcg_at_20
value: 45.802
- type: ndcg_at_3
value: 37.796
- type: ndcg_at_5
value: 40.064
- type: precision_at_1
value: 32.192
- type: precision_at_10
value: 8.071
- type: precision_at_100
value: 1.275
- type: precision_at_1000
value: 0.164
- type: precision_at_20
value: 4.869
- type: precision_at_3
value: 18.189
- type: precision_at_5
value: 13.059000000000001
- type: recall_at_1
value: 26.137
- type: recall_at_10
value: 55.87
- type: recall_at_100
value: 80.868
- type: recall_at_1000
value: 95.298
- type: recall_at_20
value: 65.365
- type: recall_at_3
value: 41.074
- type: recall_at_5
value: 46.945
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackRetrieval
type: mteb/cqadupstack
config: default
split: test
revision: 160c094312a0e1facb97e55eeddb698c0abe3571
metrics:
- type: map_at_1
value: 27.92966666666667
- type: map_at_10
value: 37.75758333333333
- type: map_at_100
value: 38.996750000000006
- type: map_at_1000
value: 39.10941666666666
- type: map_at_20
value: 38.44558333333334
- type: map_at_3
value: 34.70758333333333
- type: map_at_5
value: 36.39783333333333
- type: mrr_at_1
value: 33.07458333333333
- type: mrr_at_10
value: 42.112750000000005
- type: mrr_at_100
value: 42.94625
- type: mrr_at_1000
value: 42.998000000000005
- type: mrr_at_20
value: 42.61133333333333
- type: mrr_at_3
value: 39.65641666666667
- type: mrr_at_5
value: 41.06275
- type: ndcg_at_1
value: 33.07458333333333
- type: ndcg_at_10
value: 43.39091666666667
- type: ndcg_at_100
value: 48.568916666666674
- type: ndcg_at_1000
value: 50.666
- type: ndcg_at_20
value: 45.44491666666668
- type: ndcg_at_3
value: 38.349833333333336
- type: ndcg_at_5
value: 40.70983333333333
- type: precision_at_1
value: 33.07458333333333
- type: precision_at_10
value: 7.6090833333333325
- type: precision_at_100
value: 1.205
- type: precision_at_1000
value: 0.15808333333333335
- type: precision_at_20
value: 4.48525
- type: precision_at_3
value: 17.66225
- type: precision_at_5
value: 12.545833333333334
- type: recall_at_1
value: 27.92966666666667
- type: recall_at_10
value: 55.657999999999994
- type: recall_at_100
value: 78.20633333333335
- type: recall_at_1000
value: 92.58875
- type: recall_at_20
value: 63.13408333333332
- type: recall_at_3
value: 41.67841666666667
- type: recall_at_5
value: 47.74058333333333
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackStatsRetrieval
type: mteb/cqadupstack-stats
config: default
split: test
revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a
metrics:
- type: map_at_1
value: 27.488
- type: map_at_10
value: 34.160000000000004
- type: map_at_100
value: 35.036
- type: map_at_1000
value: 35.125
- type: map_at_20
value: 34.594
- type: map_at_3
value: 31.941000000000003
- type: map_at_5
value: 33.007
- type: mrr_at_1
value: 31.288
- type: mrr_at_10
value: 37.345
- type: mrr_at_100
value: 38.079
- type: mrr_at_1000
value: 38.141999999999996
- type: mrr_at_20
value: 37.749
- type: mrr_at_3
value: 35.327
- type: mrr_at_5
value: 36.301
- type: ndcg_at_1
value: 31.288
- type: ndcg_at_10
value: 38.415
- type: ndcg_at_100
value: 43.018
- type: ndcg_at_1000
value: 45.322
- type: ndcg_at_20
value: 39.921
- type: ndcg_at_3
value: 34.176
- type: ndcg_at_5
value: 35.827
- type: precision_at_1
value: 31.288
- type: precision_at_10
value: 5.844
- type: precision_at_100
value: 0.91
- type: precision_at_1000
value: 0.117
- type: precision_at_20
value: 3.351
- type: precision_at_3
value: 14.315
- type: precision_at_5
value: 9.693
- type: recall_at_1
value: 27.488
- type: recall_at_10
value: 48.777
- type: recall_at_100
value: 70.253
- type: recall_at_1000
value: 87.455
- type: recall_at_20
value: 54.309
- type: recall_at_3
value: 36.791000000000004
- type: recall_at_5
value: 40.938
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackTexRetrieval
type: mteb/cqadupstack-tex
config: default
split: test
revision: 46989137a86843e03a6195de44b09deda022eec7
metrics:
- type: map_at_1
value: 19.085
- type: map_at_10
value: 26.579000000000004
- type: map_at_100
value: 27.814
- type: map_at_1000
value: 27.939000000000004
- type: map_at_20
value: 27.232
- type: map_at_3
value: 24.008
- type: map_at_5
value: 25.436999999999998
- type: mrr_at_1
value: 23.159
- type: mrr_at_10
value: 30.622
- type: mrr_at_100
value: 31.631999999999998
- type: mrr_at_1000
value: 31.705
- type: mrr_at_20
value: 31.186999999999998
- type: mrr_at_3
value: 28.292
- type: mrr_at_5
value: 29.669
- type: ndcg_at_1
value: 23.159
- type: ndcg_at_10
value: 31.422
- type: ndcg_at_100
value: 37.246
- type: ndcg_at_1000
value: 40.014
- type: ndcg_at_20
value: 33.568999999999996
- type: ndcg_at_3
value: 26.893
- type: ndcg_at_5
value: 29.048000000000002
- type: precision_at_1
value: 23.159
- type: precision_at_10
value: 5.736
- type: precision_at_100
value: 1.013
- type: precision_at_1000
value: 0.14300000000000002
- type: precision_at_20
value: 3.4840000000000004
- type: precision_at_3
value: 12.617999999999999
- type: precision_at_5
value: 9.195
- type: recall_at_1
value: 19.085
- type: recall_at_10
value: 41.881
- type: recall_at_100
value: 68.026
- type: recall_at_1000
value: 87.576
- type: recall_at_20
value: 49.886
- type: recall_at_3
value: 29.355999999999998
- type: recall_at_5
value: 34.946
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackUnixRetrieval
type: mteb/cqadupstack-unix
config: default
split: test
revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53
metrics:
- type: map_at_1
value: 28.052
- type: map_at_10
value: 37.942
- type: map_at_100
value: 39.11
- type: map_at_1000
value: 39.204
- type: map_at_20
value: 38.592
- type: map_at_3
value: 35.149
- type: map_at_5
value: 36.636
- type: mrr_at_1
value: 33.022
- type: mrr_at_10
value: 42.13
- type: mrr_at_100
value: 42.992000000000004
- type: mrr_at_1000
value: 43.045
- type: mrr_at_20
value: 42.653
- type: mrr_at_3
value: 39.754
- type: mrr_at_5
value: 41.046
- type: ndcg_at_1
value: 33.022
- type: ndcg_at_10
value: 43.588
- type: ndcg_at_100
value: 48.844
- type: ndcg_at_1000
value: 50.87199999999999
- type: ndcg_at_20
value: 45.634
- type: ndcg_at_3
value: 38.653
- type: ndcg_at_5
value: 40.827000000000005
- type: precision_at_1
value: 33.022
- type: precision_at_10
value: 7.239
- type: precision_at_100
value: 1.126
- type: precision_at_1000
value: 0.14100000000000001
- type: precision_at_20
value: 4.2299999999999995
- type: precision_at_3
value: 17.755000000000003
- type: precision_at_5
value: 12.239
- type: recall_at_1
value: 28.052
- type: recall_at_10
value: 56.518
- type: recall_at_100
value: 79.081
- type: recall_at_1000
value: 93.096
- type: recall_at_20
value: 63.65
- type: recall_at_3
value: 43.061
- type: recall_at_5
value: 48.588
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWebmastersRetrieval
type: mteb/cqadupstack-webmasters
config: default
split: test
revision: 160c094312a0e1facb97e55eeddb698c0abe3571
metrics:
- type: map_at_1
value: 24.698
- type: map_at_10
value: 34.162
- type: map_at_100
value: 35.862
- type: map_at_1000
value: 36.087
- type: map_at_20
value: 35.049
- type: map_at_3
value: 31.172
- type: map_at_5
value: 32.814
- type: mrr_at_1
value: 30.237000000000002
- type: mrr_at_10
value: 39.461
- type: mrr_at_100
value: 40.514
- type: mrr_at_1000
value: 40.552
- type: mrr_at_20
value: 40.091
- type: mrr_at_3
value: 37.088
- type: mrr_at_5
value: 38.383
- type: ndcg_at_1
value: 30.237000000000002
- type: ndcg_at_10
value: 40.308
- type: ndcg_at_100
value: 46.792
- type: ndcg_at_1000
value: 48.931999999999995
- type: ndcg_at_20
value: 42.748999999999995
- type: ndcg_at_3
value: 35.541
- type: ndcg_at_5
value: 37.812
- type: precision_at_1
value: 30.237000000000002
- type: precision_at_10
value: 7.846
- type: precision_at_100
value: 1.599
- type: precision_at_1000
value: 0.247
- type: precision_at_20
value: 4.96
- type: precision_at_3
value: 16.93
- type: precision_at_5
value: 12.49
- type: recall_at_1
value: 24.698
- type: recall_at_10
value: 51.74999999999999
- type: recall_at_100
value: 80.767
- type: recall_at_1000
value: 93.569
- type: recall_at_20
value: 61.157
- type: recall_at_3
value: 38.344
- type: recall_at_5
value: 44.184
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWordpressRetrieval
type: mteb/cqadupstack-wordpress
config: default
split: test
revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4
metrics:
- type: map_at_1
value: 22.686
- type: map_at_10
value: 30.857
- type: map_at_100
value: 31.806
- type: map_at_1000
value: 31.91
- type: map_at_20
value: 31.401
- type: map_at_3
value: 27.972
- type: map_at_5
value: 29.711
- type: mrr_at_1
value: 24.769
- type: mrr_at_10
value: 33.03
- type: mrr_at_100
value: 33.899
- type: mrr_at_1000
value: 33.969
- type: mrr_at_20
value: 33.553
- type: mrr_at_3
value: 30.375999999999998
- type: mrr_at_5
value: 32.021
- type: ndcg_at_1
value: 24.769
- type: ndcg_at_10
value: 35.76
- type: ndcg_at_100
value: 40.442
- type: ndcg_at_1000
value: 43.037
- type: ndcg_at_20
value: 37.634
- type: ndcg_at_3
value: 30.314000000000004
- type: ndcg_at_5
value: 33.215
- type: precision_at_1
value: 24.769
- type: precision_at_10
value: 5.656
- type: precision_at_100
value: 0.856
- type: precision_at_1000
value: 0.12
- type: precision_at_20
value: 3.29
- type: precision_at_3
value: 12.815999999999999
- type: precision_at_5
value: 9.353
- type: recall_at_1
value: 22.686
- type: recall_at_10
value: 48.734
- type: recall_at_100
value: 70.13000000000001
- type: recall_at_1000
value: 89.441
- type: recall_at_20
value: 55.679
- type: recall_at_3
value: 34.412
- type: recall_at_5
value: 41.349000000000004
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: mteb/climate-fever
config: default
split: test
revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380
metrics:
- type: map_at_1
value: 12.842999999999998
- type: map_at_10
value: 21.776999999999997
- type: map_at_100
value: 23.796
- type: map_at_1000
value: 23.987
- type: map_at_20
value: 22.889
- type: map_at_3
value: 18.144
- type: map_at_5
value: 19.921
- type: mrr_at_1
value: 28.794999999999998
- type: mrr_at_10
value: 40.261
- type: mrr_at_100
value: 41.187000000000005
- type: mrr_at_1000
value: 41.224
- type: mrr_at_20
value: 40.853
- type: mrr_at_3
value: 36.895
- type: mrr_at_5
value: 38.781
- type: ndcg_at_1
value: 28.794999999999998
- type: ndcg_at_10
value: 30.37
- type: ndcg_at_100
value: 37.936
- type: ndcg_at_1000
value: 41.332
- type: ndcg_at_20
value: 33.452
- type: ndcg_at_3
value: 24.723
- type: ndcg_at_5
value: 26.562
- type: precision_at_1
value: 28.794999999999998
- type: precision_at_10
value: 9.498
- type: precision_at_100
value: 1.7590000000000001
- type: precision_at_1000
value: 0.23900000000000002
- type: precision_at_20
value: 6.085
- type: precision_at_3
value: 18.284
- type: precision_at_5
value: 14.046
- type: recall_at_1
value: 12.842999999999998
- type: recall_at_10
value: 36.524
- type: recall_at_100
value: 62.197
- type: recall_at_1000
value: 81.25
- type: recall_at_20
value: 45.21
- type: recall_at_3
value: 22.549
- type: recall_at_5
value: 27.938000000000002
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: mteb/dbpedia
config: default
split: test
revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659
metrics:
- type: map_at_1
value: 9.041
- type: map_at_10
value: 20.801
- type: map_at_100
value: 30.377
- type: map_at_1000
value: 32.106
- type: map_at_20
value: 24.453
- type: map_at_3
value: 14.698
- type: map_at_5
value: 17.301
- type: mrr_at_1
value: 67.75
- type: mrr_at_10
value: 76.409
- type: mrr_at_100
value: 76.727
- type: mrr_at_1000
value: 76.73400000000001
- type: mrr_at_20
value: 76.669
- type: mrr_at_3
value: 74.833
- type: mrr_at_5
value: 75.783
- type: ndcg_at_1
value: 55.875
- type: ndcg_at_10
value: 43.308
- type: ndcg_at_100
value: 49.183
- type: ndcg_at_1000
value: 56.660999999999994
- type: ndcg_at_20
value: 43.074
- type: ndcg_at_3
value: 47.758
- type: ndcg_at_5
value: 45.111000000000004
- type: precision_at_1
value: 67.75
- type: precision_at_10
value: 34.8
- type: precision_at_100
value: 11.417
- type: precision_at_1000
value: 2.114
- type: precision_at_20
value: 26.712000000000003
- type: precision_at_3
value: 52.25
- type: precision_at_5
value: 44.45
- type: recall_at_1
value: 9.041
- type: recall_at_10
value: 26.863999999999997
- type: recall_at_100
value: 57.403999999999996
- type: recall_at_1000
value: 81.22200000000001
- type: recall_at_20
value: 35.132999999999996
- type: recall_at_3
value: 15.955
- type: recall_at_5
value: 20.304
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 51.934999999999995
- type: f1
value: 46.90330636364514
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: mteb/fever
config: default
split: test
revision: bea83ef9e8fb933d90a2f1d5515737465d613e12
metrics:
- type: map_at_1
value: 70.231
- type: map_at_10
value: 79.506
- type: map_at_100
value: 79.777
- type: map_at_1000
value: 79.794
- type: map_at_20
value: 79.69000000000001
- type: map_at_3
value: 78.237
- type: map_at_5
value: 79.061
- type: mrr_at_1
value: 75.728
- type: mrr_at_10
value: 83.839
- type: mrr_at_100
value: 83.965
- type: mrr_at_1000
value: 83.97
- type: mrr_at_20
value: 83.93
- type: mrr_at_3
value: 82.908
- type: mrr_at_5
value: 83.539
- type: ndcg_at_1
value: 75.728
- type: ndcg_at_10
value: 83.576
- type: ndcg_at_100
value: 84.544
- type: ndcg_at_1000
value: 84.868
- type: ndcg_at_20
value: 84.096
- type: ndcg_at_3
value: 81.49499999999999
- type: ndcg_at_5
value: 82.69999999999999
- type: precision_at_1
value: 75.728
- type: precision_at_10
value: 10.174
- type: precision_at_100
value: 1.085
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_20
value: 5.234
- type: precision_at_3
value: 31.383
- type: precision_at_5
value: 19.625
- type: recall_at_1
value: 70.231
- type: recall_at_10
value: 91.774
- type: recall_at_100
value: 95.639
- type: recall_at_1000
value: 97.78
- type: recall_at_20
value: 93.60300000000001
- type: recall_at_3
value: 86.107
- type: recall_at_5
value: 89.164
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: mteb/fiqa
config: default
split: test
revision: 27a168819829fe9bcd655c2df245fb19452e8e06
metrics:
- type: map_at_1
value: 22.043
- type: map_at_10
value: 36.831
- type: map_at_100
value: 38.929
- type: map_at_1000
value: 39.102
- type: map_at_20
value: 38.039
- type: map_at_3
value: 32.202999999999996
- type: map_at_5
value: 35.04
- type: mrr_at_1
value: 43.980999999999995
- type: mrr_at_10
value: 53.592
- type: mrr_at_100
value: 54.384
- type: mrr_at_1000
value: 54.413999999999994
- type: mrr_at_20
value: 54.118
- type: mrr_at_3
value: 51.595
- type: mrr_at_5
value: 52.744
- type: ndcg_at_1
value: 43.980999999999995
- type: ndcg_at_10
value: 45.009
- type: ndcg_at_100
value: 52.129000000000005
- type: ndcg_at_1000
value: 54.788000000000004
- type: ndcg_at_20
value: 48.001
- type: ndcg_at_3
value: 41.46
- type: ndcg_at_5
value: 42.797000000000004
- type: precision_at_1
value: 43.980999999999995
- type: precision_at_10
value: 12.438
- type: precision_at_100
value: 1.9800000000000002
- type: precision_at_1000
value: 0.246
- type: precision_at_20
value: 7.515
- type: precision_at_3
value: 27.881
- type: precision_at_5
value: 20.463
- type: recall_at_1
value: 22.043
- type: recall_at_10
value: 51.796
- type: recall_at_100
value: 77.888
- type: recall_at_1000
value: 93.459
- type: recall_at_20
value: 60.953
- type: recall_at_3
value: 37.779
- type: recall_at_5
value: 44.666
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: mteb/hotpotqa
config: default
split: test
revision: ab518f4d6fcca38d87c25209f94beba119d02014
metrics:
- type: map_at_1
value: 39.061
- type: map_at_10
value: 62.934999999999995
- type: map_at_100
value: 63.844
- type: map_at_1000
value: 63.904
- type: map_at_20
value: 63.479
- type: map_at_3
value: 59.15899999999999
- type: map_at_5
value: 61.499
- type: mrr_at_1
value: 78.123
- type: mrr_at_10
value: 84.059
- type: mrr_at_100
value: 84.235
- type: mrr_at_1000
value: 84.241
- type: mrr_at_20
value: 84.16799999999999
- type: mrr_at_3
value: 83.086
- type: mrr_at_5
value: 83.709
- type: ndcg_at_1
value: 78.123
- type: ndcg_at_10
value: 71.26
- type: ndcg_at_100
value: 74.372
- type: ndcg_at_1000
value: 75.484
- type: ndcg_at_20
value: 72.587
- type: ndcg_at_3
value: 65.984
- type: ndcg_at_5
value: 68.89699999999999
- type: precision_at_1
value: 78.123
- type: precision_at_10
value: 15.076
- type: precision_at_100
value: 1.7500000000000002
- type: precision_at_1000
value: 0.19
- type: precision_at_20
value: 7.964
- type: precision_at_3
value: 42.494
- type: precision_at_5
value: 27.792
- type: recall_at_1
value: 39.061
- type: recall_at_10
value: 75.381
- type: recall_at_100
value: 87.522
- type: recall_at_1000
value: 94.828
- type: recall_at_20
value: 79.642
- type: recall_at_3
value: 63.741
- type: recall_at_5
value: 69.48
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 91.9088
- type: ap
value: 88.23414041783927
- type: f1
value: 91.8949910564831
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: mteb/msmarco
config: default
split: dev
revision: c5a29a104738b98a9e76336939199e264163d4a0
metrics:
- type: map_at_1
value: 22.102
- type: map_at_10
value: 34.666999999999994
- type: map_at_100
value: 35.849
- type: map_at_1000
value: 35.897
- type: map_at_20
value: 35.415
- type: map_at_3
value: 30.805
- type: map_at_5
value: 33.042
- type: mrr_at_1
value: 22.665
- type: mrr_at_10
value: 35.276999999999994
- type: mrr_at_100
value: 36.388999999999996
- type: mrr_at_1000
value: 36.43
- type: mrr_at_20
value: 35.984
- type: mrr_at_3
value: 31.453999999999997
- type: mrr_at_5
value: 33.701
- type: ndcg_at_1
value: 22.665
- type: ndcg_at_10
value: 41.63
- type: ndcg_at_100
value: 47.257
- type: ndcg_at_1000
value: 48.425000000000004
- type: ndcg_at_20
value: 44.26
- type: ndcg_at_3
value: 33.756
- type: ndcg_at_5
value: 37.771
- type: precision_at_1
value: 22.665
- type: precision_at_10
value: 6.583
- type: precision_at_100
value: 0.9400000000000001
- type: precision_at_1000
value: 0.104
- type: precision_at_20
value: 3.837
- type: precision_at_3
value: 14.379
- type: precision_at_5
value: 10.662
- type: recall_at_1
value: 22.102
- type: recall_at_10
value: 63.007000000000005
- type: recall_at_100
value: 88.942
- type: recall_at_1000
value: 97.80799999999999
- type: recall_at_20
value: 73.195
- type: recall_at_3
value: 41.632000000000005
- type: recall_at_5
value: 51.275999999999996
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 94.32512539899682
- type: f1
value: 94.08399309589969
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 76.60510715914273
- type: f1
value: 58.21529064999782
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 75.90786819098857
- type: f1
value: 74.0025337373784
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 79.43174176193679
- type: f1
value: 79.80377677179487
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 33.625500288734244
- type: v_measures
value:
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- 0.32171864455851634
- 0.31428872473108405
- 0.3221614340024842
- 0.317125267818034
- 0.32845342292625135
- 0.35982274887039417
- 0.34472428116610876
- 0.35581025975227415
- 0.3572089105669247
- 0.34123633448135204
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 31.70226358971163
- type: v_measures
value:
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- 0.3110505880489972
- 0.3043937275772366
- 0.3078312071388611
- 0.29784108532872844
- 0.3015334433877242
- 0.33960791546500374
- 0.31978896807138224
- 0.3451038707366554
- 0.3317452028242281
- 0.3113303503923461
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 32.77671285103453
- type: mrr
value: 34.069523934828844
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: mteb/nfcorpus
config: default
split: test
revision: ec0fa4fe99da2ff19ca1214b7966684033a58814
metrics:
- type: map_at_1
value: 7.281
- type: map_at_10
value: 15.652
- type: map_at_100
value: 20.165
- type: map_at_1000
value: 21.834
- type: map_at_20
value: 17.604
- type: map_at_3
value: 11.363
- type: map_at_5
value: 13.418
- type: mrr_at_1
value: 49.536
- type: mrr_at_10
value: 58.689
- type: mrr_at_100
value: 59.153
- type: mrr_at_1000
value: 59.184000000000005
- type: mrr_at_20
value: 58.958999999999996
- type: mrr_at_3
value: 56.192
- type: mrr_at_5
value: 57.91
- type: ndcg_at_1
value: 47.214
- type: ndcg_at_10
value: 39.126
- type: ndcg_at_100
value: 36.852000000000004
- type: ndcg_at_1000
value: 45.65
- type: ndcg_at_20
value: 37.263000000000005
- type: ndcg_at_3
value: 43.804
- type: ndcg_at_5
value: 42.01
- type: precision_at_1
value: 48.607
- type: precision_at_10
value: 28.762
- type: precision_at_100
value: 9.316
- type: precision_at_1000
value: 2.254
- type: precision_at_20
value: 21.95
- type: precision_at_3
value: 40.660000000000004
- type: precision_at_5
value: 35.913000000000004
- type: recall_at_1
value: 7.281
- type: recall_at_10
value: 20.006
- type: recall_at_100
value: 37.525
- type: recall_at_1000
value: 69.112
- type: recall_at_20
value: 24.396
- type: recall_at_3
value: 12.249
- type: recall_at_5
value: 15.946
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: mteb/nq
config: default
split: test
revision: b774495ed302d8c44a3a7ea25c90dbce03968f31
metrics:
- type: map_at_1
value: 30.779
- type: map_at_10
value: 46.973
- type: map_at_100
value: 47.964
- type: map_at_1000
value: 47.99
- type: map_at_20
value: 47.653
- type: map_at_3
value: 42.323
- type: map_at_5
value: 45.076
- type: mrr_at_1
value: 34.82
- type: mrr_at_10
value: 49.458999999999996
- type: mrr_at_100
value: 50.17700000000001
- type: mrr_at_1000
value: 50.195
- type: mrr_at_20
value: 49.968
- type: mrr_at_3
value: 45.606
- type: mrr_at_5
value: 47.946
- type: ndcg_at_1
value: 34.82
- type: ndcg_at_10
value: 55.131
- type: ndcg_at_100
value: 59.17400000000001
- type: ndcg_at_1000
value: 59.763
- type: ndcg_at_20
value: 57.306999999999995
- type: ndcg_at_3
value: 46.455
- type: ndcg_at_5
value: 51.034
- type: precision_at_1
value: 34.82
- type: precision_at_10
value: 9.241000000000001
- type: precision_at_100
value: 1.1520000000000001
- type: precision_at_1000
value: 0.121
- type: precision_at_20
value: 5.1450000000000005
- type: precision_at_3
value: 21.34
- type: precision_at_5
value: 15.423
- type: recall_at_1
value: 30.779
- type: recall_at_10
value: 77.424
- type: recall_at_100
value: 94.728
- type: recall_at_1000
value: 99.104
- type: recall_at_20
value: 85.458
- type: recall_at_3
value: 55.113
- type: recall_at_5
value: 65.67
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: mteb/quora
config: default
split: test
revision: e4e08e0b7dbe3c8700f0daef558ff32256715259
metrics:
- type: map_at_1
value: 71.588
- type: map_at_10
value: 85.57000000000001
- type: map_at_100
value: 86.20100000000001
- type: map_at_1000
value: 86.215
- type: map_at_20
value: 85.982
- type: map_at_3
value: 82.722
- type: map_at_5
value: 84.493
- type: mrr_at_1
value: 82.46
- type: mrr_at_10
value: 88.369
- type: mrr_at_100
value: 88.47
- type: mrr_at_1000
value: 88.47
- type: mrr_at_20
value: 88.449
- type: mrr_at_3
value: 87.485
- type: mrr_at_5
value: 88.098
- type: ndcg_at_1
value: 82.43
- type: ndcg_at_10
value: 89.119
- type: ndcg_at_100
value: 90.29700000000001
- type: ndcg_at_1000
value: 90.363
- type: ndcg_at_20
value: 89.77199999999999
- type: ndcg_at_3
value: 86.504
- type: ndcg_at_5
value: 87.934
- type: precision_at_1
value: 82.43
- type: precision_at_10
value: 13.501
- type: precision_at_100
value: 1.537
- type: precision_at_1000
value: 0.157
- type: precision_at_20
value: 7.156999999999999
- type: precision_at_3
value: 37.877
- type: precision_at_5
value: 24.8
- type: recall_at_1
value: 71.588
- type: recall_at_10
value: 95.8
- type: recall_at_100
value: 99.74499999999999
- type: recall_at_1000
value: 99.99
- type: recall_at_20
value: 97.89
- type: recall_at_3
value: 88.15899999999999
- type: recall_at_5
value: 92.35
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 59.768148638646366
- type: v_measures
value:
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- 0.6147853105210672
- 0.6591724865246826
- 0.5493814748704007
- 0.6297042175504105
- 0.5866008598060115
- 0.5809508283156773
- 0.6058754106824659
- 0.5543273885232877
- 0.5550793562936995
- 0.5610321573899796
- 0.5465207723453963
- 0.6124039455399534
- 0.6122329444911133
- 0.6037455892428413
- 0.6976772376865306
- 0.5322120114350026
- 0.6379349647684484
- 0.6921368790765298
- 0.5727065016099465
- 0.5745163060848133
- 0.5448674469960029
- 0.5689739419054519
- 0.6906211718192629
- 0.6139477505121778
- 0.5446302056704384
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 385e3cb46b4cfa89021f56c4380204149d0efe33
metrics:
- type: v_measure
value: 63.79386989587679
- type: v_measures
value:
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- 0.685339740760473
- 0.6672770984266047
- 0.6571679210172714
- 0.38659086540986226
- 0.7186082307389922
- 0.6319336711822882
- 0.42481527019225845
- 0.7509880075010729
- 0.7214601588149115
- 0.7352060255439448
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: mteb/scidocs
config: default
split: test
revision: f8c2fcf00f625baaa80f62ec5bd9e1fff3b8ae88
metrics:
- type: map_at_1
value: 5.143
- type: map_at_10
value: 14.493
- type: map_at_100
value: 17.131
- type: map_at_1000
value: 17.527
- type: map_at_20
value: 15.815999999999999
- type: map_at_3
value: 10.133000000000001
- type: map_at_5
value: 12.288
- type: mrr_at_1
value: 25.4
- type: mrr_at_10
value: 38.671
- type: mrr_at_100
value: 39.715
- type: mrr_at_1000
value: 39.745999999999995
- type: mrr_at_20
value: 39.333
- type: mrr_at_3
value: 35.467
- type: mrr_at_5
value: 37.347
- type: ndcg_at_1
value: 25.4
- type: ndcg_at_10
value: 23.785
- type: ndcg_at_100
value: 33.478
- type: ndcg_at_1000
value: 39.425
- type: ndcg_at_20
value: 27.156999999999996
- type: ndcg_at_3
value: 22.597
- type: ndcg_at_5
value: 19.798
- type: precision_at_1
value: 25.4
- type: precision_at_10
value: 12.520000000000001
- type: precision_at_100
value: 2.662
- type: precision_at_1000
value: 0.40800000000000003
- type: precision_at_20
value: 8.215
- type: precision_at_3
value: 21.767
- type: precision_at_5
value: 17.8
- type: recall_at_1
value: 5.143
- type: recall_at_10
value: 25.378
- type: recall_at_100
value: 54.032000000000004
- type: recall_at_1000
value: 82.73
- type: recall_at_20
value: 33.312000000000005
- type: recall_at_3
value: 13.222999999999999
- type: recall_at_5
value: 18.062
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: 20a6d6f312dd54037fe07a32d58e5e168867909d
metrics:
- type: cos_sim_pearson
value: 87.57401378797366
- type: cos_sim_spearman
value: 82.83001707430854
- type: euclidean_pearson
value: 84.86793164498624
- type: euclidean_spearman
value: 82.55413453843204
- type: manhattan_pearson
value: 84.8851834466949
- type: manhattan_spearman
value: 82.5582994454054
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 87.42938681941963
- type: cos_sim_spearman
value: 78.65009395911503
- type: euclidean_pearson
value: 85.83478468305478
- type: euclidean_spearman
value: 79.01427999514746
- type: manhattan_pearson
value: 85.81496883353536
- type: manhattan_spearman
value: 78.99456935403117
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 89.44529804367387
- type: cos_sim_spearman
value: 90.00142148909681
- type: euclidean_pearson
value: 89.00052026000864
- type: euclidean_spearman
value: 89.86653252628048
- type: manhattan_pearson
value: 88.95743893759386
- type: manhattan_spearman
value: 89.83494500063517
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 87.45360957773492
- type: cos_sim_spearman
value: 84.96999168443674
- type: euclidean_pearson
value: 86.73163292656861
- type: euclidean_spearman
value: 85.16035306962318
- type: manhattan_pearson
value: 86.71055630525136
- type: manhattan_spearman
value: 85.14629965640846
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 88.63706368456388
- type: cos_sim_spearman
value: 89.81153125001883
- type: euclidean_pearson
value: 88.83649620738461
- type: euclidean_spearman
value: 89.47909072703986
- type: manhattan_pearson
value: 88.83193018422992
- type: manhattan_spearman
value: 89.47672272039262
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 85.34235491663839
- type: cos_sim_spearman
value: 86.70854613787373
- type: euclidean_pearson
value: 85.73730484853073
- type: euclidean_spearman
value: 86.28313894663437
- type: manhattan_pearson
value: 85.70285004041696
- type: manhattan_spearman
value: 86.26723700895138
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 90.10976781396273
- type: cos_sim_spearman
value: 89.79699475327726
- type: euclidean_pearson
value: 89.51007666708566
- type: euclidean_spearman
value: 88.97696159087126
- type: manhattan_pearson
value: 89.5441850001744
- type: manhattan_spearman
value: 89.04684488385651
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 69.8918539910347
- type: cos_sim_spearman
value: 69.66706227647323
- type: euclidean_pearson
value: 70.87888342240508
- type: euclidean_spearman
value: 69.34119085154248
- type: manhattan_pearson
value: 70.8912286820092
- type: manhattan_spearman
value: 69.5009524916871
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 87.29883016932499
- type: cos_sim_spearman
value: 88.76691675006461
- type: euclidean_pearson
value: 88.20225127014815
- type: euclidean_spearman
value: 88.48087977970427
- type: manhattan_pearson
value: 88.2072233596074
- type: manhattan_spearman
value: 88.47336658990169
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 87.61294576605022
- type: mrr
value: 96.31477092261404
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: mteb/scifact
config: default
split: test
revision: 0228b52cf27578f30900b9e5271d331663a030d7
metrics:
- type: map_at_1
value: 60.260999999999996
- type: map_at_10
value: 70.462
- type: map_at_100
value: 70.86200000000001
- type: map_at_1000
value: 70.884
- type: map_at_20
value: 70.75
- type: map_at_3
value: 67.422
- type: map_at_5
value: 68.95400000000001
- type: mrr_at_1
value: 63.0
- type: mrr_at_10
value: 71.435
- type: mrr_at_100
value: 71.755
- type: mrr_at_1000
value: 71.776
- type: mrr_at_20
value: 71.65599999999999
- type: mrr_at_3
value: 69.167
- type: mrr_at_5
value: 70.467
- type: ndcg_at_1
value: 63.0
- type: ndcg_at_10
value: 75.247
- type: ndcg_at_100
value: 76.926
- type: ndcg_at_1000
value: 77.402
- type: ndcg_at_20
value: 76.164
- type: ndcg_at_3
value: 69.966
- type: ndcg_at_5
value: 72.25200000000001
- type: precision_at_1
value: 63.0
- type: precision_at_10
value: 10.100000000000001
- type: precision_at_100
value: 1.093
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_20
value: 5.25
- type: precision_at_3
value: 27.222
- type: precision_at_5
value: 17.933
- type: recall_at_1
value: 60.260999999999996
- type: recall_at_10
value: 88.98899999999999
- type: recall_at_100
value: 96.5
- type: recall_at_1000
value: 100.0
- type: recall_at_20
value: 92.43299999999999
- type: recall_at_3
value: 74.506
- type: recall_at_5
value: 80.217
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.86039603960396
- type: cos_sim_ap
value: 96.87211054415707
- type: cos_sim_f1
value: 92.98856290402784
- type: cos_sim_precision
value: 92.48269040553907
- type: cos_sim_recall
value: 93.5
- type: dot_accuracy
value: 99.7990099009901
- type: dot_ap
value: 94.78284318973266
- type: dot_f1
value: 89.66921119592874
- type: dot_precision
value: 91.29533678756476
- type: dot_recall
value: 88.1
- type: euclidean_accuracy
value: 99.85643564356435
- type: euclidean_ap
value: 96.67239701870625
- type: euclidean_f1
value: 92.68784669692386
- type: euclidean_precision
value: 93.48931841302137
- type: euclidean_recall
value: 91.9
- type: manhattan_accuracy
value: 99.85643564356435
- type: manhattan_ap
value: 96.68690502730702
- type: manhattan_f1
value: 92.77528649725959
- type: manhattan_precision
value: 92.45283018867924
- type: manhattan_recall
value: 93.10000000000001
- type: max_accuracy
value: 99.86039603960396
- type: max_ap
value: 96.87211054415707
- type: max_f1
value: 92.98856290402784
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 66.31370326221715
- type: v_measures
value:
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- 0.6746641255810865
- 0.6622536304657264
- 0.5847387141663161
- 0.6768822443352012
- 0.6726638120725165
- 0.6213993488349456
- 0.6240073768559564
- 0.7514629687485599
- 0.681958643043456
- 0.6642940617995263
- 0.7561680417689742
- 0.7498978187962102
- 0.7301260712898894
- 0.7003387387226521
- 0.5992390733013627
- 0.6432534258532143
- 0.636711109132664
- 0.6521000127954999
- 0.6454306128108777
- 0.649844033868562
- 0.6535706751600052
- 0.6241243444770364
- 0.6078934634355351
- 0.6553296616588102
- 0.6600738065797027
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 34.98820897729802
- type: v_measures
value:
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- 0.3416086542475584
- 0.33553801938401057
- 0.3379031258272391
- 0.3272007883428814
- 0.33661116022078547
- 0.37447130128552275
- 0.3579365983958137
- 0.36973965776864
- 0.36816341684304726
- 0.3496481754143038
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 55.185955556406554
- type: mrr
value: 56.137862341906455
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 30.657368209428938
- type: cos_sim_spearman
value: 31.926391208280304
- type: dot_pearson
value: 28.723660986211748
- type: dot_spearman
value: 29.051223656612642
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: mteb/trec-covid
config: default
split: test
revision: bb9466bac8153a0349341eb1b22e06409e78ef4e
metrics:
- type: map_at_1
value: 0.218
- type: map_at_10
value: 1.746
- type: map_at_100
value: 9.815
- type: map_at_1000
value: 24.196
- type: map_at_20
value: 3.097
- type: map_at_3
value: 0.616
- type: map_at_5
value: 0.991
- type: mrr_at_1
value: 80.0
- type: mrr_at_10
value: 88.667
- type: mrr_at_100
value: 88.667
- type: mrr_at_1000
value: 88.667
- type: mrr_at_20
value: 88.667
- type: mrr_at_3
value: 87.667
- type: mrr_at_5
value: 88.667
- type: ndcg_at_1
value: 73.0
- type: ndcg_at_10
value: 69.377
- type: ndcg_at_100
value: 53.878
- type: ndcg_at_1000
value: 49.589
- type: ndcg_at_20
value: 66.31
- type: ndcg_at_3
value: 74.654
- type: ndcg_at_5
value: 73.56899999999999
- type: precision_at_1
value: 80.0
- type: precision_at_10
value: 73.8
- type: precision_at_100
value: 55.74
- type: precision_at_1000
value: 21.814
- type: precision_at_20
value: 70.3
- type: precision_at_3
value: 80.0
- type: precision_at_5
value: 78.0
- type: recall_at_1
value: 0.218
- type: recall_at_10
value: 1.983
- type: recall_at_100
value: 13.499
- type: recall_at_1000
value: 46.869
- type: recall_at_20
value: 3.703
- type: recall_at_3
value: 0.656
- type: recall_at_5
value: 1.0739999999999998
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: mteb/touche2020
config: default
split: test
revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f
metrics:
- type: map_at_1
value: 2.358
- type: map_at_10
value: 9.494
- type: map_at_100
value: 15.809999999999999
- type: map_at_1000
value: 17.308
- type: map_at_20
value: 12.171
- type: map_at_3
value: 4.727
- type: map_at_5
value: 6.798
- type: mrr_at_1
value: 30.612000000000002
- type: mrr_at_10
value: 44.615
- type: mrr_at_100
value: 45.794000000000004
- type: mrr_at_1000
value: 45.812999999999995
- type: mrr_at_20
value: 45.519999999999996
- type: mrr_at_3
value: 41.156
- type: mrr_at_5
value: 42.483
- type: ndcg_at_1
value: 26.531
- type: ndcg_at_10
value: 23.115
- type: ndcg_at_100
value: 36.082
- type: ndcg_at_1000
value: 47.467999999999996
- type: ndcg_at_20
value: 25.224999999999998
- type: ndcg_at_3
value: 25.238
- type: ndcg_at_5
value: 24.299
- type: precision_at_1
value: 30.612000000000002
- type: precision_at_10
value: 20.816000000000003
- type: precision_at_100
value: 7.796
- type: precision_at_1000
value: 1.545
- type: precision_at_20
value: 17.347
- type: precision_at_3
value: 27.211000000000002
- type: precision_at_5
value: 25.306
- type: recall_at_1
value: 2.358
- type: recall_at_10
value: 15.433
- type: recall_at_100
value: 48.715
- type: recall_at_1000
value: 83.574
- type: recall_at_20
value: 24.038999999999998
- type: recall_at_3
value: 5.652
- type: recall_at_5
value: 9.327
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: edfaf9da55d3dd50d43143d90c1ac476895ae6de
metrics:
- type: accuracy
value: 67.9052734375
- type: ap
value: 12.464903195452706
- type: f1
value: 51.75730802861531
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 59.21618562535371
- type: f1
value: 59.5671083304645
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 52.98411009798346
- type: v_measures
value:
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- 0.5200339262530909
- 0.5659398224299081
- 0.5188653146880523
- 0.5498624282889892
- 0.49132181885931403
- 0.5312510012188089
- 0.5351846001585449
- 0.540629373100899
- 0.5278341181497205
- 0.5174886066510178
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 87.30404720748643
- type: cos_sim_ap
value: 78.24262856109937
- type: cos_sim_f1
value: 72.08312468703055
- type: cos_sim_precision
value: 68.58027632205813
- type: cos_sim_recall
value: 75.96306068601582
- type: dot_accuracy
value: 84.48471121177803
- type: dot_ap
value: 67.78610175988638
- type: dot_f1
value: 63.75754527162978
- type: dot_precision
value: 60.908217203267654
- type: dot_recall
value: 66.88654353562006
- type: euclidean_accuracy
value: 87.24444179531503
- type: euclidean_ap
value: 78.16169396391096
- type: euclidean_f1
value: 72.19500244977952
- type: euclidean_precision
value: 67.37540009144948
- type: euclidean_recall
value: 77.75725593667546
- type: manhattan_accuracy
value: 87.20867854801216
- type: manhattan_ap
value: 78.10430615026713
- type: manhattan_f1
value: 72.25504677498769
- type: manhattan_precision
value: 67.72035071527456
- type: manhattan_recall
value: 77.44063324538259
- type: max_accuracy
value: 87.30404720748643
- type: max_ap
value: 78.24262856109937
- type: max_f1
value: 72.25504677498769
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 89.08681647067955
- type: cos_sim_ap
value: 86.10715470590844
- type: cos_sim_f1
value: 78.62958187511512
- type: cos_sim_precision
value: 75.38320265592992
- type: cos_sim_recall
value: 82.16815522020326
- type: dot_accuracy
value: 88.00985756975977
- type: dot_ap
value: 83.27536710177887
- type: dot_f1
value: 76.57026000584284
- type: dot_precision
value: 72.82578494026119
- type: dot_recall
value: 80.72066522944257
- type: euclidean_accuracy
value: 88.9024721543059
- type: euclidean_ap
value: 85.83507000245919
- type: euclidean_f1
value: 78.354072605807
- type: euclidean_precision
value: 74.87197474570326
- type: euclidean_recall
value: 82.17585463504774
- type: manhattan_accuracy
value: 88.90829355377032
- type: manhattan_ap
value: 85.82130285331947
- type: manhattan_f1
value: 78.28887843364338
- type: manhattan_precision
value: 73.86464522297344
- type: manhattan_recall
value: 83.2768709578072
- type: max_accuracy
value: 89.08681647067955
- type: max_ap
value: 86.10715470590844
- type: max_f1
value: 78.62958187511512
---
`b1ade-embed` is a small but efficient embedding model for RAG. In the legacy MTEB leaderboard ( - 2024) b1ade-embed was ranked #1 in the STS catagory and placed competitively for other important task categories such as ranking, retrieval and classification. The model was trained using a combination of:
1. Model merging
- bert-large-uncased
- WhereIsAI/UAE-Large-V1
- BAAI/bge-large-en-v1.5
- mixedbread-ai/mxbai-embed-large-v1
- avsolatorio/GIST-large-Embedding-v0)
2. Knowledge distillation from larger models
To use this model:
```
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("w601sxs/b1ade-embed")
```
b1ade-embed is part of a collection of small models for RAG. Stay tuned for more updates.
## Use in research
Our embedding model "b1ade-embed" is a 335M parameter model that demonstrates strong performance across the board. Specifically, recent research used the model in clinical and labor market domains, relying on the #1 ranking of the model in Semantic Textual Similarity (STS) for models under 500M parameters on the MTEB leaderboard.
We've been working on b1ade-embed to optimize the balance between latency and performance. This balance is crucial in real-world applications, especially in verticalized domains, where rapid processing of vast amounts of data can significantly impact decision-making processes. While achieving high accuracy is important, the ability to deliver results quickly is equally vital. Larger embedding outputs also result in higher storage costs in vector indexes, so striking a balance in between task performance and latency is important.
The medRxiv paper, "A Scalable Framework for Benchmarking Embedding Models for Clinical Tasks," provides a comprehensive evaluation of embedding models in healthcare contexts. It tested 30 models across various clinical tasks (2.1M comparisons), including analysis of patient notes, synthetic EHRs, and MIMIC-IV ICU data, as well as biomedical tasks involving PubMed abstracts and research papers. The study highlights b1ade-embed's versatility across these domains:
"Other models exhibiting strong performance in both clinical and PubMed domains include 'b1ade-embed'." It also emphasizes the model's efficiency, noting that "Models like 'b1ade-embed' demonstrate high efficiency despite smaller size, making them ideal for tasks requiring rapid processing." The paper evaluated models on short tasks such as triage notes and chief complaints, where b1ade-embed achieved a high score of 27.4, competing closely with larger models.
In the labor market context, the CEUR-WS paper demonstrates b1ade-embed's effectiveness in taxonomy enrichment. The paper states, "We evaluated the robustness of our system against a closed-world evaluation constructed using ESCO's hierarchy, achieving a 81% Positive Predictive Value (PPV) when combining all three models." This high accuracy demonstrates b1ade-embed's capability to capture nuanced semantic relationships in labor market terminology. Of course, no model can be 👑. There is a need to carefully evaluate task performance vs latency for your specific embedding task - STS, retrieval, clustering etc.
Sources:
- https://huggingface.co/spaces/mteb/leaderboard_legacy
- https://medium.com/@elias.tarnaras/full-local-open-source-lightweight-simple-rag-a0a1de586209
- https://www.medrxiv.org/content/10.1101/2024.08.14.24312010v1.full
- https://ceur-ws.org/Vol-3914/short71.pdf
- b1ade - Small RAG models collection - https://huggingface.co/collections/w601sxs/b1ade-6646958cb371ea244809c5ef
## Cite
```
@misc{bigscience_workshop_2022,
author = { {Shreyas Subramanian} },
title = { {b1ade series of models} },
year = 2024,
url = { https://huggingface.co/w601sxs/b1ade-embed },
publisher = { Hugging Face }
}
```
| [
"SUMMARIZATION"
] | [
"BIOSSES",
"SCIFACT"
] |
SmartComponents/bge-micro-v2 | SmartComponents | sentence-similarity | [
"sentence-transformers",
"pytorch",
"onnx",
"bert",
"feature-extraction",
"sentence-similarity",
"transformers",
"mteb",
"model-index",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
] | 2024-02-15T12:19:16 | 2024-02-15T12:38:51 | 1,668 | 2 | ---
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
- mteb
model-index:
- name: bge_micro
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 67.76119402985074
- type: ap
value: 29.637849284211114
- type: f1
value: 61.31181187111905
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 79.7547
- type: ap
value: 74.21401629809145
- type: f1
value: 79.65319615433783
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 37.452000000000005
- type: f1
value: 37.0245198854966
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: arguana
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 31.152
- type: map_at_10
value: 46.702
- type: map_at_100
value: 47.563
- type: map_at_1000
value: 47.567
- type: map_at_3
value: 42.058
- type: map_at_5
value: 44.608
- type: mrr_at_1
value: 32.006
- type: mrr_at_10
value: 47.064
- type: mrr_at_100
value: 47.910000000000004
- type: mrr_at_1000
value: 47.915
- type: mrr_at_3
value: 42.283
- type: mrr_at_5
value: 44.968
- type: ndcg_at_1
value: 31.152
- type: ndcg_at_10
value: 55.308
- type: ndcg_at_100
value: 58.965
- type: ndcg_at_1000
value: 59.067
- type: ndcg_at_3
value: 45.698
- type: ndcg_at_5
value: 50.296
- type: precision_at_1
value: 31.152
- type: precision_at_10
value: 8.279
- type: precision_at_100
value: 0.987
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 18.753
- type: precision_at_5
value: 13.485
- type: recall_at_1
value: 31.152
- type: recall_at_10
value: 82.788
- type: recall_at_100
value: 98.72
- type: recall_at_1000
value: 99.502
- type: recall_at_3
value: 56.259
- type: recall_at_5
value: 67.425
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 44.52692241938116
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 33.245710292773595
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 58.08493637155168
- type: mrr
value: 71.94378490084861
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 84.1602804378326
- type: cos_sim_spearman
value: 82.92478106365587
- type: euclidean_pearson
value: 82.27930167277077
- type: euclidean_spearman
value: 82.18560759458093
- type: manhattan_pearson
value: 82.34277425888187
- type: manhattan_spearman
value: 81.72776583704467
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 81.17207792207792
- type: f1
value: 81.09893836310513
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 36.109308463095516
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 28.06048212317168
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 28.233999999999998
- type: map_at_10
value: 38.092999999999996
- type: map_at_100
value: 39.473
- type: map_at_1000
value: 39.614
- type: map_at_3
value: 34.839
- type: map_at_5
value: 36.523
- type: mrr_at_1
value: 35.193000000000005
- type: mrr_at_10
value: 44.089
- type: mrr_at_100
value: 44.927
- type: mrr_at_1000
value: 44.988
- type: mrr_at_3
value: 41.559000000000005
- type: mrr_at_5
value: 43.162
- type: ndcg_at_1
value: 35.193000000000005
- type: ndcg_at_10
value: 44.04
- type: ndcg_at_100
value: 49.262
- type: ndcg_at_1000
value: 51.847
- type: ndcg_at_3
value: 39.248
- type: ndcg_at_5
value: 41.298
- type: precision_at_1
value: 35.193000000000005
- type: precision_at_10
value: 8.555
- type: precision_at_100
value: 1.3820000000000001
- type: precision_at_1000
value: 0.189
- type: precision_at_3
value: 19.123
- type: precision_at_5
value: 13.648
- type: recall_at_1
value: 28.233999999999998
- type: recall_at_10
value: 55.094
- type: recall_at_100
value: 76.85300000000001
- type: recall_at_1000
value: 94.163
- type: recall_at_3
value: 40.782000000000004
- type: recall_at_5
value: 46.796
- type: map_at_1
value: 21.538
- type: map_at_10
value: 28.449
- type: map_at_100
value: 29.471000000000004
- type: map_at_1000
value: 29.599999999999998
- type: map_at_3
value: 26.371
- type: map_at_5
value: 27.58
- type: mrr_at_1
value: 26.815
- type: mrr_at_10
value: 33.331
- type: mrr_at_100
value: 34.114
- type: mrr_at_1000
value: 34.182
- type: mrr_at_3
value: 31.561
- type: mrr_at_5
value: 32.608
- type: ndcg_at_1
value: 26.815
- type: ndcg_at_10
value: 32.67
- type: ndcg_at_100
value: 37.039
- type: ndcg_at_1000
value: 39.769
- type: ndcg_at_3
value: 29.523
- type: ndcg_at_5
value: 31.048
- type: precision_at_1
value: 26.815
- type: precision_at_10
value: 5.955
- type: precision_at_100
value: 1.02
- type: precision_at_1000
value: 0.152
- type: precision_at_3
value: 14.033999999999999
- type: precision_at_5
value: 9.911
- type: recall_at_1
value: 21.538
- type: recall_at_10
value: 40.186
- type: recall_at_100
value: 58.948
- type: recall_at_1000
value: 77.158
- type: recall_at_3
value: 30.951
- type: recall_at_5
value: 35.276
- type: map_at_1
value: 35.211999999999996
- type: map_at_10
value: 46.562
- type: map_at_100
value: 47.579
- type: map_at_1000
value: 47.646
- type: map_at_3
value: 43.485
- type: map_at_5
value: 45.206
- type: mrr_at_1
value: 40.627
- type: mrr_at_10
value: 49.928
- type: mrr_at_100
value: 50.647
- type: mrr_at_1000
value: 50.685
- type: mrr_at_3
value: 47.513
- type: mrr_at_5
value: 48.958
- type: ndcg_at_1
value: 40.627
- type: ndcg_at_10
value: 52.217
- type: ndcg_at_100
value: 56.423
- type: ndcg_at_1000
value: 57.821999999999996
- type: ndcg_at_3
value: 46.949000000000005
- type: ndcg_at_5
value: 49.534
- type: precision_at_1
value: 40.627
- type: precision_at_10
value: 8.476
- type: precision_at_100
value: 1.15
- type: precision_at_1000
value: 0.132
- type: precision_at_3
value: 21.003
- type: precision_at_5
value: 14.469999999999999
- type: recall_at_1
value: 35.211999999999996
- type: recall_at_10
value: 65.692
- type: recall_at_100
value: 84.011
- type: recall_at_1000
value: 94.03099999999999
- type: recall_at_3
value: 51.404
- type: recall_at_5
value: 57.882
- type: map_at_1
value: 22.09
- type: map_at_10
value: 29.516
- type: map_at_100
value: 30.462
- type: map_at_1000
value: 30.56
- type: map_at_3
value: 26.945000000000004
- type: map_at_5
value: 28.421999999999997
- type: mrr_at_1
value: 23.616
- type: mrr_at_10
value: 31.221
- type: mrr_at_100
value: 32.057
- type: mrr_at_1000
value: 32.137
- type: mrr_at_3
value: 28.738000000000003
- type: mrr_at_5
value: 30.156
- type: ndcg_at_1
value: 23.616
- type: ndcg_at_10
value: 33.97
- type: ndcg_at_100
value: 38.806000000000004
- type: ndcg_at_1000
value: 41.393
- type: ndcg_at_3
value: 28.908
- type: ndcg_at_5
value: 31.433
- type: precision_at_1
value: 23.616
- type: precision_at_10
value: 5.299
- type: precision_at_100
value: 0.812
- type: precision_at_1000
value: 0.107
- type: precision_at_3
value: 12.015
- type: precision_at_5
value: 8.701
- type: recall_at_1
value: 22.09
- type: recall_at_10
value: 46.089999999999996
- type: recall_at_100
value: 68.729
- type: recall_at_1000
value: 88.435
- type: recall_at_3
value: 32.584999999999994
- type: recall_at_5
value: 38.550000000000004
- type: map_at_1
value: 15.469
- type: map_at_10
value: 22.436
- type: map_at_100
value: 23.465
- type: map_at_1000
value: 23.608999999999998
- type: map_at_3
value: 19.716
- type: map_at_5
value: 21.182000000000002
- type: mrr_at_1
value: 18.905
- type: mrr_at_10
value: 26.55
- type: mrr_at_100
value: 27.46
- type: mrr_at_1000
value: 27.553
- type: mrr_at_3
value: 23.921999999999997
- type: mrr_at_5
value: 25.302999999999997
- type: ndcg_at_1
value: 18.905
- type: ndcg_at_10
value: 27.437
- type: ndcg_at_100
value: 32.555
- type: ndcg_at_1000
value: 35.885
- type: ndcg_at_3
value: 22.439
- type: ndcg_at_5
value: 24.666
- type: precision_at_1
value: 18.905
- type: precision_at_10
value: 5.2490000000000006
- type: precision_at_100
value: 0.889
- type: precision_at_1000
value: 0.131
- type: precision_at_3
value: 10.862
- type: precision_at_5
value: 8.085
- type: recall_at_1
value: 15.469
- type: recall_at_10
value: 38.706
- type: recall_at_100
value: 61.242
- type: recall_at_1000
value: 84.84
- type: recall_at_3
value: 24.973
- type: recall_at_5
value: 30.603
- type: map_at_1
value: 24.918000000000003
- type: map_at_10
value: 34.296
- type: map_at_100
value: 35.632000000000005
- type: map_at_1000
value: 35.748999999999995
- type: map_at_3
value: 31.304
- type: map_at_5
value: 33.166000000000004
- type: mrr_at_1
value: 30.703000000000003
- type: mrr_at_10
value: 39.655
- type: mrr_at_100
value: 40.569
- type: mrr_at_1000
value: 40.621
- type: mrr_at_3
value: 37.023
- type: mrr_at_5
value: 38.664
- type: ndcg_at_1
value: 30.703000000000003
- type: ndcg_at_10
value: 39.897
- type: ndcg_at_100
value: 45.777
- type: ndcg_at_1000
value: 48.082
- type: ndcg_at_3
value: 35.122
- type: ndcg_at_5
value: 37.691
- type: precision_at_1
value: 30.703000000000003
- type: precision_at_10
value: 7.305000000000001
- type: precision_at_100
value: 1.208
- type: precision_at_1000
value: 0.159
- type: precision_at_3
value: 16.811
- type: precision_at_5
value: 12.203999999999999
- type: recall_at_1
value: 24.918000000000003
- type: recall_at_10
value: 51.31
- type: recall_at_100
value: 76.534
- type: recall_at_1000
value: 91.911
- type: recall_at_3
value: 37.855
- type: recall_at_5
value: 44.493
- type: map_at_1
value: 22.416
- type: map_at_10
value: 30.474
- type: map_at_100
value: 31.759999999999998
- type: map_at_1000
value: 31.891000000000002
- type: map_at_3
value: 27.728
- type: map_at_5
value: 29.247
- type: mrr_at_1
value: 28.881
- type: mrr_at_10
value: 36.418
- type: mrr_at_100
value: 37.347
- type: mrr_at_1000
value: 37.415
- type: mrr_at_3
value: 33.942
- type: mrr_at_5
value: 35.386
- type: ndcg_at_1
value: 28.881
- type: ndcg_at_10
value: 35.812
- type: ndcg_at_100
value: 41.574
- type: ndcg_at_1000
value: 44.289
- type: ndcg_at_3
value: 31.239
- type: ndcg_at_5
value: 33.302
- type: precision_at_1
value: 28.881
- type: precision_at_10
value: 6.598
- type: precision_at_100
value: 1.1079999999999999
- type: precision_at_1000
value: 0.151
- type: precision_at_3
value: 14.954
- type: precision_at_5
value: 10.776
- type: recall_at_1
value: 22.416
- type: recall_at_10
value: 46.243
- type: recall_at_100
value: 71.352
- type: recall_at_1000
value: 90.034
- type: recall_at_3
value: 32.873000000000005
- type: recall_at_5
value: 38.632
- type: map_at_1
value: 22.528166666666667
- type: map_at_10
value: 30.317833333333333
- type: map_at_100
value: 31.44108333333333
- type: map_at_1000
value: 31.566666666666666
- type: map_at_3
value: 27.84425
- type: map_at_5
value: 29.233333333333334
- type: mrr_at_1
value: 26.75733333333333
- type: mrr_at_10
value: 34.24425
- type: mrr_at_100
value: 35.11375
- type: mrr_at_1000
value: 35.184333333333335
- type: mrr_at_3
value: 32.01225
- type: mrr_at_5
value: 33.31225
- type: ndcg_at_1
value: 26.75733333333333
- type: ndcg_at_10
value: 35.072583333333334
- type: ndcg_at_100
value: 40.13358333333334
- type: ndcg_at_1000
value: 42.81825
- type: ndcg_at_3
value: 30.79275000000001
- type: ndcg_at_5
value: 32.822
- type: precision_at_1
value: 26.75733333333333
- type: precision_at_10
value: 6.128083333333334
- type: precision_at_100
value: 1.019
- type: precision_at_1000
value: 0.14391666666666664
- type: precision_at_3
value: 14.129916666666665
- type: precision_at_5
value: 10.087416666666668
- type: recall_at_1
value: 22.528166666666667
- type: recall_at_10
value: 45.38341666666667
- type: recall_at_100
value: 67.81791666666668
- type: recall_at_1000
value: 86.71716666666666
- type: recall_at_3
value: 33.38741666666667
- type: recall_at_5
value: 38.62041666666667
- type: map_at_1
value: 21.975
- type: map_at_10
value: 28.144999999999996
- type: map_at_100
value: 28.994999999999997
- type: map_at_1000
value: 29.086000000000002
- type: map_at_3
value: 25.968999999999998
- type: map_at_5
value: 27.321
- type: mrr_at_1
value: 25.0
- type: mrr_at_10
value: 30.822
- type: mrr_at_100
value: 31.647
- type: mrr_at_1000
value: 31.712
- type: mrr_at_3
value: 28.860000000000003
- type: mrr_at_5
value: 30.041
- type: ndcg_at_1
value: 25.0
- type: ndcg_at_10
value: 31.929999999999996
- type: ndcg_at_100
value: 36.258
- type: ndcg_at_1000
value: 38.682
- type: ndcg_at_3
value: 27.972
- type: ndcg_at_5
value: 30.089
- type: precision_at_1
value: 25.0
- type: precision_at_10
value: 4.923
- type: precision_at_100
value: 0.767
- type: precision_at_1000
value: 0.106
- type: precision_at_3
value: 11.860999999999999
- type: precision_at_5
value: 8.466
- type: recall_at_1
value: 21.975
- type: recall_at_10
value: 41.102
- type: recall_at_100
value: 60.866
- type: recall_at_1000
value: 78.781
- type: recall_at_3
value: 30.268
- type: recall_at_5
value: 35.552
- type: map_at_1
value: 15.845999999999998
- type: map_at_10
value: 21.861
- type: map_at_100
value: 22.798
- type: map_at_1000
value: 22.925
- type: map_at_3
value: 19.922
- type: map_at_5
value: 21.054000000000002
- type: mrr_at_1
value: 19.098000000000003
- type: mrr_at_10
value: 25.397
- type: mrr_at_100
value: 26.246000000000002
- type: mrr_at_1000
value: 26.33
- type: mrr_at_3
value: 23.469
- type: mrr_at_5
value: 24.646
- type: ndcg_at_1
value: 19.098000000000003
- type: ndcg_at_10
value: 25.807999999999996
- type: ndcg_at_100
value: 30.445
- type: ndcg_at_1000
value: 33.666000000000004
- type: ndcg_at_3
value: 22.292
- type: ndcg_at_5
value: 24.075
- type: precision_at_1
value: 19.098000000000003
- type: precision_at_10
value: 4.58
- type: precision_at_100
value: 0.8099999999999999
- type: precision_at_1000
value: 0.126
- type: precision_at_3
value: 10.346
- type: precision_at_5
value: 7.542999999999999
- type: recall_at_1
value: 15.845999999999998
- type: recall_at_10
value: 34.172999999999995
- type: recall_at_100
value: 55.24099999999999
- type: recall_at_1000
value: 78.644
- type: recall_at_3
value: 24.401
- type: recall_at_5
value: 28.938000000000002
- type: map_at_1
value: 22.974
- type: map_at_10
value: 30.108
- type: map_at_100
value: 31.208000000000002
- type: map_at_1000
value: 31.330999999999996
- type: map_at_3
value: 27.889999999999997
- type: map_at_5
value: 29.023
- type: mrr_at_1
value: 26.493
- type: mrr_at_10
value: 33.726
- type: mrr_at_100
value: 34.622
- type: mrr_at_1000
value: 34.703
- type: mrr_at_3
value: 31.575999999999997
- type: mrr_at_5
value: 32.690999999999995
- type: ndcg_at_1
value: 26.493
- type: ndcg_at_10
value: 34.664
- type: ndcg_at_100
value: 39.725
- type: ndcg_at_1000
value: 42.648
- type: ndcg_at_3
value: 30.447999999999997
- type: ndcg_at_5
value: 32.145
- type: precision_at_1
value: 26.493
- type: precision_at_10
value: 5.7090000000000005
- type: precision_at_100
value: 0.9199999999999999
- type: precision_at_1000
value: 0.129
- type: precision_at_3
value: 13.464
- type: precision_at_5
value: 9.384
- type: recall_at_1
value: 22.974
- type: recall_at_10
value: 45.097
- type: recall_at_100
value: 66.908
- type: recall_at_1000
value: 87.495
- type: recall_at_3
value: 33.338
- type: recall_at_5
value: 37.499
- type: map_at_1
value: 22.408
- type: map_at_10
value: 29.580000000000002
- type: map_at_100
value: 31.145
- type: map_at_1000
value: 31.369000000000003
- type: map_at_3
value: 27.634999999999998
- type: map_at_5
value: 28.766000000000002
- type: mrr_at_1
value: 27.272999999999996
- type: mrr_at_10
value: 33.93
- type: mrr_at_100
value: 34.963
- type: mrr_at_1000
value: 35.031
- type: mrr_at_3
value: 32.016
- type: mrr_at_5
value: 33.221000000000004
- type: ndcg_at_1
value: 27.272999999999996
- type: ndcg_at_10
value: 33.993
- type: ndcg_at_100
value: 40.333999999999996
- type: ndcg_at_1000
value: 43.361
- type: ndcg_at_3
value: 30.918
- type: ndcg_at_5
value: 32.552
- type: precision_at_1
value: 27.272999999999996
- type: precision_at_10
value: 6.285
- type: precision_at_100
value: 1.389
- type: precision_at_1000
value: 0.232
- type: precision_at_3
value: 14.427000000000001
- type: precision_at_5
value: 10.356
- type: recall_at_1
value: 22.408
- type: recall_at_10
value: 41.318
- type: recall_at_100
value: 70.539
- type: recall_at_1000
value: 90.197
- type: recall_at_3
value: 32.513
- type: recall_at_5
value: 37.0
- type: map_at_1
value: 17.258000000000003
- type: map_at_10
value: 24.294
- type: map_at_100
value: 25.305
- type: map_at_1000
value: 25.419999999999998
- type: map_at_3
value: 22.326999999999998
- type: map_at_5
value: 23.31
- type: mrr_at_1
value: 18.484
- type: mrr_at_10
value: 25.863999999999997
- type: mrr_at_100
value: 26.766000000000002
- type: mrr_at_1000
value: 26.855
- type: mrr_at_3
value: 23.968
- type: mrr_at_5
value: 24.911
- type: ndcg_at_1
value: 18.484
- type: ndcg_at_10
value: 28.433000000000003
- type: ndcg_at_100
value: 33.405
- type: ndcg_at_1000
value: 36.375
- type: ndcg_at_3
value: 24.455
- type: ndcg_at_5
value: 26.031
- type: precision_at_1
value: 18.484
- type: precision_at_10
value: 4.603
- type: precision_at_100
value: 0.773
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_3
value: 10.659
- type: precision_at_5
value: 7.505000000000001
- type: recall_at_1
value: 17.258000000000003
- type: recall_at_10
value: 39.589999999999996
- type: recall_at_100
value: 62.592000000000006
- type: recall_at_1000
value: 84.917
- type: recall_at_3
value: 28.706
- type: recall_at_5
value: 32.224000000000004
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: climate-fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 10.578999999999999
- type: map_at_10
value: 17.642
- type: map_at_100
value: 19.451
- type: map_at_1000
value: 19.647000000000002
- type: map_at_3
value: 14.618
- type: map_at_5
value: 16.145
- type: mrr_at_1
value: 23.322000000000003
- type: mrr_at_10
value: 34.204
- type: mrr_at_100
value: 35.185
- type: mrr_at_1000
value: 35.235
- type: mrr_at_3
value: 30.847
- type: mrr_at_5
value: 32.824
- type: ndcg_at_1
value: 23.322000000000003
- type: ndcg_at_10
value: 25.352999999999998
- type: ndcg_at_100
value: 32.574
- type: ndcg_at_1000
value: 36.073
- type: ndcg_at_3
value: 20.318
- type: ndcg_at_5
value: 22.111
- type: precision_at_1
value: 23.322000000000003
- type: precision_at_10
value: 8.02
- type: precision_at_100
value: 1.5730000000000002
- type: precision_at_1000
value: 0.22200000000000003
- type: precision_at_3
value: 15.049000000000001
- type: precision_at_5
value: 11.87
- type: recall_at_1
value: 10.578999999999999
- type: recall_at_10
value: 30.964999999999996
- type: recall_at_100
value: 55.986000000000004
- type: recall_at_1000
value: 75.565
- type: recall_at_3
value: 18.686
- type: recall_at_5
value: 23.629
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: dbpedia-entity
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 7.327
- type: map_at_10
value: 14.904
- type: map_at_100
value: 20.29
- type: map_at_1000
value: 21.42
- type: map_at_3
value: 10.911
- type: map_at_5
value: 12.791
- type: mrr_at_1
value: 57.25
- type: mrr_at_10
value: 66.62700000000001
- type: mrr_at_100
value: 67.035
- type: mrr_at_1000
value: 67.052
- type: mrr_at_3
value: 64.833
- type: mrr_at_5
value: 65.908
- type: ndcg_at_1
value: 43.75
- type: ndcg_at_10
value: 32.246
- type: ndcg_at_100
value: 35.774
- type: ndcg_at_1000
value: 42.872
- type: ndcg_at_3
value: 36.64
- type: ndcg_at_5
value: 34.487
- type: precision_at_1
value: 57.25
- type: precision_at_10
value: 25.924999999999997
- type: precision_at_100
value: 7.670000000000001
- type: precision_at_1000
value: 1.599
- type: precision_at_3
value: 41.167
- type: precision_at_5
value: 34.65
- type: recall_at_1
value: 7.327
- type: recall_at_10
value: 19.625
- type: recall_at_100
value: 41.601
- type: recall_at_1000
value: 65.117
- type: recall_at_3
value: 12.308
- type: recall_at_5
value: 15.437999999999999
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 44.53
- type: f1
value: 39.39884255816736
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 58.913000000000004
- type: map_at_10
value: 69.592
- type: map_at_100
value: 69.95599999999999
- type: map_at_1000
value: 69.973
- type: map_at_3
value: 67.716
- type: map_at_5
value: 68.899
- type: mrr_at_1
value: 63.561
- type: mrr_at_10
value: 74.2
- type: mrr_at_100
value: 74.468
- type: mrr_at_1000
value: 74.47500000000001
- type: mrr_at_3
value: 72.442
- type: mrr_at_5
value: 73.58
- type: ndcg_at_1
value: 63.561
- type: ndcg_at_10
value: 74.988
- type: ndcg_at_100
value: 76.52799999999999
- type: ndcg_at_1000
value: 76.88000000000001
- type: ndcg_at_3
value: 71.455
- type: ndcg_at_5
value: 73.42699999999999
- type: precision_at_1
value: 63.561
- type: precision_at_10
value: 9.547
- type: precision_at_100
value: 1.044
- type: precision_at_1000
value: 0.109
- type: precision_at_3
value: 28.143
- type: precision_at_5
value: 18.008
- type: recall_at_1
value: 58.913000000000004
- type: recall_at_10
value: 87.18
- type: recall_at_100
value: 93.852
- type: recall_at_1000
value: 96.256
- type: recall_at_3
value: 77.55199999999999
- type: recall_at_5
value: 82.42399999999999
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: fiqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 11.761000000000001
- type: map_at_10
value: 19.564999999999998
- type: map_at_100
value: 21.099
- type: map_at_1000
value: 21.288999999999998
- type: map_at_3
value: 16.683999999999997
- type: map_at_5
value: 18.307000000000002
- type: mrr_at_1
value: 23.302
- type: mrr_at_10
value: 30.979
- type: mrr_at_100
value: 32.121
- type: mrr_at_1000
value: 32.186
- type: mrr_at_3
value: 28.549000000000003
- type: mrr_at_5
value: 30.038999999999998
- type: ndcg_at_1
value: 23.302
- type: ndcg_at_10
value: 25.592
- type: ndcg_at_100
value: 32.416
- type: ndcg_at_1000
value: 36.277
- type: ndcg_at_3
value: 22.151
- type: ndcg_at_5
value: 23.483999999999998
- type: precision_at_1
value: 23.302
- type: precision_at_10
value: 7.377000000000001
- type: precision_at_100
value: 1.415
- type: precision_at_1000
value: 0.212
- type: precision_at_3
value: 14.712
- type: precision_at_5
value: 11.358
- type: recall_at_1
value: 11.761000000000001
- type: recall_at_10
value: 31.696
- type: recall_at_100
value: 58.01500000000001
- type: recall_at_1000
value: 81.572
- type: recall_at_3
value: 20.742
- type: recall_at_5
value: 25.707
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: hotpotqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 32.275
- type: map_at_10
value: 44.712
- type: map_at_100
value: 45.621
- type: map_at_1000
value: 45.698
- type: map_at_3
value: 42.016999999999996
- type: map_at_5
value: 43.659
- type: mrr_at_1
value: 64.551
- type: mrr_at_10
value: 71.58099999999999
- type: mrr_at_100
value: 71.952
- type: mrr_at_1000
value: 71.96900000000001
- type: mrr_at_3
value: 70.236
- type: mrr_at_5
value: 71.051
- type: ndcg_at_1
value: 64.551
- type: ndcg_at_10
value: 53.913999999999994
- type: ndcg_at_100
value: 57.421
- type: ndcg_at_1000
value: 59.06
- type: ndcg_at_3
value: 49.716
- type: ndcg_at_5
value: 51.971999999999994
- type: precision_at_1
value: 64.551
- type: precision_at_10
value: 11.110000000000001
- type: precision_at_100
value: 1.388
- type: precision_at_1000
value: 0.161
- type: precision_at_3
value: 30.822
- type: precision_at_5
value: 20.273
- type: recall_at_1
value: 32.275
- type: recall_at_10
value: 55.55
- type: recall_at_100
value: 69.38600000000001
- type: recall_at_1000
value: 80.35799999999999
- type: recall_at_3
value: 46.232
- type: recall_at_5
value: 50.682
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 76.4604
- type: ap
value: 70.40498168422701
- type: f1
value: 76.38572688476046
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: msmarco
config: default
split: dev
revision: None
metrics:
- type: map_at_1
value: 15.065999999999999
- type: map_at_10
value: 25.058000000000003
- type: map_at_100
value: 26.268
- type: map_at_1000
value: 26.344
- type: map_at_3
value: 21.626
- type: map_at_5
value: 23.513
- type: mrr_at_1
value: 15.501000000000001
- type: mrr_at_10
value: 25.548
- type: mrr_at_100
value: 26.723000000000003
- type: mrr_at_1000
value: 26.793
- type: mrr_at_3
value: 22.142
- type: mrr_at_5
value: 24.024
- type: ndcg_at_1
value: 15.501000000000001
- type: ndcg_at_10
value: 31.008000000000003
- type: ndcg_at_100
value: 37.08
- type: ndcg_at_1000
value: 39.102
- type: ndcg_at_3
value: 23.921999999999997
- type: ndcg_at_5
value: 27.307
- type: precision_at_1
value: 15.501000000000001
- type: precision_at_10
value: 5.155
- type: precision_at_100
value: 0.822
- type: precision_at_1000
value: 0.099
- type: precision_at_3
value: 10.363
- type: precision_at_5
value: 7.917000000000001
- type: recall_at_1
value: 15.065999999999999
- type: recall_at_10
value: 49.507
- type: recall_at_100
value: 78.118
- type: recall_at_1000
value: 93.881
- type: recall_at_3
value: 30.075000000000003
- type: recall_at_5
value: 38.222
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 90.6703146374829
- type: f1
value: 90.1258004293966
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 68.29229366165072
- type: f1
value: 50.016194478997875
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 68.57767316745124
- type: f1
value: 67.16194062146954
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 73.92064559515804
- type: f1
value: 73.6680729569968
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 31.56335607367883
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 28.131807833734268
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 31.07390328719844
- type: mrr
value: 32.117370992867905
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: nfcorpus
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 5.274
- type: map_at_10
value: 11.489
- type: map_at_100
value: 14.518
- type: map_at_1000
value: 15.914
- type: map_at_3
value: 8.399
- type: map_at_5
value: 9.889000000000001
- type: mrr_at_1
value: 42.724000000000004
- type: mrr_at_10
value: 51.486
- type: mrr_at_100
value: 51.941
- type: mrr_at_1000
value: 51.99
- type: mrr_at_3
value: 49.278
- type: mrr_at_5
value: 50.485
- type: ndcg_at_1
value: 39.938
- type: ndcg_at_10
value: 31.862000000000002
- type: ndcg_at_100
value: 29.235
- type: ndcg_at_1000
value: 37.802
- type: ndcg_at_3
value: 35.754999999999995
- type: ndcg_at_5
value: 34.447
- type: precision_at_1
value: 42.105
- type: precision_at_10
value: 23.901
- type: precision_at_100
value: 7.715
- type: precision_at_1000
value: 2.045
- type: precision_at_3
value: 33.437
- type: precision_at_5
value: 29.782999999999998
- type: recall_at_1
value: 5.274
- type: recall_at_10
value: 15.351
- type: recall_at_100
value: 29.791
- type: recall_at_1000
value: 60.722
- type: recall_at_3
value: 9.411
- type: recall_at_5
value: 12.171999999999999
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: nq
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 16.099
- type: map_at_10
value: 27.913
- type: map_at_100
value: 29.281000000000002
- type: map_at_1000
value: 29.343999999999998
- type: map_at_3
value: 23.791
- type: map_at_5
value: 26.049
- type: mrr_at_1
value: 18.337
- type: mrr_at_10
value: 29.953999999999997
- type: mrr_at_100
value: 31.080999999999996
- type: mrr_at_1000
value: 31.130000000000003
- type: mrr_at_3
value: 26.168000000000003
- type: mrr_at_5
value: 28.277
- type: ndcg_at_1
value: 18.308
- type: ndcg_at_10
value: 34.938
- type: ndcg_at_100
value: 41.125
- type: ndcg_at_1000
value: 42.708
- type: ndcg_at_3
value: 26.805
- type: ndcg_at_5
value: 30.686999999999998
- type: precision_at_1
value: 18.308
- type: precision_at_10
value: 6.476999999999999
- type: precision_at_100
value: 0.9939999999999999
- type: precision_at_1000
value: 0.11399999999999999
- type: precision_at_3
value: 12.784999999999998
- type: precision_at_5
value: 9.878
- type: recall_at_1
value: 16.099
- type: recall_at_10
value: 54.63
- type: recall_at_100
value: 82.24900000000001
- type: recall_at_1000
value: 94.242
- type: recall_at_3
value: 33.174
- type: recall_at_5
value: 42.164
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: quora
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 67.947
- type: map_at_10
value: 81.499
- type: map_at_100
value: 82.17
- type: map_at_1000
value: 82.194
- type: map_at_3
value: 78.567
- type: map_at_5
value: 80.34400000000001
- type: mrr_at_1
value: 78.18
- type: mrr_at_10
value: 85.05
- type: mrr_at_100
value: 85.179
- type: mrr_at_1000
value: 85.181
- type: mrr_at_3
value: 83.91
- type: mrr_at_5
value: 84.638
- type: ndcg_at_1
value: 78.2
- type: ndcg_at_10
value: 85.715
- type: ndcg_at_100
value: 87.2
- type: ndcg_at_1000
value: 87.39
- type: ndcg_at_3
value: 82.572
- type: ndcg_at_5
value: 84.176
- type: precision_at_1
value: 78.2
- type: precision_at_10
value: 12.973
- type: precision_at_100
value: 1.5010000000000001
- type: precision_at_1000
value: 0.156
- type: precision_at_3
value: 35.949999999999996
- type: precision_at_5
value: 23.62
- type: recall_at_1
value: 67.947
- type: recall_at_10
value: 93.804
- type: recall_at_100
value: 98.971
- type: recall_at_1000
value: 99.91600000000001
- type: recall_at_3
value: 84.75399999999999
- type: recall_at_5
value: 89.32
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 45.457201684255104
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 55.162226937477875
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: scidocs
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 4.173
- type: map_at_10
value: 10.463000000000001
- type: map_at_100
value: 12.278
- type: map_at_1000
value: 12.572
- type: map_at_3
value: 7.528
- type: map_at_5
value: 8.863
- type: mrr_at_1
value: 20.599999999999998
- type: mrr_at_10
value: 30.422
- type: mrr_at_100
value: 31.6
- type: mrr_at_1000
value: 31.663000000000004
- type: mrr_at_3
value: 27.400000000000002
- type: mrr_at_5
value: 29.065
- type: ndcg_at_1
value: 20.599999999999998
- type: ndcg_at_10
value: 17.687
- type: ndcg_at_100
value: 25.172
- type: ndcg_at_1000
value: 30.617
- type: ndcg_at_3
value: 16.81
- type: ndcg_at_5
value: 14.499
- type: precision_at_1
value: 20.599999999999998
- type: precision_at_10
value: 9.17
- type: precision_at_100
value: 2.004
- type: precision_at_1000
value: 0.332
- type: precision_at_3
value: 15.6
- type: precision_at_5
value: 12.58
- type: recall_at_1
value: 4.173
- type: recall_at_10
value: 18.575
- type: recall_at_100
value: 40.692
- type: recall_at_1000
value: 67.467
- type: recall_at_3
value: 9.488000000000001
- type: recall_at_5
value: 12.738
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 81.12603499315416
- type: cos_sim_spearman
value: 73.62060290948378
- type: euclidean_pearson
value: 78.14083565781135
- type: euclidean_spearman
value: 73.16840437541543
- type: manhattan_pearson
value: 77.92017261109734
- type: manhattan_spearman
value: 72.8805059949965
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 79.75955377133172
- type: cos_sim_spearman
value: 71.8872633964069
- type: euclidean_pearson
value: 76.31922068538256
- type: euclidean_spearman
value: 70.86449661855376
- type: manhattan_pearson
value: 76.47852229730407
- type: manhattan_spearman
value: 70.99367421984789
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 78.80762722908158
- type: cos_sim_spearman
value: 79.84588978756372
- type: euclidean_pearson
value: 79.8216849781164
- type: euclidean_spearman
value: 80.22647061695481
- type: manhattan_pearson
value: 79.56604194112572
- type: manhattan_spearman
value: 79.96495189862462
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 80.1012718092742
- type: cos_sim_spearman
value: 76.86011381793661
- type: euclidean_pearson
value: 79.94426039862019
- type: euclidean_spearman
value: 77.36751135465131
- type: manhattan_pearson
value: 79.87959373304288
- type: manhattan_spearman
value: 77.37717129004746
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 83.90618420346104
- type: cos_sim_spearman
value: 84.77290791243722
- type: euclidean_pearson
value: 84.64732258073293
- type: euclidean_spearman
value: 85.21053649543357
- type: manhattan_pearson
value: 84.61616883522647
- type: manhattan_spearman
value: 85.19803126766931
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 80.52192114059063
- type: cos_sim_spearman
value: 81.9103244827937
- type: euclidean_pearson
value: 80.99375176138985
- type: euclidean_spearman
value: 81.540250641079
- type: manhattan_pearson
value: 80.84979573396426
- type: manhattan_spearman
value: 81.3742591621492
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 85.82166001234197
- type: cos_sim_spearman
value: 86.81857495659123
- type: euclidean_pearson
value: 85.72798403202849
- type: euclidean_spearman
value: 85.70482438950965
- type: manhattan_pearson
value: 85.51579093130357
- type: manhattan_spearman
value: 85.41233705379751
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 64.48071151079803
- type: cos_sim_spearman
value: 65.37838108084044
- type: euclidean_pearson
value: 64.67378947096257
- type: euclidean_spearman
value: 65.39187147219869
- type: manhattan_pearson
value: 65.35487466133208
- type: manhattan_spearman
value: 65.51328499442272
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 82.64702367823314
- type: cos_sim_spearman
value: 82.49732953181818
- type: euclidean_pearson
value: 83.05996062475664
- type: euclidean_spearman
value: 82.28159546751176
- type: manhattan_pearson
value: 82.98305503664952
- type: manhattan_spearman
value: 82.18405771943928
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 78.5744649318696
- type: mrr
value: 93.35386291268645
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: scifact
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 52.093999999999994
- type: map_at_10
value: 61.646
- type: map_at_100
value: 62.197
- type: map_at_1000
value: 62.22800000000001
- type: map_at_3
value: 58.411
- type: map_at_5
value: 60.585
- type: mrr_at_1
value: 55.00000000000001
- type: mrr_at_10
value: 62.690999999999995
- type: mrr_at_100
value: 63.139
- type: mrr_at_1000
value: 63.166999999999994
- type: mrr_at_3
value: 60.111000000000004
- type: mrr_at_5
value: 61.778
- type: ndcg_at_1
value: 55.00000000000001
- type: ndcg_at_10
value: 66.271
- type: ndcg_at_100
value: 68.879
- type: ndcg_at_1000
value: 69.722
- type: ndcg_at_3
value: 60.672000000000004
- type: ndcg_at_5
value: 63.929
- type: precision_at_1
value: 55.00000000000001
- type: precision_at_10
value: 9.0
- type: precision_at_100
value: 1.043
- type: precision_at_1000
value: 0.11100000000000002
- type: precision_at_3
value: 23.555999999999997
- type: precision_at_5
value: 16.2
- type: recall_at_1
value: 52.093999999999994
- type: recall_at_10
value: 79.567
- type: recall_at_100
value: 91.60000000000001
- type: recall_at_1000
value: 98.333
- type: recall_at_3
value: 64.633
- type: recall_at_5
value: 72.68299999999999
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.83267326732673
- type: cos_sim_ap
value: 95.77995366495178
- type: cos_sim_f1
value: 91.51180311401306
- type: cos_sim_precision
value: 91.92734611503532
- type: cos_sim_recall
value: 91.10000000000001
- type: dot_accuracy
value: 99.63366336633663
- type: dot_ap
value: 88.53996286967461
- type: dot_f1
value: 81.06537530266343
- type: dot_precision
value: 78.59154929577464
- type: dot_recall
value: 83.7
- type: euclidean_accuracy
value: 99.82376237623762
- type: euclidean_ap
value: 95.53192209281187
- type: euclidean_f1
value: 91.19683481701286
- type: euclidean_precision
value: 90.21526418786692
- type: euclidean_recall
value: 92.2
- type: manhattan_accuracy
value: 99.82376237623762
- type: manhattan_ap
value: 95.55642082191741
- type: manhattan_f1
value: 91.16186693147964
- type: manhattan_precision
value: 90.53254437869822
- type: manhattan_recall
value: 91.8
- type: max_accuracy
value: 99.83267326732673
- type: max_ap
value: 95.77995366495178
- type: max_f1
value: 91.51180311401306
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 54.508462134213474
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 34.06549765184959
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 49.43129549466616
- type: mrr
value: 50.20613169510227
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 30.069516173193044
- type: cos_sim_spearman
value: 29.872498354017353
- type: dot_pearson
value: 28.80761257516063
- type: dot_spearman
value: 28.397422678527708
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: trec-covid
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 0.169
- type: map_at_10
value: 1.208
- type: map_at_100
value: 5.925
- type: map_at_1000
value: 14.427000000000001
- type: map_at_3
value: 0.457
- type: map_at_5
value: 0.716
- type: mrr_at_1
value: 64.0
- type: mrr_at_10
value: 74.075
- type: mrr_at_100
value: 74.303
- type: mrr_at_1000
value: 74.303
- type: mrr_at_3
value: 71.0
- type: mrr_at_5
value: 72.89999999999999
- type: ndcg_at_1
value: 57.99999999999999
- type: ndcg_at_10
value: 50.376
- type: ndcg_at_100
value: 38.582
- type: ndcg_at_1000
value: 35.663
- type: ndcg_at_3
value: 55.592
- type: ndcg_at_5
value: 53.647999999999996
- type: precision_at_1
value: 64.0
- type: precision_at_10
value: 53.2
- type: precision_at_100
value: 39.6
- type: precision_at_1000
value: 16.218
- type: precision_at_3
value: 59.333000000000006
- type: precision_at_5
value: 57.599999999999994
- type: recall_at_1
value: 0.169
- type: recall_at_10
value: 1.423
- type: recall_at_100
value: 9.049999999999999
- type: recall_at_1000
value: 34.056999999999995
- type: recall_at_3
value: 0.48700000000000004
- type: recall_at_5
value: 0.792
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: webis-touche2020
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 1.319
- type: map_at_10
value: 7.112
- type: map_at_100
value: 12.588
- type: map_at_1000
value: 14.056
- type: map_at_3
value: 2.8049999999999997
- type: map_at_5
value: 4.68
- type: mrr_at_1
value: 18.367
- type: mrr_at_10
value: 33.94
- type: mrr_at_100
value: 35.193000000000005
- type: mrr_at_1000
value: 35.193000000000005
- type: mrr_at_3
value: 29.932
- type: mrr_at_5
value: 32.279
- type: ndcg_at_1
value: 15.306000000000001
- type: ndcg_at_10
value: 18.096
- type: ndcg_at_100
value: 30.512
- type: ndcg_at_1000
value: 42.148
- type: ndcg_at_3
value: 17.034
- type: ndcg_at_5
value: 18.509
- type: precision_at_1
value: 18.367
- type: precision_at_10
value: 18.776
- type: precision_at_100
value: 7.02
- type: precision_at_1000
value: 1.467
- type: precision_at_3
value: 19.048000000000002
- type: precision_at_5
value: 22.041
- type: recall_at_1
value: 1.319
- type: recall_at_10
value: 13.748
- type: recall_at_100
value: 43.972
- type: recall_at_1000
value: 79.557
- type: recall_at_3
value: 4.042
- type: recall_at_5
value: 7.742
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 70.2282
- type: ap
value: 13.995763859570426
- type: f1
value: 54.08126256731344
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 57.64006791171477
- type: f1
value: 57.95841320748957
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 40.19267841788564
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 83.96614412588663
- type: cos_sim_ap
value: 67.75985678572738
- type: cos_sim_f1
value: 64.04661542276222
- type: cos_sim_precision
value: 60.406922357343305
- type: cos_sim_recall
value: 68.15303430079156
- type: dot_accuracy
value: 79.5732252488526
- type: dot_ap
value: 51.30562107572645
- type: dot_f1
value: 53.120759837177744
- type: dot_precision
value: 46.478037198258804
- type: dot_recall
value: 61.97889182058047
- type: euclidean_accuracy
value: 84.00786791440663
- type: euclidean_ap
value: 67.58930214486998
- type: euclidean_f1
value: 64.424821579775
- type: euclidean_precision
value: 59.4817958454322
- type: euclidean_recall
value: 70.26385224274406
- type: manhattan_accuracy
value: 83.87673600762949
- type: manhattan_ap
value: 67.4250981523309
- type: manhattan_f1
value: 64.10286658015808
- type: manhattan_precision
value: 57.96885001066781
- type: manhattan_recall
value: 71.68865435356201
- type: max_accuracy
value: 84.00786791440663
- type: max_ap
value: 67.75985678572738
- type: max_f1
value: 64.424821579775
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 88.41347459929368
- type: cos_sim_ap
value: 84.89261930113058
- type: cos_sim_f1
value: 77.13677607258877
- type: cos_sim_precision
value: 74.88581164358733
- type: cos_sim_recall
value: 79.52725592854944
- type: dot_accuracy
value: 86.32359219156285
- type: dot_ap
value: 79.29794992131094
- type: dot_f1
value: 72.84356337679777
- type: dot_precision
value: 67.31761478675462
- type: dot_recall
value: 79.35786880197105
- type: euclidean_accuracy
value: 88.33585593976791
- type: euclidean_ap
value: 84.73257641312746
- type: euclidean_f1
value: 76.83529582788195
- type: euclidean_precision
value: 72.76294052863436
- type: euclidean_recall
value: 81.3905143209116
- type: manhattan_accuracy
value: 88.3086894089339
- type: manhattan_ap
value: 84.66304891729399
- type: manhattan_f1
value: 76.8181650632165
- type: manhattan_precision
value: 73.6864436744219
- type: manhattan_recall
value: 80.22790267939637
- type: max_accuracy
value: 88.41347459929368
- type: max_ap
value: 84.89261930113058
- type: max_f1
value: 77.13677607258877
---
# bge-micro-v2
> Forked from https://huggingface.co/TaylorAI/bge-micro-v2 purely to ensure it remains available. See also [license](LICENSE).
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search.
Distilled in a 2-step training process (bge-micro was step 1) from `BAAI/bge-small-en-v1.5`.
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('{MODEL_NAME}')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('{MODEL_NAME}')
model = AutoModel.from_pretrained('{MODEL_NAME}')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, mean pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
<!--- Describe how your model was evaluated -->
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name={MODEL_NAME})
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
)
```
## Citing & Authors
<!--- Describe where people can find more information --> | [
"SUMMARIZATION"
] | [
"BIOSSES",
"SCIFACT"
] |
Cohere/Cohere-embed-multilingual-v3.0 | Cohere | null | [
"transformers",
"mteb",
"model-index",
"endpoints_compatible",
"region:us"
] | 2023-11-02T09:52:29 | 2023-11-07T12:59:44 | 1,609 | 95 | ---
tags:
- mteb
model-index:
- name: embed-multilingual-v3.0
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 77.85074626865672
- type: ap
value: 41.53151744002314
- type: f1
value: 71.94656880817726
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 95.600375
- type: ap
value: 93.57882128753579
- type: f1
value: 95.59945484944305
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 49.794
- type: f1
value: 48.740439663130985
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: arguana
config: default
split: test
revision: None
metrics:
- type: ndcg_at_10
value: 55.105000000000004
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 48.15653426568874
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 40.78876256237919
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 62.12873500780318
- type: mrr
value: 75.87037769863255
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 86.01183720167818
- type: cos_sim_spearman
value: 85.00916590717613
- type: euclidean_pearson
value: 84.072733561361
- type: euclidean_spearman
value: 85.00916590717613
- type: manhattan_pearson
value: 83.89233507343208
- type: manhattan_spearman
value: 84.87482549674115
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 86.09415584415584
- type: f1
value: 86.05173549773973
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 40.49773000165541
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 36.909633073998876
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: None
metrics:
- type: ndcg_at_10
value: 49.481
- type: ndcg_at_10
value: 47.449999999999996
- type: ndcg_at_10
value: 59.227
- type: ndcg_at_10
value: 37.729
- type: ndcg_at_10
value: 29.673
- type: ndcg_at_10
value: 44.278
- type: ndcg_at_10
value: 43.218
- type: ndcg_at_10
value: 40.63741666666667
- type: ndcg_at_10
value: 33.341
- type: ndcg_at_10
value: 29.093999999999998
- type: ndcg_at_10
value: 40.801
- type: ndcg_at_10
value: 40.114
- type: ndcg_at_10
value: 33.243
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: climate-fever
config: default
split: test
revision: None
metrics:
- type: ndcg_at_10
value: 29.958000000000002
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: dbpedia-entity
config: default
split: test
revision: None
metrics:
- type: ndcg_at_10
value: 41.004000000000005
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 48.150000000000006
- type: f1
value: 43.69803436468346
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: fever
config: default
split: test
revision: None
metrics:
- type: ndcg_at_10
value: 88.532
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: fiqa
config: default
split: test
revision: None
metrics:
- type: ndcg_at_10
value: 44.105
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: hotpotqa
config: default
split: test
revision: None
metrics:
- type: ndcg_at_10
value: 70.612
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 93.9672
- type: ap
value: 90.72947025321227
- type: f1
value: 93.96271599852622
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: msmarco
config: default
split: test
revision: None
metrics:
- type: ndcg_at_10
value: 43.447
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 94.92476060191517
- type: f1
value: 94.69383758972194
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 78.8873689010488
- type: f1
value: 62.537485052253885
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 74.51244115669132
- type: f1
value: 72.40074466830153
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 79.00470746469401
- type: f1
value: 79.03758200183096
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 36.183215937303736
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 33.443759055792135
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 32.58713095176127
- type: mrr
value: 33.7326038566206
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: nfcorpus
config: default
split: test
revision: None
metrics:
- type: ndcg_at_10
value: 36.417
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: nq
config: default
split: test
revision: None
metrics:
- type: ndcg_at_10
value: 63.415
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: quora
config: default
split: test
revision: None
metrics:
- type: ndcg_at_10
value: 88.924
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 58.10997801688676
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 65.02444843766075
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: scidocs
config: default
split: test
revision: None
metrics:
- type: ndcg_at_10
value: 19.339000000000002
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 86.61540076033945
- type: cos_sim_spearman
value: 82.1820253476181
- type: euclidean_pearson
value: 83.73901215845989
- type: euclidean_spearman
value: 82.182021064594
- type: manhattan_pearson
value: 83.76685139192031
- type: manhattan_spearman
value: 82.14074705306663
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 85.62241109228789
- type: cos_sim_spearman
value: 77.62042143066208
- type: euclidean_pearson
value: 82.77237785274072
- type: euclidean_spearman
value: 77.62042142290566
- type: manhattan_pearson
value: 82.70945589621266
- type: manhattan_spearman
value: 77.57245632826351
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 84.8307075352031
- type: cos_sim_spearman
value: 85.15620774806095
- type: euclidean_pearson
value: 84.21956724564915
- type: euclidean_spearman
value: 85.15620774806095
- type: manhattan_pearson
value: 84.0677597021641
- type: manhattan_spearman
value: 85.02572172855729
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 83.33749463516592
- type: cos_sim_spearman
value: 80.01967438481185
- type: euclidean_pearson
value: 82.16884494022196
- type: euclidean_spearman
value: 80.01967218194336
- type: manhattan_pearson
value: 81.94431512413773
- type: manhattan_spearman
value: 79.81636247503731
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 88.2070761097028
- type: cos_sim_spearman
value: 88.92297656560552
- type: euclidean_pearson
value: 87.95961374550303
- type: euclidean_spearman
value: 88.92298798854765
- type: manhattan_pearson
value: 87.85515971478168
- type: manhattan_spearman
value: 88.8100644762342
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 85.48103354546488
- type: cos_sim_spearman
value: 86.91850928862898
- type: euclidean_pearson
value: 86.06766986527145
- type: euclidean_spearman
value: 86.91850928862898
- type: manhattan_pearson
value: 86.02705585360717
- type: manhattan_spearman
value: 86.86666545434721
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 90.30267248880148
- type: cos_sim_spearman
value: 90.08752166657892
- type: euclidean_pearson
value: 90.4697525265135
- type: euclidean_spearman
value: 90.08752166657892
- type: manhattan_pearson
value: 90.57174978064741
- type: manhattan_spearman
value: 90.212834942229
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 67.10616236380835
- type: cos_sim_spearman
value: 66.81483164137016
- type: euclidean_pearson
value: 68.48505128040803
- type: euclidean_spearman
value: 66.81483164137016
- type: manhattan_pearson
value: 68.46133268524885
- type: manhattan_spearman
value: 66.83684227990202
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 87.12768629069949
- type: cos_sim_spearman
value: 88.78683817318573
- type: euclidean_pearson
value: 88.47603251297261
- type: euclidean_spearman
value: 88.78683817318573
- type: manhattan_pearson
value: 88.46483630890225
- type: manhattan_spearman
value: 88.76593424921617
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 84.30886658431281
- type: mrr
value: 95.5964251797585
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: scifact
config: default
split: test
revision: None
metrics:
- type: ndcg_at_10
value: 70.04599999999999
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.87524752475248
- type: cos_sim_ap
value: 96.79160651306724
- type: cos_sim_f1
value: 93.57798165137615
- type: cos_sim_precision
value: 95.42619542619542
- type: cos_sim_recall
value: 91.8
- type: dot_accuracy
value: 99.87524752475248
- type: dot_ap
value: 96.79160651306724
- type: dot_f1
value: 93.57798165137615
- type: dot_precision
value: 95.42619542619542
- type: dot_recall
value: 91.8
- type: euclidean_accuracy
value: 99.87524752475248
- type: euclidean_ap
value: 96.79160651306724
- type: euclidean_f1
value: 93.57798165137615
- type: euclidean_precision
value: 95.42619542619542
- type: euclidean_recall
value: 91.8
- type: manhattan_accuracy
value: 99.87326732673267
- type: manhattan_ap
value: 96.7574606340297
- type: manhattan_f1
value: 93.45603271983639
- type: manhattan_precision
value: 95.60669456066945
- type: manhattan_recall
value: 91.4
- type: max_accuracy
value: 99.87524752475248
- type: max_ap
value: 96.79160651306724
- type: max_f1
value: 93.57798165137615
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 68.12288811917144
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 35.22267280169542
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 52.39780995606098
- type: mrr
value: 53.26826563958916
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 31.15118979569649
- type: cos_sim_spearman
value: 30.99428921914572
- type: dot_pearson
value: 31.151189338601924
- type: dot_spearman
value: 30.99428921914572
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: trec-covid
config: default
split: test
revision: None
metrics:
- type: ndcg_at_10
value: 83.372
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: webis-touche2020
config: default
split: test
revision: None
metrics:
- type: ndcg_at_10
value: 32.698
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 71.1998
- type: ap
value: 14.646205259325157
- type: f1
value: 54.96172518137252
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 62.176004527447645
- type: f1
value: 62.48549068096645
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 50.13767789739772
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 86.38016331882935
- type: cos_sim_ap
value: 75.1635976260804
- type: cos_sim_f1
value: 69.29936305732484
- type: cos_sim_precision
value: 66.99507389162561
- type: cos_sim_recall
value: 71.76781002638522
- type: dot_accuracy
value: 86.38016331882935
- type: dot_ap
value: 75.16359359202374
- type: dot_f1
value: 69.29936305732484
- type: dot_precision
value: 66.99507389162561
- type: dot_recall
value: 71.76781002638522
- type: euclidean_accuracy
value: 86.38016331882935
- type: euclidean_ap
value: 75.16360246558416
- type: euclidean_f1
value: 69.29936305732484
- type: euclidean_precision
value: 66.99507389162561
- type: euclidean_recall
value: 71.76781002638522
- type: manhattan_accuracy
value: 86.27883411813792
- type: manhattan_ap
value: 75.02872038741897
- type: manhattan_f1
value: 69.29256284011403
- type: manhattan_precision
value: 68.07535641547861
- type: manhattan_recall
value: 70.55408970976254
- type: max_accuracy
value: 86.38016331882935
- type: max_ap
value: 75.16360246558416
- type: max_f1
value: 69.29936305732484
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 89.39729110878255
- type: cos_sim_ap
value: 86.48560260020555
- type: cos_sim_f1
value: 79.35060602690982
- type: cos_sim_precision
value: 76.50632549496105
- type: cos_sim_recall
value: 82.41453649522637
- type: dot_accuracy
value: 89.39729110878255
- type: dot_ap
value: 86.48559829915334
- type: dot_f1
value: 79.35060602690982
- type: dot_precision
value: 76.50632549496105
- type: dot_recall
value: 82.41453649522637
- type: euclidean_accuracy
value: 89.39729110878255
- type: euclidean_ap
value: 86.48559993122497
- type: euclidean_f1
value: 79.35060602690982
- type: euclidean_precision
value: 76.50632549496105
- type: euclidean_recall
value: 82.41453649522637
- type: manhattan_accuracy
value: 89.36042224550782
- type: manhattan_ap
value: 86.47238558562499
- type: manhattan_f1
value: 79.24500641378047
- type: manhattan_precision
value: 75.61726236273344
- type: manhattan_recall
value: 83.23837388358484
- type: max_accuracy
value: 89.39729110878255
- type: max_ap
value: 86.48560260020555
- type: max_f1
value: 79.35060602690982
---
# Cohere embed-multilingual-v3.0
This repository contains the tokenizer for the Cohere `embed-multilingual-v3.0` model. See our blogpost [Cohere Embed V3](https://txt.cohere.com/introducing-embed-v3/) for more details on this model.
You can use the embedding model either via the Cohere API, AWS SageMaker or in your private deployments.
## Usage Cohere API
The following code snippet shows the usage of the Cohere API. Install the cohere SDK via:
```
pip install -U cohere
```
Get your free API key on: www.cohere.com
```python
# This snippet shows and example how to use the Cohere Embed V3 models for semantic search.
# Make sure to have the Cohere SDK in at least v4.30 install: pip install -U cohere
# Get your API key from: www.cohere.com
import cohere
import numpy as np
cohere_key = "{YOUR_COHERE_API_KEY}" #Get your API key from www.cohere.com
co = cohere.Client(cohere_key)
docs = ["The capital of France is Paris",
"PyTorch is a machine learning framework based on the Torch library.",
"The average cat lifespan is between 13-17 years"]
#Encode your documents with input type 'search_document'
doc_emb = co.embed(docs, input_type="search_document", model="embed-multilingual-v3.0").embeddings
doc_emb = np.asarray(doc_emb)
#Encode your query with input type 'search_query'
query = "What is Pytorch"
query_emb = co.embed([query], input_type="search_query", model="embed-multilingual-v3.0").embeddings
query_emb = np.asarray(query_emb)
query_emb.shape
#Compute the dot product between query embedding and document embedding
scores = np.dot(query_emb, doc_emb.T)[0]
#Find the highest scores
max_idx = np.argsort(-scores)
print(f"Query: {query}")
for idx in max_idx:
print(f"Score: {scores[idx]:.2f}")
print(docs[idx])
print("--------")
```
## Usage AWS SageMaker
The embedding model can be privately deployed in your AWS Cloud using our [AWS SageMaker marketplace offering](https://aws.amazon.com/marketplace/pp/prodview-z6huxszcqc25i). It runs privately in your VPC, with latencies as low as 5ms for query encoding.
## Usage AWS Bedrock
Soon the model will also be available via AWS Bedrock. Stay tuned
## Private Deployment
You want to run the model on your own hardware? [Contact Sales](https://cohere.com/contact-sales) to learn more.
## Supported Languages
This model was trained on nearly 1B English training pairs and nearly 0.5B Non-English training pairs from 100+ languages.
Evaluation results can be found in the [Embed V3.0 Benchmark Results spreadsheet](https://docs.google.com/spreadsheets/d/1w7gnHWMDBdEUrmHgSfDnGHJgVQE5aOiXCCwO3uNH_mI/edit?usp=sharing). | [
"SUMMARIZATION"
] | [
"BIOSSES",
"SCIFACT"
] |
ggml-org/gte-small-Q8_0-GGUF | ggml-org | sentence-similarity | [
"sentence-transformers",
"gguf",
"mteb",
"sentence-similarity",
"Sentence Transformers",
"llama-cpp",
"gguf-my-repo",
"en",
"base_model:thenlper/gte-small",
"base_model:quantized:thenlper/gte-small",
"license:mit",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us",
"feature-extraction"
] | 2025-02-06T08:36:20 | 2025-02-06T09:11:30 | 1,586 | 0 | ---
base_model: thenlper/gte-small
language:
- en
license: mit
tags:
- mteb
- sentence-similarity
- sentence-transformers
- Sentence Transformers
- llama-cpp
- gguf-my-repo
model-index:
- name: gte-small
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 73.22388059701493
- type: ap
value: 36.09895941426988
- type: f1
value: 67.3205651539195
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 91.81894999999999
- type: ap
value: 88.5240138417305
- type: f1
value: 91.80367382706962
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 48.032
- type: f1
value: 47.4490665674719
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: arguana
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 30.725
- type: map_at_10
value: 46.604
- type: map_at_100
value: 47.535
- type: map_at_1000
value: 47.538000000000004
- type: map_at_3
value: 41.833
- type: map_at_5
value: 44.61
- type: mrr_at_1
value: 31.223
- type: mrr_at_10
value: 46.794000000000004
- type: mrr_at_100
value: 47.725
- type: mrr_at_1000
value: 47.727000000000004
- type: mrr_at_3
value: 42.07
- type: mrr_at_5
value: 44.812000000000005
- type: ndcg_at_1
value: 30.725
- type: ndcg_at_10
value: 55.440999999999995
- type: ndcg_at_100
value: 59.134
- type: ndcg_at_1000
value: 59.199
- type: ndcg_at_3
value: 45.599000000000004
- type: ndcg_at_5
value: 50.637
- type: precision_at_1
value: 30.725
- type: precision_at_10
value: 8.364
- type: precision_at_100
value: 0.991
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 18.848000000000003
- type: precision_at_5
value: 13.77
- type: recall_at_1
value: 30.725
- type: recall_at_10
value: 83.64200000000001
- type: recall_at_100
value: 99.14699999999999
- type: recall_at_1000
value: 99.644
- type: recall_at_3
value: 56.543
- type: recall_at_5
value: 68.848
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 47.90178078197678
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 40.25728393431922
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 61.720297062897764
- type: mrr
value: 75.24139295607439
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 89.43527309184616
- type: cos_sim_spearman
value: 88.17128615100206
- type: euclidean_pearson
value: 87.89922623089282
- type: euclidean_spearman
value: 87.96104039655451
- type: manhattan_pearson
value: 87.9818290932077
- type: manhattan_spearman
value: 88.00923426576885
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 84.0844155844156
- type: f1
value: 84.01485017302213
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 38.36574769259432
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 35.4857033165287
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 30.261
- type: map_at_10
value: 42.419000000000004
- type: map_at_100
value: 43.927
- type: map_at_1000
value: 44.055
- type: map_at_3
value: 38.597
- type: map_at_5
value: 40.701
- type: mrr_at_1
value: 36.91
- type: mrr_at_10
value: 48.02
- type: mrr_at_100
value: 48.658
- type: mrr_at_1000
value: 48.708
- type: mrr_at_3
value: 44.945
- type: mrr_at_5
value: 46.705000000000005
- type: ndcg_at_1
value: 36.91
- type: ndcg_at_10
value: 49.353
- type: ndcg_at_100
value: 54.456
- type: ndcg_at_1000
value: 56.363
- type: ndcg_at_3
value: 43.483
- type: ndcg_at_5
value: 46.150999999999996
- type: precision_at_1
value: 36.91
- type: precision_at_10
value: 9.700000000000001
- type: precision_at_100
value: 1.557
- type: precision_at_1000
value: 0.202
- type: precision_at_3
value: 21.078
- type: precision_at_5
value: 15.421999999999999
- type: recall_at_1
value: 30.261
- type: recall_at_10
value: 63.242
- type: recall_at_100
value: 84.09100000000001
- type: recall_at_1000
value: 96.143
- type: recall_at_3
value: 46.478
- type: recall_at_5
value: 53.708
- type: map_at_1
value: 31.145
- type: map_at_10
value: 40.996
- type: map_at_100
value: 42.266999999999996
- type: map_at_1000
value: 42.397
- type: map_at_3
value: 38.005
- type: map_at_5
value: 39.628
- type: mrr_at_1
value: 38.344
- type: mrr_at_10
value: 46.827000000000005
- type: mrr_at_100
value: 47.446
- type: mrr_at_1000
value: 47.489
- type: mrr_at_3
value: 44.448
- type: mrr_at_5
value: 45.747
- type: ndcg_at_1
value: 38.344
- type: ndcg_at_10
value: 46.733000000000004
- type: ndcg_at_100
value: 51.103
- type: ndcg_at_1000
value: 53.075
- type: ndcg_at_3
value: 42.366
- type: ndcg_at_5
value: 44.242
- type: precision_at_1
value: 38.344
- type: precision_at_10
value: 8.822000000000001
- type: precision_at_100
value: 1.417
- type: precision_at_1000
value: 0.187
- type: precision_at_3
value: 20.403
- type: precision_at_5
value: 14.306
- type: recall_at_1
value: 31.145
- type: recall_at_10
value: 56.909
- type: recall_at_100
value: 75.274
- type: recall_at_1000
value: 87.629
- type: recall_at_3
value: 43.784
- type: recall_at_5
value: 49.338
- type: map_at_1
value: 38.83
- type: map_at_10
value: 51.553000000000004
- type: map_at_100
value: 52.581
- type: map_at_1000
value: 52.638
- type: map_at_3
value: 48.112
- type: map_at_5
value: 50.095
- type: mrr_at_1
value: 44.513999999999996
- type: mrr_at_10
value: 54.998000000000005
- type: mrr_at_100
value: 55.650999999999996
- type: mrr_at_1000
value: 55.679
- type: mrr_at_3
value: 52.602000000000004
- type: mrr_at_5
value: 53.931
- type: ndcg_at_1
value: 44.513999999999996
- type: ndcg_at_10
value: 57.67400000000001
- type: ndcg_at_100
value: 61.663999999999994
- type: ndcg_at_1000
value: 62.743
- type: ndcg_at_3
value: 51.964
- type: ndcg_at_5
value: 54.773
- type: precision_at_1
value: 44.513999999999996
- type: precision_at_10
value: 9.423
- type: precision_at_100
value: 1.2309999999999999
- type: precision_at_1000
value: 0.13699999999999998
- type: precision_at_3
value: 23.323
- type: precision_at_5
value: 16.163
- type: recall_at_1
value: 38.83
- type: recall_at_10
value: 72.327
- type: recall_at_100
value: 89.519
- type: recall_at_1000
value: 97.041
- type: recall_at_3
value: 57.206
- type: recall_at_5
value: 63.88399999999999
- type: map_at_1
value: 25.484
- type: map_at_10
value: 34.527
- type: map_at_100
value: 35.661
- type: map_at_1000
value: 35.739
- type: map_at_3
value: 32.199
- type: map_at_5
value: 33.632
- type: mrr_at_1
value: 27.458
- type: mrr_at_10
value: 36.543
- type: mrr_at_100
value: 37.482
- type: mrr_at_1000
value: 37.543
- type: mrr_at_3
value: 34.256
- type: mrr_at_5
value: 35.618
- type: ndcg_at_1
value: 27.458
- type: ndcg_at_10
value: 39.396
- type: ndcg_at_100
value: 44.742
- type: ndcg_at_1000
value: 46.708
- type: ndcg_at_3
value: 34.817
- type: ndcg_at_5
value: 37.247
- type: precision_at_1
value: 27.458
- type: precision_at_10
value: 5.976999999999999
- type: precision_at_100
value: 0.907
- type: precision_at_1000
value: 0.11100000000000002
- type: precision_at_3
value: 14.878
- type: precision_at_5
value: 10.35
- type: recall_at_1
value: 25.484
- type: recall_at_10
value: 52.317
- type: recall_at_100
value: 76.701
- type: recall_at_1000
value: 91.408
- type: recall_at_3
value: 40.043
- type: recall_at_5
value: 45.879
- type: map_at_1
value: 16.719
- type: map_at_10
value: 25.269000000000002
- type: map_at_100
value: 26.442
- type: map_at_1000
value: 26.557
- type: map_at_3
value: 22.56
- type: map_at_5
value: 24.082
- type: mrr_at_1
value: 20.896
- type: mrr_at_10
value: 29.982999999999997
- type: mrr_at_100
value: 30.895
- type: mrr_at_1000
value: 30.961
- type: mrr_at_3
value: 27.239
- type: mrr_at_5
value: 28.787000000000003
- type: ndcg_at_1
value: 20.896
- type: ndcg_at_10
value: 30.814000000000004
- type: ndcg_at_100
value: 36.418
- type: ndcg_at_1000
value: 39.182
- type: ndcg_at_3
value: 25.807999999999996
- type: ndcg_at_5
value: 28.143
- type: precision_at_1
value: 20.896
- type: precision_at_10
value: 5.821
- type: precision_at_100
value: 0.991
- type: precision_at_1000
value: 0.136
- type: precision_at_3
value: 12.562000000000001
- type: precision_at_5
value: 9.254
- type: recall_at_1
value: 16.719
- type: recall_at_10
value: 43.155
- type: recall_at_100
value: 67.831
- type: recall_at_1000
value: 87.617
- type: recall_at_3
value: 29.259
- type: recall_at_5
value: 35.260999999999996
- type: map_at_1
value: 29.398999999999997
- type: map_at_10
value: 39.876
- type: map_at_100
value: 41.205999999999996
- type: map_at_1000
value: 41.321999999999996
- type: map_at_3
value: 36.588
- type: map_at_5
value: 38.538
- type: mrr_at_1
value: 35.9
- type: mrr_at_10
value: 45.528
- type: mrr_at_100
value: 46.343
- type: mrr_at_1000
value: 46.388
- type: mrr_at_3
value: 42.862
- type: mrr_at_5
value: 44.440000000000005
- type: ndcg_at_1
value: 35.9
- type: ndcg_at_10
value: 45.987
- type: ndcg_at_100
value: 51.370000000000005
- type: ndcg_at_1000
value: 53.400000000000006
- type: ndcg_at_3
value: 40.841
- type: ndcg_at_5
value: 43.447
- type: precision_at_1
value: 35.9
- type: precision_at_10
value: 8.393
- type: precision_at_100
value: 1.283
- type: precision_at_1000
value: 0.166
- type: precision_at_3
value: 19.538
- type: precision_at_5
value: 13.975000000000001
- type: recall_at_1
value: 29.398999999999997
- type: recall_at_10
value: 58.361
- type: recall_at_100
value: 81.081
- type: recall_at_1000
value: 94.004
- type: recall_at_3
value: 43.657000000000004
- type: recall_at_5
value: 50.519999999999996
- type: map_at_1
value: 21.589
- type: map_at_10
value: 31.608999999999998
- type: map_at_100
value: 33.128
- type: map_at_1000
value: 33.247
- type: map_at_3
value: 28.671999999999997
- type: map_at_5
value: 30.233999999999998
- type: mrr_at_1
value: 26.712000000000003
- type: mrr_at_10
value: 36.713
- type: mrr_at_100
value: 37.713
- type: mrr_at_1000
value: 37.771
- type: mrr_at_3
value: 34.075
- type: mrr_at_5
value: 35.451
- type: ndcg_at_1
value: 26.712000000000003
- type: ndcg_at_10
value: 37.519999999999996
- type: ndcg_at_100
value: 43.946000000000005
- type: ndcg_at_1000
value: 46.297
- type: ndcg_at_3
value: 32.551
- type: ndcg_at_5
value: 34.660999999999994
- type: precision_at_1
value: 26.712000000000003
- type: precision_at_10
value: 7.066
- type: precision_at_100
value: 1.216
- type: precision_at_1000
value: 0.157
- type: precision_at_3
value: 15.906
- type: precision_at_5
value: 11.437999999999999
- type: recall_at_1
value: 21.589
- type: recall_at_10
value: 50.090999999999994
- type: recall_at_100
value: 77.43900000000001
- type: recall_at_1000
value: 93.35900000000001
- type: recall_at_3
value: 36.028999999999996
- type: recall_at_5
value: 41.698
- type: map_at_1
value: 25.121666666666663
- type: map_at_10
value: 34.46258333333334
- type: map_at_100
value: 35.710499999999996
- type: map_at_1000
value: 35.82691666666666
- type: map_at_3
value: 31.563249999999996
- type: map_at_5
value: 33.189750000000004
- type: mrr_at_1
value: 29.66441666666667
- type: mrr_at_10
value: 38.5455
- type: mrr_at_100
value: 39.39566666666667
- type: mrr_at_1000
value: 39.45325
- type: mrr_at_3
value: 36.003333333333345
- type: mrr_at_5
value: 37.440916666666666
- type: ndcg_at_1
value: 29.66441666666667
- type: ndcg_at_10
value: 39.978416666666675
- type: ndcg_at_100
value: 45.278666666666666
- type: ndcg_at_1000
value: 47.52275
- type: ndcg_at_3
value: 35.00058333333334
- type: ndcg_at_5
value: 37.34908333333333
- type: precision_at_1
value: 29.66441666666667
- type: precision_at_10
value: 7.094500000000001
- type: precision_at_100
value: 1.1523333333333332
- type: precision_at_1000
value: 0.15358333333333332
- type: precision_at_3
value: 16.184166666666663
- type: precision_at_5
value: 11.6005
- type: recall_at_1
value: 25.121666666666663
- type: recall_at_10
value: 52.23975000000001
- type: recall_at_100
value: 75.48408333333333
- type: recall_at_1000
value: 90.95316666666668
- type: recall_at_3
value: 38.38458333333333
- type: recall_at_5
value: 44.39933333333333
- type: map_at_1
value: 23.569000000000003
- type: map_at_10
value: 30.389
- type: map_at_100
value: 31.396
- type: map_at_1000
value: 31.493
- type: map_at_3
value: 28.276
- type: map_at_5
value: 29.459000000000003
- type: mrr_at_1
value: 26.534000000000002
- type: mrr_at_10
value: 33.217999999999996
- type: mrr_at_100
value: 34.054
- type: mrr_at_1000
value: 34.12
- type: mrr_at_3
value: 31.058000000000003
- type: mrr_at_5
value: 32.330999999999996
- type: ndcg_at_1
value: 26.534000000000002
- type: ndcg_at_10
value: 34.608
- type: ndcg_at_100
value: 39.391999999999996
- type: ndcg_at_1000
value: 41.837999999999994
- type: ndcg_at_3
value: 30.564999999999998
- type: ndcg_at_5
value: 32.509
- type: precision_at_1
value: 26.534000000000002
- type: precision_at_10
value: 5.414
- type: precision_at_100
value: 0.847
- type: precision_at_1000
value: 0.11399999999999999
- type: precision_at_3
value: 12.986
- type: precision_at_5
value: 9.202
- type: recall_at_1
value: 23.569000000000003
- type: recall_at_10
value: 44.896
- type: recall_at_100
value: 66.476
- type: recall_at_1000
value: 84.548
- type: recall_at_3
value: 33.79
- type: recall_at_5
value: 38.512
- type: map_at_1
value: 16.36
- type: map_at_10
value: 23.57
- type: map_at_100
value: 24.698999999999998
- type: map_at_1000
value: 24.834999999999997
- type: map_at_3
value: 21.093
- type: map_at_5
value: 22.418
- type: mrr_at_1
value: 19.718
- type: mrr_at_10
value: 27.139999999999997
- type: mrr_at_100
value: 28.097
- type: mrr_at_1000
value: 28.177999999999997
- type: mrr_at_3
value: 24.805
- type: mrr_at_5
value: 26.121
- type: ndcg_at_1
value: 19.718
- type: ndcg_at_10
value: 28.238999999999997
- type: ndcg_at_100
value: 33.663
- type: ndcg_at_1000
value: 36.763
- type: ndcg_at_3
value: 23.747
- type: ndcg_at_5
value: 25.796000000000003
- type: precision_at_1
value: 19.718
- type: precision_at_10
value: 5.282
- type: precision_at_100
value: 0.9390000000000001
- type: precision_at_1000
value: 0.13899999999999998
- type: precision_at_3
value: 11.264000000000001
- type: precision_at_5
value: 8.341
- type: recall_at_1
value: 16.36
- type: recall_at_10
value: 38.669
- type: recall_at_100
value: 63.184
- type: recall_at_1000
value: 85.33800000000001
- type: recall_at_3
value: 26.214
- type: recall_at_5
value: 31.423000000000002
- type: map_at_1
value: 25.618999999999996
- type: map_at_10
value: 34.361999999999995
- type: map_at_100
value: 35.534
- type: map_at_1000
value: 35.634
- type: map_at_3
value: 31.402
- type: map_at_5
value: 32.815
- type: mrr_at_1
value: 30.037000000000003
- type: mrr_at_10
value: 38.284
- type: mrr_at_100
value: 39.141999999999996
- type: mrr_at_1000
value: 39.2
- type: mrr_at_3
value: 35.603
- type: mrr_at_5
value: 36.867
- type: ndcg_at_1
value: 30.037000000000003
- type: ndcg_at_10
value: 39.87
- type: ndcg_at_100
value: 45.243
- type: ndcg_at_1000
value: 47.507
- type: ndcg_at_3
value: 34.371
- type: ndcg_at_5
value: 36.521
- type: precision_at_1
value: 30.037000000000003
- type: precision_at_10
value: 6.819
- type: precision_at_100
value: 1.0699999999999998
- type: precision_at_1000
value: 0.13699999999999998
- type: precision_at_3
value: 15.392
- type: precision_at_5
value: 10.821
- type: recall_at_1
value: 25.618999999999996
- type: recall_at_10
value: 52.869
- type: recall_at_100
value: 76.395
- type: recall_at_1000
value: 92.19500000000001
- type: recall_at_3
value: 37.943
- type: recall_at_5
value: 43.342999999999996
- type: map_at_1
value: 23.283
- type: map_at_10
value: 32.155
- type: map_at_100
value: 33.724
- type: map_at_1000
value: 33.939
- type: map_at_3
value: 29.018
- type: map_at_5
value: 30.864000000000004
- type: mrr_at_1
value: 28.063
- type: mrr_at_10
value: 36.632
- type: mrr_at_100
value: 37.606
- type: mrr_at_1000
value: 37.671
- type: mrr_at_3
value: 33.992
- type: mrr_at_5
value: 35.613
- type: ndcg_at_1
value: 28.063
- type: ndcg_at_10
value: 38.024
- type: ndcg_at_100
value: 44.292
- type: ndcg_at_1000
value: 46.818
- type: ndcg_at_3
value: 32.965
- type: ndcg_at_5
value: 35.562
- type: precision_at_1
value: 28.063
- type: precision_at_10
value: 7.352
- type: precision_at_100
value: 1.514
- type: precision_at_1000
value: 0.23800000000000002
- type: precision_at_3
value: 15.481
- type: precision_at_5
value: 11.542
- type: recall_at_1
value: 23.283
- type: recall_at_10
value: 49.756
- type: recall_at_100
value: 78.05
- type: recall_at_1000
value: 93.854
- type: recall_at_3
value: 35.408
- type: recall_at_5
value: 42.187000000000005
- type: map_at_1
value: 19.201999999999998
- type: map_at_10
value: 26.826
- type: map_at_100
value: 27.961000000000002
- type: map_at_1000
value: 28.066999999999997
- type: map_at_3
value: 24.237000000000002
- type: map_at_5
value: 25.811
- type: mrr_at_1
value: 20.887
- type: mrr_at_10
value: 28.660000000000004
- type: mrr_at_100
value: 29.660999999999998
- type: mrr_at_1000
value: 29.731
- type: mrr_at_3
value: 26.155
- type: mrr_at_5
value: 27.68
- type: ndcg_at_1
value: 20.887
- type: ndcg_at_10
value: 31.523
- type: ndcg_at_100
value: 37.055
- type: ndcg_at_1000
value: 39.579
- type: ndcg_at_3
value: 26.529000000000003
- type: ndcg_at_5
value: 29.137
- type: precision_at_1
value: 20.887
- type: precision_at_10
value: 5.065
- type: precision_at_100
value: 0.856
- type: precision_at_1000
value: 0.11900000000000001
- type: precision_at_3
value: 11.399
- type: precision_at_5
value: 8.392
- type: recall_at_1
value: 19.201999999999998
- type: recall_at_10
value: 44.285000000000004
- type: recall_at_100
value: 69.768
- type: recall_at_1000
value: 88.302
- type: recall_at_3
value: 30.804
- type: recall_at_5
value: 37.039
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: climate-fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 11.244
- type: map_at_10
value: 18.956
- type: map_at_100
value: 20.674
- type: map_at_1000
value: 20.863
- type: map_at_3
value: 15.923000000000002
- type: map_at_5
value: 17.518
- type: mrr_at_1
value: 25.080999999999996
- type: mrr_at_10
value: 35.94
- type: mrr_at_100
value: 36.969
- type: mrr_at_1000
value: 37.013
- type: mrr_at_3
value: 32.617000000000004
- type: mrr_at_5
value: 34.682
- type: ndcg_at_1
value: 25.080999999999996
- type: ndcg_at_10
value: 26.539
- type: ndcg_at_100
value: 33.601
- type: ndcg_at_1000
value: 37.203
- type: ndcg_at_3
value: 21.695999999999998
- type: ndcg_at_5
value: 23.567
- type: precision_at_1
value: 25.080999999999996
- type: precision_at_10
value: 8.143
- type: precision_at_100
value: 1.5650000000000002
- type: precision_at_1000
value: 0.22300000000000003
- type: precision_at_3
value: 15.983
- type: precision_at_5
value: 12.417
- type: recall_at_1
value: 11.244
- type: recall_at_10
value: 31.457
- type: recall_at_100
value: 55.92
- type: recall_at_1000
value: 76.372
- type: recall_at_3
value: 19.784
- type: recall_at_5
value: 24.857000000000003
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: dbpedia-entity
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 8.595
- type: map_at_10
value: 18.75
- type: map_at_100
value: 26.354
- type: map_at_1000
value: 27.912
- type: map_at_3
value: 13.794
- type: map_at_5
value: 16.021
- type: mrr_at_1
value: 65.75
- type: mrr_at_10
value: 73.837
- type: mrr_at_100
value: 74.22800000000001
- type: mrr_at_1000
value: 74.234
- type: mrr_at_3
value: 72.5
- type: mrr_at_5
value: 73.387
- type: ndcg_at_1
value: 52.625
- type: ndcg_at_10
value: 39.101
- type: ndcg_at_100
value: 43.836000000000006
- type: ndcg_at_1000
value: 51.086
- type: ndcg_at_3
value: 44.229
- type: ndcg_at_5
value: 41.555
- type: precision_at_1
value: 65.75
- type: precision_at_10
value: 30.45
- type: precision_at_100
value: 9.81
- type: precision_at_1000
value: 2.045
- type: precision_at_3
value: 48.667
- type: precision_at_5
value: 40.8
- type: recall_at_1
value: 8.595
- type: recall_at_10
value: 24.201
- type: recall_at_100
value: 50.096
- type: recall_at_1000
value: 72.677
- type: recall_at_3
value: 15.212
- type: recall_at_5
value: 18.745
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 46.565
- type: f1
value: 41.49914329345582
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 66.60000000000001
- type: map_at_10
value: 76.838
- type: map_at_100
value: 77.076
- type: map_at_1000
value: 77.09
- type: map_at_3
value: 75.545
- type: map_at_5
value: 76.39
- type: mrr_at_1
value: 71.707
- type: mrr_at_10
value: 81.514
- type: mrr_at_100
value: 81.64099999999999
- type: mrr_at_1000
value: 81.645
- type: mrr_at_3
value: 80.428
- type: mrr_at_5
value: 81.159
- type: ndcg_at_1
value: 71.707
- type: ndcg_at_10
value: 81.545
- type: ndcg_at_100
value: 82.477
- type: ndcg_at_1000
value: 82.73899999999999
- type: ndcg_at_3
value: 79.292
- type: ndcg_at_5
value: 80.599
- type: precision_at_1
value: 71.707
- type: precision_at_10
value: 10.035
- type: precision_at_100
value: 1.068
- type: precision_at_1000
value: 0.11100000000000002
- type: precision_at_3
value: 30.918
- type: precision_at_5
value: 19.328
- type: recall_at_1
value: 66.60000000000001
- type: recall_at_10
value: 91.353
- type: recall_at_100
value: 95.21
- type: recall_at_1000
value: 96.89999999999999
- type: recall_at_3
value: 85.188
- type: recall_at_5
value: 88.52
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: fiqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 19.338
- type: map_at_10
value: 31.752000000000002
- type: map_at_100
value: 33.516
- type: map_at_1000
value: 33.694
- type: map_at_3
value: 27.716
- type: map_at_5
value: 29.67
- type: mrr_at_1
value: 38.117000000000004
- type: mrr_at_10
value: 47.323
- type: mrr_at_100
value: 48.13
- type: mrr_at_1000
value: 48.161
- type: mrr_at_3
value: 45.062000000000005
- type: mrr_at_5
value: 46.358
- type: ndcg_at_1
value: 38.117000000000004
- type: ndcg_at_10
value: 39.353
- type: ndcg_at_100
value: 46.044000000000004
- type: ndcg_at_1000
value: 49.083
- type: ndcg_at_3
value: 35.891
- type: ndcg_at_5
value: 36.661
- type: precision_at_1
value: 38.117000000000004
- type: precision_at_10
value: 11.187999999999999
- type: precision_at_100
value: 1.802
- type: precision_at_1000
value: 0.234
- type: precision_at_3
value: 24.126
- type: precision_at_5
value: 17.562
- type: recall_at_1
value: 19.338
- type: recall_at_10
value: 45.735
- type: recall_at_100
value: 71.281
- type: recall_at_1000
value: 89.537
- type: recall_at_3
value: 32.525
- type: recall_at_5
value: 37.671
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: hotpotqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 36.995
- type: map_at_10
value: 55.032000000000004
- type: map_at_100
value: 55.86
- type: map_at_1000
value: 55.932
- type: map_at_3
value: 52.125
- type: map_at_5
value: 53.884
- type: mrr_at_1
value: 73.991
- type: mrr_at_10
value: 80.096
- type: mrr_at_100
value: 80.32000000000001
- type: mrr_at_1000
value: 80.331
- type: mrr_at_3
value: 79.037
- type: mrr_at_5
value: 79.719
- type: ndcg_at_1
value: 73.991
- type: ndcg_at_10
value: 63.786
- type: ndcg_at_100
value: 66.78
- type: ndcg_at_1000
value: 68.255
- type: ndcg_at_3
value: 59.501000000000005
- type: ndcg_at_5
value: 61.82299999999999
- type: precision_at_1
value: 73.991
- type: precision_at_10
value: 13.157
- type: precision_at_100
value: 1.552
- type: precision_at_1000
value: 0.17500000000000002
- type: precision_at_3
value: 37.519999999999996
- type: precision_at_5
value: 24.351
- type: recall_at_1
value: 36.995
- type: recall_at_10
value: 65.78699999999999
- type: recall_at_100
value: 77.583
- type: recall_at_1000
value: 87.421
- type: recall_at_3
value: 56.279999999999994
- type: recall_at_5
value: 60.878
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 86.80239999999999
- type: ap
value: 81.97305141128378
- type: f1
value: 86.76976305549273
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: msmarco
config: default
split: dev
revision: None
metrics:
- type: map_at_1
value: 21.166
- type: map_at_10
value: 33.396
- type: map_at_100
value: 34.588
- type: map_at_1000
value: 34.637
- type: map_at_3
value: 29.509999999999998
- type: map_at_5
value: 31.719
- type: mrr_at_1
value: 21.762
- type: mrr_at_10
value: 33.969
- type: mrr_at_100
value: 35.099000000000004
- type: mrr_at_1000
value: 35.141
- type: mrr_at_3
value: 30.148000000000003
- type: mrr_at_5
value: 32.324000000000005
- type: ndcg_at_1
value: 21.776999999999997
- type: ndcg_at_10
value: 40.306999999999995
- type: ndcg_at_100
value: 46.068
- type: ndcg_at_1000
value: 47.3
- type: ndcg_at_3
value: 32.416
- type: ndcg_at_5
value: 36.345
- type: precision_at_1
value: 21.776999999999997
- type: precision_at_10
value: 6.433
- type: precision_at_100
value: 0.932
- type: precision_at_1000
value: 0.104
- type: precision_at_3
value: 13.897
- type: precision_at_5
value: 10.324
- type: recall_at_1
value: 21.166
- type: recall_at_10
value: 61.587
- type: recall_at_100
value: 88.251
- type: recall_at_1000
value: 97.727
- type: recall_at_3
value: 40.196
- type: recall_at_5
value: 49.611
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 93.04605563155496
- type: f1
value: 92.78007303978372
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 69.65116279069767
- type: f1
value: 52.75775172527262
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 70.34633490248822
- type: f1
value: 68.15345065392562
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 75.63887020847343
- type: f1
value: 76.08074680233685
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 33.77933406071333
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 32.06504927238196
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 32.20682480490871
- type: mrr
value: 33.41462721527003
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: nfcorpus
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 5.548
- type: map_at_10
value: 13.086999999999998
- type: map_at_100
value: 16.698
- type: map_at_1000
value: 18.151999999999997
- type: map_at_3
value: 9.576
- type: map_at_5
value: 11.175
- type: mrr_at_1
value: 44.272
- type: mrr_at_10
value: 53.635999999999996
- type: mrr_at_100
value: 54.228
- type: mrr_at_1000
value: 54.26499999999999
- type: mrr_at_3
value: 51.754
- type: mrr_at_5
value: 53.086
- type: ndcg_at_1
value: 42.724000000000004
- type: ndcg_at_10
value: 34.769
- type: ndcg_at_100
value: 32.283
- type: ndcg_at_1000
value: 40.843
- type: ndcg_at_3
value: 39.852
- type: ndcg_at_5
value: 37.858999999999995
- type: precision_at_1
value: 44.272
- type: precision_at_10
value: 26.068
- type: precision_at_100
value: 8.328000000000001
- type: precision_at_1000
value: 2.1
- type: precision_at_3
value: 37.874
- type: precision_at_5
value: 33.065
- type: recall_at_1
value: 5.548
- type: recall_at_10
value: 16.936999999999998
- type: recall_at_100
value: 33.72
- type: recall_at_1000
value: 64.348
- type: recall_at_3
value: 10.764999999999999
- type: recall_at_5
value: 13.361
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: nq
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 28.008
- type: map_at_10
value: 42.675000000000004
- type: map_at_100
value: 43.85
- type: map_at_1000
value: 43.884
- type: map_at_3
value: 38.286
- type: map_at_5
value: 40.78
- type: mrr_at_1
value: 31.518
- type: mrr_at_10
value: 45.015
- type: mrr_at_100
value: 45.924
- type: mrr_at_1000
value: 45.946999999999996
- type: mrr_at_3
value: 41.348
- type: mrr_at_5
value: 43.428
- type: ndcg_at_1
value: 31.489
- type: ndcg_at_10
value: 50.285999999999994
- type: ndcg_at_100
value: 55.291999999999994
- type: ndcg_at_1000
value: 56.05
- type: ndcg_at_3
value: 41.976
- type: ndcg_at_5
value: 46.103
- type: precision_at_1
value: 31.489
- type: precision_at_10
value: 8.456
- type: precision_at_100
value: 1.125
- type: precision_at_1000
value: 0.12
- type: precision_at_3
value: 19.09
- type: precision_at_5
value: 13.841000000000001
- type: recall_at_1
value: 28.008
- type: recall_at_10
value: 71.21499999999999
- type: recall_at_100
value: 92.99
- type: recall_at_1000
value: 98.578
- type: recall_at_3
value: 49.604
- type: recall_at_5
value: 59.094
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: quora
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 70.351
- type: map_at_10
value: 84.163
- type: map_at_100
value: 84.785
- type: map_at_1000
value: 84.801
- type: map_at_3
value: 81.16
- type: map_at_5
value: 83.031
- type: mrr_at_1
value: 80.96
- type: mrr_at_10
value: 87.241
- type: mrr_at_100
value: 87.346
- type: mrr_at_1000
value: 87.347
- type: mrr_at_3
value: 86.25699999999999
- type: mrr_at_5
value: 86.907
- type: ndcg_at_1
value: 80.97
- type: ndcg_at_10
value: 88.017
- type: ndcg_at_100
value: 89.241
- type: ndcg_at_1000
value: 89.34299999999999
- type: ndcg_at_3
value: 85.053
- type: ndcg_at_5
value: 86.663
- type: precision_at_1
value: 80.97
- type: precision_at_10
value: 13.358
- type: precision_at_100
value: 1.525
- type: precision_at_1000
value: 0.157
- type: precision_at_3
value: 37.143
- type: precision_at_5
value: 24.451999999999998
- type: recall_at_1
value: 70.351
- type: recall_at_10
value: 95.39800000000001
- type: recall_at_100
value: 99.55199999999999
- type: recall_at_1000
value: 99.978
- type: recall_at_3
value: 86.913
- type: recall_at_5
value: 91.448
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 55.62406719814139
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 61.386700035141736
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: scidocs
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 4.618
- type: map_at_10
value: 12.920000000000002
- type: map_at_100
value: 15.304
- type: map_at_1000
value: 15.656999999999998
- type: map_at_3
value: 9.187
- type: map_at_5
value: 10.937
- type: mrr_at_1
value: 22.8
- type: mrr_at_10
value: 35.13
- type: mrr_at_100
value: 36.239
- type: mrr_at_1000
value: 36.291000000000004
- type: mrr_at_3
value: 31.917
- type: mrr_at_5
value: 33.787
- type: ndcg_at_1
value: 22.8
- type: ndcg_at_10
value: 21.382
- type: ndcg_at_100
value: 30.257
- type: ndcg_at_1000
value: 36.001
- type: ndcg_at_3
value: 20.43
- type: ndcg_at_5
value: 17.622
- type: precision_at_1
value: 22.8
- type: precision_at_10
value: 11.26
- type: precision_at_100
value: 2.405
- type: precision_at_1000
value: 0.377
- type: precision_at_3
value: 19.633
- type: precision_at_5
value: 15.68
- type: recall_at_1
value: 4.618
- type: recall_at_10
value: 22.811999999999998
- type: recall_at_100
value: 48.787000000000006
- type: recall_at_1000
value: 76.63799999999999
- type: recall_at_3
value: 11.952
- type: recall_at_5
value: 15.892000000000001
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 84.01529458252244
- type: cos_sim_spearman
value: 77.92985224770254
- type: euclidean_pearson
value: 81.04251429422487
- type: euclidean_spearman
value: 77.92838490549133
- type: manhattan_pearson
value: 80.95892251458979
- type: manhattan_spearman
value: 77.81028089705941
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 83.97885282534388
- type: cos_sim_spearman
value: 75.1221970851712
- type: euclidean_pearson
value: 80.34455956720097
- type: euclidean_spearman
value: 74.5894274239938
- type: manhattan_pearson
value: 80.38999766325465
- type: manhattan_spearman
value: 74.68524557166975
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 82.95746064915672
- type: cos_sim_spearman
value: 85.08683458043946
- type: euclidean_pearson
value: 84.56699492836385
- type: euclidean_spearman
value: 85.66089116133713
- type: manhattan_pearson
value: 84.47553323458541
- type: manhattan_spearman
value: 85.56142206781472
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 82.71377893595067
- type: cos_sim_spearman
value: 81.03453291428589
- type: euclidean_pearson
value: 82.57136298308613
- type: euclidean_spearman
value: 81.15839961890875
- type: manhattan_pearson
value: 82.55157879373837
- type: manhattan_spearman
value: 81.1540163767054
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 86.64197832372373
- type: cos_sim_spearman
value: 88.31966852492485
- type: euclidean_pearson
value: 87.98692129976983
- type: euclidean_spearman
value: 88.6247340837856
- type: manhattan_pearson
value: 87.90437827826412
- type: manhattan_spearman
value: 88.56278787131457
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 81.84159950146693
- type: cos_sim_spearman
value: 83.90678384140168
- type: euclidean_pearson
value: 83.19005018860221
- type: euclidean_spearman
value: 84.16260415876295
- type: manhattan_pearson
value: 83.05030612994494
- type: manhattan_spearman
value: 83.99605629718336
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 87.49935350176666
- type: cos_sim_spearman
value: 87.59086606735383
- type: euclidean_pearson
value: 88.06537181129983
- type: euclidean_spearman
value: 87.6687448086014
- type: manhattan_pearson
value: 87.96599131972935
- type: manhattan_spearman
value: 87.63295748969642
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 67.68232799482763
- type: cos_sim_spearman
value: 67.99930378085793
- type: euclidean_pearson
value: 68.50275360001696
- type: euclidean_spearman
value: 67.81588179309259
- type: manhattan_pearson
value: 68.5892154749763
- type: manhattan_spearman
value: 67.84357259640682
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 84.37049618406554
- type: cos_sim_spearman
value: 85.57014313159492
- type: euclidean_pearson
value: 85.57469513908282
- type: euclidean_spearman
value: 85.661948135258
- type: manhattan_pearson
value: 85.36866831229028
- type: manhattan_spearman
value: 85.5043455368843
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 84.83259065376154
- type: mrr
value: 95.58455433455433
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: scifact
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 58.817
- type: map_at_10
value: 68.459
- type: map_at_100
value: 68.951
- type: map_at_1000
value: 68.979
- type: map_at_3
value: 65.791
- type: map_at_5
value: 67.583
- type: mrr_at_1
value: 61.667
- type: mrr_at_10
value: 69.368
- type: mrr_at_100
value: 69.721
- type: mrr_at_1000
value: 69.744
- type: mrr_at_3
value: 67.278
- type: mrr_at_5
value: 68.611
- type: ndcg_at_1
value: 61.667
- type: ndcg_at_10
value: 72.70100000000001
- type: ndcg_at_100
value: 74.928
- type: ndcg_at_1000
value: 75.553
- type: ndcg_at_3
value: 68.203
- type: ndcg_at_5
value: 70.804
- type: precision_at_1
value: 61.667
- type: precision_at_10
value: 9.533
- type: precision_at_100
value: 1.077
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_3
value: 26.444000000000003
- type: precision_at_5
value: 17.599999999999998
- type: recall_at_1
value: 58.817
- type: recall_at_10
value: 84.789
- type: recall_at_100
value: 95.0
- type: recall_at_1000
value: 99.667
- type: recall_at_3
value: 72.8
- type: recall_at_5
value: 79.294
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.8108910891089
- type: cos_sim_ap
value: 95.5743678558349
- type: cos_sim_f1
value: 90.43133366385722
- type: cos_sim_precision
value: 89.67551622418878
- type: cos_sim_recall
value: 91.2
- type: dot_accuracy
value: 99.75841584158415
- type: dot_ap
value: 94.00786363627253
- type: dot_f1
value: 87.51910341314316
- type: dot_precision
value: 89.20041536863967
- type: dot_recall
value: 85.9
- type: euclidean_accuracy
value: 99.81485148514851
- type: euclidean_ap
value: 95.4752113136905
- type: euclidean_f1
value: 90.44334975369456
- type: euclidean_precision
value: 89.126213592233
- type: euclidean_recall
value: 91.8
- type: manhattan_accuracy
value: 99.81584158415842
- type: manhattan_ap
value: 95.5163172682464
- type: manhattan_f1
value: 90.51987767584097
- type: manhattan_precision
value: 92.3076923076923
- type: manhattan_recall
value: 88.8
- type: max_accuracy
value: 99.81584158415842
- type: max_ap
value: 95.5743678558349
- type: max_f1
value: 90.51987767584097
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 62.63235986949449
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 36.334795589585575
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 52.02955214518782
- type: mrr
value: 52.8004838298956
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 30.63769566275453
- type: cos_sim_spearman
value: 30.422379185989335
- type: dot_pearson
value: 26.88493071882256
- type: dot_spearman
value: 26.505249740971305
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: trec-covid
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 0.21
- type: map_at_10
value: 1.654
- type: map_at_100
value: 10.095
- type: map_at_1000
value: 25.808999999999997
- type: map_at_3
value: 0.594
- type: map_at_5
value: 0.9289999999999999
- type: mrr_at_1
value: 78.0
- type: mrr_at_10
value: 87.019
- type: mrr_at_100
value: 87.019
- type: mrr_at_1000
value: 87.019
- type: mrr_at_3
value: 86.333
- type: mrr_at_5
value: 86.733
- type: ndcg_at_1
value: 73.0
- type: ndcg_at_10
value: 66.52900000000001
- type: ndcg_at_100
value: 53.433
- type: ndcg_at_1000
value: 51.324000000000005
- type: ndcg_at_3
value: 72.02199999999999
- type: ndcg_at_5
value: 69.696
- type: precision_at_1
value: 78.0
- type: precision_at_10
value: 70.39999999999999
- type: precision_at_100
value: 55.46
- type: precision_at_1000
value: 22.758
- type: precision_at_3
value: 76.667
- type: precision_at_5
value: 74.0
- type: recall_at_1
value: 0.21
- type: recall_at_10
value: 1.8849999999999998
- type: recall_at_100
value: 13.801
- type: recall_at_1000
value: 49.649
- type: recall_at_3
value: 0.632
- type: recall_at_5
value: 1.009
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: webis-touche2020
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 1.797
- type: map_at_10
value: 9.01
- type: map_at_100
value: 14.682
- type: map_at_1000
value: 16.336000000000002
- type: map_at_3
value: 4.546
- type: map_at_5
value: 5.9270000000000005
- type: mrr_at_1
value: 24.490000000000002
- type: mrr_at_10
value: 41.156
- type: mrr_at_100
value: 42.392
- type: mrr_at_1000
value: 42.408
- type: mrr_at_3
value: 38.775999999999996
- type: mrr_at_5
value: 40.102
- type: ndcg_at_1
value: 21.429000000000002
- type: ndcg_at_10
value: 22.222
- type: ndcg_at_100
value: 34.405
- type: ndcg_at_1000
value: 46.599000000000004
- type: ndcg_at_3
value: 25.261
- type: ndcg_at_5
value: 22.695999999999998
- type: precision_at_1
value: 24.490000000000002
- type: precision_at_10
value: 19.796
- type: precision_at_100
value: 7.306
- type: precision_at_1000
value: 1.5350000000000001
- type: precision_at_3
value: 27.211000000000002
- type: precision_at_5
value: 22.857
- type: recall_at_1
value: 1.797
- type: recall_at_10
value: 15.706000000000001
- type: recall_at_100
value: 46.412
- type: recall_at_1000
value: 83.159
- type: recall_at_3
value: 6.1370000000000005
- type: recall_at_5
value: 8.599
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 70.3302
- type: ap
value: 14.169121204575601
- type: f1
value: 54.229345975274235
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 58.22297679683077
- type: f1
value: 58.62984908377875
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 49.952922428464255
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 84.68140907194373
- type: cos_sim_ap
value: 70.12180123666836
- type: cos_sim_f1
value: 65.77501791258658
- type: cos_sim_precision
value: 60.07853403141361
- type: cos_sim_recall
value: 72.66490765171504
- type: dot_accuracy
value: 81.92167848840674
- type: dot_ap
value: 60.49837581423469
- type: dot_f1
value: 58.44186046511628
- type: dot_precision
value: 52.24532224532224
- type: dot_recall
value: 66.3060686015831
- type: euclidean_accuracy
value: 84.73505394289802
- type: euclidean_ap
value: 70.3278904593286
- type: euclidean_f1
value: 65.98851124940161
- type: euclidean_precision
value: 60.38107752956636
- type: euclidean_recall
value: 72.74406332453826
- type: manhattan_accuracy
value: 84.73505394289802
- type: manhattan_ap
value: 70.00737738537337
- type: manhattan_f1
value: 65.80150784822642
- type: manhattan_precision
value: 61.892583120204606
- type: manhattan_recall
value: 70.23746701846966
- type: max_accuracy
value: 84.73505394289802
- type: max_ap
value: 70.3278904593286
- type: max_f1
value: 65.98851124940161
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 88.44258159661582
- type: cos_sim_ap
value: 84.91926704880888
- type: cos_sim_f1
value: 77.07651086632926
- type: cos_sim_precision
value: 74.5894554883319
- type: cos_sim_recall
value: 79.73514012935017
- type: dot_accuracy
value: 85.88116583226608
- type: dot_ap
value: 78.9753854779923
- type: dot_f1
value: 72.17757637979255
- type: dot_precision
value: 66.80647486729143
- type: dot_recall
value: 78.48783492454572
- type: euclidean_accuracy
value: 88.5299025885823
- type: euclidean_ap
value: 85.08006075642194
- type: euclidean_f1
value: 77.29637336504163
- type: euclidean_precision
value: 74.69836253950014
- type: euclidean_recall
value: 80.08161379735141
- type: manhattan_accuracy
value: 88.55124771995187
- type: manhattan_ap
value: 85.00941529932851
- type: manhattan_f1
value: 77.33100233100232
- type: manhattan_precision
value: 73.37572573956317
- type: manhattan_recall
value: 81.73698798891284
- type: max_accuracy
value: 88.55124771995187
- type: max_ap
value: 85.08006075642194
- type: max_f1
value: 77.33100233100232
---
# ggml-org/gte-small-Q8_0-GGUF
This model was converted to GGUF format from [`thenlper/gte-small`](https://huggingface.co/thenlper/gte-small) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space.
Refer to the [original model card](https://huggingface.co/thenlper/gte-small) for more details on the model.
## Use with llama.cpp
Install llama.cpp through brew (works on Mac and Linux)
```bash
brew install llama.cpp
```
Invoke the llama.cpp server or the CLI.
### CLI:
```bash
llama-cli --hf-repo ggml-org/gte-small-Q8_0-GGUF --hf-file gte-small-q8_0.gguf -p "The meaning to life and the universe is"
```
### Server:
```bash
llama-server --hf-repo ggml-org/gte-small-Q8_0-GGUF --hf-file gte-small-q8_0.gguf -c 2048
```
Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
Step 1: Clone llama.cpp from GitHub.
```
git clone https://github.com/ggerganov/llama.cpp
```
Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux).
```
cd llama.cpp && LLAMA_CURL=1 make
```
Step 3: Run inference through the main binary.
```
./llama-cli --hf-repo ggml-org/gte-small-Q8_0-GGUF --hf-file gte-small-q8_0.gguf -p "The meaning to life and the universe is"
```
or
```
./llama-server --hf-repo ggml-org/gte-small-Q8_0-GGUF --hf-file gte-small-q8_0.gguf -c 2048
```
| [
"SUMMARIZATION"
] | [
"BIOSSES",
"SCIFACT"
] |
BSC-LT/salamandraTA-2B | BSC-LT | translation | [
"transformers",
"safetensors",
"llama",
"text-generation",
"translation",
"it",
"pt",
"de",
"en",
"es",
"eu",
"gl",
"fr",
"bg",
"cs",
"lt",
"hr",
"ca",
"nl",
"ro",
"da",
"el",
"fi",
"hu",
"sk",
"sl",
"et",
"pl",
"lv",
"mt",
"ga",
"sv",
"an",
"ast",
"oc",
"arxiv:2010.11125",
"arxiv:2403.14009",
"arxiv:1907.05791",
"arxiv:1911.04944",
"arxiv:2207.04672",
"base_model:BSC-LT/salamandra-2b",
"base_model:finetune:BSC-LT/salamandra-2b",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:eu"
] | 2024-10-28T08:43:09 | 2025-03-17T17:35:52 | 1,583 | 10 | ---
base_model:
- BSC-LT/salamandra-2b
language:
- it
- pt
- de
- en
- es
- eu
- gl
- fr
- bg
- cs
- lt
- hr
- ca
- nl
- ro
- da
- el
- fi
- hu
- sk
- sl
- et
- pl
- lv
- mt
- ga
- sv
- an
- ast
- oc
library_name: transformers
license: apache-2.0
pipeline_tag: translation
---

# SalamandraTA Model Card
SalamandraTA-2B is a machine translation model that has been continually pre-trained on [Salamandra 2B](https://huggingface.co/BSC-LT/salamandra-2b) on 70 billion tokens of parallel data in 30 different languages:
Catalan, Italian, Portuguese, German, English, Spanish, Euskera, Galician, French, Bulgarian, Czech, Lithuanian, Croatian, Dutch, Romanian, Danish, Greek, Finnish,
Hungarian, Slovak, Slovenian, Estonian, Polish, Latvian, Swedish, Maltese, Irish, Aranese, Aragonese, Asturian.
SalamandraTA-2B is the first model in **SalamandraTA** series and is trained to handle sentence-level machine translation.
- **Developed by:** The Language Technologies Unit from Barcelona Supercomputing Center (BSC).
- **Model type:** A 2B parameter model continually pre-trained on 70 billion tokens.
- **Languages:** Catalan, Italian, Portuguese, German, English, Spanish, Euskera, Galician, French, Bulgarian, Czech, Lithuanian, Croatian, Dutch, Romanian, Danish, Greek, Finnish, Hungarian, Slovak, Slovenian, Estonian, Polish, Latvian, Swedish, Maltese, Irish, Aranese, Aragonese, Asturian.
- **License:** Apache License, Version 2.0
## Model Details
### Description
This machine translation model is built upon the foundation of [Salamandra 2B](https://huggingface.co/BSC-LT/salamandra-2b). By leveraging the knowledge of the base Salamandra 2B model,
this model is able to perform high quality translations between **almost 900 translation directions**.
Key Features:
* **Continual Pretraining:** The model is trained on 70 Billion tokens of parallel data. All data employed is open-sourced or generated from open-source
* data using the Machine Translation models at [BSC](https://huggingface.co/collections/projecte-aina/mt-models-655e154668c6dd132159081c)
* **Large Language Model Foundation:** Built on Salamandra 2B, providing a strong language understanding and generation capability.
* **Multilingual Support:** Capable of translating between 30 european languages, including low-resource languages.
* **High-Quality Translations:** Delivers accurate and fluent translations, thanks to its continual pretraining and large-scale dataset.
* **Efficient Inference:** 2 Billion parameters allow for a trade-off between performance and hardware requirements by most systems.
### Hyperparameters
The full list of hyperparameters for each model can be found [here](https://github.com/langtech-bsc/salamandra/tree/main/configs).
### Architecture
| | |
|-------------------------|:--------------|
| Total Parameters | 2,253,490,176 |
| Embedding Parameters | 524,288,000 |
| Layers | 24 |
| Hidden size | 2,048 |
| Attention heads | 16 |
| Context length | 8,192 |
| Vocabulary size | 256,000 |
| Precision | bfloat16 |
| Embedding type | RoPE |
| Activation Function | SwiGLU |
| Layer normalization | RMS Norm |
| Flash attention | ✅ |
| Grouped Query Attention | ❌ |
| Num. query groups | N/A |
---
## Intended Use
### Direct Use
The models are intended for both research and commercial use in any of the languages included in the training data.
The base models are intended for general machine translation tasks.
### Out-of-scope Use
The model is not intended for malicious activities, such as harming others or violating human rights.
Any downstream application must comply with current laws and regulations.
Irresponsible usage in production environments without proper risk assessment and mitigation is also discouraged.
---
## Hardware and Software
### Training Framework
Continual pre-training was conducted using [LLaMA-Factory framework](https://github.com/hiyouga/LLaMA-Factory).
### Compute Infrastructure
All models were trained on [MareNostrum 5](https://www.bsc.es/ca/marenostrum/marenostrum-5), a pre-exascale EuroHPC supercomputer hosted and
operated by Barcelona Supercomputing Center.
The accelerated partition is composed of 1,120 nodes with the following specifications:
- 4x Nvidia Hopper GPUs with 64 HBM2 memory
- 2x Intel Sapphire Rapids 8460Y+ at 2.3Ghz and 32c each (64 cores)
- 4x NDR200 (BW per node 800Gb/s)
- 512 GB of Main memory (DDR5)
- 460GB on NVMe storage
---
## How to use
To translate with the salamandraTA-2B model, first you need to create a prompt that specifies the source and target languages in this format:
```css
[source_language] sentence \n[target_language]
```
You can translate between these languages by using their names directly:
Italian, Portuguese, German, English, Spanish, Euskera, Galician, French, Bulgarian, Czech, Lithuanian, Croatian, Dutch, Romanian, Danish, Greek, Finnish,
Hungarian, Slovak, Slovenian, Estonian, Polish, Latvian, Swedish, Maltese, Irish, Aranese, Aragonese, Asturian.
### Inference
To translate from Spanish to Catalan using Huggingface's AutoModel class on a single sentence you can use the following code:
```python
import torch
from transformers import AutoTokenizer, AutoModelForCausalLM
model_id = 'BSC-LT/salamandraTA-2b'
# Load tokenizer and model
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(model_id)
# Move model to GPU if available
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
model.to(device)
src_lang_code = 'Spanish'
tgt_lang_code = 'Catalan'
sentence = 'Ayer se fue, tomó sus cosas y se puso a navegar.'
prompt = f'[{src_lang_code}] {sentence} \n[{tgt_lang_code}]'
# Tokenize and move inputs to the same device as the model
input_ids = tokenizer(prompt, return_tensors='pt').input_ids.to(device)
output_ids = model.generate(input_ids, max_length=500, num_beams=5)
input_length = input_ids.shape[1]
generated_text = tokenizer.decode(output_ids[0, input_length:], skip_special_tokens=True).strip()
print(generated_text)
#Ahir se'n va anar, va agafar les seves coses i es va posar a navegar.
```
<br>
To run batch inference using Huggingface's AutoModel class you can use the following code.
<details>
<summary>Show code</summary>
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
import torch
model_id = 'BSC-LT/salamandraTA-2b'
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(model_id, attn_implementation='eager')
# Move the model to GPU
device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')
model = model.to(device)
# List of sentences to translate
sentences = [
'Ayer se fue, tomó sus cosas y se puso a navegar.',
'Se despidió y decidió batirse en duelo con el mar, y recorrer el mundo en su velero',
'Su corazón buscó una forma diferente de vivir, pero las olas le gritaron: Vete con los demás',
'Y se durmió y la noche le gritó: Dónde vas, y en sus sueños dibujó gaviotas, y pensó: Hoy debo regresar.'
]
src_lang_code = 'Spanish'
tgt_lang_code = 'Catalan'
prompt = lambda x: f'[{src_lang_code}] {x} \n[{tgt_lang_code}]'
prompts = [prompt(x) for x in sentences]
encodings = tokenizer(prompts, return_tensors='pt', padding=True, add_special_tokens=True)
input_ids = encodings['input_ids'].to(model.device)
attention_mask = encodings['attention_mask'].to(model.device)
with torch.no_grad():
outputs = model.generate(input_ids=input_ids, attention_mask=attention_mask, num_beams=5,max_length=256,early_stopping=True)
results_detokenized = []
for i, output in enumerate(outputs):
input_length = input_ids[i].shape[0]
generated_text = tokenizer.decode(output[input_length:], skip_special_tokens=True).strip()
results_detokenized.append(generated_text)
print("Generated Translations:", results_detokenized)
#Generated Translations: ["Ahir se'n va anar, va agafar les seves coses i es va posar a navegar.",
#"Es va acomiadar i va decidir batre's en duel amb el mar, i recórrer el món en el seu veler",
#"El seu cor va buscar una forma diferent de viure, però les onades li van cridar: Vés amb els altres",
#"I es va adormir i la nit li va cridar: On vas, i en els seus somnis va dibuixar gavines, i va pensar: Avui he de tornar."]
```
</details>
## Data
### Pretraining Data
The training corpus consists of 70 billion tokens of Catalan- and Spanish-centric parallel data, including all of the official European languages plus Catalan, Basque,
Galician, Asturian, Aragonese and Aranese. It amounts to 3,157,965,012 parallel sentence pairs.
This highly multilingual corpus is predominantly composed of data sourced from [OPUS](https://opus.nlpl.eu/), with additional data taken from the [NTEU project](https://nteu.eu/) and Project Aina’s existing corpora.
Where little parallel Catalan <-> xx data could be found, synthetic Catalan data was generated from the Spanish side of the collected Spanish <-> xx corpora using
[Projecte Aina’s Spanish-Catalan model](https://huggingface.co/projecte-aina/aina-translator-es-ca). The final distribution of languages was as below:

Click the expand button below to see the full list of corpora included in the training data.
<details>
<summary>Data Sources</summary>
| Dataset | Ca-xx Languages | Es-xx Langugages |
|-----------------------------------------------|----------------------------------------------------------------|-----------------------------------------------|
|[CCMatrix](https://opus.nlpl.eu/CCMatrix/corpus/version/CCMatrix) |eu | |
|[DGT](https://opus.nlpl.eu/DGT/corpus/version/DGT) | |bg,cs,da,de,el ,et,fi,fr,ga,hr,hu,lt,lv,mt,nl,pl,pt,ro,sk,sl,sv |
|[ELRC-EMEA](https://opus.nlpl.eu/ELRC-EMEA/corpus/version/ELRC-EMEA) | |bg,cs,da,hu,lt,lv,mt,pl,ro,sk,sl |
|[EMEA](https://opus.nlpl.eu/EMEA/corpus/version/EMEA) | |bg,cs,da,el,fi,hu,lt,mt,nl,pl,ro,sk,sl,sv |
|[EUBookshop](https://opus.nlpl.eu/EUbookshop/corpus/version/EUbookshop) |lt,pl,pt |cs,da,de,el,fi,fr,ga,it,lv,mt,nl,pl,pt,ro,sk,sl,sv |
|[Europarl](https://opus.nlpl.eu/Europarl/corpus/version/Europarl) | |bg,cs,da,el,fi,fr,hu,lt,lv,nl,pl,pt ,ro,sk,sl,sv |
|[Europat](https://opus.nlpl.eu/EuroPat/corpus/version/EuroPat) | |hr |
|[KDE4](https://opus.nlpl.eu/KDE4/corpus/version/KDE4) |bg,cs,da,de,el ,et,eu,fi,fr,ga,gl,hr,it,lt,lv,nl,pl,pt,ro,sk,sl,sv |bg,ga,hr |
|[GlobalVoices](https://opus.nlpl.eu/GlobalVoices/corpus/version/GlobalVoices) | bg,de,fr,it,nl,pl,pt |bg,de,fr,pt |
|[GNOME](https://opus.nlpl.eu/GNOME/corpus/version/GNOME) |eu,fr,ga,gl,pt |ga |
|[JRC-Arquis](https://opus.nlpl.eu/JRC-Acquis/corpus/version/JRC-Acquis) | |cs,da,et,fr,lt,lv,mt,nl,pl ,ro,sv|
|[MultiCCAligned](https://opus.nlpl.eu/JRC-Acquis/corpus/version/JRC-Acquis) |bg,cs,de,el,et,fi,fr,hr,hu,it,lt,lv,nl,pl,ro,sk,sv |bg,fi,fr,hr,it,lv,nl,pt |
|[MultiHPLT](https://opus.nlpl.eu/MultiHPLT/corpus/version/MultiHPLT) |et,fi,ga,hr,mt | |
|[MultiParaCrawl](https://opus.nlpl.eu/MultiParaCrawl/corpus/version/MultiParaCrawl) |bg,da |de,fr,ga,hr,hu,it,mt,pt | |
|[MultiUN](https://opus.nlpl.eu/MultiUN/corpus/version/MultiUN) | |fr | |
|[News-Commentary](https://opus.nlpl.eu/News-Commentary/corpus/version/News-Commentary) | |fr |
|[NLLB](https://opus.nlpl.eu/NLLB/corpus/version/NLLB) |bg,da,el,et,fi,fr,gl,hu,it ,lt,lv,pt,ro,sk,sl |bg,cs,da,de,el ,et,fi,fr,hu,it,lt,lv,nl,pl,pt ,ro,sk,sl,sv|
|[NTEU](https://www.elrc-share.eu/repository/search/?q=NTEU) | |bg,cs,da,de,el ,et,fi,fr,ga,hr,hu,it,lt,lv,mt,nl,pl,pt,ro,sk,sl,sv |
|[OpenSubtitles](https://opus.nlpl.eu/OpenSubtitles/corpus/version/OpenSubtitles) |bg,cs,da,de,el ,et,eu,fi,gl,hr,hu,lt,lv,nl,pl,pt,ro,sk,sl,sv |da,de,fi,fr,hr,hu,it,lv,nl |
|[Tatoeba](https://opus.nlpl.eu/Tatoeba/corpus/version/Tatoeba) |de,pt |pt |
|[TildeModel](https://opus.nlpl.eu/TildeMODEL/corpus/version/TildeMODEL) | |bg |
|[UNPC](https://opus.nlpl.eu/UNPC/corpus/version/UNPC) | |fr |
|[WikiMatrix](https://opus.nlpl.eu/WikiMatrix/corpus/version/WikiMatrix) |bg,cs,da,de,el ,et,eu,fi,fr,gl,hr,hu,it,lt,nl,pl,pt,ro,sk,sl,sv |bg,fr,hr,it,pt |
|[XLENT](https://opus.nlpl.eu/XLEnt/corpus/version/XLEnt) |eu,ga,gl |ga |
</details>
<details>
<summary>References</summary>
- Aulamo, M., Sulubacak, U., Virpioja, S., & Tiedemann, J. (2020). OpusTools and Parallel Corpus Diagnostics. In N. Calzolari, F. Béchet, P. Blache, K. Choukri, C. Cieri, T. Declerck, S. Goggi, H. Isahara, B. Maegaard, J. Mariani, H. Mazo, A. Moreno, J. Odijk, & S. Piperidis (Eds.), Proceedings of the Twelfth Language Resources and Evaluation Conference (pp. 3782–3789). European Language Resources Association. https://aclanthology.org/2020.lrec-1.467
- Chaudhary, V., Tang, Y., Guzmán, F., Schwenk, H., & Koehn, P. (2019). Low-Resource Corpus Filtering Using Multilingual Sentence Embeddings. In O. Bojar, R. Chatterjee, C. Federmann, M. Fishel, Y. Graham, B. Haddow, M. Huck, A. J. Yepes, P. Koehn, A. Martins, C. Monz, M. Negri, A. Névéol, M. Neves, M. Post, M. Turchi, & K. Verspoor (Eds.), Proceedings of the Fourth Conference on Machine Translation (Volume 3: Shared Task Papers, Day 2) (pp. 261–266). Association for Computational Linguistics. https://doi.org/10.18653/v1/W19-5435
- DGT-Translation Memory—European Commission. (n.d.). Retrieved November 4, 2024, from https://joint-research-centre.ec.europa.eu/language-technology-resources/dgt-translation-memory_en
- Eisele, A., & Chen, Y. (2010). MultiUN: A Multilingual Corpus from United Nation Documents. In N. Calzolari, K. Choukri, B. Maegaard, J. Mariani, J. Odijk, S. Piperidis, M. Rosner, & D. Tapias (Eds.), Proceedings of the Seventh International Conference on Language Resources and Evaluation (LREC’10). European Language Resources Association (ELRA). http://www.lrec-conf.org/proceedings/lrec2010/pdf/686_Paper.pdf
- El-Kishky, A., Chaudhary, V., Guzmán, F., & Koehn, P. (2020). CCAligned: A Massive Collection of Cross-Lingual Web-Document Pairs. Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), 5960–5969. https://doi.org/10.18653/v1/2020.emnlp-main.480
- El-Kishky, A., Renduchintala, A., Cross, J., Guzmán, F., & Koehn, P. (2021). XLEnt: Mining a Large Cross-lingual Entity Dataset with Lexical-Semantic-Phonetic Word Alignment. Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, 10424–10430. https://doi.org/10.18653/v1/2021.emnlp-main.814
- Fan, A., Bhosale, S., Schwenk, H., Ma, Z., El-Kishky, A., Goyal, S., Baines, M., Celebi, O., Wenzek, G., Chaudhary, V., Goyal, N., Birch, T., Liptchinsky, V., Edunov, S., Grave, E., Auli, M., & Joulin, A. (2020). Beyond English-Centric Multilingual Machine Translation (No. arXiv:2010.11125). arXiv. https://doi.org/10.48550/arXiv.2010.11125
- García-Martínez, M., Bié, L., Cerdà, A., Estela, A., Herranz, M., Krišlauks, R., Melero, M., O’Dowd, T., O’Gorman, S., Pinnis, M., Stafanovič, A., Superbo, R., & Vasiļevskis, A. (2021). Neural Translation for European Union (NTEU). 316–334. https://aclanthology.org/2021.mtsummit-up.23
- Gibert, O. de, Nail, G., Arefyev, N., Bañón, M., Linde, J. van der, Ji, S., Zaragoza-Bernabeu, J., Aulamo, M., Ramírez-Sánchez, G., Kutuzov, A., Pyysalo, S., Oepen, S., & Tiedemann, J. (2024). A New Massive Multilingual Dataset for High-Performance Language Technologies (No. arXiv:2403.14009). arXiv. http://arxiv.org/abs/2403.14009
- Koehn, P. (2005). Europarl: A Parallel Corpus for Statistical Machine Translation. Proceedings of Machine Translation Summit X: Papers, 79–86. https://aclanthology.org/2005.mtsummit-papers.11
- Kreutzer, J., Caswell, I., Wang, L., Wahab, A., Van Esch, D., Ulzii-Orshikh, N., Tapo, A., Subramani, N., Sokolov, A., Sikasote, C., Setyawan, M., Sarin, S., Samb, S., Sagot, B., Rivera, C., Rios, A., Papadimitriou, I., Osei, S., Suarez, P. O., … Adeyemi, M. (2022). Quality at a Glance: An Audit of Web-Crawled Multilingual Datasets. Transactions of the Association for Computational Linguistics, 10, 50–72. https://doi.org/10.1162/tacl_a_00447
- Rozis, R.,Skadiņš, R (2017). Tilde MODEL - Multilingual Open Data for EU Languages. https://aclanthology.org/W17-0235
- Schwenk, H., Chaudhary, V., Sun, S., Gong, H., & Guzmán, F. (2019). WikiMatrix: Mining 135M Parallel Sentences in 1620 Language Pairs from Wikipedia (No. arXiv:1907.05791). arXiv. https://doi.org/10.48550/arXiv.1907.05791
- Schwenk, H., Wenzek, G., Edunov, S., Grave, E., & Joulin, A. (2020). CCMatrix: Mining Billions of High-Quality Parallel Sentences on the WEB (No. arXiv:1911.04944). arXiv. https://doi.org/10.48550/arXiv.1911.04944
- Steinberger, R., Pouliquen, B., Widiger, A., Ignat, C., Erjavec, T., Tufiş, D., & Varga, D. (n.d.). The JRC-Acquis: A Multilingual Aligned Parallel Corpus with 20+ Languages. http://www.lrec-conf.org/proceedings/lrec2006/pdf/340_pdf
- Subramani, N., Luccioni, S., Dodge, J., & Mitchell, M. (2023). Detecting Personal Information in Training Corpora: An Analysis. In A. Ovalle, K.-W. Chang, N. Mehrabi, Y. Pruksachatkun, A. Galystan, J. Dhamala, A. Verma, T. Cao, A. Kumar, & R. Gupta (Eds.), Proceedings of the 3rd Workshop on Trustworthy Natural Language Processing (TrustNLP 2023) (pp. 208–220). Association for Computational Linguistics. https://doi.org/10.18653/v1/2023.trustnlp-1.18
- Tiedemann, J. (23-25). Parallel Data, Tools and Interfaces in OPUS. In N. C. (Conference Chair), K. Choukri, T. Declerck, M. U. Doğan, B. Maegaard, J. Mariani, A. Moreno, J. Odijk, & S. Piperidis (Eds.), Proceedings of the Eight International Conference on Language Resources and Evaluation (LREC’12). European Language Resources Association (ELRA). http://www.lrec-conf.org/proceedings/lrec2012/pdf/463_Paper
- Ziemski, M., Junczys-Dowmunt, M., & Pouliquen, B. (n.d.). The United Nations Parallel Corpus v1.0. https://aclanthology.org/L16-1561
</details>
## Evaluation
Below are the evaluation results on Flores-200 dev and devtest compared to NLLB-3.3 ([Costa-jussà et al., 2022](https://arxiv.org/abs/2207.04672)) for CA-XX
and XX-CA directions. The metrics have been computed excluding Asturian, Aranese, and Aragonese as we report them separately. The evaluation was conducted
using [MT Lens](https://github.com/langtech-bsc/mt-evaluation) following the standard setting (beam search with beam size 5, limiting the translation length to 250 tokens). We report the following metrics:
<details>
<summary>Click to show metrics details</summary>
- `BLEU`: Sacrebleu implementation. Signature: nrefs:1— case:mixed— eff:no— tok:13a— smooth:exp—version:2.3.1
- `TER`: Sacrebleu implementation.
- `ChrF`: Sacrebleu implementation.
- `Comet`: Model checkpoint: "Unbabel/wmt22-comet-da".
- `Comet-kiwi`: Model checkpoint: "Unbabel/wmt22-cometkiwi-da".
- `Bleurt`: Model checkpoint: "lucadiliello/BLEURT-20".
</details>
#### Flores200-dev
| | Bleu ↑ | Ter ↓ | ChrF ↑ | Comet ↑ | Comet-kiwi ↑ | Bleurt ↑ |
|:-----------------------|-------:|------:|-------:|--------:|-------------:|---------:|
| **CA-XX** | | | | | | |
| SalamandraTA-2B | **27.41** | **60.88** | **56.27** | 0.86 | 0.82 | 0.76 |
| nllb 3.3B | 26.84 | 61.75 | 55.7 | 0.86 | 0.82 | 0.76 |
| **XX-CA** | | | | | | |
| SalamandraTA-2B | **30.75** | **57.66** | **57.6** | 0.85 | 0.81 | 0.73 |
| nllb 3.3B | 29.76 | 58.25 | 56.75 | 0.85 | **0.82** | 0.73 |
<details>
<summary>Click to show full table CA-XX Flores-dev</summary>
| | source | target | Bleu ↑ | Ter ↓ | ChrF ↑ | Comet ↑ | Comet-kiwi ↑ | Bleurt ↑ |
|:-----------------------|:---------|:---------|-------:|------:|-------:|--------:|-------------:|---------:|
| nllb 3.3B | ca | sv | 33.05 | 53.98 | 60.09 | 0.88 | 0.83 | 0.79 |
| SalamandraTA-2B | ca | sv | 30.62 | 55.4 | 57.77 | 0.87 | 0.81 | 0.78 |
| | | | | | | | | |
| SalamandraTA-2B | ca | sl | 25.74 | 63.78 | 54.29 | 0.88 | 0.83 | 0.81 |
| nllb 3.3B | ca | sl | 25.04 | 65.02 | 53.08 | 0.88 | 0.83 | 0.82 |
| | | | | | | | | |
| SalamandraTA-2B | ca | sk | 26.03 | 62.58 | 53.53 | 0.89 | 0.84 | 0.8 |
| nllb 3.3B | ca | sk | 25.59 | 63.17 | 53.28 | 0.89 | 0.84 | 0.8 |
| | | | | | | | | |
| SalamandraTA-2B | ca | ro | 33.08 | 54.36 | 59.18 | 0.89 | 0.85 | 0.8 |
| nllb 3.3B | ca | ro | 31.91 | 55.46 | 58.36 | 0.89 | 0.85 | 0.81 |
| | | | | | | | | |
| SalamandraTA-2B | ca | pt | 37.6 | 48.82 | 62.73 | 0.88 | 0.84 | 0.76 |
| nllb 3.3B | ca | pt | 36.85 | 49.56 | 62.02 | 0.88 | 0.85 | 0.76 |
| | | | | | | | | |
| nllb 3.3B | ca | pl | 17.97 | 73.06 | 47.94 | 0.88 | 0.84 | 0.78 |
| SalamandraTA-2B | ca | pl | 17.85 | 72.67 | 47.77 | 0.88 | 0.84 | 0.78 |
| | | | | | | | | |
| SalamandraTA-2B | ca | nl | 23.88 | 64.95 | 54.46 | 0.85 | 0.84 | 0.75 |
| nllb 3.3B | ca | nl | 23.26 | 66.46 | 54.17 | 0.85 | 0.85 | 0.75 |
| | | | | | | | | |
| SalamandraTA-2B | ca | mt | 25.62 | 59.08 | 60.83 | 0.69 | 0.61 | 0.43 |
| nllb 3.3B | ca | mt | 25.37 | 59.47 | 60.1 | 0.69 | 0.63 | 0.39 |
| | | | | | | | | |
| SalamandraTA-2B | ca | lv | 21.23 | 71.48 | 49.47 | 0.82 | 0.79 | 0.73 |
| nllb 3.3B | ca | lv | 20.56 | 70.88 | 50.07 | 0.85 | 0.78 | 0.77 |
| | | | | | | | | |
| SalamandraTA-2B | ca | lt | 19.92 | 71.02 | 50.88 | 0.87 | 0.8 | 0.81 |
| nllb 3.3B | ca | lt | 18.82 | 71.8 | 51.84 | 0.87 | 0.82 | 0.82 |
| | | | | | | | | |
| SalamandraTA-2B | ca | it | 26.76 | 60.67 | 56.3 | 0.88 | 0.85 | 0.77 |
| nllb 3.3B | ca | it | 26.42 | 61.47 | 55.66 | 0.87 | 0.86 | 0.77 |
| | | | | | | | | |
| SalamandraTA-2B | ca | hu | 22.8 | 66.41 | 53.41 | 0.86 | 0.82 | 0.85 |
| nllb 3.3B | ca | hu | 21.2 | 68.54 | 51.99 | 0.87 | 0.83 | 0.87 |
| | | | | | | | | |
| SalamandraTA-2B | ca | hr | 26.24 | 61.83 | 55.87 | 0.89 | 0.84 | 0.81 |
| nllb 3.3B | ca | hr | 24.04 | 64.25 | 53.79 | 0.89 | 0.85 | 0.82 |
| | | | | | | | | |
| nllb 3.3B | ca | gl | 32.85 | 51.69 | 59.33 | 0.87 | 0.85 | 0.72 |
| SalamandraTA-2B | ca | gl | 31.84 | 52.52 | 59.16 | 0.87 | 0.84 | 0.71 |
| | | | | | | | | |
| SalamandraTA-2B | ca | ga | 25.24 | 63.36 | 53.24 | 0.78 | 0.64 | 0.62 |
| nllb 3.3B | ca | ga | 23.51 | 66.54 | 51.53 | 0.77 | 0.66 | 0.62 |
| | | | | | | | | |
| SalamandraTA-2B | ca | fr | 40.14 | 48.34 | 64.24 | 0.86 | 0.84 | 0.73 |
| nllb 3.3B | ca | fr | 39.8 | 48.96 | 63.97 | 0.86 | 0.85 | 0.74 |
| | | | | | | | | |
| nllb 3.3B | ca | fi | 18.63 | 71.42 | 52.71 | 0.89 | 0.82 | 0.82 |
| SalamandraTA-2B | ca | fi | 18.49 | 71.46 | 52.09 | 0.88 | 0.8 | 0.8 |
| | | | | | | | | |
| SalamandraTA-2B | ca | eu | 18.75 | 71.09 | 57.05 | 0.87 | 0.81 | 0.8 |
| nllb 3.3B | ca | eu | 13.15 | 77.69 | 50.35 | 0.83 | 0.75 | 0.75 |
| | | | | | | | | |
| SalamandraTA-2B | ca | et | 22.03 | 67.55 | 54.87 | 0.88 | 0.8 | 0.79 |
| nllb 3.3B | ca | et | 20.07 | 70.66 | 53.19 | 0.88 | 0.81 | 0.8 |
| | | | | | | | | |
| nllb 3.3B | ca | es | 25.59 | 60.39 | 53.7 | 0.86 | 0.86 | 0.74 |
| SalamandraTA-2B | ca | es | 24.46 | 61.54 | 53.02 | 0.86 | 0.86 | 0.74 |
| | | | | | | | | |
| nllb 3.3B | ca | en | 49.62 | 37.33 | 71.65 | 0.89 | 0.86 | 0.8 |
| SalamandraTA-2B | ca | en | 46.62 | 40.03 | 70.23 | 0.88 | 0.86 | 0.79 |
| | | | | | | | | |
| SalamandraTA-2B | ca | el | 23.38 | 63 | 50.03 | 0.87 | 0.84 | 0.74 |
| nllb 3.3B | ca | el | 22.62 | 63.73 | 49.5 | 0.87 | 0.84 | 0.74 |
| | | | | | | | | |
| SalamandraTA-2B | ca | de | 31.89 | 57.12 | 59.07 | 0.84 | 0.83 | 0.75 |
| nllb 3.3B | ca | de | 31.19 | 57.87 | 58.47 | 0.85 | 0.84 | 0.76 |
| | | | | | | | | |
| SalamandraTA-2B | ca | da | 34.69 | 53.31 | 61.11 | 0.87 | 0.82 | 0.75 |
| nllb 3.3B | ca | da | 34.32 | 54.2 | 60.2 | 0.88 | 0.83 | 0.77 |
| | | | | | | | | |
| SalamandraTA-2B | ca | cs | 25.67 | 63.37 | 53.07 | 0.89 | 0.85 | 0.79 |
| nllb 3.3B | ca | cs | 25.02 | 63.59 | 52.43 | 0.89 | 0.85 | 0.79 |
| | | | | | | | | |
| SalamandraTA-2B | ca | bg | 32.09 | 57.01 | 59.4 | 0.89 | 0.85 | 0.84 |
| nllb 3.3B | ca | bg | 31.24 | 58.41 | 58.81 | 0.89 | 0.86 | 0.85 |
</details>
<details>
<summary>Click to show full table XX-CA Flores-dev</summary>
| | source | target | Bleu ↑ | Ter ↓ | ChrF ↑ | Comet ↑ | Comet-kiwi ↑ | Bleurt ↑ |
|:-----------------------|:---------|:---------|-------:|------:|-------:|--------:|-------------:|---------:|
| SalamandraTA-2B | sv | ca | 34.21 | 53 | 59.52 | 0.86 | 0.83 | 0.74 |
| nllb 3.3B | sv | ca | 33.03 | 53.42 | 59.02 | 0.86 | 0.84 | 0.75 |
| | | | | | | | | |
| SalamandraTA-2B | sl | ca | 28.98 | 59.95 | 56.24 | 0.85 | 0.82 | 0.72 |
| nllb 3.3B | sl | ca | 27.51 | 61.23 | 54.96 | 0.85 | 0.83 | 0.72 |
| | | | | | | | | |
| SalamandraTA-2B | sk | ca | 30.61 | 58.1 | 57.53 | 0.86 | 0.81 | 0.73 |
| nllb 3.3B | sk | ca | 29.24 | 58.93 | 56.29 | 0.86 | 0.83 | 0.73 |
| | | | | | | | | |
| SalamandraTA-2B | ro | ca | 33.73 | 54.23 | 60.11 | 0.87 | 0.83 | 0.75 |
| nllb 3.3B | ro | ca | 32.9 | 54.71 | 59.56 | 0.87 | 0.84 | 0.75 |
| | | | | | | | | |
| SalamandraTA-2B | pt | ca | 35.99 | 50.64 | 61.52 | 0.87 | 0.84 | 0.76 |
| nllb 3.3B | pt | ca | 34.63 | 51.15 | 60.68 | 0.87 | 0.84 | 0.76 |
| | | | | | | | | |
| SalamandraTA-2B | pl | ca | 25.77 | 64.99 | 53.46 | 0.84 | 0.82 | 0.71 |
| nllb 3.3B | pl | ca | 24.41 | 65.69 | 52.45 | 0.85 | 0.83 | 0.71 |
| | | | | | | | | |
| SalamandraTA-2B | nl | ca | 26.04 | 64.09 | 53.64 | 0.84 | 0.84 | 0.71 |
| nllb 3.3B | nl | ca | 25.35 | 64.64 | 53.15 | 0.84 | 0.85 | 0.71 |
| | | | | | | | | |
| SalamandraTA-2B | mt | ca | 37.51 | 50.18 | 62.42 | 0.79 | 0.69 | 0.75 |
| nllb 3.3B | mt | ca | 36.29 | 51.01 | 61.24 | 0.79 | 0.7 | 0.75 |
| | | | | | | | | |
| SalamandraTA-2B | lv | ca | 27.14 | 62.61 | 55.6 | 0.84 | 0.78 | 0.7 |
| nllb 3.3B | lv | ca | 27.02 | 61.12 | 54.28 | 0.84 | 0.79 | 0.71 |
| | | | | | | | | |
| SalamandraTA-2B | lt | ca | 27.76 | 61.3 | 54.52 | 0.84 | 0.76 | 0.71 |
| nllb 3.3B | lt | ca | 26.05 | 62.75 | 53.4 | 0.84 | 0.77 | 0.71 |
| | | | | | | | | |
| SalamandraTA-2B | it | ca | 28.44 | 61.09 | 57.12 | 0.87 | 0.85 | 0.74 |
| nllb 3.3B | it | ca | 27.79 | 61.42 | 56.62 | 0.87 | 0.86 | 0.74 |
| | | | | | | | | |
| SalamandraTA-2B | hu | ca | 28.15 | 60.01 | 55.29 | 0.85 | 0.81 | 0.72 |
| nllb 3.3B | hu | ca | 27.06 | 60.44 | 54.38 | 0.85 | 0.83 | 0.72 |
| | | | | | | | | |
| SalamandraTA-2B | hr | ca | 29.89 | 58.61 | 56.62 | 0.85 | 0.82 | 0.72 |
| nllb 3.3B | hr | ca | 28.23 | 59.55 | 55.37 | 0.86 | 0.84 | 0.73 |
| | | | | | | | | |
| nllb 3.3B | gl | ca | 34.28 | 52.34 | 60.86 | 0.87 | 0.85 | 0.76 |
| SalamandraTA-2B | gl | ca | 32.14 | 54.03 | 60.3 | 0.87 | 0.84 | 0.75 |
| | | | | | | | | |
| SalamandraTA-2B | ga | ca | 28.59 | 61.13 | 55.61 | 0.8 | 0.69 | 0.68 |
| nllb 3.3B | ga | ca | 28.09 | 61.12 | 54.55 | 0.8 | 0.7 | 0.68 |
| | | | | | | | | |
| SalamandraTA-2B | fr | ca | 34.53 | 52.9 | 60.38 | 0.87 | 0.83 | 0.76 |
| nllb 3.3B | fr | ca | 33.61 | 53.57 | 59.73 | 0.87 | 0.84 | 0.76 |
| | | | | | | | | |
| SalamandraTA-2B | fi | ca | 26.71 | 62.19 | 54.09 | 0.86 | 0.8 | 0.71 |
| nllb 3.3B | fi | ca | 26.31 | 62.6 | 54.06 | 0.86 | 0.82 | 0.71 |
| | | | | | | | | |
| SalamandraTA-2B | eu | ca | 27.93 | 60.26 | 55.27 | 0.87 | 0.83 | 0.73 |
| nllb 3.3B | eu | ca | 26.43 | 63.76 | 53.75 | 0.86 | 0.82 | 0.72 |
| | | | | | | | | |
| SalamandraTA-2B | et | ca | 30.03 | 58.25 | 56.88 | 0.86 | 0.79 | 0.72 |
| nllb 3.3B | et | ca | 27.56 | 59.95 | 54.92 | 0.86 | 0.8 | 0.72 |
| | | | | | | | | |
| nllb 3.3B | es | ca | 25.33 | 64.23 | 55.1 | 0.86 | 0.84 | 0.73 |
| SalamandraTA-2B | es | ca | 22.95 | 67.1 | 53.67 | 0.86 | 0.84 | 0.72 |
| | | | | | | | | |
| SalamandraTA-2B | en | ca | 43.55 | 42.62 | 67.03 | 0.88 | 0.85 | 0.78 |
| nllb 3.3B | en | ca | 42.21 | 43.63 | 65.95 | 0.88 | 0.85 | 0.78 |
| | | | | | | | | |
| SalamandraTA-2B | el | ca | 28.52 | 60.34 | 54.99 | 0.85 | 0.83 | 0.71 |
| nllb 3.3B | el | ca | 27.36 | 60.49 | 54.76 | 0.85 | 0.85 | 0.72 |
| | | | | | | | | |
| SalamandraTA-2B | de | ca | 33.07 | 54.46 | 59.06 | 0.85 | 0.84 | 0.74 |
| nllb 3.3B | de | ca | 31.43 | 56.05 | 57.95 | 0.86 | 0.85 | 0.74 |
| | | | | | | | | |
| SalamandraTA-2B | da | ca | 34.6 | 53.22 | 60.43 | 0.86 | 0.83 | 0.75 |
| nllb 3.3B | da | ca | 32.71 | 54.2 | 58.9 | 0.86 | 0.84 | 0.75 |
| | | | | | | | | |
| SalamandraTA-2B | cs | ca | 30.92 | 57.54 | 57.71 | 0.86 | 0.82 | 0.73 |
| nllb 3.3B | cs | ca | 29.02 | 58.78 | 56.44 | 0.86 | 0.83 | 0.73 |
| | | | | | | | | |
| SalamandraTA-2B | bg | ca | 31.68 | 56.32 | 58.61 | 0.85 | 0.84 | 0.73 |
| nllb 3.3B | bg | ca | 29.87 | 57.75 | 57.26 | 0.85 | 0.85 | 0.73 |
</details>
#### Flores200-devtest
| | Bleu ↑ | Ter ↓ | ChrF ↑ | Comet ↑ | Comet-kiwi ↑ | Bleurt ↑ |
|:-----------------------|-------:|------:|-------:|--------:|-------------:|---------:|
| **CA-XX** | | | | | | |
| SalamandraTA-2B | **27.09** | **61.06** | **56.41** | 0.86 | 0.81 | 0.75 |
| nllb 3.3B | 26.7 | 61.74 | 55.85 | 0.86 | **0.82** | **0.76** |
| **XX-CA** | | | | | | |
| SalamandraTA-2B | **31** | **57.46** | **57.96** | 0.85 | 0.81 | 0.73 |
| nllb 3.3B | 30.31 | 58.26 | 57.12 | 0.85 | **0.82** | 0.73 |
<details>
<summary>Click to show full table CA-XX Flores-devtest</summary>
| | source | target | Bleu ↑ | Ter ↓ | ChrF ↑ | Comet ↑ | Comet-kiwi ↑ | Bleurt ↑ |
|:-----------------------|:---------|:---------|-------:|------:|-------:|--------:|-------------:|---------:|
| nllb 3.3B | ca | sv | 32.49 | 55.11 | 59.93 | 0.88 | 0.82 | 0.79 |
| SalamandraTA-2B | ca | sv | 30.53 | 56.24 | 58.05 | 0.87 | 0.8 | 0.77 |
| | | | | | | | | |
| SalamandraTA-2B | ca | sl | 25.16 | 64.25 | 53.88 | 0.87 | 0.82 | 0.8 |
| nllb 3.3B | ca | sl | 24.64 | 66.02 | 52.71 | 0.88 | 0.82 | 0.81 |
| | | | | | | | | |
| SalamandraTA-2B | ca | sk | 25.64 | 63.03 | 53.55 | 0.88 | 0.83 | 0.79 |
| nllb 3.3B | ca | sk | 25.44 | 63.29 | 53.37 | 0.89 | 0.84 | 0.79 |
| | | | | | | | | |
| SalamandraTA-2B | ca | ro | 33.21 | 54.27 | 59.53 | 0.89 | 0.84 | 0.8 |
| nllb 3.3B | ca | ro | 31.29 | 56.44 | 58.16 | 0.89 | 0.85 | 0.8 |
| | | | | | | | | |
| SalamandraTA-2B | ca | pt | 37.9 | 48.95 | 63.15 | 0.88 | 0.84 | 0.75 |
| nllb 3.3B | ca | pt | 37.31 | 49.31 | 62.7 | 0.88 | 0.85 | 0.75 |
| | | | | | | | | |
| SalamandraTA-2B | ca | pl | 18.62 | 71.88 | 48.44 | 0.88 | 0.83 | 0.77 |
| nllb 3.3B | ca | pl | 18.01 | 72.23 | 48.26 | 0.88 | 0.83 | 0.77 |
| | | | | | | | | |
| SalamandraTA-2B | ca | nl | 23.4 | 65.66 | 54.55 | 0.85 | 0.84 | 0.74 |
| nllb 3.3B | ca | nl | 22.99 | 66.68 | 53.95 | 0.85 | 0.84 | 0.75 |
| | | | | | | | | |
| nllb 3.3B | ca | mt | 24.78 | 59.97 | 59.58 | 0.68 | 0.62 | 0.36 |
| SalamandraTA-2B | ca | mt | 24.35 | 60.1 | 60.51 | 0.69 | 0.6 | 0.4 |
| | | | | | | | | |
| SalamandraTA-2B | ca | lv | 20.55 | 71.85 | 50.24 | 0.82 | 0.78 | 0.74 |
| nllb 3.3B | ca | lv | 20.16 | 70.37 | 50.3 | 0.85 | 0.78 | 0.78 |
| | | | | | | | | |
| SalamandraTA-2B | ca | lt | 20.37 | 70.15 | 51.61 | 0.88 | 0.79 | 0.82 |
| nllb 3.3B | ca | lt | 19.95 | 70.47 | 52.49 | 0.88 | 0.81 | 0.81 |
| | | | | | | | | |
| SalamandraTA-2B | ca | it | 27.18 | 60.37 | 56.65 | 0.88 | 0.85 | 0.77 |
| nllb 3.3B | ca | it | 26.83 | 60.96 | 56.33 | 0.88 | 0.85 | 0.77 |
| | | | | | | | | |
| SalamandraTA-2B | ca | hu | 21.76 | 66.96 | 53.45 | 0.86 | 0.81 | 0.85 |
| nllb 3.3B | ca | hu | 20.54 | 68.28 | 52.2 | 0.87 | 0.82 | 0.87 |
| | | | | | | | | |
| SalamandraTA-2B | ca | hr | 25.41 | 62.55 | 55.65 | 0.89 | 0.84 | 0.81 |
| nllb 3.3B | ca | hr | 24.01 | 64.39 | 53.95 | 0.89 | 0.84 | 0.82 |
| | | | | | | | | |
| nllb 3.3B | ca | gl | 32.33 | 52.64 | 59.3 | 0.87 | 0.85 | 0.71 |
| SalamandraTA-2B | ca | gl | 31.97 | 52.76 | 59.48 | 0.87 | 0.84 | 0.7 |
| | | | | | | | | |
| SalamandraTA-2B | ca | ga | 23.19 | 66.3 | 51.99 | 0.77 | 0.64 | 0.6 |
| nllb 3.3B | ca | ga | 22.38 | 67.76 | 50.92 | 0.77 | 0.66 | 0.6 |
| | | | | | | | | |
| nllb 3.3B | ca | fr | 40.82 | 47.72 | 64.82 | 0.86 | 0.85 | 0.74 |
| SalamandraTA-2B | ca | fr | 40.35 | 47.79 | 64.56 | 0.86 | 0.84 | 0.73 |
| | | | | | | | | |
| nllb 3.3B | ca | fi | 18.93 | 70.8 | 53.03 | 0.89 | 0.81 | 0.82 |
| SalamandraTA-2B | ca | fi | 18.92 | 70.69 | 52.85 | 0.88 | 0.8 | 0.8 |
| | | | | | | | | |
| SalamandraTA-2B | ca | eu | 18.33 | 72 | 56.65 | 0.86 | 0.81 | 0.79 |
| nllb 3.3B | ca | eu | 12.79 | 78.69 | 50.19 | 0.83 | 0.75 | 0.75 |
| | | | | | | | | |
| SalamandraTA-2B | ca | et | 21.45 | 67.08 | 55.01 | 0.88 | 0.8 | 0.79 |
| nllb 3.3B | ca | et | 19.84 | 70.08 | 53.48 | 0.88 | 0.8 | 0.79 |
| | | | | | | | | |
| nllb 3.3B | ca | es | 25.87 | 59.66 | 54.06 | 0.86 | 0.86 | 0.74 |
| SalamandraTA-2B | ca | es | 24.73 | 60.79 | 53.48 | 0.86 | 0.86 | 0.73 |
| | | | | | | | | |
| nllb 3.3B | ca | en | 48.41 | 38.1 | 71.29 | 0.89 | 0.86 | 0.8 |
| SalamandraTA-2B | ca | en | 45.19 | 41.18 | 69.46 | 0.88 | 0.85 | 0.78 |
| | | | | | | | | |
| SalamandraTA-2B | ca | el | 22.78 | 63.17 | 49.97 | 0.87 | 0.83 | 0.73 |
| nllb 3.3B | ca | el | 22.59 | 63.8 | 49.33 | 0.87 | 0.83 | 0.73 |
| | | | | | | | | |
| SalamandraTA-2B | ca | de | 31.31 | 57.16 | 59.42 | 0.85 | 0.83 | 0.75 |
| nllb 3.3B | ca | de | 31.25 | 57.87 | 59.05 | 0.85 | 0.83 | 0.75 |
| | | | | | | | | |
| SalamandraTA-2B | ca | da | 34.83 | 53.16 | 61.44 | 0.88 | 0.82 | 0.75 |
| nllb 3.3B | ca | da | 34.43 | 53.82 | 60.73 | 0.88 | 0.83 | 0.76 |
| | | | | | | | | |
| SalamandraTA-2B | ca | cs | 24.98 | 63.45 | 53.11 | 0.89 | 0.84 | 0.77 |
| nllb 3.3B | ca | cs | 24.73 | 63.94 | 52.66 | 0.89 | 0.85 | 0.78 |
| | | | | | | | | |
| SalamandraTA-2B | ca | bg | 32.25 | 55.76 | 59.85 | 0.89 | 0.85 | 0.84 |
| nllb 3.3B | ca | bg | 31.45 | 56.93 | 59.29 | 0.89 | 0.85 | 0.85 |
</details>
<details>
<summary>Click to show full table XX-CA Flores-devtest</summary>
| | source | target | Bleu ↑ | Ter ↓ | ChrF ↑ | Comet ↑ | Comet-kiwi ↑ | Bleurt ↑ |
|:-----------------------|:---------|:---------|-------:|------:|-------:|--------:|-------------:|---------:|
| SalamandraTA-2B | sv | ca | 34.4 | 52.6 | 59.96 | 0.86 | 0.82 | 0.73 |
| nllb 3.3B | sv | ca | 33.4 | 53.19 | 59.29 | 0.86 | 0.83 | 0.74 |
| | | | | | | | | |
| SalamandraTA-2B | sl | ca | 29.12 | 59.26 | 56.56 | 0.85 | 0.8 | 0.71 |
| nllb 3.3B | sl | ca | 28.23 | 60.61 | 55.34 | 0.85 | 0.82 | 0.72 |
| | | | | | | | | |
| SalamandraTA-2B | sk | ca | 30.71 | 57.99 | 57.81 | 0.85 | 0.8 | 0.72 |
| nllb 3.3B | sk | ca | 29.79 | 58.99 | 56.61 | 0.85 | 0.82 | 0.73 |
| | | | | | | | | |
| SalamandraTA-2B | ro | ca | 34.79 | 53.37 | 61.22 | 0.87 | 0.83 | 0.75 |
| nllb 3.3B | ro | ca | 33.53 | 54.36 | 60.18 | 0.87 | 0.84 | 0.75 |
| | | | | | | | | |
| SalamandraTA-2B | pt | ca | 36.72 | 50.64 | 62.08 | 0.87 | 0.84 | 0.76 |
| nllb 3.3B | pt | ca | 36.11 | 50.96 | 61.33 | 0.87 | 0.84 | 0.76 |
| | | | | | | | | |
| SalamandraTA-2B | pl | ca | 25.62 | 64.15 | 53.55 | 0.85 | 0.81 | 0.71 |
| nllb 3.3B | pl | ca | 25.14 | 64.43 | 53.09 | 0.85 | 0.83 | 0.71 |
| | | | | | | | | |
| SalamandraTA-2B | nl | ca | 26.17 | 63.88 | 54.01 | 0.84 | 0.83 | 0.7 |
| nllb 3.3B | nl | ca | 25.61 | 64.26 | 53.43 | 0.84 | 0.85 | 0.71 |
| | | | | | | | | |
| SalamandraTA-2B | mt | ca | 36.97 | 50.43 | 62.69 | 0.79 | 0.68 | 0.75 |
| nllb 3.3B | mt | ca | 36.03 | 51.51 | 61.46 | 0.79 | 0.69 | 0.74 |
| | | | | | | | | |
| SalamandraTA-2B | lv | ca | 27.81 | 61.96 | 56.12 | 0.84 | 0.77 | 0.7 |
| nllb 3.3B | lv | ca | 26.83 | 63.33 | 53.93 | 0.84 | 0.78 | 0.7 |
| | | | | | | | | |
| SalamandraTA-2B | lt | ca | 27.29 | 61.15 | 54.14 | 0.84 | 0.75 | 0.7 |
| nllb 3.3B | lt | ca | 26.13 | 62.2 | 53.17 | 0.84 | 0.77 | 0.7 |
| | | | | | | | | |
| SalamandraTA-2B | it | ca | 29.12 | 60.95 | 57.85 | 0.87 | 0.85 | 0.74 |
| nllb 3.3B | it | ca | 28.06 | 61.81 | 57.06 | 0.87 | 0.85 | 0.74 |
| | | | | | | | | |
| SalamandraTA-2B | hu | ca | 28.21 | 60.54 | 55.38 | 0.85 | 0.81 | 0.71 |
| nllb 3.3B | hu | ca | 27.58 | 60.77 | 54.76 | 0.85 | 0.83 | 0.72 |
| | | | | | | | | |
| SalamandraTA-2B | hr | ca | 30.13 | 57.59 | 57.25 | 0.86 | 0.81 | 0.72 |
| nllb 3.3B | hr | ca | 29.15 | 62.59 | 56.04 | 0.86 | 0.83 | 0.72 |
| | | | | | | | | |
| nllb 3.3B | gl | ca | 34.23 | 53.25 | 61.28 | 0.88 | 0.85 | 0.76 |
| SalamandraTA-2B | gl | ca | 32.09 | 54.77 | 60.42 | 0.87 | 0.84 | 0.75 |
| | | | | | | | | |
| SalamandraTA-2B | ga | ca | 28.11 | 62.93 | 55.28 | 0.8 | 0.68 | 0.67 |
| nllb 3.3B | ga | ca | 27.73 | 62.91 | 53.93 | 0.79 | 0.69 | 0.66 |
| | | | | | | | | |
| SalamandraTA-2B | fr | ca | 35.87 | 52.28 | 61.2 | 0.87 | 0.83 | 0.75 |
| nllb 3.3B | fr | ca | 34.42 | 53.05 | 60.31 | 0.87 | 0.84 | 0.76 |
| | | | | | | | | |
| SalamandraTA-2B | fi | ca | 27.35 | 61.33 | 54.95 | 0.86 | 0.8 | 0.7 |
| nllb 3.3B | fi | ca | 27.04 | 62.35 | 54.48 | 0.86 | 0.81 | 0.71 |
| | | | | | | | | |
| SalamandraTA-2B | eu | ca | 28.02 | 60.45 | 55.44 | 0.87 | 0.82 | 0.73 |
| nllb 3.3B | eu | ca | 26.68 | 62.62 | 54.22 | 0.86 | 0.82 | 0.71 |
| | | | | | | | | |
| SalamandraTA-2B | et | ca | 29.84 | 58.79 | 56.74 | 0.86 | 0.78 | 0.72 |
| nllb 3.3B | et | ca | 28.43 | 60.01 | 55.48 | 0.86 | 0.79 | 0.72 |
| | | | | | | | | |
| nllb 3.3B | es | ca | 25.64 | 64.21 | 55.18 | 0.87 | 0.85 | 0.73 |
| SalamandraTA-2B | es | ca | 23.47 | 66.71 | 54.05 | 0.86 | 0.84 | 0.72 |
| | | | | | | | | |
| SalamandraTA-2B | en | ca | 43.98 | 42.35 | 67.3 | 0.87 | 0.85 | 0.77 |
| nllb 3.3B | en | ca | 43.24 | 43.37 | 66.58 | 0.88 | 0.85 | 0.78 |
| | | | | | | | | |
| SalamandraTA-2B | el | ca | 28.91 | 59.86 | 55.26 | 0.85 | 0.83 | 0.71 |
| nllb 3.3B | el | ca | 28.46 | 60.28 | 55.13 | 0.85 | 0.84 | 0.72 |
| | | | | | | | | |
| SalamandraTA-2B | de | ca | 33.71 | 54.06 | 59.79 | 0.86 | 0.83 | 0.74 |
| nllb 3.3B | de | ca | 32.71 | 54.91 | 58.91 | 0.86 | 0.84 | 0.74 |
| | | | | | | | | |
| SalamandraTA-2B | da | ca | 35.14 | 52.51 | 60.81 | 0.86 | 0.82 | 0.74 |
| nllb 3.3B | da | ca | 34.03 | 53.41 | 59.46 | 0.86 | 0.83 | 0.75 |
| | | | | | | | | |
| SalamandraTA-2B | cs | ca | 31.12 | 56.71 | 58.22 | 0.86 | 0.81 | 0.73 |
| nllb 3.3B | cs | ca | 29.26 | 58.38 | 56.53 | 0.86 | 0.82 | 0.73 |
| | | | | | | | | |
| SalamandraTA-2B | bg | ca | 31.33 | 56.72 | 58.75 | 0.85 | 0.84 | 0.73 |
| nllb 3.3B | bg | ca | 30.5 | 57.03 | 57.92 | 0.85 | 0.85 | 0.73 |
</details>
## Evaluation Aranese, Aragonese, Asturian
Using [MT Lens](https://github.com/langtech-bsc/mt-evaluation) we evaluate Spanish-Asturian (ast), Spanish-Aragonese (an) and Spanish-Aranese (arn) on BLEU and ChrF scores on the [Flores+ dev](https://github.com/openlanguagedata/flores) evaluation dataset. We also report BLEU and ChrF scores for catalan directions.
### Asturian Flores+ dev
Below are the evaluation results compared to [Apertium](https://www.apertium.org/), [Eslema](https://eslema.it.uniovi.es/) and NLLB ([Costa-jussà et al., 2022](https://arxiv.org/abs/2207.04672)).
| | source | target | Bleu | ChrF |
|:-----------------------|:---------|:---------|------:|-------:|
| nllb 3.3B | es | ast | **18.78** | 50.5 |
| Eslema | es | ast | 17.30 | **50.77** |
| nllb 600M | es | ast | 17.23 | 49.72 |
| SalamandraTA-2B | es | ast | 17.11 | 49.49 |
| Apertium | es | ast | 16.66 | 50.57 |
| | | | | | | | | |
| | | | | | | | | |
| nllb 3.3B | ca | ast | **25.87** | 54.9 |
| SalamandraTA-2B | ca | ast | 25.17 | **55.17** |
### Aragonese Flores+ dev
Below are the evaluation results on compared to [Apertium](https://www.apertium.org/), [Softcatalà](https://www.softcatala.org/traductor/) and [Traduze](https://traduze.aragon.es).
| | source | target | Bleu | ChrF |
|:-----------------------|:---------|:---------|-------:|-------:|
| Apertium | es | an | **65.34** | **82.00** |
| Softcatalà | es | an | 50.21 | 73.97 |
| SalamandraTA-2B | es | an | 49.13 | 74.22 |
| Traduze | es | an | 37.43 | 69.51 |
| | | | | | | | | |
| | | | | | | | | |
| SalamandraTA-2B | ca | an | 17.06 | 49.12 |
### Aranese Flores+ dev
Below are the evaluation results on compared to [Apertium](https://www.apertium.org/) and [Softcatalà](https://www.softcatala.org/traductor/).
| | source | target | Bleu | ChrF |
|:-----------------------|:---------|:---------|-------:|-------:|
| Apertium | es | arn | **48.96** | **72.63** |
| Softcatalà | es | arn | 34.43 | 58.61 |
| SalamandraTA-2B | es | arn | 34.35 | 57.78 |
| | | | | | | | | |
| | | | | | | | | |
| SalamandraTA-2B | ca | arn | 21.95 | 48.67 |
## Ethical Considerations and Limitations
Detailed information on the work done to examine the presence of unwanted social and cognitive biases in the base model can be found
at [Salamandra-2B model card](https://huggingface.co/BSC-LT/salamandra-2b).
With regard to MT models, no specific analysis has yet been carried out in order to evaluate potential biases or limitations in translation
accuracy across different languages, dialects, or domains. However, we recognize the importance of identifying and addressing any harmful stereotypes,
cultural inaccuracies, or systematic performance discrepancies that may arise in Machine Translation. As such, we plan to perform more analyses as soon
as we have implemented the necessary metrics and methods within our evaluation framework [MT Lens](https://github.com/langtech-bsc/mt-evaluation).
## Additional information
### Author
The Language Technologies Unit from Barcelona Supercomputing Center.
### Contact
For further information, please send an email to <[email protected]>.
### Copyright
Copyright(c) 2024 by Language Technologies Unit, Barcelona Supercomputing Center.
### Funding
This work has been promoted and financed by the Government of Catalonia through the [Aina Project](https://projecteaina.cat/).
This work is funded by the _Ministerio para la Transformación Digital y de la Función Pública_ - Funded by EU – NextGenerationEU
within the framework of [ILENIA Project](https://proyectoilenia.es/) with reference 2022/TL22/00215337.
### Disclaimer
Be aware that the model may contain biases or other unintended distortions.
When third parties deploy systems or provide services based on this model, or use the model themselves,
they bear the responsibility for mitigating any associated risks and ensuring compliance with applicable regulations,
including those governing the use of Artificial Intelligence.
The Barcelona Supercomputing Center, as the owner and creator of the model, shall not be held liable for any outcomes resulting from third-party use.
### License
[Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0) | [
"TRANSLATION"
] | [
"BEAR"
] |
BSC-LT/ALIA-40b | BSC-LT | text-generation | [
"transformers",
"safetensors",
"llama",
"text-generation",
"bg",
"ca",
"code",
"cs",
"cy",
"da",
"de",
"el",
"en",
"es",
"et",
"eu",
"fi",
"fr",
"ga",
"gl",
"hr",
"hu",
"it",
"lt",
"lv",
"mt",
"nl",
"nn",
"oc",
"pl",
"pt",
"ro",
"ru",
"sh",
"sk",
"sl",
"sr",
"sv",
"uk",
"dataset:oscar-corpus/colossal-oscar-1.0",
"dataset:HuggingFaceFW/fineweb-edu",
"dataset:joelniklaus/eurlex_resources",
"dataset:joelniklaus/legal-mc4",
"dataset:projecte-aina/CATalog",
"dataset:UFRGS/brwac",
"dataset:community-datasets/hrwac",
"dataset:danish-foundation-models/danish-gigaword",
"dataset:HiTZ/euscrawl",
"dataset:PleIAs/French-PD-Newspapers",
"dataset:PleIAs/French-PD-Books",
"dataset:AI-team-UoA/greek_legal_code",
"dataset:HiTZ/latxa-corpus-v1.1",
"dataset:allenai/peS2o",
"dataset:pile-of-law/pile-of-law",
"dataset:PORTULAN/parlamento-pt",
"dataset:hoskinson-center/proof-pile",
"dataset:togethercomputer/RedPajama-Data-1T",
"dataset:bigcode/starcoderdata",
"dataset:bjoernp/tagesschau-2018-2023",
"dataset:EleutherAI/the_pile_deduplicated",
"arxiv:2403.14009",
"arxiv:2403.20266",
"arxiv:2101.00027",
"arxiv:2207.00220",
"arxiv:1810.06694",
"arxiv:1911.05507",
"arxiv:1906.03741",
"arxiv:2406.17557",
"arxiv:2402.06619",
"arxiv:1803.09010",
"arxiv:2502.08489",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:eu"
] | 2024-12-09T14:04:29 | 2025-02-13T10:44:39 | 1,561 | 73 | ---
datasets:
- oscar-corpus/colossal-oscar-1.0
- HuggingFaceFW/fineweb-edu
- joelniklaus/eurlex_resources
- joelniklaus/legal-mc4
- projecte-aina/CATalog
- UFRGS/brwac
- community-datasets/hrwac
- danish-foundation-models/danish-gigaword
- HiTZ/euscrawl
- PleIAs/French-PD-Newspapers
- PleIAs/French-PD-Books
- AI-team-UoA/greek_legal_code
- HiTZ/latxa-corpus-v1.1
- allenai/peS2o
- pile-of-law/pile-of-law
- PORTULAN/parlamento-pt
- hoskinson-center/proof-pile
- togethercomputer/RedPajama-Data-1T
- bigcode/starcoderdata
- bjoernp/tagesschau-2018-2023
- EleutherAI/the_pile_deduplicated
language:
- bg
- ca
- code
- cs
- cy
- da
- de
- el
- en
- es
- et
- eu
- fi
- fr
- ga
- gl
- hr
- hu
- it
- lt
- lv
- mt
- nl
- nn
- \no
- oc
- pl
- pt
- ro
- ru
- sh
- sk
- sl
- sr
- sv
- uk
library_name: transformers
license: apache-2.0
pipeline_tag: text-generation
---

> [!WARNING]
> **WARNING:** This is an intermediate checkpoint, as training is still ongoing.
>
> The weights will be promptly updated as soon as the training process is complete.
# ALIA-40b Model Card
ALIA-40b is a highly multilingual model pre-trained from scratch that will come with its respective base and instruction-tuned variants. This model card corresponds to the 40B base version.
To visit the model cards of other model versions, please refer to the [Model Index](#model-index).
This model is released under a permissive [Apache 2.0 license](https://www.apache.org/licenses/LICENSE-2.0).
Along with the open weights, all training scripts and configuration files are made publicly available in [this GitHub repository](https://github.com/langtech-bsc/alia).
---
## Model Details
### Description
Transformer-based decoder-only language model that has been pre-trained from scratch on 6.9 trillion tokens of highly curated data.
The pre-training corpus contains text in 35 European languages and code.
### Hyperparameters
The full list of hyperparameters can be found [here](https://github.com/langtech-bsc/alia/blob/main/configs/bsc_40b.yaml).
### Architecture
| | |
|-------------------------|:--------------|
| Total Parameters | 40,433,885,184|
| Embedding Parameters | 2,097,152,000 |
| Layers | 48 |
| Hidden size | 8,192 |
| Attention heads | 64 |
| Context length | 4,096 |
| Vocabulary size | 256,000 |
| Precision | bfloat16 |
| Embedding type | RoPE |
| Activation Function | SwiGLU |
| Layer normalization | RMS Norm |
| Flash attention | ✅ |
| Grouped Query Attention | ✅ |
| Num. query groups | 8 |
---
## Intended Use
### Direct Use
The models are intended for both research and commercial use in any of the languages included in the training data.
The base models are intended either for language generation or to be further fine-tuned for specific use-cases.
The instruction-tuned variants can be used as general-purpose assistants, as long as the user is fully aware of the model’s limitations.
### Out-of-scope Use
The model is not intended for malicious activities, such as harming others or violating human rights.
Any downstream application must comply with current laws and regulations.
Irresponsible usage in production environments without proper risk assessment and mitigation is also discouraged.
---
## Hardware and Software
### Training Framework
Pre-training was conducted using NVIDIA’s [NeMo Framework](https://docs.nvidia.com/nemo-framework/index.html),
which leverages PyTorch Lightning for efficient model training in highly distributed settings.
The instruction-tuned versions were produced with [FastChat](https://github.com/lm-sys/FastChat).
### Compute Infrastructure
All models were trained on [MareNostrum 5](https://www.bsc.es/ca/marenostrum/marenostrum-5), a pre-exascale EuroHPC supercomputer hosted and
operated by Barcelona Supercomputing Center.
The accelerated partition is composed of 1,120 nodes with the following specifications:
- 4x Nvidia Hopper GPUs with 64GB HBM2 memory
- 2x Intel Sapphire Rapids 8460Y+ at 2.3Ghz and 32c each (64 cores)
- 4x NDR200 (BW per node 800Gb/s)
- 512 GB of Main memory (DDR5)
- 460GB on NVMe storage
|Model|Nodes|GPUs|
|:---:|:---:|:---:|
|2B|64|256|
|7B|128|512|
|40B|256 / 512|1,024 / 2,048|
---
## How to use
This section offers examples of how to perform inference using various methods.
### Inference
You'll find different techniques for running inference, including Huggingface's Text Generation Pipeline, multi-GPU configurations, and vLLM for scalable and efficient generation.
#### Inference with Huggingface's Text Generation Pipeline
The Huggingface Text Generation Pipeline provides a straightforward way to run inference using the ALIA-40b model.
```bash
pip install transformers torch accelerate sentencepiece protobuf
```
<details>
<summary>Show code</summary>
```python
from transformers import pipeline, set_seed
model_id = "BSC-LT/ALIA-40b"
# Sample prompts
prompts = [
"Las fiestas de San Isidro Labrador de Yecla son",
"El punt més alt del Parc Natural del Montseny és",
"Sentence in English: The typical chance of such a storm is around 10%. Sentence in Catalan:",
"Si le monde était clair",
"The future of AI is",
]
# Create the pipeline
generator = pipeline("text-generation", model_id, device_map="auto")
generation_args = {
"temperature": 0.1,
"top_p": 0.95,
"max_new_tokens": 25,
"repetition_penalty": 1.2,
"do_sample": True
}
# Fix the seed
set_seed(1)
# Generate texts
outputs = generator(prompts, **generation_args)
# Print outputs
for output in outputs:
print(output[0]["generated_text"])
```
</details>
#### Inference with single / multi GPU
This section provides a simple example of how to run inference using Huggingface's AutoModel class.
```bash
pip install transformers torch accelerate sentencepiece protobuf
```
<details>
<summary>Show code</summary>
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
import torch
model_id = "BSC-LT/ALIA-40b"
# Input text
text = "El mercat del barri és"
# Load the tokenizer
tokenizer = AutoTokenizer.from_pretrained(model_id)
# Load the model
model = AutoModelForCausalLM.from_pretrained(
model_id,
device_map="auto",
torch_dtype=torch.bfloat16
)
generation_args = {
"temperature": 0.1,
"top_p": 0.95,
"max_new_tokens": 25,
"repetition_penalty": 1.2,
"do_sample": True
}
inputs = tokenizer(text, return_tensors="pt")
# Generate texts
output = model.generate(input_ids=inputs["input_ids"].to(model.device), attention_mask=inputs["attention_mask"], **generation_args)
# Print outputs
print(tokenizer.decode(output[0], skip_special_tokens=True))
```
</details>
#### Inference with vLLM
vLLM is an efficient library for inference that enables faster and more scalable text generation.
```bash
pip install vllm
```
<details>
<summary>Show code</summary>
```python
from vllm import LLM, SamplingParams
model_id = "BSC-LT/ALIA-40b"
# Sample prompts
prompts = [
"Las fiestas de San Isidro Labrador de Yecla son",
"El punt més alt del Parc Natural del Montseny és",
"Sentence in English: The typical chance of such a storm is around 10%. Sentence in Catalan:",
"Si le monde était clair",
"The future of AI is",
]
# Create a sampling params object
sampling_params = SamplingParams(
temperature=0.1,
top_p=0.95,
seed=1,
max_tokens=25,
repetition_penalty=1.2)
# Create an LLM
llm = LLM(model=model_id, tensor_parallel_size=4)
# Generate texts
outputs = llm.generate(prompts, sampling_params)
# Print outputs
for output in outputs:
prompt = output.prompt
generated_text = output.outputs[0].text
print(f"Prompt: {prompt!r}, Generated text: {generated_text!r}")
```
</details>
---
## Data
### Pretraining Data
The pre-training corpus comprises data from 35 European languages and 92 programming languages, with detailed data sources provided below.
The initial 1.5 training epochs used 2.4 trillion tokens, obtained by manually adjusting data proportion to balance the representation
and give more importance to Spain’s co-official (Spanish, Catalan, Galician, and Basque). This way, we downsampled code and English data to half,
Spanish co-official languages were oversampled by 2x, and the remaining languages were kept in their original proportions.
During the following epochs (still training), the Colossal OSCAR dataset was replaced with the FineWeb-Edu dataset.
This adjustment resulted in a total of 2.68 trillion tokens, distributed as outlined below:

The pretraining corpus is predominantly composed of data from Colossal OSCAR, which contributes a significant 53.05% of the total tokens.
Following this, Starcoder provides 13.67%, and FineWeb-Edu (350B tokens subset) adds 10.24%. The next largest sources are HPLT at 4.21% and French-PD at 3.59%.
Other notable contributions include MaCoCu, Legal-ES, and EurLex, each contributing around 1.72% to 1.41%.
These major sources collectively form the bulk of the corpus, ensuring a rich and diverse dataset for training the language model.
The remaining 10% comes from smaller sources in various languages.
Feel free to click the expand button below to see the full list of sources.
<details>
<summary>Data Sources</summary>
| Dataset | Language | Source |
|---|---|---|
| Colossal OSCAR 1.0 | bg, ca, cs, cy, da, de, el, en, es, et, eu, fi, fr, ga, gl, hr, hu, it, lt, lv, mt, nl, nn, no, oc, pl, pt, ro, ru, sh, sk, sl, sr, sv, uk | Brack et al., 2024 |
| Aya Dataset (w/o Evaluation Suite) | eu, hr, nl, fi, ka, hu, lt, nn, ro, sk, lv, cy, bg, cs, en, fr, de, ga, mt, pl, ru, sl, sv, ca, da, et, gl, el, it, no, pt, sr, es, uk | Singh et al., 2024 |
| Wikimedia dumps | bg, ca, cs, da, de, el, en, es, et, eu, fi, fr, ga, gl, hr, hu, it, lt, lv, mt, nl, nn, no, pl, pt, ro, sh, sk, sl, sr, uk | [Link](https://dumps.wikimedia.org/) |
| OpenSubtitles v2016 | bg, ca, cs, da, de, el, en, es, et, eu, fi, fr, gl, hr, it, lt, lv, nl, no, pl, pt, ro, sk, sl, sr, sv, uk | Lison & Tiedemann, 2016 |
| EurLEX-Resources | bg, cs, da, de, el, en, es, et, fi, fr, ga, hr, hu, it, lt, lv, mt, nl, pl, pt, ro, sk, sl, sv | [Link](https://huggingface.co/datasets/joelniklaus/eurlex_resources) |
| MC4-Legal | bg, cs, da, de, el, en, es, et, fi, fr, ga, hu, it, lt, lv, mt, nl, pl, pt, ro, sk, sl, sv | [Link](https://huggingface.co/datasets/joelito/legal-mc4) |
| Parlamint | at, bg, cz, dk, ee, es, es-ga, fi, fr, gb, gr, hr, hu, it, lv, nl, no, pl, pt, rs, se, si | Erjavec et al., 2021 |
| MaCoCu | bg, ca, el, hr, mt, sl, sr, uk | Bañón et al., 2022 |
| CURLICAT | bg, hr, hu, pl, ro, sk, sl | Váradi et al., 2022 |
| Norwegian Colossal Corpus (NCC) | nn, no | Kummervold et al., 2021 |
| Academic Slovene KAS 2.0 | sl | Žagar et al., 2022 |
| BIGPATENT | en | Sharma et al., 2019 |
| Biomedical-ES | es | Internally generated biomedical dataset: Wikipedia LS, Pubmed, MeSpEn, patents, clinical cases, medical crawler |
| Brazilian Portuguese Web as Corpus (BrWaC) | pt | Wagner Filho et al., 2018 |
| Bulgarian National Corpus (BulNC) | bg | [Link](http://old.dcl.bas.bg/dataset/BulNC.7z) |
| CaBeRnet | fr | Popa-Fabre et al., 2020 |
| CATalog 1.0 | ca | Palomar-Giner et al., 2024 |
| CorpusNÓS | gl | de-Dios-Flores et al., 2024 |
| Croatian Web as Corpus 2.1 (hrWaC) | hr | Ljubešić & Klubička, 2014 |
| DaNewsroom | da | Varab & Schluter, 2020 |
| Danish GigaWord | da | Strømberg-Derczynski et al., 2021 |
| DK-CLARIN Reference Corpus of General Danish | da | [Link](https://korpus.dsl.dk/clarin/) |
| Estonian National Corpus 2021 (ENC) | et | Koppel & Kallas, 2022 |
| Estonian Reference Corpus (ERC) | et | [Link](https://www.cl.ut.ee/korpused/segakorpus/) |
| EusCrawl (w/o Wikipedia or NC-licenses) | eu | Artetxe et al., 2022 |
| FineWeb-Edu (350BT subset) | en | Penedo et al., 2024 |
| French Public Domain Books (French-PD) | fr | [Link](https://huggingface.co/datasets/PleIAs/French-PD-Books) |
| French Public Domain Newspapers (French-PD) | fr | [Link](https://huggingface.co/datasets/PleIAs/French-PD-Newspapers) |
| German Web as Corpus (DeWaC) | de | [Link](https://docs.sslmit.unibo.it/doku.php?id=corpora:dewac) |
| Greek Legal Code (GLC) | el | Papaloukas et al., 2021 |
| Greek Web Corpus (GWC) | el | Outsios et al., 2018 |
| HPLT v1 - Spanish | es | de Gibert et al., 2024 |
| HPLT v1.1 - Spanish | es | de Gibert et al., 2024 |
| Irish Universal Dependencies (Ga-UD) | ga | [Link](https://universaldependencies.org/ga/index.html) |
| Italian Web as Corpus (ItWaC) | it | [Link](https://docs.sslmit.unibo.it/doku.php?id=corpora:itwac) |
| Korpus Malti | mt | Micallef et al., 2022 |
| Korpus slovenských právnych predpisov v1.9 (SK-Laws) | sk | [Link](https://www.juls.savba.sk/data/marcell/legal-sk-20220322-1.9.ver.xz) |
| Latxa Corpus v1.1 (GAITU) | eu | Etxaniz et al., 2024 [Link](https://huggingface.co/datasets/HiTZ/latxa-corpus-v1.1) |
| Laws and legal acts of Ukraine (UK-Laws) | uk | [Link](https://lang.org.ua/en/corpora/#anchor7) |
| Legal-ES | es | Internally generated legal dataset: BOE, BORME, Senado, Congreso, Spanish court orders, DOGC |
| MARCELL Romanian legislative subcorpus v2 | ro | [Link](https://elrc-share.eu/reposMARCELL%20Romanian%20legislative%20subcorpus%20v2itory/browse/marcell-romanian-legislative-subcorpus-v2/2da548428b9d11eb9c1a00155d026706ce94a6b59ffc4b0e9fb5cd9cebe6889e/) |
| Math AMPS | en | Hendrycks et al., 2021 |
| NKPJ National Corpus of Polish v1.2 (NKPJ) | pl | Lewandowska-Tomaszczyk et al., 2013 |
| Occitan Corpus (IEA-AALO) | oc | Provided by [IEA](https://www.institutestudisaranesi.cat/) |
| Open Legal Data - German court decisions and laws | de | Ostendorff et al., 2020 |
| ParlamentoPT | pt | Rodrigues et al., 2023 |
| peS2o | en | Soldaini & Lo, 2023 |
| PG-19 | en | Rae et al., 2019 |
| Pile of Law (selected subsets) | en | Henderson* et al., 2022 |
| Polish Parliamentary Corpus (PPC) | pl | Ogrodniczuk, 2018 |
| Proof Pile | en | [Link](https://huggingface.co/datasets/hoskinson-center/proof-pile) |
| RedPajama-Data T1 (StackExchange subset) | en | Computer, 2023 |
| Scientific-ES | es | Internally generated scientific dataset: Dialnet, Scielo, CSIC, TDX, BSC, UCM |
| SK Court Decisions v2.0 (OD-Justice) | sk | [Link](https://www.juls.savba.sk/data/od-justice/od-justice-2.0.ver.xz) |
| Slovene Web as Corpus (slWaC) | sl | Erjavec et al., 2015 |
| SoNaR Corpus NC 1.2 | nl | [Link](https://taalmaterialen.ivdnt.org/download/tstc-sonar-corpus/) |
| Spanish Legal Domain Corpora (Spanish-Legal) | es | Gutiérrez-Fandiño et al., 2021 |
| SrpKorSubset: news, legal, academic, conversation, lit- erary (SrpKor) | sr | [Link](http://www.korpus.matf.bg.ac.rs/) |
| Starcoder | code | Li et al., 2023 |
| State-related content from the Latvian Web (State-Latvian-Web) | lv | [Link](https://catalog.elra.info/en-us/repository/browse/ELRA-W0169/) |
| SYN v9: large corpus of written Czech | cs | Křen et al., 2021 |
| Tagesschau Archive Article | de | [Link](https://huggingface.co/datasets/bjoernp/tagesschau-2018-2023) |
| The Danish Parliament Corpus 2009 - 2017, v1 | da | Hansen, 2018 |
| The Gaois bilingual corpus of English-Irish legislation (Ga-Legislation) | ga | [Link](https://portulanclarin.net/repository/browse/the-gaois-bilingual-corpus-of-english-irish-legislation-processed/daeac17c9e3511ea9b7f02420a000407b83de243dc0b469aab41084386c5b80f/) |
| The Pile (PhilPapers) | en | Gao et al., 2021 |
| The Swedish Culturomics Gigaword Corpus (Swedish- Gigaword) | sv | Rødven-Eide, 2016 |
| Welsh-GOV | cy | Crawling from [Link](https://www.llyw.cymru) |
| Yle Finnish News Archive (Yle-News) | fi | [Link](http://urn.fi/urn:nbn:fi:lb-2021050401) |
To consult the data summary document with the respective licences, please send an e-mail to [email protected].
<details>
<summary>References</summary>
- Abadji, J., Suárez, P. J. O., Romary, L., & Sagot, B. (2021). Ungoliant: An optimized pipeline for the generation of a very large-scale multilingual web corpus (H. Lüngen, M. Kupietz, P. Bański, A. Barbaresi, S. Clematide, & I. Pisetta, Eds.; pp. 1–9). Leibniz-Institut für Deutsche Sprache. [Link](https://doi.org/10.14618/ids-pub-10468)
- Artetxe, M., Aldabe, I., Agerri, R., Perez-de-Viñaspre, O., & Soroa, A. (2022). Does Corpus Quality Really Matter for Low-Resource Languages?
- Bañón, M., Esplà-Gomis, M., Forcada, M. L., García-Romero, C., Kuzman, T., Ljubešić, N., van Noord, R., Sempere, L. P., Ramírez-Sánchez, G., Rupnik, P., Suchomel, V., Toral, A., van der Werff, T., & Zaragoza, J. (2022). MaCoCu: Massive collection and curation of monolingual and bilingual data: Focus on under-resourced languages. Proceedings of the 23rd Annual Conference of the European Association for Machine Translation, 303–304. [Link](https://aclanthology.org/2022.eamt-1.41)
- Brack, M., Ostendorff, M., Suarez, P. O., Saiz, J. J., Castilla, I. L., Palomar-Giner, J., Shvets, A., Schramowski, P., Rehm, G., Villegas, M., & Kersting, K. (2024). Community OSCAR: A Community Effort for Multilingual Web Data. [Link](https://occiglot.eu/papers/Community_Oscar.pdf)
- Computer, T. (2023). RedPajama: An Open Source Recipe to Reproduce LLaMA training dataset [Computer software]. [Link](https://github.com/togethercomputer/RedPajama-Data)
- de Gibert, O., Nail, G., Arefyev, N., Bañón, M., van der Linde, J., Ji, S., Zaragoza-Bernabeu, J., Aulamo, M., Ramírez-Sánchez, G., Kutuzov, A., Pyysalo, S., Oepen, S., & Tiedemann, J. (2024). A New Massive Multilingual Dataset for High-Performance Language Technologies (arXiv:2403.14009). arXiv. [Link](http://arxiv.org/abs/2403.14009)
- Dodge, J., Sap, M., Marasović, A., Agnew, W., Ilharco, G., Groeneveld, D., Mitchell, M., & Gardner, M. (2021). Documenting Large Webtext Corpora: A Case Study on the Colossal Clean Crawled Corpus. In M.-F. Moens, X. Huang, L. Specia, & S. W. Yih (Eds.), Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (pp. 1286–1305). Association for Computational Linguistics. [Link](https://doi.org/10.18653/v1/2021.emnlp-main.98)
- Erjavec, T., Ljubešić, N., & Logar, N. (2015). The slWaC corpus of the Slovene web. Informatica (Slovenia), 39, 35–42.
- Erjavec, T., Ogrodniczuk, M., Osenova, P., Ljubešić, N., Simov, K., Grigorova, V., Rudolf, M., Pančur, A., Kopp, M., Barkarson, S., Steingrímsson, S. hór, van der Pol, H., Depoorter, G., de Does, J., Jongejan, B., Haltrup Hansen, D., Navarretta, C., Calzada Pérez, M., de Macedo, L. D., … Rayson, P. (2021). Linguistically annotated multilingual comparable corpora of parliamentary debates ParlaMint.ana 2.1. [Link](http://hdl.handle.net/11356/1431)
- Etxaniz, J., Sainz, O., Perez, N., Aldabe, I., Rigau, G., Agirre, E., Ormazabal, A., Artetxe, M., & Soroa, A. (2024). Latxa: An Open Language Model and Evaluation Suite for Basque. [Link] (https://arxiv.org/abs/2403.20266)
- Gao, L., Biderman, S., Black, S., Golding, L., Hoppe, T., Foster, C., Phang, J., He, H., Thite, A., Nabeshima, N., Presser, S., & Leahy, C. (2021). The Pile: An 800GB Dataset of Diverse Text for Language Modeling. CoRR, abs/2101.00027. [Link](https://arxiv.org/abs/2101.00027)
- Gutiérrez-Fandiño, A., Armengol-Estapé, J., Gonzalez-Agirre, A., & Villegas, M. (2021). Spanish Legalese Language Model and Corpora.
- Hansen, D. H. (2018). The Danish Parliament Corpus 2009—2017, v1. [Link](http://hdl.handle.net/20.500.12115/8)
- Henderson*, P., Krass*, M. S., Zheng, L., Guha, N., Manning, C. D., Jurafsky, D., & Ho, D. E. (2022). Pile of Law: Learning Responsible Data Filtering from the Law and a 256GB Open-Source Legal Dataset. arXiv. [Link](https://arxiv.org/abs/2207.00220)
- Hendrycks, D., Burns, C., Kadavath, S., Arora, A., Basart, S., Tang, E., Song, D., & Steinhardt, J. (2021). Measuring Mathematical Problem Solving With the MATH Dataset. NeurIPS.
- Jansen, T., Tong, Y., Zevallos, V., & Suarez, P. O. (2022). Perplexed by Quality: A Perplexity-based Method for Adult and Harmful Content Detection in Multilingual Heterogeneous Web Data.
- Koppel, K., & Kallas, J. (2022). Eesti keele ühendkorpuste sari 2013–2021: Mahukaim eestikeelsete digitekstide kogu. Eesti Rakenduslingvistika Ühingu Aastaraamat Estonian Papers in Applied Linguistics, 18, 207–228. [Link](https://doi.org/10.5128/erya18.12)
- Křen, M., Cvrček, V., Henyš, J., Hnátková, M., Jelínek, T., Kocek, J., Kováříková, D., Křivan, J., Milička, J., Petkevič, V., Procházka, P., Skoumalová, H., Šindlerová, J., & Škrabal, M. (2021). SYN v9: Large corpus of written Czech. [Link](http://hdl.handle.net/11234/1-4635)
- Kreutzer, J., Caswell, I., Wang, L., Wahab, A., van Esch, D., Ulzii-Orshikh, N., Tapo, A., Subramani, N., Sokolov, A., Sikasote, C., Setyawan, M., Sarin, S., Samb, S., Sagot, B., Rivera, C., Rios, A., Papadimitriou, I., Osei, S., Suarez, P. O., … Adeyemi, M. (2022). Quality at a Glance: An Audit of Web-Crawled Multilingual Datasets. Transactions of the Association for Computational Linguistics, 10, 50–72. [Link](https://doi.org/10.1162/tacl_a_00447)
- Kummervold, P. E., De la Rosa, J., Wetjen, F., & Brygfjeld, S. A. (2021). Operationalizing a National Digital Library: The Case for a Norwegian Transformer Model. In S. Dobnik & L. Øvrelid (Eds.), Proceedings of the 23rd Nordic Conference on Computational Linguistics (NoDaLiDa) (pp. 20–29). Linköping University Electronic Press, Sweden. [Link](https://aclanthology.org/2021.nodalida-main.3)
- Lewandowska-Tomaszczyk, B., Górski, R., Łaziński, M., & Przepiórkowski, A. (2013). The National Corpus of Polish (NKJP). Language use and data analysis. 309–319.
- Li, R., Allal, L. B., Zi, Y., Muennighoff, N., Kocetkov, D., Mou, C., Marone, M., Akiki, C., Li, J., Chim, J., Liu, Q., Zheltonozhskii, E., Zhuo, T. Y., Wang, T., Dehaene, O., Davaadorj, M., Lamy-Poirier, J., Monteiro, J., Shliazhko, O., … Vries, H. de. (2023). StarCoder: May the source be with you!
- Lison, P., & Tiedemann, J. (2016). OpenSubtitles2016: Extracting Large Parallel Corpora from Movie and TV Subtitles. In N. Calzolari, K. Choukri, T. Declerck, S. Goggi, M. Grobelnik, B. Maegaard, J. Mariani, H. Mazo, A. Moreno, J. Odijk, & S. Piperidis (Eds.), Proceedings of the Tenth International Conference on Language Resources and Evaluation (LREC’16) (pp. 923–929). European Language Resources Association (ELRA). [Link](https://aclanthology.org/L16-1147)
- Ljubešić, N., & Klubička, F. (2014). Bs,hr,srWaC - Web Corpora of Bosnian, Croatian and Serbian. In F. Bildhauer & R. Schäfer (Eds.), Proceedings of the 9th Web as Corpus Workshop (WaC-9) (pp. 29–35). Association for Computational Linguistics. [Link](https://doi.org/10.3115/v1/W14-0405)
- Micallef, K., Gatt, A., Tanti, M., van der Plas, L., & Borg, C. (2022). Pre-training Data Quality and Quantity for a Low-Resource Language: New Corpus and BERT Models for Maltese. Proceedings of the Third Workshop on Deep Learning for Low-Resource Natural Language Processing, 90–101. [Link](https://doi.org/10.18653/v1/2022.deeplo-1.10)
- Ogrodniczuk, M. (2018). Polish Parliamentary Corpus. [Link](https://api.semanticscholar.org/CorpusID:235134113)
- Ostendorff, M., Blume, T., & Ostendorff, S. (2020). Towards an Open Platform for Legal Information. Proceedings of the ACM/IEEE Joint Conference on Digital Libraries in 2020, 385–388. [Link](https://doi.org/10.1145/3383583.3398616)
- Ostendorff, M., Suarez, P. O., Lage, L. F., & Rehm, G. (2024). LLM-Datasets: An Open Framework for Pretraining Datasets of Large Language Models. First Conference on Language Modeling. [Link](https://openreview.net/forum?id=5RdIMlGLXL)
- Outsios, S., Skianis, K., Meladianos, P., Xypolopoulos, C., & Vazirgiannis, M. (2018). Word Embeddings from Large-Scale Greek Web content. arXiv Preprint arXiv:1810.06694.
- Palomar-Giner, J., Saiz, J. J., Espuña, F., Mina, M., Da Dalt, S., Llop, J., Ostendorff, M., Ortiz Suarez, P., Rehm, G., Gonzalez-Agirre, A., & Villegas, M. (2024). A CURATEd CATalog: Rethinking the Extraction of Pretraining Corpora for Mid-Resourced Languages. In N. Calzolari, M.-Y. Kan, V. Hoste, A. Lenci, S. Sakti, & N. Xue (Eds.), Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024) (pp. 335–349). ELRA and ICCL. [Link](https://aclanthology.org/2024.lrec-main.31)
- Papaloukas, C., Chalkidis, I., Athinaios, K., Pantazi, D.-A., & Koubarakis, M. (2021). Multi-granular Legal Topic Classification on Greek Legislation. Proceedings of the Natural Legal Language Processing Workshop 2021, 63–75. [Link](https://doi.org/10.48550/arXiv.2109.15298)
- Popa-Fabre, M., Ortiz Suárez, P. J., Sagot, B., & de la Clergerie, É. (2020). French Contextualized Word-Embeddings with a sip of CaBeRnet: A New French Balanced Reference Corpus. Proceedings of the 8th Workshop on Challenges in the Management of Large Corpora, 15–23. [Link](https://aclanthology.org/2020.cmlc-1.3)
- Rae, J. W., Potapenko, A., Jayakumar, S. M., Hillier, C., & Lillicrap, T. P. (2019). Compressive Transformers for Long-Range Sequence Modelling. arXiv Preprint. [Link](https://arxiv.org/abs/1911.05507)
- Rodrigues, J., Gomes, L., Silva, J., Branco, A., Santos, R., Cardoso, H. L., & Osório, T. (2023). Advancing Neural Encoding of Portuguese with Transformer Albertina PT-\*.
- Rødven-Eide, S. (2016). The Swedish Culturomics Gigaword CorpusThe Swedish Culturomics Gigaword Corpus [Dataset]. Språkbanken Text. [Link](https://doi.org/10.23695/3WMV-1Z09)
- Sharma, E., Li, C., & Wang, L. (2019). BIGPATENT: A Large-Scale Dataset for Abstractive and Coherent Summarization. CoRR, abs/1906.03741. [Link](http://arxiv.org/abs/1906.03741)
- Soldaini, L., & Lo, K. (2023). peS2o (Pretraining Efficiently on S2ORC) Dataset. Allen Institute for AI.
- Strømberg-Derczynski, L., Ciosici, M., Baglini, R., Christiansen, M. H., Dalsgaard, J. A., Fusaroli, R., Henrichsen, P. J., Hvingelby, R., Kirkedal, A., Kjeldsen, A. S., Ladefoged, C., Nielsen, F. Å., Madsen, J., Petersen, M. L., Rystrøm, J. H., & Varab, D. (2021). The Danish Gigaword Corpus. Proceedings of the 23rd Nordic Conference on Computational Linguistics (NoDaLiDa), 413–421. [Link](https://aclanthology.org/2021.nodalida-main.46)
- Subramani, N., Luccioni, S., Dodge, J., & Mitchell, M. (2023). Detecting Personal Information in Training Corpora: An Analysis. 208–220. [Link](https://doi.org/10.18653/v1/2023.trustnlp-1.18)
- Varab, D., & Schluter, N. (2020). DaNewsroom: A Large-scale Danish Summarisation Dataset. Proceedings of The 12th Language Resources and Evaluation Conference, 6731–6739. [Link](https://www.aclweb.org/anthology/2020.lrec-1.831)
- Váradi, T., Nyéki, B., Koeva, S., Tadić, M., Štefanec, V., Ogrodniczuk, M., Nitoń, B., Pezik, P., Barbu Mititelu, V., Irimia, E., Mitrofan, M., Tufi\textcommabelows, D., Garabík, R., Krek, S., & Repar, A. (2022). Introducing the CURLICAT Corpora: Seven-language Domain Specific Annotated Corpora from Curated Sources. In N. Calzolari, F. Béchet, P. Blache, K. Choukri, C. Cieri, T. Declerck, S. Goggi, H. Isahara, B. Maegaard, J. Mariani, H. Mazo, J. Odijk, & S. Piperidis (Eds.), Proceedings of the Thirteenth Language Resources and Evaluation Conference (pp. 100–108). European Language Resources Association. [Link](https://aclanthology.org/2022.lrec-1.11)
- Wagner Filho, J. A., Wilkens, R., Idiart, M., & Villavicencio, A. (2018). The brwac corpus: A new open resource for brazilian portuguese. Proceedings of the Eleventh International Conference on Language Resources and Evaluation (LREC 2018).
- Žagar, A., Kavaš, M., Robnik-Šikonja, M., Erjavec, T., Fišer, D., Ljubešić, N., Ferme, M., Borovič, M., Boškovič, B., Ojsteršek, M., & Hrovat, G. (2022). Corpus of academic Slovene KAS 2.0. [Link](http://hdl.handle.net/11356/1448)
- Alicia Parrish, Angelica Chen, Nikita Nangia, Vishakh Padmakumar, Jason Phang, Jana Thompson, Phu Mon Htut, and Samuel Bowman. 2022. BBQ: A hand-built bias benchmark for question answering. In Findings of the Association for Computational Linguistics: ACL 2022, pages 2086–2105, Dublin, Ireland. Association for Computational Linguistics.
- Emily Sheng, Kai-Wei Chang, Premkumar Natarajan, and Nanyun Peng. 2019. The Woman Worked as a Babysitter: On Biases in Language Generation. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pages 3407–3412, Hong Kong, China. Association for Computational Linguistics.
- Clark, P., Cowhey, I., Etzioni, O., Khot, T., Sabharwal, A., Schoenick, C., & Tafjord, O. (2018). Think you have Solved Question Answering? Try ARC, the AI2 Reasoning Challenge. arXiv:1803. 05457v1.
- Richard Socher, Alex Perelygin, Jean Wu, Jason Chuang, Christopher D. Manning, Andrew Ng, and Christopher Potts. 2013. Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank. In Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, pages 1631–1642, Seattle, Washington, USA. Association for Computational Linguistics.
- Penedo, G., Kydlíček, H., allal, L. B., Lozhkov, A., Mitchell, M., Raffel, C., Von Werra, L., & Wolf, T. (2024). The FineWeb Datasets: Decanting the Web for the Finest Text Data at Scale (arXiv:2406.17557). arXiv. http://arxiv.org/abs/2406.17557
- Singh, S., Vargus, F., Dsouza, D., Karlsson, B. F., Mahendiran, A., Ko, W.-Y., Shandilya, H., Patel, J., Mataciunas, D., OMahony, L., Zhang, M., Hettiarachchi, R., Wilson, J., Machado, M., Moura, L. S., Krzemiński, D., Fadaei, H., Ergün, I., Okoh, I., … Hooker, S. (2024). Aya Dataset: An Open-Access Collection for Multilingual Instruction Tuning (arXiv:2402.06619). arXiv. http://arxiv.org/abs/2402.06619
</details>
</details>
We provide an extense Datasheet section following the best practices defined by [(Gebru et al., 2021)](https://arxiv.org/pdf/1803.09010).
<details>
<summary>Datasheet</summary>
#### Motivation
**For what purpose was the dataset created? Was there a specific task in mind? Was there a specific gap that needed to be filled? Please provide a description.**
The purpose of creating this dataset is to pre-train the Salamandra family of multilingual models with high performance in a large number of European languages (35)
and programming languages (92). We also want to represent the co-official languages of Spain: Spanish, Catalan, Galician and Basque. For this reason, we oversample
these languages by a factor of 2.
There is a great lack of massive multilingual data, especially in minority languages (Ostendorff & Rehm, 2023), so part of our efforts in the creation of
this pre-training dataset have resulted in the contribution to large projects such as the Community OSCAR (Brack et al., 2024), which includes 151 languages
and 40T words, or CATalog (Palomar-Giner et al., 2024), the largest open dataset in Catalan in the world.
**Who created the dataset (e.g., which team, research group) and on behalf of which entity (e.g., company, institution, organization)?**
The dataset has been created by the Language Technologies unit (LangTech) of the Barcelona Supercomputing Center - Centro Nacional de Supercomputación (BSC-CNS),
which aims to advance the field of natural language processing through cutting-edge research and development and the use of HPC. In particular, it was created by
the unit's data team, the main contributors being José Javier Saiz, Ferran Espuña and Jorge Palomar.
However, the creation of the dataset would not have been possible without the collaboration of a large number of collaborators, partners and public institutions,
which can be found in detail in the acknowledgements.
**Who funded the creation of the dataset? If there is an associated grant, please provide the name of the grantor and the grant name and number.**
This work has been promoted and financed by the Government of Catalonia through the [Aina project](https://projecteaina.cat/).
This work is funded by the _Ministerio para la Transformación Digital y de la Función Pública_ - Funded by EU – NextGenerationEU
within the framework of [ILENIA Project](https://proyectoilenia.es/) with reference 2022/TL22/00215337.
#### Composition
**What do the instances that comprise the dataset represent (e.g., documents, photos, people, countries)? Are there multiple types of instances (e.g., movies, users, and ratings; people and interactions between them; nodes and edges)? Please provide a description.**
The dataset consists entirely of text documents in various languages. Specifically, data was mainly sourced from the following databases and
repositories:
- **Common Crawl:** Repository that holds website data and is run by the Common Crawl non-profit organization. It is updated monthly and is
distributed under the CC0 1.0 public domain license.
- **GitHub:** Community platform that allows developers to create, store, manage, and share their code. Repositories are crawled and then
distributed with their original licenses, which may vary from permissive to non-commercial licenses.
- **Wikimedia:** Database that holds the collection databases managed by the Wikimedia Foundation, including Wikipedia, Wikibooks, Wikinews,
Wikiquote, Wikisource, and Wikivoyage. It is updated monthly and is distributed under Creative Commons Attribution-ShareAlike License 4.0.
- **EurLex:** Repository that holds the collection of legal documents from the European Union, available in all of the EU’s 24 official
languages and run by the Publications Office of the European Union. It is updated daily and is distributed under the Creative Commons
Attribution 4.0 International license.
- **Other repositories:** Specific repositories were crawled under permission for domain-specific corpora, which include academic, legal,
and newspaper repositories.
We provide a complete list of dataset sources at the end of this section.
**How many instances are there in total (of each type, if appropriate)?**
The dataset contains a diverse range of instances across multiple languages, with notable adjustments for certain languages. English
represents the largest portion, accounting for 39.31% of the total data. Spanish was upsampled by a factor of 2, bringing its share to 16.12%,
while Catalan (1.97%), Basque (0.24%), and Galician (0.31%) were also upsampled by 2. On the other hand, code-related data was downsampled
by half, making up 5.78% of the total. Other prominent languages include French (6.6%), Russian (5.56%), German (4.79%), and Hungarian
(4.59%), with several additional languages contributing between 1% and 2%, and smaller portions represented by a variety of others.
**Does the dataset contain all possible instances or is it a sample (not necessarily random) of instances from a larger set? If the dataset is a sample, then what is the larger set? Is the sample representative of the larger set (e.g., geographic coverage)? If so, please describe how this representativeness was validated/verified. If it is not representative of the larger set, please describe why not (e.g., to cover a more diverse range of instances, because instances were withheld or unavailable).**
The dataset is a sample from multiple sources, with different weights based on the primary language of the content: Spanish, Catalan,
Basque, and Galician content was upsampled by a factor of two, while programming languages were downsampled by a factor of half. Other
sources were sampled in proportion to their occurrence.
**What data does each instance consist of? “Raw” data (e.g., unprocessed text or images) or features? In either case, please provide a description.**
Each instance consists of a text document processed for deduplication, language identification, and source-specific filtering. Some documents required
optical character recognition (OCR) to extract text from non-text formats such as PDFs.
**Is there a label or target associated with each instance? If so, please provide a description.**
Each instance is labelled with a unique identifier, the primary language of the content, and the URL for web-sourced instances. Additional labels were
automatically assigned to detect specific types of content -harmful or toxic content- and to assign preliminary indicators of undesired qualities -very
short documents, high density of symbols, etc.- which were used for filtering instances.
**Is any information missing from individual instances? If so, please provide a description, explaining why this information is missing (e.g., because it was unavailable). This does not include intentionally removed information, but might include, e.g., redacted text.**
No significant information is missing from the instances.
**Are relationships between individual instances made explicit (e.g., users’ movie ratings, social network links)? If so, please describe how these relationships are made explicit.**
Instances are related through shared metadata, such as source and language identifiers.
**Are there recommended data splits (e.g., training, development/validation, testing)? If so, please provide a description of these splits, explaining the rationale behind them.**
The dataset is randomly divided into training, validation and test sets, where the validation and test sets are each 1% of the total corpus.
**Are there any errors, sources of noise, or redundancies in the dataset? If so, please provide a description.**
Despite removing duplicated instances within each source, redundancy remains at the paragraph and sentence levels, particularly in web-sourced
instances where search engine optimization techniques and templates contribute to repeated textual patterns. Some instances may be also duplicated
across sources due to format variations.
**Is the dataset self-contained, or does it link to or otherwise rely on external resources (e.g., websites, tweets, other datasets)? If it links to or relies on external resources, a) are there guarantees that they will exist, and remain constant, over time; b) are there official archival versions of the complete dataset (i.e., including the external resources as they existed at the time the dataset was created); c) are there any restrictions (e.g., licenses, fees) associated with any of the external resources that might apply to a dataset consumer? Please provide descriptions of all external resources and any restrictions associated with them, as well as links or other access points, as appropriate.**
The dataset is self-contained and does not rely on external resources.
**Does the dataset contain data that might be considered confidential (e.g., data that is protected by legal privilege or by doctor–patient confidentiality, data that includes the content of individuals’ non-public communications)? If so, please provide a description.**
The dataset does not contain confidential data.
**Does the dataset contain data that, if viewed directly, might be offensive, insulting, threatening, or might otherwise cause anxiety? If so, please describe why. If the dataset does not relate to people, you may skip the remaining questions in this section.**
The dataset includes web-crawled content, which may overrepresent pornographic material across languages (Kreutzer et al., 2022). Although
pre-processing techniques were applied to mitigate offensive content, the heterogeneity and scale of web-sourced data make exhaustive
filtering challenging, which makes it next to impossible to identify all adult content without falling into excessive filtering, which may
negatively influence certain demographic groups (Dodge et al., 2021).
**Does the dataset identify any subpopulations (e.g., by age, gender)? If so, please describe how these subpopulations are identified and provide a description of their respective distributions within the dataset.**
The dataset does not explicitly identify any subpopulations.
**Is it possible to identify individuals (i.e., one or more natural persons), either directly or indirectly (i.e., in combination with other data) from the dataset? If so, please describe how.**
Web-sourced instances in the dataset may contain personally identifiable information (PII) that is publicly available on the Web, such as names,
IP addresses, email addresses, and phone numbers. While it would be possible to indirectly identify individuals through the combination of multiple
data points, the nature and scale of web data makes it difficult to parse such information. In any case, efforts are made to filter or anonymize
sensitive data (Mina et al., 2024), but some identifiable information may remain in the dataset.
**Does the dataset contain data that might be considered sensitive in any way? If so, please provide a description.**
Given that the dataset includes web-sourced content and other publicly available documents, instances may inadvertently reveal financial
information, health-related details, or forms of government identification, such as social security numbers (Subramani et al., 2023),
especially if the content originates from less-regulated sources or user-generated platforms.
#### Collection Process
**How was the data collected?**
This dataset is constituted by combining several sources, whose acquisition methods can be classified into three groups:
- Web-sourced datasets with some preprocessing available under permissive license.
- Domain-specific or language-specific raw crawls.
- Manually curated data obtained through collaborators, data providers (by means of legal assignment agreements) or open source projects (e.g. CATalog).
**What mechanisms or procedures were used to collect the data? How were these mechanisms or procedures validated?**
The data collection process was carried out using three different mechanisms, each corresponding to one of the groups defined in the previous answer. The specific methods used and their respective validation procedures are outlined below:
- Open Direct Download: Data were obtained directly from publicly accessible sources, such as websites or repositories that provide open data downloads. We validate the data with a data integrity check, which ensures that the downloaded files are complete, uncorrupted and in the expected format and structure.
- Ad hoc scrapers or crawlers: Custom web scraping scripts or crawlers were used to extract data from various online sources where direct downloads were not available. These scripts navigate web pages, extract relevant data and store it in a structured format. We validate this method with software unit tests to evaluate the functionality of individual components of the scraping programs, checking for errors or unexpected behaviour. In addition, data integrity tests were performed to verify that the collected data remained complete throughout the extraction and storage process.
- Direct download via FTP, SFTP, API or S3: Some datasets were acquired using secure transfer protocols such as FTP (File Transfer Protocol), SFTP (Secure File Transfer Protocol), or API (Application Programming Interface) requests from cloud storage services such as Amazon S3. As with the open direct download method, data integrity tests were used to validate the completeness of the files to ensure that the files were not altered or corrupted during the transfer process.
**If the dataset is a sample from a larger set, what was the sampling strategy?**
The sampling strategy was to use the whole dataset resulting from the filtering explained in the 'preprocessing/cleaning/labelling' section,
with the particularity that an upsampling of 2 (i.e. twice the probability of sampling a document) was performed for the co-official languages
of Spain (Spanish, Catalan, Galician, Basque), and a downsampling of 1/2 was applied for code (half the probability of sampling a code document,
evenly distributed among all programming languages).
**Who was involved in the data collection process and how were they compensated?**
This data is generally extracted, filtered and sampled by automated processes. The code required to run these processes has been developed entirely
by members of the Language Technologies data team, or otherwise obtained from open-source software. Furthermore, there has been no monetary
consideration for acquiring data from suppliers.
**Over what timeframe was the data collected? Does this timeframe match the creation timeframe of the data associated with the instances? If not, please describe the timeframe in which the data associated with the instances was created.**
Data were acquired and processed from April 2023 to April 2024. However, as mentioned, much data has been obtained from open projects such
as Common Crawl, which contains data from 2014, so it is the end date (04/2024) rather than the start date that is important.
**Were any ethical review processes conducted? If so, please provide a description of these review processes, including the outcomes, as well as a link or other access point to any supporting documentation.**
No particular ethical review process has been carried out as the data is mostly open and not particularly sensitive. However, we have an
internal evaluation team and a bias team to monitor ethical issues. In addition, we work closely with ‘Observatori d'Ètica en Intel·ligència
Artificial’ (OEIAC) and ‘Agencia Española de Supervisión de la Inteligencia Artificial’ (AESIA) to audit the processes we carry out from an
ethical and legal point of view, respectively.
#### Preprocessing
**Was any preprocessing/cleaning/labeling of the data done? If so, please provide a description. If not, you may skip the remaining questions in this section.**
No changes were made to the content of individual text document instances. However, the web-sourced documents underwent a filtering process based on specific criteria along two key dimensions:
- Quality filtering: The text processing pipeline CURATE (Palomar et. al, 2024) calculates a quality score for each document based on a set of filtering criteria that identify undesirable textual characteristics. Any document with a score below the 0.8 threshold was excluded from the dataset.
- Harmful or adult content filtering: To reduce the amount of harmful or inappropriate material in the dataset, documents from Colossal OSCAR were filtered using the Ungoliant pipeline (Abadji et al., 2021), which uses the 'harmful\_pp' field, a perplexity-based score generated by a language model.
**Was the “raw” data saved in addition to the preprocessed/cleaned/labeled data? If so, please provide a link or other access point to the “raw” data.**
The original raw data was not kept.
**Is the software that was used to preprocess/clean/label the data available? If so, please provide a link or other access point.**
Yes, the preprocessing and filtering software is open-sourced. The [CURATE](https://github.com/langtech-bsc/CURATE) pipeline was used for CATalog and other curated datasets,
and the [Ungoliant](https://github.com/oscar-project/ungoliant) pipeline was used for the OSCAR project.
#### Uses
**Has the dataset been used for any tasks already? If so, please provide a description.**
Pre-train the Salamandra model family.
**What (other) tasks could the dataset be used for?**
The data can be used primarily to pre-train other language models, which can then be used for a wide range of use cases. The dataset could
also be used for other tasks such as fine-tuning language models, cross-lingual NLP tasks, machine translation, domain-specific text
generation, and language-specific data analysis.
**Is there anything about the composition of the dataset or the way it was collected and preprocessed/cleaned/labeled that might impact future uses? Is there anything a dataset consumer could do to mitigate these risks or harms?**
Web-crawled content is over-represented with standard language varieties, impacting language model performance for minority languages.
Language diversity in data is crucial to avoid bias, especially in encoding non-standard dialects, preventing the exclusion of demographic
groups. Moreover, despite legal uncertainties in web-scraped data, we prioritize permissive licenses and privacy protection measures,
acknowledging the challenges posed by personally identifiable information (PII) within large-scale datasets. Our ongoing efforts aim to
address privacy concerns and contribute to a more inclusive linguistic dataset.
**Are there tasks for which the dataset should not be used?**
-
#### Distribution
**Will the dataset be distributed to third parties outside of the entity on behalf of which the dataset was created? If so, please provide a description.**
The dataset will not be released or distributed to third parties. Any related question to distribution is omitted in this section.
#### Maintenance
**Who will be supporting/hosting/maintaining the dataset?**
The dataset will be hosted by the Language Technologies unit (LangTech) of the Barcelona Supercomputing Center (BSC). The team will ensure
regular updates and monitor the dataset for any issues related to content integrity, legal compliance, and bias for the sources they are
responsible for.
**How can the owner/curator/manager of the dataset be contacted?**
The data owner may be contacted with the email address [email protected].
**Will the dataset be updated?**
The dataset will not be updated.
**If the dataset relates to people, are there applicable limits on the retention of the data associated with the instances? If so, please describe these limits and explain how they will be enforced.**
The dataset does not keep sensitive data that could allow direct identification of individuals, apart from the data that is publicly available in
web-sourced content. Due to the sheer volume and diversity of web data, it is not feasible to notify individuals or manage data retention on an
individual basis. However, efforts are made to mitigate the risks associated with sensitive information through pre-processing and filtering to
remove identifiable or harmful content. Despite these measures, vigilance is maintained to address potential privacy and ethical issues.
**Will older versions of the dataset continue to be supported/hosted/maintained? If so, please describe how. If not, please describe how its obsolescence will be communicated to dataset consumers.**
Since the dataset will not be updated, only the final version will be kept.
**If others want to extend/augment/build on/contribute to the dataset, is there a mechanism for them to do so?**
The dataset does not allow for external contributions.
</details>
---
## Evaluation
### Gold-standard benchmarks
Evaluation is done using the Language Model Evaluation Harness (Gao et al., 2024). We evaluate on a set of tasks taken from [SpanishBench](https://github.com/EleutherAI/lm-evaluation-harness/tree/main/lm_eval/tasks/spanish_bench), [CatalanBench](https://github.com/EleutherAI/lm-evaluation-harness/tree/main/lm_eval/tasks/catalan_bench), [BasqueBench](https://github.com/EleutherAI/lm-evaluation-harness/tree/main/lm_eval/tasks/basque_bench) and [GalicianBench](https://github.com/EleutherAI/lm-evaluation-harness/tree/main/lm_eval/tasks/galician_bench). We also use English tasks already available on the LM Evaluation Harness. These benchmarks include both new and existing tasks and datasets. In the tables below, we include the results in a selection of evaluation datasets that represent model's performance across a variety of tasks within these benchmarks.
We only use tasks that are either human generated, human translated, or with a strong human-in-the-loop (i.e., machine translation followed by professional revision or machine generation followed by human revision and annotation). This is the reason behind the variety in number of tasks reported across languages. As more tasks that fulfill these requirements are published, we will update the presented results. We also intend to expand the evaluation to other languages, as long as the datasets meet our quality standards.
During the implementation of the evaluation we observed a series of issues worth considering when replicating and interpreting the results presented. These issues include ≈1.5% variances in performance in some tasks depending on the version of the `transformers` library used, and depending on the use (or lack of use) of tensor parallelism when loading a model. When implementing existing tasks, we carry out a comprehensive quality evaluation of the dataset, the Harness task itself, and what kind of input models see during evaluation. Our implementation (see links above) addresses multiple existing problems such as errors in datasets and prompts, and lack of pre-processing. All this means that results will vary if using other Harness implementations, and may slightly vary depending on the replication setup.
It should be noted that these results are subject to all the drawbacks of every current gold-standard evaluation, and that the figures do not fully represent the model's capabilities and potential. We thus advise caution when reading and interpreting the results.
A full list of results compared to other baselines, a discussion of the model's performance across tasks and its implications, and details regarding problem-solving with task implementation will soon be available in the technical report.
All results reported below are on a 5-shot setting.
#### Spanish
<table><thead>
<tr>
<th>Category</th>
<th>Task</th>
<th>Metric</th>
<th>Result</th>
</tr></thead>
<tbody>
<tr>
<td>Commonsense Reasoning</td>
<td>xstorycloze_es</td>
<td>acc</td>
<td>78.89</td>
</tr>
<tr>
<td rowspan="2">NLI</td>
<td>wnli_es</td>
<td>acc</td>
<td>60.56</td>
</tr>
<tr>
<td>xnli_es</td>
<td>acc</td>
<td>48.31</td>
</tr>
<tr>
<td>Paraphrasing</td>
<td>paws_es</td>
<td>acc</td>
<td>67.50</td>
</tr>
<tr>
<td>QA</td>
<td>xquad_es</td>
<td>acc</td>
<td>74.03</td>
</tr>
<tr>
<td>Translation</td>
<td>flores_es</td>
<td>bleu</td>
<td>25.12</td>
</tr>
</tbody>
</table>
#### Catalan
<table><thead>
<tr>
<th>Category</th>
<th>Task</th>
<th>Metric</th>
<th>Result</th>
</tr></thead>
<tbody>
<tr>
<td rowspan="2">Commonsense Reasoning</td>
<td>copa_ca</td>
<td>acc</td>
<td>85.20</td>
</tr>
<tr>
<td>xstorycloze_ca</td>
<td>acc</td>
<td>78.09</td>
</tr>
<tr>
<td rowspan="2">NLI</td>
<td>wnli_ca</td>
<td>acc</td>
<td>60.56</td>
</tr>
<tr>
<td>xnli_ca</td>
<td>acc</td>
<td>49.84</td>
</tr>
<tr>
<td rowspan="2">Paraphrasing</td>
<td>parafraseja</td>
<td>acc</td>
<td>64.33</td>
</tr>
<tr>
<td>paws_ca</td>
<td>acc</td>
<td>67.35</td>
</tr>
<tr>
<td rowspan="5">QA</td>
<td>arc_ca_easy</td>
<td>acc</td>
<td>78.87</td>
</tr>
<tr>
<td>arc_ca_challenge</td>
<td>acc</td>
<td>51.62</td>
</tr>
<tr>
<td>openbookqa_ca</td>
<td>acc</td>
<td>38.40</td>
</tr>
<tr>
<td>piqa_ca</td>
<td>acc</td>
<td>74.86</td>
</tr>
<tr>
<td>siqa_ca</td>
<td>acc</td>
<td>53.07</td>
</tr>
<tr>
<td>Translation</td>
<td>flores_ca</td>
<td>bleu</td>
<td>32.97</td>
</tr>
</tbody></table>
#### Basque
<table><thead>
<tr>
<th>Category</th>
<th>Task</th>
<th>Metric</th>
<th>Result</th>
</tr></thead>
<tbody>
<tr>
<td rowspan="2">Commonsense Reasoning</td>
<td>xcopa_eu</td>
<td>acc</td>
<td>74.20</td>
</tr>
<tr>
<td>xstorycloze_eu</td>
<td>acc</td>
<td>70.75</td>
</tr>
<tr>
<td rowspan="2">NLI</td>
<td>wnli_eu</td>
<td>acc</td>
<td>54.93</td>
</tr>
<tr>
<td>xnli_eu</td>
<td>acc</td>
<td>46.54</td>
</tr>
<tr>
<td rowspan="3">QA</td>
<td>eus_exams</td>
<td>acc</td>
<td>55.12</td>
</tr>
<tr>
<td>eus_proficiency</td>
<td>acc</td>
<td>54.25</td>
</tr>
<tr>
<td>eus_trivia</td>
<td>acc</td>
<td>63.62</td>
</tr>
<tr>
<td>Reading Comprehension</td>
<td>eus_reading</td>
<td>acc</td>
<td>52.56</td>
</tr>
<tr>
<td>Translation</td>
<td>flores_eu</td>
<td>bleu</td>
<td>19.85</td>
</tr>
</tbody></table>
#### Galician
<table><thead>
<tr>
<th>Category</th>
<th>Task</th>
<th>Metric</th>
<th>Result</th>
</tr></thead>
<tbody>
<tr>
<td rowspan="2">Paraphrasing</td>
<td>parafrases_gl</td>
<td>acc</td>
<td>60.20</td>
</tr>
<tr>
<td>paws_gl</td>
<td>acc</td>
<td>69.10</td>
</tr>
<tr>
<td>QA</td>
<td>openbookqa_gl</td>
<td>acc</td>
<td>35.00</td>
</tr>
<tr>
<td>Translation</td>
<td>flores_gl</td>
<td>bleu</td>
<td>30.19</td>
</tr>
</tbody>
</table>
#### English
<table><thead>
<tr>
<th>Category</th>
<th>Task</th>
<th>Metric</th>
<th>Result</th>
</tr></thead>
<tbody>
<tr>
<td rowspan="2">Commonsense Reasoning</td>
<td>copa</td>
<td>acc</td>
<td>91</td>
</tr>
<tr>
<td>xstorycloze_en</td>
<td>acc</td>
<td>82.20</td>
</tr>
<tr>
<td rowspan="2">NLI</td>
<td>wnli</td>
<td>acc</td>
<td>61.97</td>
</tr>
<tr>
<td>xnli_en</td>
<td>acc</td>
<td>51.77</td>
</tr>
<tr>
<td>Paraphrasing</td>
<td>paws *</td>
<td>acc</td>
<td>64.65</td>
</tr>
<tr>
<td rowspan="6">QA</td>
<td>arc_easy</td>
<td>acc</td>
<td>85.40</td>
</tr>
<tr>
<td>arc_challenge</td>
<td>acc</td>
<td>58.70</td>
</tr>
<tr>
<td>openbookqa</td>
<td>acc</td>
<td>37.80</td>
</tr>
<tr>
<td>piqa</td>
<td>acc</td>
<td>81.77</td>
</tr>
<tr>
<td>social_iqa</td>
<td>acc</td>
<td>53.48</td>
</tr>
<tr>
<td>squad_en **</td>
<td>acc</td>
<td>81.53</td>
</tr>
</tbody></table>
\* Current LM Evaluation Harness implementation is lacking correct pre-processing. These results are obtained with adequate pre-processing.
\*\* This task is not yet available in the official Harness, we hope to add it soon.
---
## Ethical Considerations and Limitations
We examine the presence of undesired societal and cognitive biases present in this model using different benchmarks. For societal biases, we test performance using our Spanish version of the BBQ dataset (Parrish et al., 2022). We report that while accuracy in disambiguated settings is relatively high for a base model, the model performs very poorly in ambiguous settings. Further examination of the differences in accuracy scores as described in Jin et al. (2024) reveals a low-to-moderate alignment between the model's responses and societal biases. These largely vanish in disambiguated setting. Our analyses on societal biases show that while these biases are capable of interfering with model performance as expressed in the results on the BBQ dataset, their interference with task performance is somewhat limited given the results on the disambiguated dataset. We highlight that our analyses of these biases are by no means exhaustive and are limited by the relative scarcity of adequate resources in all languages present in the training data. We aim to gradually extend and expand our analyses in future work.
Our cognitive bias analysis focuses on positional effects in 0-shot settings, and majority class bias in few-shot settings. For positional effects, we leverage the ARC Multiple Choice Question dataset (Clark et al., 2018). We observe weak primacy effects, whereby the model shows a preference for answers towards the beginning of the list of provided answers. We measure the effects of majority class effects in few-shot settings using SST-2 (Socher et al., 2013). We detect significant effects, albeit extremely weak ones, implying that outputs are generally robust against variations in prompt format, and order.
We highlight that these results can be expected from a pretrained model that has not yet been instruction-tuned or aligned. These tests are performed in order to show the biases the model may contain. We urge developers to take them into account and perform safety testing and tuning tailored to their specific applications of the model.
---
## Additional information
### Author
The Language Technologies Unit from Barcelona Supercomputing Center.
### Contact
For further information, please send an email to <[email protected]>.
### Copyright
Copyright(c) 2024 by Language Technologies Unit, Barcelona Supercomputing Center.
### Funding
This work is funded by the Ministerio para la Transformación Digital y de la Función Pública - Funded by EU – NextGenerationEU within the framework of the project Modelos del Lenguaje.
This work has been promoted and supported by the Government of Catalonia through the Aina Project.
### Acknowledgements
This project has benefited from the contributions of numerous teams and institutions, mainly through data contributions, knowledge transfer or technical support.
We are especially grateful to our ILENIA project partners: CENID, HiTZ and CiTIUS for their participation. We also extend our genuine gratitude to the Spanish Senate and Congress, Fundación Dialnet, and the ‘Instituto Universitario de Sistemas Inteligentes y Aplicaciones Numéricas en Ingeniería (SIANI)’ of the University of Las Palmas de Gran Canaria. Many other institutions have been involved in the project. Our thanks to Òmnium Cultural, Parlament de Catalunya, Institut d'Estudis Aranesos, Racó Català, Vilaweb, ACN, Nació Digital, El món and Aquí Berguedà. We thank the Welsh government, DFKI, Occiglot project, especially Malte Ostendorff, and The Common Crawl Foundation, especially Pedro Ortiz, for their collaboration.
We would also like to give special thanks to the NVIDIA team, with whom we have met regularly, specially to: Ignacio Sarasua, Adam Henryk Grzywaczewski, Oleg Sudakov, Sergio Perez, Miguel Martinez, Felipes Soares and Meriem Bendris. Their constant support has been especially appreciated throughout the entire process.
Their valuable efforts have been instrumental in the development of this work.
### Disclaimer
Be aware that the model may contain biases or other unintended distortions.
When third parties deploy systems or provide services based on this model, or use the model themselves,
they bear the responsibility for mitigating any associated risks and ensuring compliance with applicable regulations,
including those governing the use of Artificial Intelligence.
The Barcelona Supercomputing Center, as the owner and creator of the model, shall not be held liable for any outcomes resulting from third-party use.
### Citation
```
@misc{gonzalezagirre2025salamandratechnicalreport,
title={Salamandra Technical Report},
author={Aitor Gonzalez-Agirre and Marc Pàmies and Joan Llop and Irene Baucells and Severino Da Dalt and Daniel Tamayo and José Javier Saiz and Ferran Espuña and Jaume Prats and Javier Aula-Blasco and Mario Mina and Adrián Rubio and Alexander Shvets and Anna Sallés and Iñaki Lacunza and Iñigo Pikabea and Jorge Palomar and Júlia Falcão and Lucía Tormo and Luis Vasquez-Reina and Montserrat Marimon and Valle Ruíz-Fernández and Marta Villegas},
year={2025},
eprint={2502.08489},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2502.08489},
}
```
### License
[Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0)
## Model Index
|Model|Base|Instruct|
|:---:|:---:|:---:|
|2B| [Link](https://huggingface.co/BSC-LT/salamandra-2b) | [Link](https://huggingface.co/BSC-LT/salamandra-2b-instruct) |
|7B| [Link](https://huggingface.co/BSC-LT/salamandra-7b) | [Link](https://huggingface.co/BSC-LT/salamandra-7b-instruct) |
|40B| [Link](https://huggingface.co/BSC-LT/ALIA-40b) | WiP | | [
"QUESTION_ANSWERING",
"TRANSLATION",
"SUMMARIZATION",
"PARAPHRASING"
] | [
"BEAR",
"SCIELO"
] |
pruas/BENT-PubMedBERT-NER-Cell-Line | pruas | token-classification | [
"transformers",
"pytorch",
"bert",
"token-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2023-01-14T14:25:56 | 2024-03-02T10:08:07 | 1,534 | 2 | ---
language:
- en
pipeline_tag: token-classification
---
Named Entity Recognition (NER) model to recognize cell line entities.
Please cite our work:
```
@article{NILNKER2022,
title = {NILINKER: Attention-based approach to NIL Entity Linking},
journal = {Journal of Biomedical Informatics},
volume = {132},
pages = {104137},
year = {2022},
issn = {1532-0464},
doi = {https://doi.org/10.1016/j.jbi.2022.104137},
url = {https://www.sciencedirect.com/science/article/pii/S1532046422001526},
author = {Pedro Ruas and Francisco M. Couto},
}
```
[PubMedBERT](https://huggingface.co/microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract-fulltext) fine-tuned on the following datasets:
- [CellFinder](http://cellfinder.org/about/annotation/): entity type "CellLine"
- [JNLPBA](http://www.geniaproject.org/genia-corpus/term-corpus): entity type "cell_line" | [
"NAMED_ENTITY_RECOGNITION"
] | [
"CELLFINDER",
"JNLPBA"
] |
SeaLLMs/SeaLLMs-v3-1.5B | SeaLLMs | text-generation | [
"transformers",
"safetensors",
"qwen2",
"text-generation",
"sea",
"multilingual",
"conversational",
"en",
"zh",
"id",
"vi",
"th",
"ms",
"tl",
"ta",
"jv",
"arxiv:2407.19672",
"arxiv:2306.05179",
"arxiv:2009.03300",
"license:other",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | 2024-07-29T09:03:52 | 2024-07-30T04:58:05 | 1,530 | 5 | ---
language:
- en
- zh
- id
- vi
- th
- ms
- tl
- ta
- jv
license: other
license_name: seallms
license_link: https://huggingface.co/SeaLLMs/SeaLLM-13B-Chat/blob/main/LICENSE
tags:
- sea
- multilingual
---
# *SeaLLMs-v3* - Large Language Models for Southeast Asia
<p align="center">
<a href="https://damo-nlp-sg.github.io/SeaLLMs/" target="_blank" rel="noopener">Website</a>
<a href="https://huggingface.co/SeaLLMs/SeaLLMs-v3-1.5B" target="_blank" rel="noopener">Model</a>
<a href="https://huggingface.co/spaces/SeaLLMs/SeaLLM-Chat" target="_blank" rel="noopener"> 🤗 DEMO</a>
<a href="https://github.com/DAMO-NLP-SG/SeaLLMs" target="_blank" rel="noopener">Github</a>
<a href="https://arxiv.org/pdf/2407.19672" target="_blank" rel="noopener">[NEW] Technical Report</a>
</p>
We introduce **SeaLLMs-v3**, the latest series of the SeaLLMs (Large Language Models for Southeast Asian languages) family. It achieves state-of-the-art performance among models with similar sizes, excelling across a diverse array of tasks such as world knowledge, mathematical reasoning, translation, and instruction following. In the meantime, it was specifically enhanced to be more trustworthy, exhibiting reduced hallucination and providing safe responses, particularly in queries closed related to Southeast Asian culture.
## 🔥 Highlights
- State-of-the-art performance compared to open-source models of similar sizes, evaluated across various dimensions such as human exam questions, instruction-following, mathematics, and translation.
- Significantly enhanced instruction-following capability, especially in multi-turn settings.
- Ensures safety in usage with significantly reduced instances of hallucination and sensitivity to local contexts.
## Uses
SeaLLMs is tailored for handling a wide range of languages spoken in the SEA region, including English, Chinese, Indonesian, Vietnamese, Thai, Tagalog, Malay, Burmese, Khmer, Lao, Tamil, and Javanese.
This page introduces the **SeaLLMs-v3-1.5B** model, which can be easily fine-tuned for your specific downstream tasks, especially in SEA languages.
Note that this is a base model, if you are looking for a model that can be directly applicable to your downstream applications, you may want to check the chat version model: **[SeaLLMs-v3-1.5B-Chat](https://huggingface.co/SeaLLMs/SeaLLMs-v3-1.5B-Chat)**.
## Evaluation
## Evaluation
We evaluate SeaLLMs-v3-1.5B mainly using human exam questions.
#### Multilingual World Knowledge - M3Exam
[M3Exam](https://arxiv.org/abs/2306.05179) consists of local exam questions collected from each country. It reflects the model's world knowledge (e.g., with language or social science subjects) and reasoning abilities (e.g., with mathematics or natural science subjects).
| Model | en | zh | id | th | vi | avg | avg_sea |
| :------------------ | --------: | --------: | --------: | --------: | --------: | --------: | --------: |
| Gemma-2B | 0.411 | 0.267 | 0.296 | 0.283 | 0.313 | 0.314 | 0.297 |
| Sailor-1.8B | 0.270 | 0.239 | 0.250 | 0.261 | 0.260 | 0.256 | 0.257 |
| Sailor-4B | 0.387 | 0.295 | 0.275 | 0.296 | 0.311 | 0.313 | 0.294 |
| Qwen2-1.5B | 0.628 | **0.753** | 0.409 | 0.352 | 0.443 | 0.517 | 0.401 |
| **SeaLLMs-v3-1.5B** | **0.635** | 0.745 | **0.424** | **0.371** | **0.465** | **0.528** | **0.420** |
#### Multilingual World Knowledge - MMLU
[MMLU](https://arxiv.org/abs/2009.03300) questions are translated to SEA languages for evaluation, which primarily tests the cross-lingual alignment of the model as the required knowledge is still mainly Western-focused.
| Model | en | zh | id | th | vi | avg | avg_sea |
| :------------------ | --------: | --------: | --------: | --------: | --------: | --------: | --------: |
| Gemma-2B | 0.374 | 0.304 | 0.315 | 0.292 | 0.305 | 0.318 | 0.304 |
| Sailor-1.8B | 0.293 | 0.251 | 0.268 | 0.256 | 0.256 | 0.265 | 0.260 |
| Sailor-4B | 0.333 | 0.267 | 0.299 | 0.278 | 0.282 | 0.292 | 0.286 |
| Qwen2-1.5B | 0.552 | **0.491** | 0.426 | 0.366 | 0.398 | 0.447 | 0.397 |
| **SeaLLMs-v3-1.5B** | **0.553** | 0.487 | **0.443** | **0.377** | **0.423** | **0.456** | **0.414** |
## Acknowledgement to Our Linguists
We would like to express our special thanks to our professional and native linguists, Tantong Champaiboon, Nguyen Ngoc Yen Nhi and Tara Devina Putri, who helped build, evaluate, and fact-check our sampled pretraining and SFT dataset as well as evaluating our models across different aspects, especially safety.
## Citation
If you find our project useful, we hope you would kindly star our repo and cite our work as follows:
```
@article{damonlp2024seallm3,
author = {Wenxuan Zhang*, Hou Pong Chan*, Yiran Zhao*, Mahani Aljunied*,
Jianyu Wang*, Chaoqun Liu, Yue Deng, Zhiqiang Hu, Weiwen Xu,
Yew Ken Chia, Xin Li, Lidong Bing},
title = {SeaLLMs 3: Open Foundation and Chat Multilingual Large Language Models for Southeast Asian Languages},
year = {2024},
url = {https://arxiv.org/abs/2407.19672}
}
```
Corresponding Author: [email protected] | [
"TRANSLATION"
] | [
"CHIA"
] |
Mihaiii/Ivysaur | Mihaiii | sentence-similarity | [
"sentence-transformers",
"onnx",
"safetensors",
"bert",
"feature-extraction",
"sentence-similarity",
"gte",
"mteb",
"dataset:Mihaiii/qa-assistant",
"base_model:TaylorAI/gte-tiny",
"base_model:quantized:TaylorAI/gte-tiny",
"license:mit",
"model-index",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
] | 2024-04-27T10:10:39 | 2024-04-30T07:10:12 | 1,509 | 0 | ---
base_model: TaylorAI/gte-tiny
datasets:
- Mihaiii/qa-assistant
library_name: sentence-transformers
license: mit
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- gte
- mteb
model-index:
- name: Ivysaur
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 72.1044776119403
- type: ap
value: 35.09105788324913
- type: f1
value: 66.26967715703572
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 86.686075
- type: ap
value: 81.92716581685914
- type: f1
value: 86.65902299160209
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 42.698
- type: f1
value: 42.287785312461885
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: mteb/arguana
config: default
split: test
revision: c22ab2a51041ffd869aaddef7af8d8215647e41a
metrics:
- type: map_at_1
value: 30.441000000000003
- type: map_at_10
value: 46.951
- type: map_at_100
value: 47.788000000000004
- type: map_at_1000
value: 47.794
- type: map_at_20
value: 47.621
- type: map_at_3
value: 42.295
- type: map_at_5
value: 45.126
- type: mrr_at_1
value: 31.65
- type: mrr_at_10
value: 47.394999999999996
- type: mrr_at_100
value: 48.238
- type: mrr_at_1000
value: 48.245
- type: mrr_at_20
value: 48.069
- type: mrr_at_3
value: 42.852000000000004
- type: mrr_at_5
value: 45.58
- type: ndcg_at_1
value: 30.441000000000003
- type: ndcg_at_10
value: 55.783
- type: ndcg_at_100
value: 59.227
- type: ndcg_at_1000
value: 59.376
- type: ndcg_at_20
value: 58.18
- type: ndcg_at_3
value: 46.291
- type: ndcg_at_5
value: 51.405
- type: precision_at_1
value: 30.441000000000003
- type: precision_at_10
value: 8.378
- type: precision_at_100
value: 0.985
- type: precision_at_1000
value: 0.1
- type: precision_at_20
value: 4.659
- type: precision_at_3
value: 19.298000000000002
- type: precision_at_5
value: 14.068
- type: recall_at_1
value: 30.441000000000003
- type: recall_at_10
value: 83.784
- type: recall_at_100
value: 98.506
- type: recall_at_1000
value: 99.644
- type: recall_at_20
value: 93.172
- type: recall_at_3
value: 57.894999999999996
- type: recall_at_5
value: 70.341
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 46.39249132731755
- type: v_measures
value:
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- 0.462627943488718
- 0.4670198046702645
- 0.4799590043041496
- 0.4769331119808875
- 0.4676232129237324
- 0.4776548131275231
- 0.4670074065859379
- 0.4796639656537766
- 0.4618481699630812
- 0.4663292111226376
- 0.5293353429909269
- 0.5398570175481274
- 0.5399074383870329
- 0.5363158656403061
- 0.5377616813701683
- 0.5375897664056992
- 0.5391811647339062
- 0.5408906197352437
- 0.5330346186210795
- 0.5333610235325786
- 0.5043600016005657
- 0.2861923615995782
- 0.42134506758129586
- 0.4019628602326345
- 0.345945272411779
- 0.2605048863591227
- 0.28469463800386774
- 0.23235682032046123
- 0.30618655352256796
- 1.0
- 0.2642226670507902
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 35.410038545643225
- type: v_measures
value:
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- 0.33766811548231473
- 0.3734777203759399
- 0.33991212785072317
- 0.3661605677492215
- 0.36064589524249807
- 0.3656962944251887
- 0.34702091841974203
- 0.3500477383658047
- 0.35477756658493836
- 0.3624636373603448
- 0.40289427457065846
- 0.3971477930112288
- 0.40597327027674507
- 0.40596489455329327
- 0.40317124541440197
- 0.4034334047970072
- 0.4035619316327058
- 0.4021323074077349
- 0.40002234969788997
- 0.39359153564695076
- 0.3721698397439144
- 0.20022120055536463
- 0.2733292585686657
- 0.329333695822746
- 0.267015905471991
- 0.1951877019437801
- 0.21813528003614752
- 0.1428255078757563
- 0.21839826060461043
- 1.0
- 0.1847317096610917
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 59.69637278242267
- type: mrr
value: 74.02948159873367
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 87.14461604689758
- type: cos_sim_spearman
value: 87.31584497244751
- type: euclidean_pearson
value: 84.78141750973201
- type: euclidean_spearman
value: 87.05017626840346
- type: manhattan_pearson
value: 84.35436632710646
- type: manhattan_spearman
value: 86.49534434907336
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 81.91558441558439
- type: f1
value: 81.88197959191479
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 38.97808934568377
- type: v_measures
value:
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- 0.3950220690882689
- 0.38918993520470474
- 0.3874211082831238
- 0.3769994856835508
- 0.37876292165982844
- 0.3979648803949703
- 0.39019384497819176
- 0.4100620420333616
- 0.3809405025237201
- 0.3912521447186565
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 31.7412250739116
- type: v_measures
value:
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- 0.31156273517579985
- 0.31497713177719505
- 0.3211720123203406
- 0.30456845682253647
- 0.3152485096373301
- 0.32328632147728803
- 0.3114059814606084
- 0.32290781970290505
- 0.31626398941398964
- 0.3327295496031667
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 31.7776266029616
- type: mrr
value: 32.9057970138914
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackRetrieval
type: mteb/cqadupstack
config: default
split: test
revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4
metrics:
- type: map_at_1
value: 24.78675
- type: map_at_10
value: 33.18391666666666
- type: map_at_100
value: 34.34583333333333
- type: map_at_1000
value: 34.46825
- type: map_at_20
value: 33.819
- type: map_at_3
value: 30.636500000000005
- type: map_at_5
value: 32.02091666666667
- type: mrr_at_1
value: 29.478749999999998
- type: mrr_at_10
value: 37.385
- type: mrr_at_100
value: 38.23491666666667
- type: mrr_at_1000
value: 38.298833333333334
- type: mrr_at_20
value: 37.87508333333333
- type: mrr_at_3
value: 35.089666666666666
- type: mrr_at_5
value: 36.36816666666667
- type: ndcg_at_1
value: 29.478749999999998
- type: ndcg_at_10
value: 38.2035
- type: ndcg_at_100
value: 43.301083333333324
- type: ndcg_at_1000
value: 45.758666666666656
- type: ndcg_at_20
value: 40.15116666666667
- type: ndcg_at_3
value: 33.86033333333334
- type: ndcg_at_5
value: 35.81266666666666
- type: precision_at_1
value: 29.478749999999998
- type: precision_at_10
value: 6.642833333333334
- type: precision_at_100
value: 1.08425
- type: precision_at_1000
value: 0.14850000000000002
- type: precision_at_20
value: 3.948083333333334
- type: precision_at_3
value: 15.511
- type: precision_at_5
value: 10.929833333333333
- type: recall_at_1
value: 24.78675
- type: recall_at_10
value: 48.9305
- type: recall_at_100
value: 71.49416666666666
- type: recall_at_1000
value: 88.54375
- type: recall_at_20
value: 56.06475
- type: recall_at_3
value: 36.66891666666666
- type: recall_at_5
value: 41.790499999999994
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: mteb/cqadupstack-android
config: default
split: test
revision: f46a197baaae43b4f621051089b82a364682dfeb
metrics:
- type: map_at_1
value: 30.793
- type: map_at_10
value: 42.254000000000005
- type: map_at_100
value: 43.569
- type: map_at_1000
value: 43.714999999999996
- type: map_at_20
value: 42.994
- type: map_at_3
value: 39.007999999999996
- type: map_at_5
value: 40.488
- type: mrr_at_1
value: 38.34
- type: mrr_at_10
value: 48.274
- type: mrr_at_100
value: 48.946
- type: mrr_at_1000
value: 49.001
- type: mrr_at_20
value: 48.701
- type: mrr_at_3
value: 45.756
- type: mrr_at_5
value: 47.036
- type: ndcg_at_1
value: 38.34
- type: ndcg_at_10
value: 48.622
- type: ndcg_at_100
value: 53.288999999999994
- type: ndcg_at_1000
value: 55.614
- type: ndcg_at_20
value: 50.495000000000005
- type: ndcg_at_3
value: 43.852999999999994
- type: ndcg_at_5
value: 45.442
- type: precision_at_1
value: 38.34
- type: precision_at_10
value: 9.413
- type: precision_at_100
value: 1.4749999999999999
- type: precision_at_1000
value: 0.19499999999999998
- type: precision_at_20
value: 5.494000000000001
- type: precision_at_3
value: 20.935000000000002
- type: precision_at_5
value: 14.735000000000001
- type: recall_at_1
value: 30.793
- type: recall_at_10
value: 60.455000000000005
- type: recall_at_100
value: 80.061
- type: recall_at_1000
value: 95.322
- type: recall_at_20
value: 67.27
- type: recall_at_3
value: 46.296
- type: recall_at_5
value: 51.139
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackEnglishRetrieval
type: mteb/cqadupstack-english
config: default
split: test
revision: ad9991cb51e31e31e430383c75ffb2885547b5f0
metrics:
- type: map_at_1
value: 27.93
- type: map_at_10
value: 36.085
- type: map_at_100
value: 37.192
- type: map_at_1000
value: 37.324
- type: map_at_20
value: 36.614999999999995
- type: map_at_3
value: 33.452
- type: map_at_5
value: 35.088
- type: mrr_at_1
value: 34.777
- type: mrr_at_10
value: 41.865
- type: mrr_at_100
value: 42.518
- type: mrr_at_1000
value: 42.571
- type: mrr_at_20
value: 42.219
- type: mrr_at_3
value: 39.628
- type: mrr_at_5
value: 41.038999999999994
- type: ndcg_at_1
value: 34.777
- type: ndcg_at_10
value: 41.095
- type: ndcg_at_100
value: 45.286
- type: ndcg_at_1000
value: 47.656
- type: ndcg_at_20
value: 42.472
- type: ndcg_at_3
value: 37.349
- type: ndcg_at_5
value: 39.318
- type: precision_at_1
value: 34.777
- type: precision_at_10
value: 7.617999999999999
- type: precision_at_100
value: 1.242
- type: precision_at_1000
value: 0.173
- type: precision_at_20
value: 4.481
- type: precision_at_3
value: 17.771
- type: precision_at_5
value: 12.687999999999999
- type: recall_at_1
value: 27.93
- type: recall_at_10
value: 49.464000000000006
- type: recall_at_100
value: 67.64099999999999
- type: recall_at_1000
value: 83.066
- type: recall_at_20
value: 54.452999999999996
- type: recall_at_3
value: 38.157000000000004
- type: recall_at_5
value: 43.829
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGamingRetrieval
type: mteb/cqadupstack-gaming
config: default
split: test
revision: 4885aa143210c98657558c04aaf3dc47cfb54340
metrics:
- type: map_at_1
value: 37.332
- type: map_at_10
value: 49.146
- type: map_at_100
value: 50.222
- type: map_at_1000
value: 50.281
- type: map_at_20
value: 49.802
- type: map_at_3
value: 46.264
- type: map_at_5
value: 47.912
- type: mrr_at_1
value: 43.009
- type: mrr_at_10
value: 52.586999999999996
- type: mrr_at_100
value: 53.323
- type: mrr_at_1000
value: 53.352999999999994
- type: mrr_at_20
value: 53.04299999999999
- type: mrr_at_3
value: 50.261
- type: mrr_at_5
value: 51.615
- type: ndcg_at_1
value: 43.009
- type: ndcg_at_10
value: 54.652
- type: ndcg_at_100
value: 58.918000000000006
- type: ndcg_at_1000
value: 60.172000000000004
- type: ndcg_at_20
value: 56.554
- type: ndcg_at_3
value: 49.757
- type: ndcg_at_5
value: 52.169
- type: precision_at_1
value: 43.009
- type: precision_at_10
value: 8.715
- type: precision_at_100
value: 1.1780000000000002
- type: precision_at_1000
value: 0.133
- type: precision_at_20
value: 4.931
- type: precision_at_3
value: 22.153
- type: precision_at_5
value: 15.146999999999998
- type: recall_at_1
value: 37.332
- type: recall_at_10
value: 67.55600000000001
- type: recall_at_100
value: 85.885
- type: recall_at_1000
value: 94.87400000000001
- type: recall_at_20
value: 74.568
- type: recall_at_3
value: 54.419
- type: recall_at_5
value: 60.288
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGisRetrieval
type: mteb/cqadupstack-gis
config: default
split: test
revision: 5003b3064772da1887988e05400cf3806fe491f2
metrics:
- type: map_at_1
value: 24.09
- type: map_at_10
value: 32.608
- type: map_at_100
value: 33.571
- type: map_at_1000
value: 33.668
- type: map_at_20
value: 33.181
- type: map_at_3
value: 30.091
- type: map_at_5
value: 31.518
- type: mrr_at_1
value: 25.763
- type: mrr_at_10
value: 34.25
- type: mrr_at_100
value: 35.134
- type: mrr_at_1000
value: 35.207
- type: mrr_at_20
value: 34.78
- type: mrr_at_3
value: 31.807999999999996
- type: mrr_at_5
value: 33.198
- type: ndcg_at_1
value: 25.763
- type: ndcg_at_10
value: 37.305
- type: ndcg_at_100
value: 42.114000000000004
- type: ndcg_at_1000
value: 44.467
- type: ndcg_at_20
value: 39.272
- type: ndcg_at_3
value: 32.405
- type: ndcg_at_5
value: 34.775
- type: precision_at_1
value: 25.763
- type: precision_at_10
value: 5.729
- type: precision_at_100
value: 0.853
- type: precision_at_1000
value: 0.109
- type: precision_at_20
value: 3.3329999999999997
- type: precision_at_3
value: 13.71
- type: precision_at_5
value: 9.65
- type: recall_at_1
value: 24.09
- type: recall_at_10
value: 50.161
- type: recall_at_100
value: 72.419
- type: recall_at_1000
value: 89.983
- type: recall_at_20
value: 57.53
- type: recall_at_3
value: 36.961
- type: recall_at_5
value: 42.568
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackMathematicaRetrieval
type: mteb/cqadupstack-mathematica
config: default
split: test
revision: 90fceea13679c63fe563ded68f3b6f06e50061de
metrics:
- type: map_at_1
value: 16.333000000000002
- type: map_at_10
value: 23.352999999999998
- type: map_at_100
value: 24.618000000000002
- type: map_at_1000
value: 24.743000000000002
- type: map_at_20
value: 24.117
- type: map_at_3
value: 21.013
- type: map_at_5
value: 22.259
- type: mrr_at_1
value: 20.398
- type: mrr_at_10
value: 28.28
- type: mrr_at_100
value: 29.307
- type: mrr_at_1000
value: 29.381
- type: mrr_at_20
value: 28.955
- type: mrr_at_3
value: 25.933
- type: mrr_at_5
value: 27.114
- type: ndcg_at_1
value: 20.398
- type: ndcg_at_10
value: 28.359
- type: ndcg_at_100
value: 34.178999999999995
- type: ndcg_at_1000
value: 37.112
- type: ndcg_at_20
value: 30.982
- type: ndcg_at_3
value: 24.104999999999997
- type: ndcg_at_5
value: 25.877
- type: precision_at_1
value: 20.398
- type: precision_at_10
value: 5.2490000000000006
- type: precision_at_100
value: 0.927
- type: precision_at_1000
value: 0.131
- type: precision_at_20
value: 3.3520000000000003
- type: precision_at_3
value: 11.733
- type: precision_at_5
value: 8.433
- type: recall_at_1
value: 16.333000000000002
- type: recall_at_10
value: 39.082
- type: recall_at_100
value: 64.269
- type: recall_at_1000
value: 85.103
- type: recall_at_20
value: 48.625
- type: recall_at_3
value: 26.740000000000002
- type: recall_at_5
value: 31.519000000000002
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackPhysicsRetrieval
type: mteb/cqadupstack-physics
config: default
split: test
revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4
metrics:
- type: map_at_1
value: 26.857999999999997
- type: map_at_10
value: 36.258
- type: map_at_100
value: 37.556
- type: map_at_1000
value: 37.669999999999995
- type: map_at_20
value: 36.937
- type: map_at_3
value: 33.306000000000004
- type: map_at_5
value: 35.004999999999995
- type: mrr_at_1
value: 33.397
- type: mrr_at_10
value: 42.089
- type: mrr_at_100
value: 42.864999999999995
- type: mrr_at_1000
value: 42.915
- type: mrr_at_20
value: 42.510999999999996
- type: mrr_at_3
value: 39.413
- type: mrr_at_5
value: 40.905
- type: ndcg_at_1
value: 33.397
- type: ndcg_at_10
value: 42.062
- type: ndcg_at_100
value: 47.620000000000005
- type: ndcg_at_1000
value: 49.816
- type: ndcg_at_20
value: 44.096999999999994
- type: ndcg_at_3
value: 37.165
- type: ndcg_at_5
value: 39.493
- type: precision_at_1
value: 33.397
- type: precision_at_10
value: 7.5649999999999995
- type: precision_at_100
value: 1.224
- type: precision_at_1000
value: 0.16
- type: precision_at_20
value: 4.495
- type: precision_at_3
value: 17.613
- type: precision_at_5
value: 12.589
- type: recall_at_1
value: 26.857999999999997
- type: recall_at_10
value: 53.900000000000006
- type: recall_at_100
value: 77.595
- type: recall_at_1000
value: 92.116
- type: recall_at_20
value: 60.962
- type: recall_at_3
value: 39.799
- type: recall_at_5
value: 45.961
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackProgrammersRetrieval
type: mteb/cqadupstack-programmers
config: default
split: test
revision: 6184bc1440d2dbc7612be22b50686b8826d22b32
metrics:
- type: map_at_1
value: 24.131
- type: map_at_10
value: 33.016
- type: map_at_100
value: 34.263
- type: map_at_1000
value: 34.39
- type: map_at_20
value: 33.703
- type: map_at_3
value: 30.055
- type: map_at_5
value: 31.651
- type: mrr_at_1
value: 30.593999999999998
- type: mrr_at_10
value: 38.786
- type: mrr_at_100
value: 39.674
- type: mrr_at_1000
value: 39.739000000000004
- type: mrr_at_20
value: 39.322
- type: mrr_at_3
value: 36.32
- type: mrr_at_5
value: 37.787
- type: ndcg_at_1
value: 30.593999999999998
- type: ndcg_at_10
value: 38.606
- type: ndcg_at_100
value: 44.116
- type: ndcg_at_1000
value: 46.772999999999996
- type: ndcg_at_20
value: 40.775
- type: ndcg_at_3
value: 33.854
- type: ndcg_at_5
value: 35.957
- type: precision_at_1
value: 30.593999999999998
- type: precision_at_10
value: 7.112
- type: precision_at_100
value: 1.154
- type: precision_at_1000
value: 0.155
- type: precision_at_20
value: 4.2410000000000005
- type: precision_at_3
value: 16.323999999999998
- type: precision_at_5
value: 11.644
- type: recall_at_1
value: 24.131
- type: recall_at_10
value: 49.767
- type: recall_at_100
value: 73.57000000000001
- type: recall_at_1000
value: 91.842
- type: recall_at_20
value: 57.498000000000005
- type: recall_at_3
value: 35.888
- type: recall_at_5
value: 41.801
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackStatsRetrieval
type: mteb/cqadupstack-stats
config: default
split: test
revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a
metrics:
- type: map_at_1
value: 23.075000000000003
- type: map_at_10
value: 29.584
- type: map_at_100
value: 30.4
- type: map_at_1000
value: 30.501
- type: map_at_20
value: 30.051
- type: map_at_3
value: 27.561000000000003
- type: map_at_5
value: 28.603
- type: mrr_at_1
value: 26.227
- type: mrr_at_10
value: 32.647
- type: mrr_at_100
value: 33.391999999999996
- type: mrr_at_1000
value: 33.469
- type: mrr_at_20
value: 33.053
- type: mrr_at_3
value: 30.776999999999997
- type: mrr_at_5
value: 31.828
- type: ndcg_at_1
value: 26.227
- type: ndcg_at_10
value: 33.582
- type: ndcg_at_100
value: 37.814
- type: ndcg_at_1000
value: 40.444
- type: ndcg_at_20
value: 35.163
- type: ndcg_at_3
value: 29.874000000000002
- type: ndcg_at_5
value: 31.53
- type: precision_at_1
value: 26.227
- type: precision_at_10
value: 5.244999999999999
- type: precision_at_100
value: 0.788
- type: precision_at_1000
value: 0.11100000000000002
- type: precision_at_20
value: 3.006
- type: precision_at_3
value: 12.73
- type: precision_at_5
value: 8.741999999999999
- type: recall_at_1
value: 23.075000000000003
- type: recall_at_10
value: 42.894
- type: recall_at_100
value: 62.721000000000004
- type: recall_at_1000
value: 81.858
- type: recall_at_20
value: 48.842
- type: recall_at_3
value: 32.783
- type: recall_at_5
value: 36.949
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackTexRetrieval
type: mteb/cqadupstack-tex
config: default
split: test
revision: 46989137a86843e03a6195de44b09deda022eec7
metrics:
- type: map_at_1
value: 17.028
- type: map_at_10
value: 23.377
- type: map_at_100
value: 24.399
- type: map_at_1000
value: 24.524
- type: map_at_20
value: 23.863
- type: map_at_3
value: 21.274
- type: map_at_5
value: 22.431
- type: mrr_at_1
value: 20.578
- type: mrr_at_10
value: 27.009
- type: mrr_at_100
value: 27.889999999999997
- type: mrr_at_1000
value: 27.969
- type: mrr_at_20
value: 27.46
- type: mrr_at_3
value: 24.959999999999997
- type: mrr_at_5
value: 26.113999999999997
- type: ndcg_at_1
value: 20.578
- type: ndcg_at_10
value: 27.522999999999996
- type: ndcg_at_100
value: 32.601
- type: ndcg_at_1000
value: 35.636
- type: ndcg_at_20
value: 29.132
- type: ndcg_at_3
value: 23.771
- type: ndcg_at_5
value: 25.539
- type: precision_at_1
value: 20.578
- type: precision_at_10
value: 4.962
- type: precision_at_100
value: 0.8880000000000001
- type: precision_at_1000
value: 0.132
- type: precision_at_20
value: 2.959
- type: precision_at_3
value: 11.068999999999999
- type: precision_at_5
value: 8.052
- type: recall_at_1
value: 17.028
- type: recall_at_10
value: 36.266
- type: recall_at_100
value: 59.556
- type: recall_at_1000
value: 81.416
- type: recall_at_20
value: 42.303000000000004
- type: recall_at_3
value: 25.858999999999998
- type: recall_at_5
value: 30.422
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackUnixRetrieval
type: mteb/cqadupstack-unix
config: default
split: test
revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53
metrics:
- type: map_at_1
value: 25.863000000000003
- type: map_at_10
value: 33.586
- type: map_at_100
value: 34.682
- type: map_at_1000
value: 34.791
- type: map_at_20
value: 34.182
- type: map_at_3
value: 31.044
- type: map_at_5
value: 32.507000000000005
- type: mrr_at_1
value: 30.131000000000004
- type: mrr_at_10
value: 37.518
- type: mrr_at_100
value: 38.355
- type: mrr_at_1000
value: 38.425
- type: mrr_at_20
value: 37.961
- type: mrr_at_3
value: 35.059000000000005
- type: mrr_at_5
value: 36.528
- type: ndcg_at_1
value: 30.131000000000004
- type: ndcg_at_10
value: 38.387
- type: ndcg_at_100
value: 43.617
- type: ndcg_at_1000
value: 46.038000000000004
- type: ndcg_at_20
value: 40.261
- type: ndcg_at_3
value: 33.722
- type: ndcg_at_5
value: 36.013
- type: precision_at_1
value: 30.131000000000004
- type: precision_at_10
value: 6.297
- type: precision_at_100
value: 1.008
- type: precision_at_1000
value: 0.132
- type: precision_at_20
value: 3.689
- type: precision_at_3
value: 15.049999999999999
- type: precision_at_5
value: 10.634
- type: recall_at_1
value: 25.863000000000003
- type: recall_at_10
value: 49.101
- type: recall_at_100
value: 72.286
- type: recall_at_1000
value: 89.14
- type: recall_at_20
value: 55.742999999999995
- type: recall_at_3
value: 36.513
- type: recall_at_5
value: 42.204
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWebmastersRetrieval
type: mteb/cqadupstack-webmasters
config: default
split: test
revision: 160c094312a0e1facb97e55eeddb698c0abe3571
metrics:
- type: map_at_1
value: 24.747
- type: map_at_10
value: 32.067
- type: map_at_100
value: 33.739999999999995
- type: map_at_1000
value: 33.952
- type: map_at_20
value: 32.927
- type: map_at_3
value: 29.736
- type: map_at_5
value: 30.996000000000002
- type: mrr_at_1
value: 29.644
- type: mrr_at_10
value: 36.683
- type: mrr_at_100
value: 37.808
- type: mrr_at_1000
value: 37.858999999999995
- type: mrr_at_20
value: 37.326
- type: mrr_at_3
value: 34.42
- type: mrr_at_5
value: 35.626000000000005
- type: ndcg_at_1
value: 29.644
- type: ndcg_at_10
value: 36.989
- type: ndcg_at_100
value: 43.589
- type: ndcg_at_1000
value: 46.133
- type: ndcg_at_20
value: 39.403
- type: ndcg_at_3
value: 33.273
- type: ndcg_at_5
value: 34.853
- type: precision_at_1
value: 29.644
- type: precision_at_10
value: 6.8180000000000005
- type: precision_at_100
value: 1.4529999999999998
- type: precision_at_1000
value: 0.23500000000000001
- type: precision_at_20
value: 4.457
- type: precision_at_3
value: 15.152
- type: precision_at_5
value: 10.711
- type: recall_at_1
value: 24.747
- type: recall_at_10
value: 45.714
- type: recall_at_100
value: 75.212
- type: recall_at_1000
value: 90.884
- type: recall_at_20
value: 54.777
- type: recall_at_3
value: 34.821999999999996
- type: recall_at_5
value: 39.278999999999996
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWordpressRetrieval
type: mteb/cqadupstack-wordpress
config: default
split: test
revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4
metrics:
- type: map_at_1
value: 19.261
- type: map_at_10
value: 26.873
- type: map_at_100
value: 27.938000000000002
- type: map_at_1000
value: 28.060000000000002
- type: map_at_20
value: 27.456000000000003
- type: map_at_3
value: 24.834
- type: map_at_5
value: 25.793
- type: mrr_at_1
value: 20.887
- type: mrr_at_10
value: 28.634999999999998
- type: mrr_at_100
value: 29.609
- type: mrr_at_1000
value: 29.698999999999998
- type: mrr_at_20
value: 29.173
- type: mrr_at_3
value: 26.741
- type: mrr_at_5
value: 27.628000000000004
- type: ndcg_at_1
value: 20.887
- type: ndcg_at_10
value: 31.261
- type: ndcg_at_100
value: 36.471
- type: ndcg_at_1000
value: 39.245000000000005
- type: ndcg_at_20
value: 33.209
- type: ndcg_at_3
value: 27.195999999999998
- type: ndcg_at_5
value: 28.786
- type: precision_at_1
value: 20.887
- type: precision_at_10
value: 4.9910000000000005
- type: precision_at_100
value: 0.8210000000000001
- type: precision_at_1000
value: 0.116
- type: precision_at_20
value: 2.939
- type: precision_at_3
value: 11.892
- type: precision_at_5
value: 8.133
- type: recall_at_1
value: 19.261
- type: recall_at_10
value: 42.806
- type: recall_at_100
value: 66.715
- type: recall_at_1000
value: 86.921
- type: recall_at_20
value: 50.205999999999996
- type: recall_at_3
value: 31.790000000000003
- type: recall_at_5
value: 35.527
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: mteb/climate-fever
config: default
split: test
revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380
metrics:
- type: map_at_1
value: 9.009
- type: map_at_10
value: 14.629
- type: map_at_100
value: 16.092000000000002
- type: map_at_1000
value: 16.267
- type: map_at_20
value: 15.384999999999998
- type: map_at_3
value: 12.280000000000001
- type: map_at_5
value: 13.442000000000002
- type: mrr_at_1
value: 20.0
- type: mrr_at_10
value: 29.298000000000002
- type: mrr_at_100
value: 30.375999999999998
- type: mrr_at_1000
value: 30.436999999999998
- type: mrr_at_20
value: 29.956
- type: mrr_at_3
value: 26.362999999999996
- type: mrr_at_5
value: 28.021
- type: ndcg_at_1
value: 20.0
- type: ndcg_at_10
value: 21.234
- type: ndcg_at_100
value: 27.687
- type: ndcg_at_1000
value: 31.325999999999997
- type: ndcg_at_20
value: 23.631
- type: ndcg_at_3
value: 17.101
- type: ndcg_at_5
value: 18.501
- type: precision_at_1
value: 20.0
- type: precision_at_10
value: 6.651
- type: precision_at_100
value: 1.347
- type: precision_at_1000
value: 0.201
- type: precision_at_20
value: 4.316
- type: precision_at_3
value: 12.53
- type: precision_at_5
value: 9.707
- type: recall_at_1
value: 9.009
- type: recall_at_10
value: 25.824
- type: recall_at_100
value: 48.535000000000004
- type: recall_at_1000
value: 69.44399999999999
- type: recall_at_20
value: 32.78
- type: recall_at_3
value: 15.693999999999999
- type: recall_at_5
value: 19.59
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: mteb/dbpedia
config: default
split: test
revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659
metrics:
- type: map_at_1
value: 7.454
- type: map_at_10
value: 15.675
- type: map_at_100
value: 21.335
- type: map_at_1000
value: 22.639
- type: map_at_20
value: 17.822
- type: map_at_3
value: 11.609
- type: map_at_5
value: 13.342
- type: mrr_at_1
value: 56.25
- type: mrr_at_10
value: 65.30799999999999
- type: mrr_at_100
value: 65.90599999999999
- type: mrr_at_1000
value: 65.92099999999999
- type: mrr_at_20
value: 65.74600000000001
- type: mrr_at_3
value: 63.333
- type: mrr_at_5
value: 64.521
- type: ndcg_at_1
value: 44.625
- type: ndcg_at_10
value: 33.881
- type: ndcg_at_100
value: 37.775999999999996
- type: ndcg_at_1000
value: 44.956
- type: ndcg_at_20
value: 33.451
- type: ndcg_at_3
value: 37.72
- type: ndcg_at_5
value: 35.811
- type: precision_at_1
value: 56.25
- type: precision_at_10
value: 27.175
- type: precision_at_100
value: 8.448
- type: precision_at_1000
value: 1.809
- type: precision_at_20
value: 20.262
- type: precision_at_3
value: 41.333
- type: precision_at_5
value: 35.199999999999996
- type: recall_at_1
value: 7.454
- type: recall_at_10
value: 20.355999999999998
- type: recall_at_100
value: 43.168
- type: recall_at_1000
value: 66.559
- type: recall_at_20
value: 26.785999999999998
- type: recall_at_3
value: 13.052
- type: recall_at_5
value: 15.733
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 45.44499999999999
- type: f1
value: 40.581418056070994
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: mteb/fever
config: default
split: test
revision: bea83ef9e8fb933d90a2f1d5515737465d613e12
metrics:
- type: map_at_1
value: 46.339000000000006
- type: map_at_10
value: 57.87
- type: map_at_100
value: 58.447
- type: map_at_1000
value: 58.474000000000004
- type: map_at_20
value: 58.241
- type: map_at_3
value: 55.336
- type: map_at_5
value: 56.879000000000005
- type: mrr_at_1
value: 49.91
- type: mrr_at_10
value: 61.55199999999999
- type: mrr_at_100
value: 62.07
- type: mrr_at_1000
value: 62.086
- type: mrr_at_20
value: 61.899
- type: mrr_at_3
value: 59.108000000000004
- type: mrr_at_5
value: 60.622
- type: ndcg_at_1
value: 49.91
- type: ndcg_at_10
value: 63.970000000000006
- type: ndcg_at_100
value: 66.625
- type: ndcg_at_1000
value: 67.221
- type: ndcg_at_20
value: 65.261
- type: ndcg_at_3
value: 59.059
- type: ndcg_at_5
value: 61.68900000000001
- type: precision_at_1
value: 49.91
- type: precision_at_10
value: 8.699
- type: precision_at_100
value: 1.015
- type: precision_at_1000
value: 0.108
- type: precision_at_20
value: 4.6370000000000005
- type: precision_at_3
value: 23.942
- type: precision_at_5
value: 15.815000000000001
- type: recall_at_1
value: 46.339000000000006
- type: recall_at_10
value: 79.28
- type: recall_at_100
value: 91.148
- type: recall_at_1000
value: 95.438
- type: recall_at_20
value: 84.187
- type: recall_at_3
value: 66.019
- type: recall_at_5
value: 72.394
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: mteb/fiqa
config: default
split: test
revision: 27a168819829fe9bcd655c2df245fb19452e8e06
metrics:
- type: map_at_1
value: 14.504
- type: map_at_10
value: 24.099999999999998
- type: map_at_100
value: 25.820999999999998
- type: map_at_1000
value: 25.997999999999998
- type: map_at_20
value: 25.003999999999998
- type: map_at_3
value: 21.218999999999998
- type: map_at_5
value: 22.744
- type: mrr_at_1
value: 29.475
- type: mrr_at_10
value: 38.072
- type: mrr_at_100
value: 39.196999999999996
- type: mrr_at_1000
value: 39.249
- type: mrr_at_20
value: 38.757999999999996
- type: mrr_at_3
value: 36.214
- type: mrr_at_5
value: 37.094
- type: ndcg_at_1
value: 29.475
- type: ndcg_at_10
value: 30.708999999999996
- type: ndcg_at_100
value: 37.744
- type: ndcg_at_1000
value: 41.215
- type: ndcg_at_20
value: 33.336
- type: ndcg_at_3
value: 28.243000000000002
- type: ndcg_at_5
value: 28.62
- type: precision_at_1
value: 29.475
- type: precision_at_10
value: 8.596
- type: precision_at_100
value: 1.562
- type: precision_at_1000
value: 0.219
- type: precision_at_20
value: 5.394
- type: precision_at_3
value: 19.084
- type: precision_at_5
value: 13.672999999999998
- type: recall_at_1
value: 14.504
- type: recall_at_10
value: 36.232
- type: recall_at_100
value: 62.712
- type: recall_at_1000
value: 83.864
- type: recall_at_20
value: 44.357
- type: recall_at_3
value: 26.029000000000003
- type: recall_at_5
value: 29.909000000000002
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: mteb/hotpotqa
config: default
split: test
revision: ab518f4d6fcca38d87c25209f94beba119d02014
metrics:
- type: map_at_1
value: 31.634
- type: map_at_10
value: 45.007000000000005
- type: map_at_100
value: 45.963
- type: map_at_1000
value: 46.052
- type: map_at_20
value: 45.550000000000004
- type: map_at_3
value: 42.092
- type: map_at_5
value: 43.832
- type: mrr_at_1
value: 63.268
- type: mrr_at_10
value: 70.691
- type: mrr_at_100
value: 71.063
- type: mrr_at_1000
value: 71.082
- type: mrr_at_20
value: 70.917
- type: mrr_at_3
value: 69.176
- type: mrr_at_5
value: 70.132
- type: ndcg_at_1
value: 63.268
- type: ndcg_at_10
value: 54.205000000000005
- type: ndcg_at_100
value: 57.847
- type: ndcg_at_1000
value: 59.64
- type: ndcg_at_20
value: 55.663
- type: ndcg_at_3
value: 49.613
- type: ndcg_at_5
value: 52.054
- type: precision_at_1
value: 63.268
- type: precision_at_10
value: 11.357000000000001
- type: precision_at_100
value: 1.423
- type: precision_at_1000
value: 0.166
- type: precision_at_20
value: 6.148
- type: precision_at_3
value: 31.041999999999998
- type: precision_at_5
value: 20.551
- type: recall_at_1
value: 31.634
- type: recall_at_10
value: 56.786
- type: recall_at_100
value: 71.128
- type: recall_at_1000
value: 82.97099999999999
- type: recall_at_20
value: 61.47899999999999
- type: recall_at_3
value: 46.563
- type: recall_at_5
value: 51.376999999999995
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 80.7996
- type: ap
value: 74.98592172204835
- type: f1
value: 80.77161545117626
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: mteb/msmarco
config: default
split: dev
revision: c5a29a104738b98a9e76336939199e264163d4a0
metrics:
- type: map_at_1
value: 16.637
- type: map_at_10
value: 27.331
- type: map_at_100
value: 28.518
- type: map_at_1000
value: 28.583
- type: map_at_20
value: 28.031
- type: map_at_3
value: 23.715
- type: map_at_5
value: 25.758
- type: mrr_at_1
value: 17.077
- type: mrr_at_10
value: 27.807
- type: mrr_at_100
value: 28.965999999999998
- type: mrr_at_1000
value: 29.025000000000002
- type: mrr_at_20
value: 28.499999999999996
- type: mrr_at_3
value: 24.234
- type: mrr_at_5
value: 26.257
- type: ndcg_at_1
value: 17.077
- type: ndcg_at_10
value: 33.607
- type: ndcg_at_100
value: 39.593
- type: ndcg_at_1000
value: 41.317
- type: ndcg_at_20
value: 36.118
- type: ndcg_at_3
value: 26.204
- type: ndcg_at_5
value: 29.862
- type: precision_at_1
value: 17.077
- type: precision_at_10
value: 5.54
- type: precision_at_100
value: 0.857
- type: precision_at_1000
value: 0.101
- type: precision_at_20
value: 3.2870000000000004
- type: precision_at_3
value: 11.361
- type: precision_at_5
value: 8.673
- type: recall_at_1
value: 16.637
- type: recall_at_10
value: 53.077
- type: recall_at_100
value: 81.306
- type: recall_at_1000
value: 94.72699999999999
- type: recall_at_20
value: 62.855000000000004
- type: recall_at_3
value: 32.897999999999996
- type: recall_at_5
value: 41.697
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 92.12494300045599
- type: f1
value: 91.6522604757574
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 71.86046511627907
- type: f1
value: 53.8926541769729
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 70.34633490248824
- type: f1
value: 67.94196699295675
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 74.88903833221251
- type: f1
value: 74.54991713265153
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 33.129785771060526
- type: v_measures
value:
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- 0.3116408980465631
- 0.31900622847630045
- 0.31934151231927727
- 0.3186791563176499
- 0.32750328333726775
- 0.3510627418495332
- 0.33347506212887845
- 0.35025343435496104
- 0.3417862644568677
- 0.3402299958187535
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 30.29367725266166
- type: v_measures
value:
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- 0.2892892644106019
- 0.2904909862243706
- 0.29717543408443786
- 0.28841424958079537
- 0.2946040279701031
- 0.3071795420433026
- 0.30471220279454575
- 0.31753537687383027
- 0.318823343042763
- 0.32114329824141535
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: mteb/nfcorpus
config: default
split: test
revision: ec0fa4fe99da2ff19ca1214b7966684033a58814
metrics:
- type: map_at_1
value: 5.542
- type: map_at_10
value: 11.734
- type: map_at_100
value: 14.812
- type: map_at_1000
value: 16.184
- type: map_at_20
value: 13.045000000000002
- type: map_at_3
value: 8.859
- type: map_at_5
value: 10.162
- type: mrr_at_1
value: 43.963
- type: mrr_at_10
value: 51.914
- type: mrr_at_100
value: 52.422000000000004
- type: mrr_at_1000
value: 52.479
- type: mrr_at_20
value: 52.215
- type: mrr_at_3
value: 49.897000000000006
- type: mrr_at_5
value: 50.965
- type: ndcg_at_1
value: 42.105
- type: ndcg_at_10
value: 32.035000000000004
- type: ndcg_at_100
value: 29.487999999999996
- type: ndcg_at_1000
value: 38.316
- type: ndcg_at_20
value: 30.255
- type: ndcg_at_3
value: 37.098
- type: ndcg_at_5
value: 34.98
- type: precision_at_1
value: 43.344
- type: precision_at_10
value: 23.313
- type: precision_at_100
value: 7.591
- type: precision_at_1000
value: 2.023
- type: precision_at_20
value: 17.755000000000003
- type: precision_at_3
value: 33.745999999999995
- type: precision_at_5
value: 29.474
- type: recall_at_1
value: 5.542
- type: recall_at_10
value: 15.61
- type: recall_at_100
value: 29.413
- type: recall_at_1000
value: 61.926
- type: recall_at_20
value: 19.517
- type: recall_at_3
value: 9.669
- type: recall_at_5
value: 11.772
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: mteb/nq
config: default
split: test
revision: b774495ed302d8c44a3a7ea25c90dbce03968f31
metrics:
- type: map_at_1
value: 21.590999999999998
- type: map_at_10
value: 35.088
- type: map_at_100
value: 36.386
- type: map_at_1000
value: 36.439
- type: map_at_20
value: 35.93
- type: map_at_3
value: 30.985000000000003
- type: map_at_5
value: 33.322
- type: mrr_at_1
value: 24.189
- type: mrr_at_10
value: 37.395
- type: mrr_at_100
value: 38.449
- type: mrr_at_1000
value: 38.486
- type: mrr_at_20
value: 38.092999999999996
- type: mrr_at_3
value: 33.686
- type: mrr_at_5
value: 35.861
- type: ndcg_at_1
value: 24.189
- type: ndcg_at_10
value: 42.471
- type: ndcg_at_100
value: 48.150999999999996
- type: ndcg_at_1000
value: 49.342000000000006
- type: ndcg_at_20
value: 45.245000000000005
- type: ndcg_at_3
value: 34.483000000000004
- type: ndcg_at_5
value: 38.505
- type: precision_at_1
value: 24.189
- type: precision_at_10
value: 7.3870000000000005
- type: precision_at_100
value: 1.056
- type: precision_at_1000
value: 0.117
- type: precision_at_20
value: 4.35
- type: precision_at_3
value: 16.009999999999998
- type: precision_at_5
value: 11.883000000000001
- type: recall_at_1
value: 21.590999999999998
- type: recall_at_10
value: 62.79
- type: recall_at_100
value: 87.71
- type: recall_at_1000
value: 96.418
- type: recall_at_20
value: 73.042
- type: recall_at_3
value: 41.876999999999995
- type: recall_at_5
value: 51.205
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: mteb/quora
config: default
split: test
revision: e4e08e0b7dbe3c8700f0daef558ff32256715259
metrics:
- type: map_at_1
value: 68.31099999999999
- type: map_at_10
value: 81.845
- type: map_at_100
value: 82.518
- type: map_at_1000
value: 82.541
- type: map_at_20
value: 82.292
- type: map_at_3
value: 78.827
- type: map_at_5
value: 80.715
- type: mrr_at_1
value: 78.62
- type: mrr_at_10
value: 85.42
- type: mrr_at_100
value: 85.54899999999999
- type: mrr_at_1000
value: 85.55
- type: mrr_at_20
value: 85.516
- type: mrr_at_3
value: 84.265
- type: mrr_at_5
value: 85.021
- type: ndcg_at_1
value: 78.63
- type: ndcg_at_10
value: 86.032
- type: ndcg_at_100
value: 87.50099999999999
- type: ndcg_at_1000
value: 87.67200000000001
- type: ndcg_at_20
value: 86.822
- type: ndcg_at_3
value: 82.813
- type: ndcg_at_5
value: 84.555
- type: precision_at_1
value: 78.63
- type: precision_at_10
value: 13.025999999999998
- type: precision_at_100
value: 1.504
- type: precision_at_1000
value: 0.156
- type: precision_at_20
value: 6.944999999999999
- type: precision_at_3
value: 36.013
- type: precision_at_5
value: 23.788
- type: recall_at_1
value: 68.31099999999999
- type: recall_at_10
value: 94.003
- type: recall_at_100
value: 99.11999999999999
- type: recall_at_1000
value: 99.923
- type: recall_at_20
value: 96.55799999999999
- type: recall_at_3
value: 84.836
- type: recall_at_5
value: 89.655
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 47.52530454226057
- type: v_measures
value:
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- 0.47757401852125586
- 0.5247425354540537
- 0.4204113161707625
- 0.46730199475875295
- 0.44060686916417374
- 0.40965236253971965
- 0.5406478376242424
- 0.4258020776189897
- 0.45263355666588695
- 0.4485852520776176
- 0.45776058545875725
- 0.5163652480866036
- 0.4839337312350155
- 0.4787997358105262
- 0.5744729237665975
- 0.4250543347829616
- 0.49829072714687295
- 0.5853438771525417
- 0.4205343962194473
- 0.42565458494862596
- 0.4278942125559693
- 0.450724893645709
- 0.6135871494667406
- 0.4720579979931778
- 0.44289391670014056
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 385e3cb46b4cfa89021f56c4380204149d0efe33
metrics:
- type: v_measure
value: 56.028612066452
- type: v_measures
value:
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- 0.616850986362034
- 0.6156955011870908
- 0.5889048703354965
- 0.3132434489631298
- 0.6351476398732859
- 0.5618708165569017
- 0.2892441818894155
- 0.678005863237291
- 0.6308488746145553
- 0.6730490236260003
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: mteb/scidocs
config: default
split: test
revision: f8c2fcf00f625baaa80f62ec5bd9e1fff3b8ae88
metrics:
- type: map_at_1
value: 4.108
- type: map_at_10
value: 10.953
- type: map_at_100
value: 13.004
- type: map_at_1000
value: 13.303
- type: map_at_20
value: 12.004
- type: map_at_3
value: 7.754999999999999
- type: map_at_5
value: 9.19
- type: mrr_at_1
value: 20.200000000000003
- type: mrr_at_10
value: 31.069999999999997
- type: mrr_at_100
value: 32.222
- type: mrr_at_1000
value: 32.277
- type: mrr_at_20
value: 31.761
- type: mrr_at_3
value: 27.717000000000002
- type: mrr_at_5
value: 29.416999999999998
- type: ndcg_at_1
value: 20.200000000000003
- type: ndcg_at_10
value: 18.636
- type: ndcg_at_100
value: 26.442
- type: ndcg_at_1000
value: 31.828
- type: ndcg_at_20
value: 21.441
- type: ndcg_at_3
value: 17.323
- type: ndcg_at_5
value: 15.010000000000002
- type: precision_at_1
value: 20.200000000000003
- type: precision_at_10
value: 9.9
- type: precision_at_100
value: 2.106
- type: precision_at_1000
value: 0.33999999999999997
- type: precision_at_20
value: 6.575
- type: precision_at_3
value: 16.367
- type: precision_at_5
value: 13.200000000000001
- type: recall_at_1
value: 4.108
- type: recall_at_10
value: 20.052
- type: recall_at_100
value: 42.723
- type: recall_at_1000
value: 69.118
- type: recall_at_20
value: 26.662999999999997
- type: recall_at_3
value: 9.963
- type: recall_at_5
value: 13.377
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: 20a6d6f312dd54037fe07a32d58e5e168867909d
metrics:
- type: cos_sim_pearson
value: 81.73133871784073
- type: cos_sim_spearman
value: 75.63155962642634
- type: euclidean_pearson
value: 78.84721858652286
- type: euclidean_spearman
value: 75.52150847464515
- type: manhattan_pearson
value: 78.65433033180727
- type: manhattan_spearman
value: 75.30995832884881
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 75.66063073145264
- type: cos_sim_spearman
value: 68.58158236004101
- type: euclidean_pearson
value: 72.54019756825143
- type: euclidean_spearman
value: 69.05526621955067
- type: manhattan_pearson
value: 72.69442494173272
- type: manhattan_spearman
value: 69.24310689645435
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 79.93061145846976
- type: cos_sim_spearman
value: 80.54473705232682
- type: euclidean_pearson
value: 80.25598213392439
- type: euclidean_spearman
value: 80.57639468906437
- type: manhattan_pearson
value: 80.04739474388745
- type: manhattan_spearman
value: 80.35672978503159
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 80.63106651366024
- type: cos_sim_spearman
value: 77.628680514703
- type: euclidean_pearson
value: 79.88625241187461
- type: euclidean_spearman
value: 77.80535399731345
- type: manhattan_pearson
value: 79.78810133011544
- type: manhattan_spearman
value: 77.73028091841451
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 85.30832602658512
- type: cos_sim_spearman
value: 86.15687211744392
- type: euclidean_pearson
value: 85.94586990553746
- type: euclidean_spearman
value: 86.48157226860724
- type: manhattan_pearson
value: 85.88233798668581
- type: manhattan_spearman
value: 86.42359889540302
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 81.48207305822743
- type: cos_sim_spearman
value: 82.8229306585227
- type: euclidean_pearson
value: 82.3912454156615
- type: euclidean_spearman
value: 83.09865476559257
- type: manhattan_pearson
value: 82.30053520575876
- type: manhattan_spearman
value: 83.00392320200139
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 87.83517082969622
- type: cos_sim_spearman
value: 88.5704237984555
- type: euclidean_pearson
value: 88.15443024833176
- type: euclidean_spearman
value: 88.60313594495189
- type: manhattan_pearson
value: 87.99012996276818
- type: manhattan_spearman
value: 88.39306322978999
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 67.62856734038614
- type: cos_sim_spearman
value: 67.38775280429276
- type: euclidean_pearson
value: 68.09416503472238
- type: euclidean_spearman
value: 67.45221088834498
- type: manhattan_pearson
value: 68.31811474137709
- type: manhattan_spearman
value: 67.75846817406287
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 84.13302836216701
- type: cos_sim_spearman
value: 84.24952159575491
- type: euclidean_pearson
value: 84.65017899273384
- type: euclidean_spearman
value: 84.43303793097236
- type: manhattan_pearson
value: 84.55589549879238
- type: manhattan_spearman
value: 84.42827667887977
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 80.03616790601166
- type: mrr
value: 94.31135132115524
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: mteb/scifact
config: default
split: test
revision: 0228b52cf27578f30900b9e5271d331663a030d7
metrics:
- type: map_at_1
value: 51.678000000000004
- type: map_at_10
value: 62.011
- type: map_at_100
value: 62.443000000000005
- type: map_at_1000
value: 62.468999999999994
- type: map_at_20
value: 62.226000000000006
- type: map_at_3
value: 58.443999999999996
- type: map_at_5
value: 60.550000000000004
- type: mrr_at_1
value: 54.0
- type: mrr_at_10
value: 63.27199999999999
- type: mrr_at_100
value: 63.596
- type: mrr_at_1000
value: 63.619
- type: mrr_at_20
value: 63.416
- type: mrr_at_3
value: 60.5
- type: mrr_at_5
value: 62.283
- type: ndcg_at_1
value: 54.0
- type: ndcg_at_10
value: 67.315
- type: ndcg_at_100
value: 69.372
- type: ndcg_at_1000
value: 70.15400000000001
- type: ndcg_at_20
value: 67.943
- type: ndcg_at_3
value: 61.121
- type: ndcg_at_5
value: 64.399
- type: precision_at_1
value: 54.0
- type: precision_at_10
value: 9.232999999999999
- type: precision_at_100
value: 1.047
- type: precision_at_1000
value: 0.11100000000000002
- type: precision_at_20
value: 4.7829999999999995
- type: precision_at_3
value: 23.666999999999998
- type: precision_at_5
value: 16.2
- type: recall_at_1
value: 51.678000000000004
- type: recall_at_10
value: 82.389
- type: recall_at_100
value: 92.0
- type: recall_at_1000
value: 98.333
- type: recall_at_20
value: 84.63300000000001
- type: recall_at_3
value: 66.05
- type: recall_at_5
value: 74.006
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.82673267326733
- type: cos_sim_ap
value: 95.11999931294784
- type: cos_sim_f1
value: 91.0941475826972
- type: cos_sim_precision
value: 92.74611398963731
- type: cos_sim_recall
value: 89.5
- type: dot_accuracy
value: 99.73861386138614
- type: dot_ap
value: 92.76208671816435
- type: dot_f1
value: 86.5055387713998
- type: dot_precision
value: 87.11967545638946
- type: dot_recall
value: 85.9
- type: euclidean_accuracy
value: 99.82376237623762
- type: euclidean_ap
value: 95.02471241011084
- type: euclidean_f1
value: 90.97363083164299
- type: euclidean_precision
value: 92.28395061728395
- type: euclidean_recall
value: 89.7
- type: manhattan_accuracy
value: 99.82574257425742
- type: manhattan_ap
value: 95.08424842231868
- type: manhattan_f1
value: 91.10212335692619
- type: manhattan_precision
value: 92.12678936605317
- type: manhattan_recall
value: 90.10000000000001
- type: max_accuracy
value: 99.82673267326733
- type: max_ap
value: 95.11999931294784
- type: max_f1
value: 91.10212335692619
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 53.870949746768424
- type: v_measures
value:
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- 0.53571634076978
- 0.5884760755274984
- 0.46493825119779986
- 0.5647097615749553
- 0.5050495849120543
- 0.491061219994023
- 0.4819622731542588
- 0.5685868012607284
- 0.5540760555292195
- 0.531322826771169
- 0.5932274601787088
- 0.6261393631444355
- 0.6353921700607754
- 0.6018599887005625
- 0.5217064752780205
- 0.5317605881853373
- 0.5257201882718268
- 0.5260835662200616
- 0.5003275253721006
- 0.5110511254674243
- 0.5261695936445681
- 0.5091730883971124
- 0.48910042016546806
- 0.5422967369475379
- 0.5418299559666825
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 33.56703823226784
- type: v_measures
value:
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- 0.320494817263046
- 0.3250723341694729
- 0.32168615316198984
- 0.31328349679632345
- 0.31938046148819477
- 0.36421160408518477
- 0.3463076518950044
- 0.35187389429456556
- 0.3507929680626984
- 0.3436004420103039
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 49.82873266157383
- type: mrr
value: 50.652096065699006
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 31.35739606124227
- type: cos_sim_spearman
value: 31.26775311472305
- type: dot_pearson
value: 29.421400993418278
- type: dot_spearman
value: 30.180472594773534
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: mteb/trec-covid
config: default
split: test
revision: bb9466bac8153a0349341eb1b22e06409e78ef4e
metrics:
- type: map_at_1
value: 0.184
- type: map_at_10
value: 1.398
- type: map_at_100
value: 7.2090000000000005
- type: map_at_1000
value: 18.414
- type: map_at_20
value: 2.414
- type: map_at_3
value: 0.509
- type: map_at_5
value: 0.767
- type: mrr_at_1
value: 72.0
- type: mrr_at_10
value: 80.467
- type: mrr_at_100
value: 80.735
- type: mrr_at_1000
value: 80.735
- type: mrr_at_20
value: 80.735
- type: mrr_at_3
value: 79.0
- type: mrr_at_5
value: 79.80000000000001
- type: ndcg_at_1
value: 68.0
- type: ndcg_at_10
value: 60.324
- type: ndcg_at_100
value: 43.866
- type: ndcg_at_1000
value: 41.932
- type: ndcg_at_20
value: 56.013999999999996
- type: ndcg_at_3
value: 66.458
- type: ndcg_at_5
value: 63.048
- type: precision_at_1
value: 72.0
- type: precision_at_10
value: 64.2
- type: precision_at_100
value: 44.56
- type: precision_at_1000
value: 18.736
- type: precision_at_20
value: 59.0
- type: precision_at_3
value: 72.0
- type: precision_at_5
value: 67.2
- type: recall_at_1
value: 0.184
- type: recall_at_10
value: 1.649
- type: recall_at_100
value: 10.659
- type: recall_at_1000
value: 40.424
- type: recall_at_20
value: 3.0349999999999997
- type: recall_at_3
value: 0.5519999999999999
- type: recall_at_5
value: 0.852
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: mteb/touche2020
config: default
split: test
revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f
metrics:
- type: map_at_1
value: 1.252
- type: map_at_10
value: 8.029
- type: map_at_100
value: 13.504
- type: map_at_1000
value: 15.013000000000002
- type: map_at_20
value: 10.306
- type: map_at_3
value: 3.372
- type: map_at_5
value: 4.923
- type: mrr_at_1
value: 18.367
- type: mrr_at_10
value: 36.612
- type: mrr_at_100
value: 37.345
- type: mrr_at_1000
value: 37.345
- type: mrr_at_20
value: 36.955
- type: mrr_at_3
value: 32.993
- type: mrr_at_5
value: 33.912
- type: ndcg_at_1
value: 16.326999999999998
- type: ndcg_at_10
value: 21.124000000000002
- type: ndcg_at_100
value: 32.635
- type: ndcg_at_1000
value: 43.993
- type: ndcg_at_20
value: 22.429
- type: ndcg_at_3
value: 20.836
- type: ndcg_at_5
value: 20.437
- type: precision_at_1
value: 18.367
- type: precision_at_10
value: 21.02
- type: precision_at_100
value: 7.245
- type: precision_at_1000
value: 1.473
- type: precision_at_20
value: 15.714
- type: precision_at_3
value: 23.128999999999998
- type: precision_at_5
value: 22.448999999999998
- type: recall_at_1
value: 1.252
- type: recall_at_10
value: 15.312999999999999
- type: recall_at_100
value: 44.908
- type: recall_at_1000
value: 79.396
- type: recall_at_20
value: 22.647000000000002
- type: recall_at_3
value: 4.883
- type: recall_at_5
value: 7.917000000000001
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: edfaf9da55d3dd50d43143d90c1ac476895ae6de
metrics:
- type: accuracy
value: 65.458984375
- type: ap
value: 12.013147326225168
- type: f1
value: 50.30981581053394
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 58.658743633276735
- type: f1
value: 59.01001910848807
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 40.7719980016582
- type: v_measures
value:
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- 0.43398618769240316
- 0.411071419600849
- 0.4084167708216848
- 0.4309144066998439
- 0.3937926057303082
- 0.41327169334332636
- 0.4194895558089149
- 0.3732114423385808
- 0.4053128667752613
- 0.3877328513546471
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 84.71717231924659
- type: cos_sim_ap
value: 69.78325722226528
- type: cos_sim_f1
value: 66.23786691615015
- type: cos_sim_precision
value: 59.483301827347205
- type: cos_sim_recall
value: 74.72295514511873
- type: dot_accuracy
value: 81.95148119449246
- type: dot_ap
value: 60.71125646179137
- type: dot_f1
value: 58.44781026182928
- type: dot_precision
value: 52.65496086312672
- type: dot_recall
value: 65.67282321899735
- type: euclidean_accuracy
value: 84.84830422602371
- type: euclidean_ap
value: 69.97192936786296
- type: euclidean_f1
value: 66.53649011471808
- type: euclidean_precision
value: 61.898274296094456
- type: euclidean_recall
value: 71.92612137203166
- type: manhattan_accuracy
value: 84.75889610776659
- type: manhattan_ap
value: 69.75691180376053
- type: manhattan_f1
value: 66.32788868723533
- type: manhattan_precision
value: 61.2513966480447
- type: manhattan_recall
value: 72.32189973614776
- type: max_accuracy
value: 84.84830422602371
- type: max_ap
value: 69.97192936786296
- type: max_f1
value: 66.53649011471808
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 88.43287926417511
- type: cos_sim_ap
value: 85.07378179191598
- type: cos_sim_f1
value: 77.50230244980658
- type: cos_sim_precision
value: 74.30246521155613
- type: cos_sim_recall
value: 80.99014474899907
- type: dot_accuracy
value: 86.946481934257
- type: dot_ap
value: 80.90485630835825
- type: dot_f1
value: 74.43342263413221
- type: dot_precision
value: 70.24736914035807
- type: dot_recall
value: 79.1499846011703
- type: euclidean_accuracy
value: 88.49303372530757
- type: euclidean_ap
value: 85.08920672765427
- type: euclidean_f1
value: 77.53514807059526
- type: euclidean_precision
value: 75.3707473102646
- type: euclidean_recall
value: 79.82753310748383
- type: manhattan_accuracy
value: 88.47168859393798
- type: manhattan_ap
value: 85.01816084029292
- type: manhattan_f1
value: 77.36513181524315
- type: manhattan_precision
value: 72.5057223643463
- type: manhattan_recall
value: 82.9226978749615
- type: max_accuracy
value: 88.49303372530757
- type: max_ap
value: 85.08920672765427
- type: max_f1
value: 77.53514807059526
---
# Ivysaur
This is a fine-tune of [gte-tiny](https://huggingface.co/TaylorAI/gte-tiny) using [qa-assistant](https://huggingface.co/datasets/Mihaiii/qa-assistant).
## Intended purpose
<span style="color:blue">This model is designed for use in semantic-autocomplete ([click here for demo](https://mihaiii.github.io/semantic-autocomplete/)).</span>
## Usage (Sentence-Transformers) (same as [gte-tiny](https://huggingface.co/TaylorAI/gte-tiny))
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('Mihaiii/Ivysaur')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers) (same as [gte-tiny](https://huggingface.co/TaylorAI/gte-tiny))
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('Mihaiii/Ivysaur')
model = AutoModel.from_pretrained('Mihaiii/Ivysaur')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, mean pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
### Limitation (same as [gte-small](https://huggingface.co/thenlper/gte-small))
This model exclusively caters to English texts, and any lengthy texts will be truncated to a maximum of 512 tokens. | [
"SUMMARIZATION"
] | [
"BIOSSES",
"SCIFACT"
] |
Mozilla/Phi-3-mini-4k-instruct-llamafile | Mozilla | text-generation | [
"llamafile",
"text-generation",
"en",
"base_model:microsoft/Phi-3-mini-4k-instruct",
"base_model:finetune:microsoft/Phi-3-mini-4k-instruct",
"license:apache-2.0",
"region:us"
] | 2024-04-26T20:47:56 | 2024-07-01T20:28:54 | 1,508 | 16 | ---
base_model: microsoft/Phi-3-mini-4k-instruct
language:
- en
license: apache-2.0
pipeline_tag: text-generation
tags:
- llamafile
prompt_template: '<|system|>
You are a helpful AI assistant.<|end|>
<|user|>
{{prompt}}<|end|>
<|assistant|>'
---
# Phi-3-mini-4k-instruct - llamafile
This repository contains executable weights (which we call
[llamafiles](https://github.com/Mozilla-Ocho/llamafile)) that run on
Linux, MacOS, Windows, FreeBSD, OpenBSD, and NetBSD for AMD64 and ARM64.
- Model creator: [Microsoft](https://huggingface.co/microsoft)
- Original model: [microsoft/Phi-3-mini-4k-instruct](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct)
## Quickstart
Assuming your system has at least 32GB of RAM, you can try running the
following command which download, concatenate, and execute the model.
```
wget https://huggingface.co/jartine/Phi-3-mini-4k-instruct-llamafile/resolve/main/Phi-3-mini-4k-instruct.F16.llamafile
chmod +x Phi-3-mini-4k-instruct.F16.llamafile
./Phi-3-mini-4k-instruct.F16.llamafile --help # view manual
./Phi-3-mini-4k-instruct.F16.llamafile # launch web gui + oai api
./Phi-3-mini-4k-instruct.F16.llamafile -p ... # cli interface (scriptable)
```
Alternatively, you may download an official `llamafile` executable from
Mozilla Ocho on GitHub, in which case you can use the Mixtral llamafiles
as a simple weights data file.
```
llamafile -m ./Phi-3-mini-4k-instruct.F16.llamafile ...
```
For further information, please see the [llamafile
README](https://github.com/mozilla-ocho/llamafile/).
Having **trouble?** See the ["Gotchas"
section](https://github.com/mozilla-ocho/llamafile/?tab=readme-ov-file#gotchas)
of the README.
## Prompting
Prompt template:
```
<|system|>
You are a helpful AI assistant.<|end|>
<|user|>
How to explain Internet for a medieval knight?<|end|>
<|assistant|>
```
Command template:
```
./Phi-3-mini-4k-instruct.F16.llamafile -e -p "<|user|>\n{{prompt}}<|end|>\n<|assistant|>"
```
## About llamafile
llamafile is a new format introduced by Mozilla Ocho on Nov 20th 2023.
It uses Cosmopolitan Libc to turn LLM weights into runnable llama.cpp
binaries that run on the stock installs of six OSes for both ARM64 and
AMD64.
In addition to being executables, llamafiles are also zip archives. Each
llamafile contains a GGUF file, which you can extract using the `unzip`
command. If you want to change or add files to your llamafiles, then the
`zipalign` command (distributed on the llamafile github) should be used
instead of the traditional `zip` command.
## Licensing (Phi-3 Specific)
The Phi-3 llamafiles are licensed Apache 2.0 because some of the
software that went into creating these llamafiles uses that as its
license. The Phi-3 weights themselves were published by Microsoft under
the even more permissive MIT license. You can use the `unzip` command to
extract the MIT-licensed GGUF file from each llamafile, which contains
only the Microsoft Phi-3 weights.
For further details on the complete picture, read our `LICENSE` file,
since it documents the copyright notice of every transitive dependency.
## About Quantization Formats (General Advice)
Your choice of quantization format depends on three things:
1. Will it fit in RAM or VRAM?
2. Is your use case reading (e.g. summarization) or writing (e.g. chatbot)?
3. llamafiles bigger than 4.30 GB are hard to run on Windows (see [gotchas](https://github.com/mozilla-ocho/llamafile/?tab=readme-ov-file#gotchas))
Good quants for writing (prediction speed) are Q5\_K\_M, and Q4\_0. Text
generation is bounded by memory speed, so smaller quants help, but they
cause the LLM to hallucinate more. However that doesn't mean they can't
think correctly. A highly degraded quant like `Q2_K` may not make a
great encyclopedia, but it's still capable of logical reasoning and
the emergent capabilities LLMs exhibit.
Good quants for reading (evaluation speed) are BF16, F16, Q8\_0, and
Q4\_0 (ordered from fastest to slowest). Prompt evaluation is bounded by
flop count, which means perf can be improved through software
engineering alone, e.g. BLAS algorithms, in which case quantization
starts hurting more than it helps, since it competes for CPU resources
and makes it harder for the compiler to parallelize instructions. You
want to ideally use the simplest smallest floating point format that's
natively implemented by your hardware. In most cases, that's BF16 or
FP16. However, llamafile is able to still offer respectable tinyBLAS
speedups for llama.cpp's simplest quants: Q8\_0 and Q4\_0.
--
## Model Summary
The Phi-3-Mini-4K-Instruct is a 3.8B parameters, lightweight, state-of-the-art open model trained with the Phi-3 datasets that includes both synthetic data and the filtered publicly available websites data with a focus on high-quality and reasoning dense properties.
The model belongs to the Phi-3 family with the Mini version in two variants [4K](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct) and [128K](https://huggingface.co/microsoft/Phi-3-mini-128k-instruct) which is the context length (in tokens) that it can support.
The model has underwent a post-training process that incorporates both supervised fine-tuning and direct preference optimization for the instruction following and safety measures.
When assessed against benchmarks testing common sense, language understanding, math, code, long context and logical reasoning, Phi-3 Mini-4K-Instruct showcased a robust and state-of-the-art performance among models with less than 13 billion parameters.
Resources and Technical Documentation:
+ [Phi-3 Microsoft Blog](https://aka.ms/phi3blog-april)
+ [Phi-3 Technical Report](https://aka.ms/phi3-tech-report)
+ [Phi-3 on Azure AI Studio](https://aka.ms/phi3-azure-ai)
+ Phi-3 GGUF: [4K](https://aka.ms/Phi3-mini-4k-instruct-gguf)
+ Phi-3 ONNX: [4K](https://aka.ms/Phi3-mini-4k-instruct-onnx)
## Intended Uses
**Primary use cases**
The model is intended for commercial and research use in English. The model provides uses for applications which require:
1) Memory/compute constrained environments
2) Latency bound scenarios
3) Strong reasoning (especially code, math and logic)
Our model is designed to accelerate research on language and multimodal models, for use as a building block for generative AI powered features.
**Use case considerations**
Our models are not specifically designed or evaluated for all downstream purposes. Developers should consider common limitations of language models as they select use cases, and evaluate and mitigate for accuracy, safety, and fariness before using within a specific downstream use case, particularly for high risk scenarios. Developers should be aware of and adhere to applicable laws or regulations (including privacy, trade compliance laws, etc.) that are relevant to their use case.
Nothing contained in this Model Card should be interpreted as or deemed a restriction or modification to the license the model is released under.
## How to Use
Phi-3 Mini-4K-Instruct has been integrated in the development version (4.40.0) of `transformers`. Until the official version is released through `pip`, ensure that you are doing one of the following:
* When loading the model, ensure that `trust_remote_code=True` is passed as an argument of the `from_pretrained()` function.
* Update your local `transformers` to the development version: `pip uninstall -y transformers && pip install git+https://github.com/huggingface/transformers`. The previous command is an alternative to cloning and installing from the source.
The current `transformers` version can be verified with: `pip list | grep transformers`.
Phi-3 Mini-4K-Instruct is also available in [HuggingChat](https://aka.ms/try-phi3-hf-chat).
### Tokenizer
Phi-3 Mini-4K-Instruct supports a vocabulary size of up to `32064` tokens. The [tokenizer files](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct/blob/main/added_tokens.json) already provide placeholder tokens that can be used for downstream fine-tuning, but they can also be extended up to the model's vocabulary size.
### Chat Format
Given the nature of the training data, the Phi-3 Mini-4K-Instruct model is best suited for prompts using the chat format as follows.
You can provide the prompt as a question with a generic template as follow:
```markdown
<|user|>\nQuestion <|end|>\n<|assistant|>
```
For example:
```markdown
<|system|>
You are a helpful AI assistant.<|end|>
<|user|>
How to explain Internet for a medieval knight?<|end|>
<|assistant|>
```
where the model generates the text after `<|assistant|>` . In case of few-shots prompt, the prompt can be formatted as the following:
```markdown
<|system|>
You are a helpful AI assistant.<|end|>
<|user|>
I am going to Paris, what should I see?<|end|>
<|assistant|>
Paris, the capital of France, is known for its stunning architecture, art museums, historical landmarks, and romantic atmosphere. Here are some of the top attractions to see in Paris:\n\n1. The Eiffel Tower: The iconic Eiffel Tower is one of the most recognizable landmarks in the world and offers breathtaking views of the city.\n2. The Louvre Museum: The Louvre is one of the world's largest and most famous museums, housing an impressive collection of art and artifacts, including the Mona Lisa.\n3. Notre-Dame Cathedral: This beautiful cathedral is one of the most famous landmarks in Paris and is known for its Gothic architecture and stunning stained glass windows.\n\nThese are just a few of the many attractions that Paris has to offer. With so much to see and do, it's no wonder that Paris is one of the most popular tourist destinations in the world."<|end|>
<|user|>
What is so great about #1?<|end|>
<|assistant|>
```
### Sample inference code
This code snippets show how to get quickly started with running the model on a GPU:
```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
torch.random.manual_seed(0)
model = AutoModelForCausalLM.from_pretrained(
"microsoft/Phi-3-mini-4k-instruct",
device_map="cuda",
torch_dtype="auto",
trust_remote_code=True,
)
tokenizer = AutoTokenizer.from_pretrained("microsoft/Phi-3-mini-4k-instruct")
messages = [
{"role": "system", "content": "You are a helpful digital assistant. Please provide safe, ethical and accurate information to the user."},
{"role": "user", "content": "Can you provide ways to eat combinations of bananas and dragonfruits?"},
{"role": "assistant", "content": "Sure! Here are some ways to eat bananas and dragonfruits together: 1. Banana and dragonfruit smoothie: Blend bananas and dragonfruits together with some milk and honey. 2. Banana and dragonfruit salad: Mix sliced bananas and dragonfruits together with some lemon juice and honey."},
{"role": "user", "content": "What about solving an 2x + 3 = 7 equation?"},
]
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
)
generation_args = {
"max_new_tokens": 500,
"return_full_text": False,
"temperature": 0.0,
"do_sample": False,
}
output = pipe(messages, **generation_args)
print(output[0]['generated_text'])
```
*Some applications/frameworks might not include a BOS token (`<s>`) at the start of the conversation. Please ensure that it is included since it provides more reliable results.*
## Responsible AI Considerations
Like other language models, the Phi series models can potentially behave in ways that are unfair, unreliable, or offensive. Some of the limiting behaviors to be aware of include:
+ Quality of Service: the Phi models are trained primarily on English text. Languages other than English will experience worse performance. English language varieties with less representation in the training data might experience worse performance than standard American English.
+ Representation of Harms & Perpetuation of Stereotypes: These models can over- or under-represent groups of people, erase representation of some groups, or reinforce demeaning or negative stereotypes. Despite safety post-training, these limitations may still be present due to differing levels of representation of different groups or prevalence of examples of negative stereotypes in training data that reflect real-world patterns and societal biases.
+ Inappropriate or Offensive Content: these models may produce other types of inappropriate or offensive content, which may make it inappropriate to deploy for sensitive contexts without additional mitigations that are specific to the use case.
+ Information Reliability: Language models can generate nonsensical content or fabricate content that might sound reasonable but is inaccurate or outdated.
+ Limited Scope for Code: Majority of Phi-3 training data is based in Python and use common packages such as "typing, math, random, collections, datetime, itertools". If the model generates Python scripts that utilize other packages or scripts in other languages, we strongly recommend users manually verify all API uses.
Developers should apply responsible AI best practices and are responsible for ensuring that a specific use case complies with relevant laws and regulations (e.g. privacy, trade, etc.). Important areas for consideration include:
+ Allocation: Models may not be suitable for scenarios that could have consequential impact on legal status or the allocation of resources or life opportunities (ex: housing, employment, credit, etc.) without further assessments and additional debiasing techniques.
+ High-Risk Scenarios: Developers should assess suitability of using models in high-risk scenarios where unfair, unreliable or offensive outputs might be extremely costly or lead to harm. This includes providing advice in sensitive or expert domains where accuracy and reliability are critical (ex: legal or health advice). Additional safeguards should be implemented at the application level according to the deployment context.
+ Misinformation: Models may produce inaccurate information. Developers should follow transparency best practices and inform end-users they are interacting with an AI system. At the application level, developers can build feedback mechanisms and pipelines to ground responses in use-case specific, contextual information, a technique known as Retrieval Augmented Generation (RAG).
+ Generation of Harmful Content: Developers should assess outputs for their context and use available safety classifiers or custom solutions appropriate for their use case.
+ Misuse: Other forms of misuse such as fraud, spam, or malware production may be possible, and developers should ensure that their applications do not violate applicable laws and regulations.
## Training
### Model
* Architecture: Phi-3 Mini-4K-Instruct has 3.8B parameters and is a dense decoder-only Transformer model. The model is fine-tuned with Supervised fine-tuning (SFT) and Direct Preference Optimization (DPO) to ensure alignment with human preferences and safety guidlines.
* Inputs: Text. It is best suited for prompts using chat format.
* Context length: 4K tokens
* GPUs: 512 H100-80G
* Training time: 7 days
* Training data: 3.3T tokens
* Outputs: Generated text in response to the input
* Dates: Our models were trained between February and April 2024
* Status: This is a static model trained on an offline dataset with cutoff date October 2023. Future versions of the tuned models may be released as we improve models.
### Datasets
Our training data includes a wide variety of sources, totaling 3.3 trillion tokens, and is a combination of
1) Publicly available documents filtered rigorously for quality, selected high-quality educational data, and code;
2) Newly created synthetic, “textbook-like” data for the purpose of teaching math, coding, common sense reasoning, general knowledge of the world (science, daily activities, theory of mind, etc.);
3) High quality chat format supervised data covering various topics to reflect human preferences on different aspects such as instruct-following, truthfulness, honesty and helpfulness.
### Fine-tuning
A basic example of multi-GPUs supervised fine-tuning (SFT) with TRL and Accelerate modules is provided [here](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct/resolve/main/sample_finetune.py).
## Benchmarks
We report the results for Phi-3-Mini-4K-Instruct on standard open-source benchmarks measuring the model's reasoning ability (both common sense reasoning and logical reasoning). We compare to Phi-2, Mistral-7b-v0.1, Mixtral-8x7b, Gemma 7B, Llama-3-8B-Instruct, and GPT-3.5.
All the reported numbers are produced with the exact same pipeline to ensure that the numbers are comparable. These numbers might differ from other published numbers due to slightly different choices in the evaluation.
As is now standard, we use few-shot prompts to evaluate the models, at temperature 0.
The prompts and number of shots are part of a Microsoft internal tool to evaluate language models, and in particular we did no optimization to the pipeline for Phi-3.
More specifically, we do not change prompts, pick different few-shot examples, change prompt format, or do any other form of optimization for the model.
The number of k–shot examples is listed per-benchmark.
| | Phi-3-Mini-4K-In<br>3.8b | Phi-3-Small<br>7b (preview) | Phi-3-Medium<br>14b (preview) | Phi-2<br>2.7b | Mistral<br>7b | Gemma<br>7b | Llama-3-In<br>8b | Mixtral<br>8x7b | GPT-3.5<br>version 1106 |
|---|---|---|---|---|---|---|---|---|---|
| MMLU <br>5-Shot | 68.8 | 75.3 | 78.2 | 56.3 | 61.7 | 63.6 | 66.5 | 68.4 | 71.4 |
| HellaSwag <br> 5-Shot | 76.7 | 78.7 | 83.2 | 53.6 | 58.5 | 49.8 | 71.1 | 70.4 | 78.8 |
| ANLI <br> 7-Shot | 52.8 | 55.0 | 58.7 | 42.5 | 47.1 | 48.7 | 57.3 | 55.2 | 58.1 |
| GSM-8K <br> 0-Shot; CoT | 82.5 | 86.4 | 90.8 | 61.1 | 46.4 | 59.8 | 77.4 | 64.7 | 78.1 |
| MedQA <br> 2-Shot | 53.8 | 58.2 | 69.8 | 40.9 | 49.6 | 50.0 | 60.5 | 62.2 | 63.4 |
| AGIEval <br> 0-Shot | 37.5 | 45.0 | 49.7 | 29.8 | 35.1 | 42.1 | 42.0 | 45.2 | 48.4 |
| TriviaQA <br> 5-Shot | 64.0 | 59.1 | 73.3 | 45.2 | 72.3 | 75.2 | 67.7 | 82.2 | 85.8 |
| Arc-C <br> 10-Shot | 84.9 | 90.7 | 91.9 | 75.9 | 78.6 | 78.3 | 82.8 | 87.3 | 87.4 |
| Arc-E <br> 10-Shot | 94.6 | 97.1 | 98.0 | 88.5 | 90.6 | 91.4 | 93.4 | 95.6 | 96.3 |
| PIQA <br> 5-Shot | 84.2 | 87.8 | 88.2 | 60.2 | 77.7 | 78.1 | 75.7 | 86.0 | 86.6 |
| SociQA <br> 5-Shot | 76.6 | 79.0 | 79.4 | 68.3 | 74.6 | 65.5 | 73.9 | 75.9 | 68.3 |
| BigBench-Hard <br> 0-Shot | 71.7 | 75.0 | 82.5 | 59.4 | 57.3 | 59.6 | 51.5 | 69.7 | 68.32 |
| WinoGrande <br> 5-Shot | 70.8 | 82.5 | 81.2 | 54.7 | 54.2 | 55.6 | 65 | 62.0 | 68.8 |
| OpenBookQA <br> 10-Shot | 83.2 | 88.4 | 86.6 | 73.6 | 79.8 | 78.6 | 82.6 | 85.8 | 86.0 |
| BoolQ <br> 0-Shot | 77.6 | 82.9 | 86.5 | -- | 72.2 | 66.0 | 80.9 | 77.6 | 79.1 |
| CommonSenseQA <br> 10-Shot | 80.2 | 80.3 | 82.6 | 69.3 | 72.6 | 76.2 | 79 | 78.1 | 79.6 |
| TruthfulQA <br> 10-Shot | 65.0 | 68.1 | 74.8 | -- | 52.1 | 53.0 | 63.2 | 60.1 | 85.8 |
| HumanEval <br> 0-Shot | 59.1 | 59.1 | 54.7 | 47.0 | 28.0 | 34.1 | 60.4 | 37.8 | 62.2 |
| MBPP <br> 3-Shot | 53.8 | 71.4 | 73.7 | 60.6 | 50.8 | 51.5 | 67.7 | 60.2 | 77.8 |
## Software
* [PyTorch](https://github.com/pytorch/pytorch)
* [DeepSpeed](https://github.com/microsoft/DeepSpeed)
* [Transformers](https://github.com/huggingface/transformers)
* [Flash-Attention](https://github.com/HazyResearch/flash-attention)
## Hardware
Note that by default, the Phi-3-mini model uses flash attention, which requires certain types of GPU hardware to run. We have tested on the following GPU types:
* NVIDIA A100
* NVIDIA A6000
* NVIDIA H100
If you want to run the model on:
* NVIDIA V100 or earlier generation GPUs: call AutoModelForCausalLM.from_pretrained() with attn_implementation="eager"
* CPU: use the **GGUF** quantized models [4K](https://aka.ms/Phi3-mini-4k-instruct-gguf)
+ Optimized inference on GPU, CPU, and Mobile: use the **ONNX** models [4K](https://aka.ms/Phi3-mini-4k-instruct-onnx)
## Cross Platform Support
ONNX runtime ecosystem now supports Phi-3 Mini models across platforms and hardware. You can find the optimized Phi-3 Mini-4K-Instruct ONNX model [here](https://aka.ms/phi3-mini-4k-instruct-onnx).
Optimized Phi-3 models are also published here in ONNX format, to run with ONNX Runtime on CPU and GPU across devices, including server platforms, Windows, Linux and Mac desktops, and mobile CPUs, with the precision best suited to each of these targets. DirectML support lets developers bring hardware acceleration to Windows devices at scale across AMD, Intel, and NVIDIA GPUs.
Along with DirectML, ONNX Runtime provides cross platform support for Phi-3 across a range of devices CPU, GPU, and mobile.
Here are some of the optimized configurations we have added:
1. ONNX models for int4 DML: Quantized to int4 via AWQ
2. ONNX model for fp16 CUDA
3. ONNX model for int4 CUDA: Quantized to int4 via RTN
4. ONNX model for int4 CPU and Mobile: Quantized to int4 via RTN
## License
The model is licensed under the [MIT license](https://huggingface.co/microsoft/Phi-3-mini-4k/resolve/main/LICENSE).
## Trademarks
This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow [Microsoft’s Trademark & Brand Guidelines](https://www.microsoft.com/en-us/legal/intellectualproperty/trademarks). Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party’s policies.
| [
"SUMMARIZATION"
] | [
"MEDQA"
] |
SeaLLMs/SeaLLMs-v3-1.5B-Chat | SeaLLMs | text-generation | [
"transformers",
"safetensors",
"qwen2",
"text-generation",
"sea",
"multilingual",
"conversational",
"en",
"zh",
"id",
"vi",
"th",
"ms",
"tl",
"ta",
"jv",
"arxiv:2407.19672",
"arxiv:2306.05179",
"license:other",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | 2024-07-17T09:04:00 | 2024-07-30T05:00:21 | 1,507 | 12 | ---
language:
- en
- zh
- id
- vi
- th
- ms
- tl
- ta
- jv
license: other
license_name: seallms
license_link: https://huggingface.co/SeaLLMs/SeaLLM-13B-Chat/blob/main/LICENSE
tags:
- sea
- multilingual
---
# *SeaLLMs-v3* - Large Language Models for Southeast Asia
<p align="center">
<a href="https://damo-nlp-sg.github.io/SeaLLMs/" target="_blank" rel="noopener">Website</a>
<a href="https://huggingface.co/SeaLLMs/SeaLLMs-v3-1.5B-Chat" target="_blank" rel="noopener">Model</a>
<a href="https://huggingface.co/spaces/SeaLLMs/SeaLLM-Chat" target="_blank" rel="noopener"> 🤗 DEMO</a>
<a href="https://github.com/DAMO-NLP-SG/SeaLLMs" target="_blank" rel="noopener">Github</a>
<a href="https://arxiv.org/pdf/2407.19672" target="_blank" rel="noopener">[NEW] Technical Report</a>
</p>
We introduce **SeaLLMs-v3**, the latest series of the SeaLLMs (Large Language Models for Southeast Asian languages) family. It achieves state-of-the-art performance among models with similar sizes, excelling across a diverse array of tasks such as world knowledge, mathematical reasoning, translation, and instruction following. In the meantime, it was specifically enhanced to be more trustworthy, exhibiting reduced hallucination and providing safe responses, particularly in queries closed related to Southeast Asian culture.
## 🔥 Highlights
- State-of-the-art performance compared to open-source models of similar sizes, evaluated across various dimensions such as human exam questions, instruction-following, mathematics, and translation.
- Significantly enhanced instruction-following capability, especially in multi-turn settings.
- Ensures safety in usage with significantly reduced instances of hallucination and sensitivity to local contexts.
## Uses
SeaLLMs is tailored for handling a wide range of languages spoken in the SEA region, including English, Chinese, Indonesian, Vietnamese, Thai, Tagalog, Malay, Burmese, Khmer, Lao, Tamil, and Javanese.
This page introduces the **SeaLLMs-v3-1.5B-Chat** model, specifically fine-tuned to follow human instructions effectively for task completion, making it directly applicable to your applications.
You may also refer to the [SeaLLMs-v3-7B-Chat](https://huggingface.co/SeaLLMs/SeaLLM3-7B-Chat) model for enhanced performance, although it requires higher computational resources.
### Get started with `Transformers`
To quickly try the model, we show how to conduct inference with `transformers` below. Make sure you have installed the latest transformers version (>4.40).
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
device = "cuda" # the device to load the model onto
model = AutoModelForCausalLM.from_pretrained(
"SeaLLMs/SeaLLMs-v3-1.5B-Chat",
torch_dtype=torch.bfloat16,
device_map=device
)
tokenizer = AutoTokenizer.from_pretrained("SeaLLMs/SeaLLMs-v3-1.5B-Chat")
# prepare messages to model
prompt = "Hiii How are you?"
messages = [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": prompt}
]
text = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
model_inputs = tokenizer([text], return_tensors="pt").to(device)
print(f"Formatted text:\n {text}")
print(f"Model input:\n {model_inputs}")
generated_ids = model.generate(model_inputs.input_ids, max_new_tokens=512, do_sample=True)
generated_ids = [
output_ids[len(input_ids):] for input_ids, output_ids in zip(model_inputs.input_ids, generated_ids)
]
response = tokenizer.batch_decode(generated_ids, skip_special_tokens=True)
print(f"Response:\n {response[0]}")
```
You can also utilize the following code snippet, which uses the streamer `TextStreamer` to enable the model to continue conversing with you:
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
from transformers import TextStreamer
device = "cuda" # the device to load the model onto
model = AutoModelForCausalLM.from_pretrained(
"SeaLLMs/SeaLLMs-v3-1.5B-Chat",
torch_dtype=torch.bfloat16,
device_map=device
)
tokenizer = AutoTokenizer.from_pretrained("SeaLLMs/SeaLLMs-v3-1.5B-Chat")
# prepare messages to model
messages = [
{"role": "system", "content": "You are a helpful assistant."},
]
while True:
prompt = input("User:")
messages.append({"role": "user", "content": prompt})
text = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
model_inputs = tokenizer([text], return_tensors="pt").to(device)
streamer = TextStreamer(tokenizer, skip_prompt=True, skip_special_tokens=True)
generated_ids = model.generate(model_inputs.input_ids, max_new_tokens=512, streamer=streamer)
generated_ids = [
output_ids[len(input_ids):] for input_ids, output_ids in zip(model_inputs.input_ids, generated_ids)
]
response = tokenizer.batch_decode(generated_ids, skip_special_tokens=True)[0]
messages.append({"role": "assistant", "content": response})
```
### Inference with `vllm`
You can also conduct inference with [vllm](https://docs.vllm.ai/en/stable/index.html), which is a fast and easy-to-use library for LLM inference and serving. To use vllm, first install the latest version via `pip install vllm`.
```python
from vllm import LLM, SamplingParams
prompts = [
"Who is the president of US?",
"Can you speak Indonesian?"
]
llm = LLM(ckpt_path, dtype="bfloat16")
sparams = SamplingParams(temperature=0.1, max_tokens=512)
outputs = llm.generate(prompts, sparams)
# print out the model response
for output in outputs:
prompt = output.prompt
generated_text = output.outputs[0].text
print(f"Prompt: {prompt}\nResponse: {generated_text}\n\n")
```
### Bias, Risks, and Limitations
<blockquote style="color:red">
<p><strong style="color: red">Terms of Use and License</strong>:
By using our released weights, codes, and demos, you agree to and comply with the terms and conditions specified in our <a href="https://huggingface.co/SeaLLMs/SeaLLM-Chat-13b/edit/main/LICENSE" target="_blank" rel="noopener">SeaLLMs Terms Of Use</a>.
</blockquote>
> **Disclaimer**:
> We must note that even though the weights, codes, and demos are released in an open manner, similar to other pre-trained language models, and despite our best efforts in red teaming and safety fine-tuning and enforcement, our models come with potential risks, including but not limited to inaccurate, misleading or potentially harmful generation.
> Developers and stakeholders should perform their own red teaming and provide related security measures before deployment, and they must abide by and comply with local governance and regulations.
> In no event shall the authors be held liable for any claim, damages, or other liability arising from the use of the released weights, codes, or demos.
## Evaluation
We briefly compare SeaLLMs-v3-1.5B-Chat with models of similar sizes with the M3Exam benchmark.
[M3Exam](https://arxiv.org/abs/2306.05179) consists of local exam questions collected from each country. It reflects the model's world knowledge (e.g., with language or social science subjects) and reasoning abilities (e.g., with mathematics or natural science subjects).
| Model | en | zh | id | th | vi | avg | avg_sea |
|--------------------------|------|------|------|------|------|------|---------|
| gemma-2b-it | 44.1 | 37.4 | 31.5 | 28.2 | 35.8 | 35.4 | 31.8 |
| Sailor-1.8B-Chat | 43.8 | 35.9 | 34.2 | 32.3 | 37.5 | 36.7 | 34.7 |
| Sailor-4B-Chat | 54.1 | 48.1 | 40.7 | 35.6 | 42.5 | 44.2 | 39.6 |
| Qwen2-1.5B-Instruct | 63.4 | 75.3 | 41.2 | 41.2 | 47.2 | 53.7 | 43.2 |
| **SeaLLMs-v3-1.5B-Chat** | 61.9 | 74.2 | 43.2 | 42.4 | 48.7 | 54.1 | 44.7 |
## Acknowledgement to Our Linguists
We would like to express our special thanks to our professional and native linguists, Tantong Champaiboon, Nguyen Ngoc Yen Nhi and Tara Devina Putri, who helped build, evaluate, and fact-check our sampled pretraining and SFT dataset as well as evaluating our models across different aspects, especially safety.
## Citation
If you find our project useful, we hope you would kindly star our repo and cite our work as follows:
```
@article{damonlp2024seallm3,
author = {Wenxuan Zhang*, Hou Pong Chan*, Yiran Zhao*, Mahani Aljunied*,
Jianyu Wang*, Chaoqun Liu, Yue Deng, Zhiqiang Hu, Weiwen Xu,
Yew Ken Chia, Xin Li, Lidong Bing},
title = {SeaLLMs 3: Open Foundation and Chat Multilingual Large Language Models for Southeast Asian Languages},
year = {2024},
url = {https://arxiv.org/abs/2407.19672}
}
```
Corresponding Author: [email protected] | [
"TRANSLATION"
] | [
"CHIA"
] |
narainp/jina-embeddings-GGUF | narainp | feature-extraction | [
"sentence-transformers",
"gguf",
"feature-extraction",
"sentence-similarity",
"mteb",
"llama-cpp",
"gguf-my-repo",
"en",
"dataset:allenai/c4",
"base_model:jinaai/jina-embeddings-v2-base-en",
"base_model:quantized:jinaai/jina-embeddings-v2-base-en",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"region:us"
] | 2025-01-08T05:27:50 | 2025-01-08T09:45:10 | 1,487 | 1 | ---
base_model: jinaai/jina-embeddings-v2-base-en
datasets:
- allenai/c4
language: en
license: apache-2.0
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- mteb
- llama-cpp
- gguf-my-repo
inference: false
model-index:
- name: jina-embedding-b-en-v2
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 74.73134328358209
- type: ap
value: 37.765427081831035
- type: f1
value: 68.79367444339518
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 88.544275
- type: ap
value: 84.61328675662887
- type: f1
value: 88.51879035862375
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 45.263999999999996
- type: f1
value: 43.778759656699435
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: arguana
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 21.693
- type: map_at_10
value: 35.487
- type: map_at_100
value: 36.862
- type: map_at_1000
value: 36.872
- type: map_at_3
value: 30.049999999999997
- type: map_at_5
value: 32.966
- type: mrr_at_1
value: 21.977
- type: mrr_at_10
value: 35.565999999999995
- type: mrr_at_100
value: 36.948
- type: mrr_at_1000
value: 36.958
- type: mrr_at_3
value: 30.121
- type: mrr_at_5
value: 33.051
- type: ndcg_at_1
value: 21.693
- type: ndcg_at_10
value: 44.181
- type: ndcg_at_100
value: 49.982
- type: ndcg_at_1000
value: 50.233000000000004
- type: ndcg_at_3
value: 32.830999999999996
- type: ndcg_at_5
value: 38.080000000000005
- type: precision_at_1
value: 21.693
- type: precision_at_10
value: 7.248
- type: precision_at_100
value: 0.9769999999999999
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 13.632
- type: precision_at_5
value: 10.725
- type: recall_at_1
value: 21.693
- type: recall_at_10
value: 72.475
- type: recall_at_100
value: 97.653
- type: recall_at_1000
value: 99.57300000000001
- type: recall_at_3
value: 40.896
- type: recall_at_5
value: 53.627
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 45.39242428696777
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 36.675626784714
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 62.247725694904034
- type: mrr
value: 74.91359978894604
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 82.68003802970496
- type: cos_sim_spearman
value: 81.23438110096286
- type: euclidean_pearson
value: 81.87462986142582
- type: euclidean_spearman
value: 81.23438110096286
- type: manhattan_pearson
value: 81.61162566600755
- type: manhattan_spearman
value: 81.11329400456184
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 84.01298701298701
- type: f1
value: 83.31690714969382
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 37.050108150972086
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 30.15731442819715
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 31.391999999999996
- type: map_at_10
value: 42.597
- type: map_at_100
value: 44.07
- type: map_at_1000
value: 44.198
- type: map_at_3
value: 38.957
- type: map_at_5
value: 40.961
- type: mrr_at_1
value: 37.196
- type: mrr_at_10
value: 48.152
- type: mrr_at_100
value: 48.928
- type: mrr_at_1000
value: 48.964999999999996
- type: mrr_at_3
value: 45.446
- type: mrr_at_5
value: 47.205999999999996
- type: ndcg_at_1
value: 37.196
- type: ndcg_at_10
value: 49.089
- type: ndcg_at_100
value: 54.471000000000004
- type: ndcg_at_1000
value: 56.385
- type: ndcg_at_3
value: 43.699
- type: ndcg_at_5
value: 46.22
- type: precision_at_1
value: 37.196
- type: precision_at_10
value: 9.313
- type: precision_at_100
value: 1.478
- type: precision_at_1000
value: 0.198
- type: precision_at_3
value: 20.839
- type: precision_at_5
value: 14.936
- type: recall_at_1
value: 31.391999999999996
- type: recall_at_10
value: 61.876
- type: recall_at_100
value: 84.214
- type: recall_at_1000
value: 95.985
- type: recall_at_3
value: 46.6
- type: recall_at_5
value: 53.588
- type: map_at_1
value: 29.083
- type: map_at_10
value: 38.812999999999995
- type: map_at_100
value: 40.053
- type: map_at_1000
value: 40.188
- type: map_at_3
value: 36.111
- type: map_at_5
value: 37.519000000000005
- type: mrr_at_1
value: 36.497
- type: mrr_at_10
value: 44.85
- type: mrr_at_100
value: 45.546
- type: mrr_at_1000
value: 45.593
- type: mrr_at_3
value: 42.686
- type: mrr_at_5
value: 43.909
- type: ndcg_at_1
value: 36.497
- type: ndcg_at_10
value: 44.443
- type: ndcg_at_100
value: 48.979
- type: ndcg_at_1000
value: 51.154999999999994
- type: ndcg_at_3
value: 40.660000000000004
- type: ndcg_at_5
value: 42.193000000000005
- type: precision_at_1
value: 36.497
- type: precision_at_10
value: 8.433
- type: precision_at_100
value: 1.369
- type: precision_at_1000
value: 0.185
- type: precision_at_3
value: 19.894000000000002
- type: precision_at_5
value: 13.873
- type: recall_at_1
value: 29.083
- type: recall_at_10
value: 54.313
- type: recall_at_100
value: 73.792
- type: recall_at_1000
value: 87.629
- type: recall_at_3
value: 42.257
- type: recall_at_5
value: 47.066
- type: map_at_1
value: 38.556000000000004
- type: map_at_10
value: 50.698
- type: map_at_100
value: 51.705
- type: map_at_1000
value: 51.768
- type: map_at_3
value: 47.848
- type: map_at_5
value: 49.358000000000004
- type: mrr_at_1
value: 43.95
- type: mrr_at_10
value: 54.191
- type: mrr_at_100
value: 54.852999999999994
- type: mrr_at_1000
value: 54.885
- type: mrr_at_3
value: 51.954
- type: mrr_at_5
value: 53.13
- type: ndcg_at_1
value: 43.95
- type: ndcg_at_10
value: 56.516
- type: ndcg_at_100
value: 60.477000000000004
- type: ndcg_at_1000
value: 61.746
- type: ndcg_at_3
value: 51.601
- type: ndcg_at_5
value: 53.795
- type: precision_at_1
value: 43.95
- type: precision_at_10
value: 9.009
- type: precision_at_100
value: 1.189
- type: precision_at_1000
value: 0.135
- type: precision_at_3
value: 22.989
- type: precision_at_5
value: 15.473
- type: recall_at_1
value: 38.556000000000004
- type: recall_at_10
value: 70.159
- type: recall_at_100
value: 87.132
- type: recall_at_1000
value: 96.16
- type: recall_at_3
value: 56.906
- type: recall_at_5
value: 62.332
- type: map_at_1
value: 24.238
- type: map_at_10
value: 32.5
- type: map_at_100
value: 33.637
- type: map_at_1000
value: 33.719
- type: map_at_3
value: 30.026999999999997
- type: map_at_5
value: 31.555
- type: mrr_at_1
value: 26.328000000000003
- type: mrr_at_10
value: 34.44
- type: mrr_at_100
value: 35.455999999999996
- type: mrr_at_1000
value: 35.521
- type: mrr_at_3
value: 32.034
- type: mrr_at_5
value: 33.565
- type: ndcg_at_1
value: 26.328000000000003
- type: ndcg_at_10
value: 37.202
- type: ndcg_at_100
value: 42.728
- type: ndcg_at_1000
value: 44.792
- type: ndcg_at_3
value: 32.368
- type: ndcg_at_5
value: 35.008
- type: precision_at_1
value: 26.328000000000003
- type: precision_at_10
value: 5.7059999999999995
- type: precision_at_100
value: 0.8880000000000001
- type: precision_at_1000
value: 0.11100000000000002
- type: precision_at_3
value: 13.672
- type: precision_at_5
value: 9.74
- type: recall_at_1
value: 24.238
- type: recall_at_10
value: 49.829
- type: recall_at_100
value: 75.21
- type: recall_at_1000
value: 90.521
- type: recall_at_3
value: 36.867
- type: recall_at_5
value: 43.241
- type: map_at_1
value: 15.378
- type: map_at_10
value: 22.817999999999998
- type: map_at_100
value: 23.977999999999998
- type: map_at_1000
value: 24.108
- type: map_at_3
value: 20.719
- type: map_at_5
value: 21.889
- type: mrr_at_1
value: 19.03
- type: mrr_at_10
value: 27.022000000000002
- type: mrr_at_100
value: 28.011999999999997
- type: mrr_at_1000
value: 28.096
- type: mrr_at_3
value: 24.855
- type: mrr_at_5
value: 26.029999999999998
- type: ndcg_at_1
value: 19.03
- type: ndcg_at_10
value: 27.526
- type: ndcg_at_100
value: 33.040000000000006
- type: ndcg_at_1000
value: 36.187000000000005
- type: ndcg_at_3
value: 23.497
- type: ndcg_at_5
value: 25.334
- type: precision_at_1
value: 19.03
- type: precision_at_10
value: 4.963
- type: precision_at_100
value: 0.893
- type: precision_at_1000
value: 0.13
- type: precision_at_3
value: 11.360000000000001
- type: precision_at_5
value: 8.134
- type: recall_at_1
value: 15.378
- type: recall_at_10
value: 38.061
- type: recall_at_100
value: 61.754
- type: recall_at_1000
value: 84.259
- type: recall_at_3
value: 26.788
- type: recall_at_5
value: 31.326999999999998
- type: map_at_1
value: 27.511999999999997
- type: map_at_10
value: 37.429
- type: map_at_100
value: 38.818000000000005
- type: map_at_1000
value: 38.924
- type: map_at_3
value: 34.625
- type: map_at_5
value: 36.064
- type: mrr_at_1
value: 33.300999999999995
- type: mrr_at_10
value: 43.036
- type: mrr_at_100
value: 43.894
- type: mrr_at_1000
value: 43.936
- type: mrr_at_3
value: 40.825
- type: mrr_at_5
value: 42.028
- type: ndcg_at_1
value: 33.300999999999995
- type: ndcg_at_10
value: 43.229
- type: ndcg_at_100
value: 48.992000000000004
- type: ndcg_at_1000
value: 51.02100000000001
- type: ndcg_at_3
value: 38.794000000000004
- type: ndcg_at_5
value: 40.65
- type: precision_at_1
value: 33.300999999999995
- type: precision_at_10
value: 7.777000000000001
- type: precision_at_100
value: 1.269
- type: precision_at_1000
value: 0.163
- type: precision_at_3
value: 18.351
- type: precision_at_5
value: 12.762
- type: recall_at_1
value: 27.511999999999997
- type: recall_at_10
value: 54.788000000000004
- type: recall_at_100
value: 79.105
- type: recall_at_1000
value: 92.49199999999999
- type: recall_at_3
value: 41.924
- type: recall_at_5
value: 47.026
- type: map_at_1
value: 24.117
- type: map_at_10
value: 33.32
- type: map_at_100
value: 34.677
- type: map_at_1000
value: 34.78
- type: map_at_3
value: 30.233999999999998
- type: map_at_5
value: 31.668000000000003
- type: mrr_at_1
value: 29.566
- type: mrr_at_10
value: 38.244
- type: mrr_at_100
value: 39.245000000000005
- type: mrr_at_1000
value: 39.296
- type: mrr_at_3
value: 35.864000000000004
- type: mrr_at_5
value: 36.919999999999995
- type: ndcg_at_1
value: 29.566
- type: ndcg_at_10
value: 39.127
- type: ndcg_at_100
value: 44.989000000000004
- type: ndcg_at_1000
value: 47.189
- type: ndcg_at_3
value: 34.039
- type: ndcg_at_5
value: 35.744
- type: precision_at_1
value: 29.566
- type: precision_at_10
value: 7.385999999999999
- type: precision_at_100
value: 1.204
- type: precision_at_1000
value: 0.158
- type: precision_at_3
value: 16.286
- type: precision_at_5
value: 11.484
- type: recall_at_1
value: 24.117
- type: recall_at_10
value: 51.559999999999995
- type: recall_at_100
value: 77.104
- type: recall_at_1000
value: 91.79899999999999
- type: recall_at_3
value: 36.82
- type: recall_at_5
value: 41.453
- type: map_at_1
value: 25.17625
- type: map_at_10
value: 34.063916666666664
- type: map_at_100
value: 35.255500000000005
- type: map_at_1000
value: 35.37275
- type: map_at_3
value: 31.351666666666667
- type: map_at_5
value: 32.80608333333333
- type: mrr_at_1
value: 29.59783333333333
- type: mrr_at_10
value: 38.0925
- type: mrr_at_100
value: 38.957249999999995
- type: mrr_at_1000
value: 39.01608333333333
- type: mrr_at_3
value: 35.77625
- type: mrr_at_5
value: 37.04991666666667
- type: ndcg_at_1
value: 29.59783333333333
- type: ndcg_at_10
value: 39.343666666666664
- type: ndcg_at_100
value: 44.488249999999994
- type: ndcg_at_1000
value: 46.83358333333334
- type: ndcg_at_3
value: 34.69708333333333
- type: ndcg_at_5
value: 36.75075
- type: precision_at_1
value: 29.59783333333333
- type: precision_at_10
value: 6.884083333333332
- type: precision_at_100
value: 1.114
- type: precision_at_1000
value: 0.15108333333333332
- type: precision_at_3
value: 15.965250000000003
- type: precision_at_5
value: 11.246500000000001
- type: recall_at_1
value: 25.17625
- type: recall_at_10
value: 51.015999999999984
- type: recall_at_100
value: 73.60174999999998
- type: recall_at_1000
value: 89.849
- type: recall_at_3
value: 37.88399999999999
- type: recall_at_5
value: 43.24541666666666
- type: map_at_1
value: 24.537
- type: map_at_10
value: 31.081999999999997
- type: map_at_100
value: 32.042
- type: map_at_1000
value: 32.141
- type: map_at_3
value: 29.137
- type: map_at_5
value: 30.079
- type: mrr_at_1
value: 27.454
- type: mrr_at_10
value: 33.694
- type: mrr_at_100
value: 34.579
- type: mrr_at_1000
value: 34.649
- type: mrr_at_3
value: 32.004
- type: mrr_at_5
value: 32.794000000000004
- type: ndcg_at_1
value: 27.454
- type: ndcg_at_10
value: 34.915
- type: ndcg_at_100
value: 39.641
- type: ndcg_at_1000
value: 42.105
- type: ndcg_at_3
value: 31.276
- type: ndcg_at_5
value: 32.65
- type: precision_at_1
value: 27.454
- type: precision_at_10
value: 5.337
- type: precision_at_100
value: 0.8250000000000001
- type: precision_at_1000
value: 0.11199999999999999
- type: precision_at_3
value: 13.241
- type: precision_at_5
value: 8.895999999999999
- type: recall_at_1
value: 24.537
- type: recall_at_10
value: 44.324999999999996
- type: recall_at_100
value: 65.949
- type: recall_at_1000
value: 84.017
- type: recall_at_3
value: 33.857
- type: recall_at_5
value: 37.316
- type: map_at_1
value: 17.122
- type: map_at_10
value: 24.32
- type: map_at_100
value: 25.338
- type: map_at_1000
value: 25.462
- type: map_at_3
value: 22.064
- type: map_at_5
value: 23.322000000000003
- type: mrr_at_1
value: 20.647
- type: mrr_at_10
value: 27.858
- type: mrr_at_100
value: 28.743999999999996
- type: mrr_at_1000
value: 28.819
- type: mrr_at_3
value: 25.769
- type: mrr_at_5
value: 26.964
- type: ndcg_at_1
value: 20.647
- type: ndcg_at_10
value: 28.849999999999998
- type: ndcg_at_100
value: 33.849000000000004
- type: ndcg_at_1000
value: 36.802
- type: ndcg_at_3
value: 24.799
- type: ndcg_at_5
value: 26.682
- type: precision_at_1
value: 20.647
- type: precision_at_10
value: 5.2170000000000005
- type: precision_at_100
value: 0.906
- type: precision_at_1000
value: 0.134
- type: precision_at_3
value: 11.769
- type: precision_at_5
value: 8.486
- type: recall_at_1
value: 17.122
- type: recall_at_10
value: 38.999
- type: recall_at_100
value: 61.467000000000006
- type: recall_at_1000
value: 82.716
- type: recall_at_3
value: 27.601
- type: recall_at_5
value: 32.471
- type: map_at_1
value: 24.396
- type: map_at_10
value: 33.415
- type: map_at_100
value: 34.521
- type: map_at_1000
value: 34.631
- type: map_at_3
value: 30.703999999999997
- type: map_at_5
value: 32.166
- type: mrr_at_1
value: 28.825
- type: mrr_at_10
value: 37.397000000000006
- type: mrr_at_100
value: 38.286
- type: mrr_at_1000
value: 38.346000000000004
- type: mrr_at_3
value: 35.028
- type: mrr_at_5
value: 36.32
- type: ndcg_at_1
value: 28.825
- type: ndcg_at_10
value: 38.656
- type: ndcg_at_100
value: 43.856
- type: ndcg_at_1000
value: 46.31
- type: ndcg_at_3
value: 33.793
- type: ndcg_at_5
value: 35.909
- type: precision_at_1
value: 28.825
- type: precision_at_10
value: 6.567
- type: precision_at_100
value: 1.0330000000000001
- type: precision_at_1000
value: 0.135
- type: precision_at_3
value: 15.516
- type: precision_at_5
value: 10.914
- type: recall_at_1
value: 24.396
- type: recall_at_10
value: 50.747
- type: recall_at_100
value: 73.477
- type: recall_at_1000
value: 90.801
- type: recall_at_3
value: 37.1
- type: recall_at_5
value: 42.589
- type: map_at_1
value: 25.072
- type: map_at_10
value: 34.307
- type: map_at_100
value: 35.725
- type: map_at_1000
value: 35.943999999999996
- type: map_at_3
value: 30.906
- type: map_at_5
value: 32.818000000000005
- type: mrr_at_1
value: 29.644
- type: mrr_at_10
value: 38.673
- type: mrr_at_100
value: 39.459
- type: mrr_at_1000
value: 39.527
- type: mrr_at_3
value: 35.771
- type: mrr_at_5
value: 37.332
- type: ndcg_at_1
value: 29.644
- type: ndcg_at_10
value: 40.548
- type: ndcg_at_100
value: 45.678999999999995
- type: ndcg_at_1000
value: 48.488
- type: ndcg_at_3
value: 34.887
- type: ndcg_at_5
value: 37.543
- type: precision_at_1
value: 29.644
- type: precision_at_10
value: 7.688000000000001
- type: precision_at_100
value: 1.482
- type: precision_at_1000
value: 0.23600000000000002
- type: precision_at_3
value: 16.206
- type: precision_at_5
value: 12.016
- type: recall_at_1
value: 25.072
- type: recall_at_10
value: 53.478
- type: recall_at_100
value: 76.07300000000001
- type: recall_at_1000
value: 93.884
- type: recall_at_3
value: 37.583
- type: recall_at_5
value: 44.464
- type: map_at_1
value: 20.712
- type: map_at_10
value: 27.467999999999996
- type: map_at_100
value: 28.502
- type: map_at_1000
value: 28.610000000000003
- type: map_at_3
value: 24.887999999999998
- type: map_at_5
value: 26.273999999999997
- type: mrr_at_1
value: 22.736
- type: mrr_at_10
value: 29.553
- type: mrr_at_100
value: 30.485
- type: mrr_at_1000
value: 30.56
- type: mrr_at_3
value: 27.078999999999997
- type: mrr_at_5
value: 28.401
- type: ndcg_at_1
value: 22.736
- type: ndcg_at_10
value: 32.023
- type: ndcg_at_100
value: 37.158
- type: ndcg_at_1000
value: 39.823
- type: ndcg_at_3
value: 26.951999999999998
- type: ndcg_at_5
value: 29.281000000000002
- type: precision_at_1
value: 22.736
- type: precision_at_10
value: 5.213
- type: precision_at_100
value: 0.832
- type: precision_at_1000
value: 0.116
- type: precision_at_3
value: 11.459999999999999
- type: precision_at_5
value: 8.244
- type: recall_at_1
value: 20.712
- type: recall_at_10
value: 44.057
- type: recall_at_100
value: 67.944
- type: recall_at_1000
value: 87.925
- type: recall_at_3
value: 30.305
- type: recall_at_5
value: 36.071999999999996
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: climate-fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 10.181999999999999
- type: map_at_10
value: 16.66
- type: map_at_100
value: 18.273
- type: map_at_1000
value: 18.45
- type: map_at_3
value: 14.141
- type: map_at_5
value: 15.455
- type: mrr_at_1
value: 22.15
- type: mrr_at_10
value: 32.062000000000005
- type: mrr_at_100
value: 33.116
- type: mrr_at_1000
value: 33.168
- type: mrr_at_3
value: 28.827
- type: mrr_at_5
value: 30.892999999999997
- type: ndcg_at_1
value: 22.15
- type: ndcg_at_10
value: 23.532
- type: ndcg_at_100
value: 30.358
- type: ndcg_at_1000
value: 33.783
- type: ndcg_at_3
value: 19.222
- type: ndcg_at_5
value: 20.919999999999998
- type: precision_at_1
value: 22.15
- type: precision_at_10
value: 7.185999999999999
- type: precision_at_100
value: 1.433
- type: precision_at_1000
value: 0.207
- type: precision_at_3
value: 13.941
- type: precision_at_5
value: 10.906
- type: recall_at_1
value: 10.181999999999999
- type: recall_at_10
value: 28.104000000000003
- type: recall_at_100
value: 51.998999999999995
- type: recall_at_1000
value: 71.311
- type: recall_at_3
value: 17.698
- type: recall_at_5
value: 22.262999999999998
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: dbpedia-entity
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 6.669
- type: map_at_10
value: 15.552
- type: map_at_100
value: 21.865000000000002
- type: map_at_1000
value: 23.268
- type: map_at_3
value: 11.309
- type: map_at_5
value: 13.084000000000001
- type: mrr_at_1
value: 55.50000000000001
- type: mrr_at_10
value: 66.46600000000001
- type: mrr_at_100
value: 66.944
- type: mrr_at_1000
value: 66.956
- type: mrr_at_3
value: 64.542
- type: mrr_at_5
value: 65.717
- type: ndcg_at_1
value: 44.75
- type: ndcg_at_10
value: 35.049
- type: ndcg_at_100
value: 39.073
- type: ndcg_at_1000
value: 46.208
- type: ndcg_at_3
value: 39.525
- type: ndcg_at_5
value: 37.156
- type: precision_at_1
value: 55.50000000000001
- type: precision_at_10
value: 27.800000000000004
- type: precision_at_100
value: 9.013
- type: precision_at_1000
value: 1.8800000000000001
- type: precision_at_3
value: 42.667
- type: precision_at_5
value: 36.0
- type: recall_at_1
value: 6.669
- type: recall_at_10
value: 21.811
- type: recall_at_100
value: 45.112
- type: recall_at_1000
value: 67.806
- type: recall_at_3
value: 13.373
- type: recall_at_5
value: 16.615
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 48.769999999999996
- type: f1
value: 42.91448356376592
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 54.013
- type: map_at_10
value: 66.239
- type: map_at_100
value: 66.62599999999999
- type: map_at_1000
value: 66.644
- type: map_at_3
value: 63.965
- type: map_at_5
value: 65.45400000000001
- type: mrr_at_1
value: 58.221000000000004
- type: mrr_at_10
value: 70.43700000000001
- type: mrr_at_100
value: 70.744
- type: mrr_at_1000
value: 70.75099999999999
- type: mrr_at_3
value: 68.284
- type: mrr_at_5
value: 69.721
- type: ndcg_at_1
value: 58.221000000000004
- type: ndcg_at_10
value: 72.327
- type: ndcg_at_100
value: 73.953
- type: ndcg_at_1000
value: 74.312
- type: ndcg_at_3
value: 68.062
- type: ndcg_at_5
value: 70.56400000000001
- type: precision_at_1
value: 58.221000000000004
- type: precision_at_10
value: 9.521
- type: precision_at_100
value: 1.045
- type: precision_at_1000
value: 0.109
- type: precision_at_3
value: 27.348
- type: precision_at_5
value: 17.794999999999998
- type: recall_at_1
value: 54.013
- type: recall_at_10
value: 86.957
- type: recall_at_100
value: 93.911
- type: recall_at_1000
value: 96.38
- type: recall_at_3
value: 75.555
- type: recall_at_5
value: 81.671
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: fiqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 21.254
- type: map_at_10
value: 33.723
- type: map_at_100
value: 35.574
- type: map_at_1000
value: 35.730000000000004
- type: map_at_3
value: 29.473
- type: map_at_5
value: 31.543
- type: mrr_at_1
value: 41.358
- type: mrr_at_10
value: 49.498
- type: mrr_at_100
value: 50.275999999999996
- type: mrr_at_1000
value: 50.308
- type: mrr_at_3
value: 47.016000000000005
- type: mrr_at_5
value: 48.336
- type: ndcg_at_1
value: 41.358
- type: ndcg_at_10
value: 41.579
- type: ndcg_at_100
value: 48.455
- type: ndcg_at_1000
value: 51.165000000000006
- type: ndcg_at_3
value: 37.681
- type: ndcg_at_5
value: 38.49
- type: precision_at_1
value: 41.358
- type: precision_at_10
value: 11.543000000000001
- type: precision_at_100
value: 1.87
- type: precision_at_1000
value: 0.23600000000000002
- type: precision_at_3
value: 24.743000000000002
- type: precision_at_5
value: 17.994
- type: recall_at_1
value: 21.254
- type: recall_at_10
value: 48.698
- type: recall_at_100
value: 74.588
- type: recall_at_1000
value: 91.00200000000001
- type: recall_at_3
value: 33.939
- type: recall_at_5
value: 39.367000000000004
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: hotpotqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 35.922
- type: map_at_10
value: 52.32599999999999
- type: map_at_100
value: 53.18000000000001
- type: map_at_1000
value: 53.245
- type: map_at_3
value: 49.294
- type: map_at_5
value: 51.202999999999996
- type: mrr_at_1
value: 71.843
- type: mrr_at_10
value: 78.24600000000001
- type: mrr_at_100
value: 78.515
- type: mrr_at_1000
value: 78.527
- type: mrr_at_3
value: 77.17500000000001
- type: mrr_at_5
value: 77.852
- type: ndcg_at_1
value: 71.843
- type: ndcg_at_10
value: 61.379
- type: ndcg_at_100
value: 64.535
- type: ndcg_at_1000
value: 65.888
- type: ndcg_at_3
value: 56.958
- type: ndcg_at_5
value: 59.434
- type: precision_at_1
value: 71.843
- type: precision_at_10
value: 12.686
- type: precision_at_100
value: 1.517
- type: precision_at_1000
value: 0.16999999999999998
- type: precision_at_3
value: 35.778
- type: precision_at_5
value: 23.422
- type: recall_at_1
value: 35.922
- type: recall_at_10
value: 63.43
- type: recall_at_100
value: 75.868
- type: recall_at_1000
value: 84.88900000000001
- type: recall_at_3
value: 53.666000000000004
- type: recall_at_5
value: 58.555
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 79.4408
- type: ap
value: 73.52820871620366
- type: f1
value: 79.36240238685001
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: msmarco
config: default
split: dev
revision: None
metrics:
- type: map_at_1
value: 21.826999999999998
- type: map_at_10
value: 34.04
- type: map_at_100
value: 35.226
- type: map_at_1000
value: 35.275
- type: map_at_3
value: 30.165999999999997
- type: map_at_5
value: 32.318000000000005
- type: mrr_at_1
value: 22.464000000000002
- type: mrr_at_10
value: 34.631
- type: mrr_at_100
value: 35.752
- type: mrr_at_1000
value: 35.795
- type: mrr_at_3
value: 30.798
- type: mrr_at_5
value: 32.946999999999996
- type: ndcg_at_1
value: 22.464000000000002
- type: ndcg_at_10
value: 40.919
- type: ndcg_at_100
value: 46.632
- type: ndcg_at_1000
value: 47.833
- type: ndcg_at_3
value: 32.992
- type: ndcg_at_5
value: 36.834
- type: precision_at_1
value: 22.464000000000002
- type: precision_at_10
value: 6.494
- type: precision_at_100
value: 0.9369999999999999
- type: precision_at_1000
value: 0.104
- type: precision_at_3
value: 14.021
- type: precision_at_5
value: 10.347000000000001
- type: recall_at_1
value: 21.826999999999998
- type: recall_at_10
value: 62.132
- type: recall_at_100
value: 88.55199999999999
- type: recall_at_1000
value: 97.707
- type: recall_at_3
value: 40.541
- type: recall_at_5
value: 49.739
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 95.68399452804377
- type: f1
value: 95.25490609832268
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 83.15321477428182
- type: f1
value: 60.35476439087966
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 71.92669804976462
- type: f1
value: 69.22815107207565
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 74.4855413584398
- type: f1
value: 72.92107516103387
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 32.412679360205544
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 28.09211869875204
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 30.540919056982545
- type: mrr
value: 31.529904607063536
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: nfcorpus
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 5.745
- type: map_at_10
value: 12.013
- type: map_at_100
value: 15.040000000000001
- type: map_at_1000
value: 16.427
- type: map_at_3
value: 8.841000000000001
- type: map_at_5
value: 10.289
- type: mrr_at_1
value: 45.201
- type: mrr_at_10
value: 53.483999999999995
- type: mrr_at_100
value: 54.20700000000001
- type: mrr_at_1000
value: 54.252
- type: mrr_at_3
value: 51.29
- type: mrr_at_5
value: 52.73
- type: ndcg_at_1
value: 43.808
- type: ndcg_at_10
value: 32.445
- type: ndcg_at_100
value: 30.031000000000002
- type: ndcg_at_1000
value: 39.007
- type: ndcg_at_3
value: 37.204
- type: ndcg_at_5
value: 35.07
- type: precision_at_1
value: 45.201
- type: precision_at_10
value: 23.684
- type: precision_at_100
value: 7.600999999999999
- type: precision_at_1000
value: 2.043
- type: precision_at_3
value: 33.953
- type: precision_at_5
value: 29.412
- type: recall_at_1
value: 5.745
- type: recall_at_10
value: 16.168
- type: recall_at_100
value: 30.875999999999998
- type: recall_at_1000
value: 62.686
- type: recall_at_3
value: 9.75
- type: recall_at_5
value: 12.413
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: nq
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 37.828
- type: map_at_10
value: 53.239000000000004
- type: map_at_100
value: 54.035999999999994
- type: map_at_1000
value: 54.067
- type: map_at_3
value: 49.289
- type: map_at_5
value: 51.784
- type: mrr_at_1
value: 42.497
- type: mrr_at_10
value: 55.916999999999994
- type: mrr_at_100
value: 56.495
- type: mrr_at_1000
value: 56.516999999999996
- type: mrr_at_3
value: 52.800000000000004
- type: mrr_at_5
value: 54.722
- type: ndcg_at_1
value: 42.468
- type: ndcg_at_10
value: 60.437
- type: ndcg_at_100
value: 63.731
- type: ndcg_at_1000
value: 64.41799999999999
- type: ndcg_at_3
value: 53.230999999999995
- type: ndcg_at_5
value: 57.26
- type: precision_at_1
value: 42.468
- type: precision_at_10
value: 9.47
- type: precision_at_100
value: 1.1360000000000001
- type: precision_at_1000
value: 0.12
- type: precision_at_3
value: 23.724999999999998
- type: precision_at_5
value: 16.593
- type: recall_at_1
value: 37.828
- type: recall_at_10
value: 79.538
- type: recall_at_100
value: 93.646
- type: recall_at_1000
value: 98.72999999999999
- type: recall_at_3
value: 61.134
- type: recall_at_5
value: 70.377
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: quora
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 70.548
- type: map_at_10
value: 84.466
- type: map_at_100
value: 85.10600000000001
- type: map_at_1000
value: 85.123
- type: map_at_3
value: 81.57600000000001
- type: map_at_5
value: 83.399
- type: mrr_at_1
value: 81.24
- type: mrr_at_10
value: 87.457
- type: mrr_at_100
value: 87.574
- type: mrr_at_1000
value: 87.575
- type: mrr_at_3
value: 86.507
- type: mrr_at_5
value: 87.205
- type: ndcg_at_1
value: 81.25
- type: ndcg_at_10
value: 88.203
- type: ndcg_at_100
value: 89.457
- type: ndcg_at_1000
value: 89.563
- type: ndcg_at_3
value: 85.465
- type: ndcg_at_5
value: 87.007
- type: precision_at_1
value: 81.25
- type: precision_at_10
value: 13.373
- type: precision_at_100
value: 1.5270000000000001
- type: precision_at_1000
value: 0.157
- type: precision_at_3
value: 37.417
- type: precision_at_5
value: 24.556
- type: recall_at_1
value: 70.548
- type: recall_at_10
value: 95.208
- type: recall_at_100
value: 99.514
- type: recall_at_1000
value: 99.988
- type: recall_at_3
value: 87.214
- type: recall_at_5
value: 91.696
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 53.04822095496839
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 60.30778476474675
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: scidocs
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 4.692
- type: map_at_10
value: 11.766
- type: map_at_100
value: 13.904
- type: map_at_1000
value: 14.216999999999999
- type: map_at_3
value: 8.245
- type: map_at_5
value: 9.92
- type: mrr_at_1
value: 23.0
- type: mrr_at_10
value: 33.78
- type: mrr_at_100
value: 34.922
- type: mrr_at_1000
value: 34.973
- type: mrr_at_3
value: 30.2
- type: mrr_at_5
value: 32.565
- type: ndcg_at_1
value: 23.0
- type: ndcg_at_10
value: 19.863
- type: ndcg_at_100
value: 28.141
- type: ndcg_at_1000
value: 33.549
- type: ndcg_at_3
value: 18.434
- type: ndcg_at_5
value: 16.384
- type: precision_at_1
value: 23.0
- type: precision_at_10
value: 10.39
- type: precision_at_100
value: 2.235
- type: precision_at_1000
value: 0.35300000000000004
- type: precision_at_3
value: 17.133000000000003
- type: precision_at_5
value: 14.44
- type: recall_at_1
value: 4.692
- type: recall_at_10
value: 21.025
- type: recall_at_100
value: 45.324999999999996
- type: recall_at_1000
value: 71.675
- type: recall_at_3
value: 10.440000000000001
- type: recall_at_5
value: 14.64
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 84.96178184892842
- type: cos_sim_spearman
value: 79.6487740813199
- type: euclidean_pearson
value: 82.06661161625023
- type: euclidean_spearman
value: 79.64876769031183
- type: manhattan_pearson
value: 82.07061164575131
- type: manhattan_spearman
value: 79.65197039464537
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 84.15305604100027
- type: cos_sim_spearman
value: 74.27447427941591
- type: euclidean_pearson
value: 80.52737337565307
- type: euclidean_spearman
value: 74.27416077132192
- type: manhattan_pearson
value: 80.53728571140387
- type: manhattan_spearman
value: 74.28853605753457
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 83.44386080639279
- type: cos_sim_spearman
value: 84.17947648159536
- type: euclidean_pearson
value: 83.34145388129387
- type: euclidean_spearman
value: 84.17947648159536
- type: manhattan_pearson
value: 83.30699061927966
- type: manhattan_spearman
value: 84.18125737380451
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 81.57392220985612
- type: cos_sim_spearman
value: 78.80745014464101
- type: euclidean_pearson
value: 80.01660371487199
- type: euclidean_spearman
value: 78.80741240102256
- type: manhattan_pearson
value: 79.96810779507953
- type: manhattan_spearman
value: 78.75600400119448
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 86.85421063026625
- type: cos_sim_spearman
value: 87.55320285299192
- type: euclidean_pearson
value: 86.69750143323517
- type: euclidean_spearman
value: 87.55320284326378
- type: manhattan_pearson
value: 86.63379169960379
- type: manhattan_spearman
value: 87.4815029877984
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 84.31314130411842
- type: cos_sim_spearman
value: 85.3489588181433
- type: euclidean_pearson
value: 84.13240933463535
- type: euclidean_spearman
value: 85.34902871403281
- type: manhattan_pearson
value: 84.01183086503559
- type: manhattan_spearman
value: 85.19316703166102
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 89.09979781689536
- type: cos_sim_spearman
value: 88.87813323759015
- type: euclidean_pearson
value: 88.65413031123792
- type: euclidean_spearman
value: 88.87813323759015
- type: manhattan_pearson
value: 88.61818758256024
- type: manhattan_spearman
value: 88.81044100494604
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 62.30693258111531
- type: cos_sim_spearman
value: 62.195516523251946
- type: euclidean_pearson
value: 62.951283701049476
- type: euclidean_spearman
value: 62.195516523251946
- type: manhattan_pearson
value: 63.068322281439535
- type: manhattan_spearman
value: 62.10621171028406
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 84.27092833763909
- type: cos_sim_spearman
value: 84.84429717949759
- type: euclidean_pearson
value: 84.8516966060792
- type: euclidean_spearman
value: 84.84429717949759
- type: manhattan_pearson
value: 84.82203139242881
- type: manhattan_spearman
value: 84.8358503952945
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 83.10290863981409
- type: mrr
value: 95.31168450286097
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: scifact
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 52.161
- type: map_at_10
value: 62.138000000000005
- type: map_at_100
value: 62.769
- type: map_at_1000
value: 62.812
- type: map_at_3
value: 59.111000000000004
- type: map_at_5
value: 60.995999999999995
- type: mrr_at_1
value: 55.333
- type: mrr_at_10
value: 63.504000000000005
- type: mrr_at_100
value: 64.036
- type: mrr_at_1000
value: 64.08
- type: mrr_at_3
value: 61.278
- type: mrr_at_5
value: 62.778
- type: ndcg_at_1
value: 55.333
- type: ndcg_at_10
value: 66.678
- type: ndcg_at_100
value: 69.415
- type: ndcg_at_1000
value: 70.453
- type: ndcg_at_3
value: 61.755
- type: ndcg_at_5
value: 64.546
- type: precision_at_1
value: 55.333
- type: precision_at_10
value: 9.033
- type: precision_at_100
value: 1.043
- type: precision_at_1000
value: 0.11199999999999999
- type: precision_at_3
value: 24.221999999999998
- type: precision_at_5
value: 16.333000000000002
- type: recall_at_1
value: 52.161
- type: recall_at_10
value: 79.156
- type: recall_at_100
value: 91.333
- type: recall_at_1000
value: 99.333
- type: recall_at_3
value: 66.43299999999999
- type: recall_at_5
value: 73.272
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.81287128712871
- type: cos_sim_ap
value: 95.30034785910676
- type: cos_sim_f1
value: 90.28629856850716
- type: cos_sim_precision
value: 92.36401673640168
- type: cos_sim_recall
value: 88.3
- type: dot_accuracy
value: 99.81287128712871
- type: dot_ap
value: 95.30034785910676
- type: dot_f1
value: 90.28629856850716
- type: dot_precision
value: 92.36401673640168
- type: dot_recall
value: 88.3
- type: euclidean_accuracy
value: 99.81287128712871
- type: euclidean_ap
value: 95.30034785910676
- type: euclidean_f1
value: 90.28629856850716
- type: euclidean_precision
value: 92.36401673640168
- type: euclidean_recall
value: 88.3
- type: manhattan_accuracy
value: 99.80990099009901
- type: manhattan_ap
value: 95.26880751950654
- type: manhattan_f1
value: 90.22177419354838
- type: manhattan_precision
value: 90.95528455284553
- type: manhattan_recall
value: 89.5
- type: max_accuracy
value: 99.81287128712871
- type: max_ap
value: 95.30034785910676
- type: max_f1
value: 90.28629856850716
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 58.518662504351184
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 34.96168178378587
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 52.04862593471896
- type: mrr
value: 52.97238402936932
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 30.092545236479946
- type: cos_sim_spearman
value: 31.599851000175498
- type: dot_pearson
value: 30.092542723901676
- type: dot_spearman
value: 31.599851000175498
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: trec-covid
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 0.189
- type: map_at_10
value: 1.662
- type: map_at_100
value: 9.384
- type: map_at_1000
value: 22.669
- type: map_at_3
value: 0.5559999999999999
- type: map_at_5
value: 0.9039999999999999
- type: mrr_at_1
value: 68.0
- type: mrr_at_10
value: 81.01899999999999
- type: mrr_at_100
value: 81.01899999999999
- type: mrr_at_1000
value: 81.01899999999999
- type: mrr_at_3
value: 79.333
- type: mrr_at_5
value: 80.733
- type: ndcg_at_1
value: 63.0
- type: ndcg_at_10
value: 65.913
- type: ndcg_at_100
value: 51.895
- type: ndcg_at_1000
value: 46.967
- type: ndcg_at_3
value: 65.49199999999999
- type: ndcg_at_5
value: 66.69699999999999
- type: precision_at_1
value: 68.0
- type: precision_at_10
value: 71.6
- type: precision_at_100
value: 53.66
- type: precision_at_1000
value: 21.124000000000002
- type: precision_at_3
value: 72.667
- type: precision_at_5
value: 74.0
- type: recall_at_1
value: 0.189
- type: recall_at_10
value: 1.913
- type: recall_at_100
value: 12.601999999999999
- type: recall_at_1000
value: 44.296
- type: recall_at_3
value: 0.605
- type: recall_at_5
value: 1.018
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: webis-touche2020
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 2.701
- type: map_at_10
value: 10.445
- type: map_at_100
value: 17.324
- type: map_at_1000
value: 19.161
- type: map_at_3
value: 5.497
- type: map_at_5
value: 7.278
- type: mrr_at_1
value: 30.612000000000002
- type: mrr_at_10
value: 45.534
- type: mrr_at_100
value: 45.792
- type: mrr_at_1000
value: 45.806999999999995
- type: mrr_at_3
value: 37.755
- type: mrr_at_5
value: 43.469
- type: ndcg_at_1
value: 26.531
- type: ndcg_at_10
value: 26.235000000000003
- type: ndcg_at_100
value: 39.17
- type: ndcg_at_1000
value: 51.038
- type: ndcg_at_3
value: 23.625
- type: ndcg_at_5
value: 24.338
- type: precision_at_1
value: 30.612000000000002
- type: precision_at_10
value: 24.285999999999998
- type: precision_at_100
value: 8.224
- type: precision_at_1000
value: 1.6179999999999999
- type: precision_at_3
value: 24.490000000000002
- type: precision_at_5
value: 24.898
- type: recall_at_1
value: 2.701
- type: recall_at_10
value: 17.997
- type: recall_at_100
value: 51.766999999999996
- type: recall_at_1000
value: 87.863
- type: recall_at_3
value: 6.295000000000001
- type: recall_at_5
value: 9.993
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 73.3474
- type: ap
value: 15.393431414459924
- type: f1
value: 56.466681887882416
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 62.062818336163
- type: f1
value: 62.11230840463252
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 42.464892820845115
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 86.15962329379508
- type: cos_sim_ap
value: 74.73674057919256
- type: cos_sim_f1
value: 68.81245642574947
- type: cos_sim_precision
value: 61.48255813953488
- type: cos_sim_recall
value: 78.12664907651715
- type: dot_accuracy
value: 86.15962329379508
- type: dot_ap
value: 74.7367634988281
- type: dot_f1
value: 68.81245642574947
- type: dot_precision
value: 61.48255813953488
- type: dot_recall
value: 78.12664907651715
- type: euclidean_accuracy
value: 86.15962329379508
- type: euclidean_ap
value: 74.7367761466634
- type: euclidean_f1
value: 68.81245642574947
- type: euclidean_precision
value: 61.48255813953488
- type: euclidean_recall
value: 78.12664907651715
- type: manhattan_accuracy
value: 86.21326816474935
- type: manhattan_ap
value: 74.64416473733951
- type: manhattan_f1
value: 68.80924855491331
- type: manhattan_precision
value: 61.23456790123457
- type: manhattan_recall
value: 78.52242744063325
- type: max_accuracy
value: 86.21326816474935
- type: max_ap
value: 74.7367761466634
- type: max_f1
value: 68.81245642574947
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 88.97620988085536
- type: cos_sim_ap
value: 86.08680845745758
- type: cos_sim_f1
value: 78.02793637114438
- type: cos_sim_precision
value: 73.11082699683736
- type: cos_sim_recall
value: 83.65414228518632
- type: dot_accuracy
value: 88.97620988085536
- type: dot_ap
value: 86.08681149437946
- type: dot_f1
value: 78.02793637114438
- type: dot_precision
value: 73.11082699683736
- type: dot_recall
value: 83.65414228518632
- type: euclidean_accuracy
value: 88.97620988085536
- type: euclidean_ap
value: 86.08681215460771
- type: euclidean_f1
value: 78.02793637114438
- type: euclidean_precision
value: 73.11082699683736
- type: euclidean_recall
value: 83.65414228518632
- type: manhattan_accuracy
value: 88.88888888888889
- type: manhattan_ap
value: 86.02916327562438
- type: manhattan_f1
value: 78.02063045516843
- type: manhattan_precision
value: 73.38851947346994
- type: manhattan_recall
value: 83.2768709578072
- type: max_accuracy
value: 88.97620988085536
- type: max_ap
value: 86.08681215460771
- type: max_f1
value: 78.02793637114438
---
# narainp/jina-embeddings-v2-base-en-Q8_0-GGUF
This model was converted to GGUF format from [`jinaai/jina-embeddings-v2-base-en`](https://huggingface.co/jinaai/jina-embeddings-v2-base-en) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space.
Refer to the [original model card](https://huggingface.co/jinaai/jina-embeddings-v2-base-en) for more details on the model.
## Use with llama.cpp
Install llama.cpp through brew (works on Mac and Linux)
```bash
brew install llama.cpp
```
Invoke the llama.cpp server or the CLI.
### CLI:
```bash
llama-cli --hf-repo narainp/jina-embeddings-v2-base-en-Q8_0-GGUF --hf-file jina-embeddings-v2-base-en-q8_0.gguf -p "The meaning to life and the universe is"
```
### Server:
```bash
llama-server --hf-repo narainp/jina-embeddings-v2-base-en-Q8_0-GGUF --hf-file jina-embeddings-v2-base-en-q8_0.gguf -c 2048
```
Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
Step 1: Clone llama.cpp from GitHub.
```
git clone https://github.com/ggerganov/llama.cpp
```
Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux).
```
cd llama.cpp && LLAMA_CURL=1 make
```
Step 3: Run inference through the main binary.
```
./llama-cli --hf-repo narainp/jina-embeddings-v2-base-en-Q8_0-GGUF --hf-file jina-embeddings-v2-base-en-q8_0.gguf -p "The meaning to life and the universe is"
```
or
```
./llama-server --hf-repo narainp/jina-embeddings-v2-base-en-Q8_0-GGUF --hf-file jina-embeddings-v2-base-en-q8_0.gguf -c 2048
```
| [
"SUMMARIZATION"
] | [
"BIOSSES",
"SCIFACT"
] |
GoToCompany/llama3-8b-cpt-sahabatai-v1-instruct | GoToCompany | null | [
"safetensors",
"llama",
"en",
"id",
"jv",
"su",
"arxiv:2309.06085",
"arxiv:2310.04928",
"arxiv:2311.07911",
"base_model:GoToCompany/llama3-8b-cpt-sahabatai-v1-base",
"base_model:finetune:GoToCompany/llama3-8b-cpt-sahabatai-v1-base",
"license:llama3",
"region:us"
] | 2024-11-06T08:08:58 | 2024-11-06T08:09:00 | 1,464 | 11 | ---
base_model:
- GoToCompany/llama3-8b-cpt-sahabatai-v1-base
language:
- en
- id
- jv
- su
license: llama3
---
# Llama3 8B CPT Sahabat-AI v1 Instruct
**Sahabat-AI** (Indonesian language for “close friends”) is a collection of Large Language Models (LLMs) which has been pretrained and instruct-tuned for Indonesian language and its various dialects. Sahabat-AI ecosystem is co-initiated by Indonesian tech and telecommunication companies: GoTo Group and Indosat Ooredoo Hutchison.
Llama3 8B CPT Sahabat-AI v1 Instruct is an Indonesian-focused model which has been fine-tuned with around **448,000 Indonesian instruction-completion pairs** alongside an Indonesian-dialect pool consisting of **96,000 instruction-completion pairs in Javanese** and **98,000 instruction-completion pairs in Sundanese**. Additionally, we added a pool of **129,000 instruction-completion pairs in English**.
- **Co-initiated by:** PT GoTo Gojek Tokopedia Tbk, Indosat Ooredoo Hutchison
- **Developed by:** PT GoTo Gojek Tokopedia Tbk, AI Singapore
- **Model type:** Decoder
- **Languages:** English, Indonesian, Javanese, Sundanese
- **License:** [Llama3 Community License](https://huggingface.co/meta-llama/Meta-Llama-3-8B/blob/main/LICENSE)
## Model Details
### Model Description
We performed instruction tuning in Indonesian, Javanese, Sundanese as well as English on our [continued pre-trained Llama3 8B CPT Sahabat-AI v1 base](https://huggingface.co/GoToCompany/llama3-8b-cpt-sahabatai-v1-base), a decoder model using the Llama3 architecture, to create Llama3 8B CPT Sahabat-AI v1 Instruct.
For tokenisation, the model employs the default tokenizer used in Llama-3-8B. The model has a context length of 8192.
### Benchmark Performance
We evaluated Llama3 8B CPT Sahabat-AI V1 Instruct on both general language capabilities and instruction-following capabilities.
#### General Language Capabilities
For the evaluation of general language capabilities, we employed the
- [SEA HELM (also known as BHASA) evaluation benchmark](https://arxiv.org/abs/2309.06085v2) across a variety of tasks.
- These tasks include Question Answering (QA), Sentiment Analysis (Sentiment), Toxicity Detection (Toxicity), Translation in both directions (Eng>Lang & Lang>Eng), Abstractive Summarization (Summ), Causal Reasoning (Causal) and Natural Language Inference (NLI).
- We also added support for Javanese and Sundanese for the BHASA tasks whenever applicable
- [IndoMMLU](https://arxiv.org/pdf/2310.04928)
- These tasks include examination questions on Humanities, Indonesian language, Local languages and cultures, Social science and STEM across primary, middle, and high school levels.
- and the common English tasks from the [HuggingFace LLM Leaderboard](https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard).
- These tasks consist of [IFEval, BBH, Math Lvl 5, GPQA, MuSR, and MMLU-PRO.](https://huggingface.co/docs/leaderboards/open_llm_leaderboard/about)
- **Caveat**: Our results differ from the HuggingFace LLM Leaderboard because we have used [VLLM](https://docs.vllm.ai/en/latest/) as our inference platform. VLLM caps the context size at **4096 tokens** while HuggingFace was set to **8192 tokens**.
Note: SEA HELM is implemented using prompts to elicit answers in a strict format. For all tasks, the model is expected to provide an answer tag from which the answer is automatically extracted. For tasks where options are provided, the answer should comprise one of the pre-defined options. The scores for each task is normalised to account for baseline performance due to random chance.
The evaluation was done **zero-shot** with native prompts on a sample of 100-1000 instances for each dataset.
#### Instruction-following Capabilities
Since Llama3 8B CPT Sahabat-AI v1 Instruct is an instruction-following model, we also evaluated it on instruction-following capabilities with the [IFEval](https://arxiv.org/abs/2311.07911) dataset.
As this dataset was in English, the linguists and native speakers in the team worked together to filter, localize and translate the dataset into the respective target languages to ensure that the examples remained reasonable, meaningful and natural.
**IFEval**
IFEval evaluates a model's ability to adhere to constraints provided in the prompt, for example beginning a response with a specific word/phrase or answering with a certain number of sections. Additionally, accuracy is normalized by the proportion of responses in the correct language (if the model performs the task correctly but responds in the wrong language, it is judged to have failed the task).
*Note*: IFEval was only used on Bahasa Indonesia. We are currently working on adding it for Javanese and Sundanese for our upcoming releases.
#### Results
#### Indonesian Results
#### SEA HELM (also known as BHASA)
<table style="border-collapse: collapse; width: 100%; font-size: 10px">
<tr>
<th style="border: 2px solid black; padding: 8px; font-weight: bold;">Language / Model Name [Instruct]</th>
<th style="border: 1px solid gray; padding: 8px;">Qwen2-7B</th>
<th style="border: 1px solid gray; padding: 8px;">Qwen2.5-7B</th>
<th style="border: 1px solid gray; padding: 8px;">Llama-3-8B</th>
<th style="border: 1px solid gray; padding: 8px;">Llama-3.1-8B</th>
<th style="border: 1px solid gray; padding: 8px;">sea-lionv2.1-8B</th>
<th style="border: 1px solid gray; padding: 8px;">gemma-2-9B</th>
<th style="border: 2px solid black; padding: 8px;">sahabatai-v1-8B</th>
<th style="border: 1px solid gray; padding: 8px;">sahabatai-v1-9B</th>
</tr>
<tr>
<td style="border: 2px solid black; padding: 8px; font-weight: bold;">Overall (Bahasa Indonesia + Javanese + Sundanese)</td>
<td style="border: 1px solid gray; padding: 8px;">36.963</td>
<td style="border: 1px solid gray; padding: 8px;">42.988</td>
<td style="border: 1px solid gray; padding: 8px;">37.805</td>
<td style="border: 1px solid gray; padding: 8px;">45.866</td>
<td style="border: 1px solid gray; padding: 8px;">46.880</td>
<td style="border: 1px solid gray; padding: 8px;">56.359</td>
<td style="border: 2px solid black; padding: 8px;">53.725</td>
<td style="border: 1px solid gray; padding: 8px; background-color: lightgreen;">61.169</td>
</tr>
<tr>
<td style="border: 2px solid black; padding: 8px; font-weight: bold;">Bahasa Indonesia</td>
<td style="border: 1px solid gray; padding: 8px;">46.760</td>
<td style="border: 1px solid gray; padding: 8px;">60.372</td>
<td style="border: 1px solid gray; padding: 8px;">42.022</td>
<td style="border: 1px solid gray; padding: 8px;">51.944</td>
<td style="border: 1px solid gray; padding: 8px;">54.579</td>
<td style="border: 1px solid gray; padding: 8px;">63.394</td>
<td style="border: 2px solid black; padding: 8px;">57.221</td>
<td style="border: 1px solid gray; padding: 8px; background-color: lightgreen;">64.154</td>
</tr>
<tr>
<td style="border: 2px solid black; padding: 8px; font-weight: bold;">Javanese</td>
<td style="border: 1px solid gray; padding: 8px;">33.956</td>
<td style="border: 1px solid gray; padding: 8px;">40.625</td>
<td style="border: 1px solid gray; padding: 8px;">41.739</td>
<td style="border: 1px solid gray; padding: 8px;">47.587</td>
<td style="border: 1px solid gray; padding: 8px;">48.012</td>
<td style="border: 1px solid gray; padding: 8px;">56.468</td>
<td style="border: 2px solid black; padding: 8px;">56.460</td>
<td style="border: 1px solid gray; padding: 8px; background-color: lightgreen;">64.439</td>
</tr>
<tr>
<td style="border: 2px solid black; padding: 8px; font-weight: bold;">Sundanese</td>
<td style="border: 1px solid gray; padding: 8px;">30.173</td>
<td style="border: 1px solid gray; padding: 8px;">27.969</td>
<td style="border: 1px solid gray; padding: 8px;">29.654</td>
<td style="border: 1px solid gray; padding: 8px;">38.068</td>
<td style="border: 1px solid gray; padding: 8px;">38.050</td>
<td style="border: 1px solid gray; padding: 8px;">49.216</td>
<td style="border: 2px solid black; padding: 8px;">47.495</td>
<td style="border: 1px solid gray; padding: 8px; background-color: lightgreen;">54.913</td>
</tr>
</table>
#### IndoMMLU
<table style="border-collapse: collapse; width: 100%; font-size: 10px">
<tr>
<th style="border: 2px solid black; padding: 8px; font-weight: bold;">Model Name [Instruct]</th>
<th style="border: 1px solid gray; padding: 8px;">Qwen2-7B</th>
<th style="border: 1px solid gray; padding: 8px;">Qwen2.5-7B</th>
<th style="border: 1px solid gray; padding: 8px;">Meta-Llama-3-8B</th>
<th style="border: 1px solid gray; padding: 8px;">Llama-3.1-8B</th>
<th style="border: 1px solid gray; padding: 8px;">sea-lionv2.1-8B</th>
<th style="border: 1px solid gray; padding: 8px;">gemma-2-9B</th>
<th style="border: 2px solid black; padding: 8px;">sahabatai-v1-8B</th>
<th style="border: 1px solid gray; padding: 8px;">sahabatai-v1-9B</th>
</tr>
<tr>
<td style="border: 2px solid black; padding: 8px; font-weight: bold;">Overall Results</td>
<td style="border: 1px solid gray; padding: 8px;">53.0%</td>
<td style="border: 1px solid gray; padding: 8px;">56.0%</td>
<td style="border: 1px solid gray; padding: 8px;">51.9%</td>
<td style="border: 1px solid gray; padding: 8px;">53.8%</td>
<td style="border: 1px solid gray; padding: 8px;">54.4%</td>
<td style="border: 1px solid gray; padding: 8px;">61.4%</td>
<td style="border: 2px solid black; padding: 8px;">55.6%</td>
<td style="border: 1px solid gray; padding: 8px; background-color: lightgreen;">62.6%</td>
</tr>
</table>
#### English Results
<table style="border-collapse: collapse; width: 100%; font-size: 10px">
<tr>
<th style="border: 2px solid black; padding: 8px;">Model Name [Instruct]</th>
<th style="border: 1px solid gray; padding: 8px;">Qwen2-7B</th>
<th style="border: 1px solid gray; padding: 8px;">Qwen2.5-7B</th>
<th style="border: 1px solid gray; padding: 8px;">Llama-3-8B</th>
<th style="border: 1px solid gray; padding: 8px;">Llama-3.1-8B</th>
<th style="border: 1px solid gray; padding: 8px;">sea-lionv2.1-8B</th>
<th style="border: 1px solid gray; padding: 8px;">gemma-2-9B</th>
<th style="border: 2px solid black; padding: 8px;">sahabatai-v1-8B</th>
<th style="border: 1px solid gray; padding: 8px;">sahabatai-v1-9B</th>
</tr>
<tr>
<td style="border: 2px solid black; padding: 8px; font-weight: bold;">Average</td>
<td style="border: 1px solid gray; padding: 8px;">24.48</td>
<td style="border: 1px solid gray; padding: 8px;">27.75</td>
<td style="border: 1px solid gray; padding: 8px;">23.91</td>
<td style="border: 1px solid gray; padding: 8px;">27.98</td>
<td style="border: 1px solid gray; padding: 8px;">24.52</td>
<td style="border: 1px solid gray; padding: 8px;">26.44</td>
<td style="border: 2px solid black; padding: 8px;">24.43</td>
<td style="border: 1px solid gray; padding: 8px; background-color: lightgreen;">33.67</td>
</tr>
</table>
Llama3 8B CPT Sahabat-AI v1 Instruct can be run using the 🤗 Transformers library
```python
# Please use transformers==4.45.0
import torch
import transformers
model_id = "GoToCompany/llama3-8b-cpt-sahabatai-v1-instruct"
pipeline = transformers.pipeline(
"text-generation",
model=model_id,
model_kwargs={"torch_dtype": torch.bfloat16},
device_map="auto",
)
terminators = [
pipeline.tokenizer.eos_token_id,
pipeline.tokenizer.convert_tokens_to_ids("<|eot_id|>")
]
# Javanese
messages = [
{"role": "system", "content": "You are a helpful assistant"},
{"role": "user", "content": "Sopo wae sing ana ing Punakawan?"}
]
outputs = pipeline(
messages,
max_new_tokens=256,
eos_token_id=terminators,
)
print(outputs[0]["generated_text"][-1])
# Sundanese
messages = [
{"role": "system", "content": "You are a helpful assistant"},
{"role": "user", "content": "Kumaha caritana si Kabayan?"},
]
outputs = pipeline(
messages,
max_new_tokens=256,
eos_token_id=terminators,
)
print(outputs[0]["generated_text"][-1])
```
### Caveats
It is important for users to be aware that our model exhibits certain limitations that warrant consideration. Like many LLMs, the model can hallucinate and occasionally generates irrelevant content, introducing fictional elements that are not grounded in the provided context. Users should also exercise caution in interpreting and validating the model's responses due to the potential inconsistencies in its reasoning.
## Limitations
### Safety
Current Sahabat-AI models, including this commercially permissive release, have not been aligned for safety. Developers and users should perform their own safety fine-tuning and related security measures. In no event shall the authors be held liable for any claim, damages, or other liability arising from the use of the released weights and codes.
## Technical Specifications
### Fine-Tuning Details
Llama3 8B CPT Sahabat-AI v1 Instruct was built using a combination of a full parameter fine-tune, on-policy alignment, and model merges of the best performing checkpoints. The training process for fine-tuning was approximately 4 hours, with alignment taking 2 hours, both on 8x H100-80GB GPUs.
## Data
Llama3 8B CPT Sahabat-AI v1 Instruct was trained on a wide range of synthetic instructions, alongside publicly available instructions hand-curated by the team with the assistance of native speakers. In addition, special care was taken to ensure that the datasets used had commercially permissive licenses through verification with the original data source.
## Call for Collaboration
Sahabat-AI (Indonesian language for “close friends”) a **local open source Large Language Model (LLM) ecosystem in Indonesian language**, co-initiated by Indonesian tech and telecommunication companies: GoTo Group and Indosat Ooredoo Hutchison.
Sahabat-AI ecosystem aims to empower Indonesians who want to develop AI-based services and applications using Bahasa Indonesia and its various local dialects.
We are supported by research centers and global tech experts such as AI Singapore and Tech Mahendra to train the model to gain general language understanding.
We also collaborate with key top Indonesia universities such as University of Indonesia, Gadjah Mada University, Bogor Institute of Agriculture, Bandung Institute of Technology, including top Indonesia media groups, such as Kompas Gramedia Group and Republika to train and enrich the model in Bahasa Indonesia, ensuring optimum provision of local context and cultural relevance.
We would like to invite **researchers, developers, and language enthusiasts** to actively contribute to the enhancement and expansion of Sahabat-AI.
Your collaborations can involve:
- Identifying and reporting technical issues
- Sharing pre-training, instruction, and preference data
- Improving documentation usability
- Proposing and implementing new model evaluation tasks and metrics
Join us in shaping the future of Sahabat-AI by sharing your expertise and insights to make these models more accessible, accurate, and versatile.
You can contribute your ideas through [this form.](https://docs.google.com/forms/d/1_us969eQtEooYOn4XkvGkdP5VHOyCbO6L_sd9kTMnaA/edit)
## The Development Team (in ascending alphabetical order)
### AI Singapore
Chan Adwin<br>
Cheng Nicholas<br>
Choa Esther<br>
Huang Yuli<br>
Lau Wayne<br>
Lee Chwan Ren<br>
Leong Wai Yi<br>
Leong Wei Qi<br>
Limkonchotiwat Peerat<br>
Liu Bing Jie Darius<br>
Montalan Jann Railey<br>
Ng Boon Cheong Raymond<br>
Ngui Jian Gang<br>
Nguyen Thanh Ngan<br>
Ong Brandon<br>
Ong Tat-Wee David<br>
Ong Zhi Hao<br>
Rengarajan Hamsawardhini<br>
Siow Bryan<br>
Susanto Yosephine<br>
Tai Ngee Chia<br>
Tan Choon Meng<br>
Teng Walter<br>
Teo Eng Sipp Leslie<br>
Teo Wei Yi<br>
Tjhi William<br>
Yeo Yeow Tong<br>
Yong Xianbin<br>
### PT GoTo Gojek Tokopedia Tbk
Anissa Dininta<br>
Chau Shiau Ching<br>
Choiri Hendra Hadhil<br>
Goel Priyank<br>
Saini Ajay Kumar<br>
Shalev Ofir<br>
Tan Daryl<br>
Tep Kilian Rithi<br>
Tiwari Anupam<br>
Widjojo Daniel<br>
## Acknowledgements
[AI Singapore](https://aisingapore.org/) is a national programme supported by the National Research Foundation, Singapore and hosted by the National University of Singapore.
Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not reflect the views of the National Research Foundation or the National University of Singapore.
## Contact
For more info, please contact us using this [Sahabat-AI Inquiry Form.](https://docs.google.com/forms/d/1_us969eQtEooYOn4XkvGkdP5VHOyCbO6L_sd9kTMnaA/edit)
## Disclaimer
This is the repository for the Instruct model.
The model has _not_ been aligned for safety.
Developers and users should perform their own safety fine-tuning and related security measures.
In no event shall the authors be held liable for any claim, damages, or other liability arising from the use of the released weights and codes.
## References
### IndoMMLU Reference
```bibtex
@inproceedings{koto-etal-2023-indommlu,
title = "Large Language Models Only Pass Primary School Exams in {I}ndonesia: A Comprehensive Test on {I}ndo{MMLU}",
author = "Fajri Koto and Nurul Aisyah and Haonan Li and Timothy Baldwin",
booktitle = "Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing (EMNLP)",
month = December,
year = "2023",
address = "Singapore",
publisher = "Association for Computational Linguistics",
}
}
``` | [
"QUESTION_ANSWERING",
"TRANSLATION",
"SUMMARIZATION"
] | [
"CHIA"
] |
andersonbcdefg/bge-small-4096 | andersonbcdefg | feature-extraction | [
"transformers",
"pytorch",
"onnx",
"bert",
"feature-extraction",
"mteb",
"model-index",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
] | 2023-10-29T00:52:52 | 2023-11-02T05:58:37 | 1,441 | 10 | ---
tags:
- mteb
model-index:
- name: andersonbcdefg/bge-small-4096
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 68.74626865671641
- type: ap
value: 31.113961861085855
- type: f1
value: 62.628656720790275
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 81.30347499999999
- type: ap
value: 76.05639977935193
- type: f1
value: 81.23180016825499
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 38.566
- type: f1
value: 38.014543974125615
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: arguana
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 29.445
- type: map_at_10
value: 44.157999999999994
- type: map_at_100
value: 45.169
- type: map_at_1000
value: 45.178000000000004
- type: map_at_3
value: 39.545
- type: map_at_5
value: 42.233
- type: mrr_at_1
value: 29.445
- type: mrr_at_10
value: 44.157999999999994
- type: mrr_at_100
value: 45.169
- type: mrr_at_1000
value: 45.178000000000004
- type: mrr_at_3
value: 39.545
- type: mrr_at_5
value: 42.233
- type: ndcg_at_1
value: 29.445
- type: ndcg_at_10
value: 52.446000000000005
- type: ndcg_at_100
value: 56.782
- type: ndcg_at_1000
value: 56.989999999999995
- type: ndcg_at_3
value: 42.935
- type: ndcg_at_5
value: 47.833999999999996
- type: precision_at_1
value: 29.445
- type: precision_at_10
value: 7.8950000000000005
- type: precision_at_100
value: 0.979
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 17.591
- type: precision_at_5
value: 12.959000000000001
- type: recall_at_1
value: 29.445
- type: recall_at_10
value: 78.947
- type: recall_at_100
value: 97.937
- type: recall_at_1000
value: 99.502
- type: recall_at_3
value: 52.774
- type: recall_at_5
value: 64.794
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 43.85187820924144
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 29.5939502757938
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 58.539409343284674
- type: mrr
value: 71.58982983775228
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 82.31440765254087
- type: cos_sim_spearman
value: 81.59884723689632
- type: euclidean_pearson
value: 80.65818473893147
- type: euclidean_spearman
value: 81.40004752638717
- type: manhattan_pearson
value: 80.52256901536644
- type: manhattan_spearman
value: 80.57292024599603
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 79.98376623376623
- type: f1
value: 79.91981901371503
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 37.79541356345093
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 26.760513681350375
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 23.794
- type: map_at_10
value: 33.361000000000004
- type: map_at_100
value: 34.86
- type: map_at_1000
value: 35.0
- type: map_at_3
value: 30.579
- type: map_at_5
value: 31.996000000000002
- type: mrr_at_1
value: 30.186
- type: mrr_at_10
value: 39.681
- type: mrr_at_100
value: 40.616
- type: mrr_at_1000
value: 40.669
- type: mrr_at_3
value: 37.244
- type: mrr_at_5
value: 38.588
- type: ndcg_at_1
value: 30.186
- type: ndcg_at_10
value: 39.34
- type: ndcg_at_100
value: 45.266
- type: ndcg_at_1000
value: 47.9
- type: ndcg_at_3
value: 35.164
- type: ndcg_at_5
value: 36.854
- type: precision_at_1
value: 30.186
- type: precision_at_10
value: 7.639
- type: precision_at_100
value: 1.328
- type: precision_at_1000
value: 0.183
- type: precision_at_3
value: 17.31
- type: precision_at_5
value: 12.275
- type: recall_at_1
value: 23.794
- type: recall_at_10
value: 50.463
- type: recall_at_100
value: 75.268
- type: recall_at_1000
value: 93.138
- type: recall_at_3
value: 37.797
- type: recall_at_5
value: 42.985
- type: map_at_1
value: 17.968999999999998
- type: map_at_10
value: 23.846999999999998
- type: map_at_100
value: 24.712999999999997
- type: map_at_1000
value: 24.833
- type: map_at_3
value: 22.024
- type: map_at_5
value: 23.087
- type: mrr_at_1
value: 22.038
- type: mrr_at_10
value: 27.808
- type: mrr_at_100
value: 28.532999999999998
- type: mrr_at_1000
value: 28.604000000000003
- type: mrr_at_3
value: 26.029999999999998
- type: mrr_at_5
value: 27.122
- type: ndcg_at_1
value: 22.038
- type: ndcg_at_10
value: 27.559
- type: ndcg_at_100
value: 31.541999999999998
- type: ndcg_at_1000
value: 34.343
- type: ndcg_at_3
value: 24.585
- type: ndcg_at_5
value: 26.026
- type: precision_at_1
value: 22.038
- type: precision_at_10
value: 5.019
- type: precision_at_100
value: 0.8920000000000001
- type: precision_at_1000
value: 0.13899999999999998
- type: precision_at_3
value: 11.423
- type: precision_at_5
value: 8.28
- type: recall_at_1
value: 17.968999999999998
- type: recall_at_10
value: 34.583000000000006
- type: recall_at_100
value: 51.849000000000004
- type: recall_at_1000
value: 70.832
- type: recall_at_3
value: 26.057000000000002
- type: recall_at_5
value: 29.816
- type: map_at_1
value: 29.183999999999997
- type: map_at_10
value: 40.245
- type: map_at_100
value: 41.324
- type: map_at_1000
value: 41.402
- type: map_at_3
value: 37.395
- type: map_at_5
value: 38.964999999999996
- type: mrr_at_1
value: 33.981
- type: mrr_at_10
value: 43.471
- type: mrr_at_100
value: 44.303
- type: mrr_at_1000
value: 44.352999999999994
- type: mrr_at_3
value: 41.149
- type: mrr_at_5
value: 42.466
- type: ndcg_at_1
value: 33.981
- type: ndcg_at_10
value: 45.776
- type: ndcg_at_100
value: 50.441
- type: ndcg_at_1000
value: 52.16
- type: ndcg_at_3
value: 40.756
- type: ndcg_at_5
value: 43.132
- type: precision_at_1
value: 33.981
- type: precision_at_10
value: 7.617999999999999
- type: precision_at_100
value: 1.083
- type: precision_at_1000
value: 0.129
- type: precision_at_3
value: 18.558
- type: precision_at_5
value: 12.915
- type: recall_at_1
value: 29.183999999999997
- type: recall_at_10
value: 59.114
- type: recall_at_100
value: 79.549
- type: recall_at_1000
value: 91.925
- type: recall_at_3
value: 45.551
- type: recall_at_5
value: 51.38399999999999
- type: map_at_1
value: 20.286
- type: map_at_10
value: 27.143
- type: map_at_100
value: 28.107
- type: map_at_1000
value: 28.212
- type: map_at_3
value: 25.149
- type: map_at_5
value: 26.179999999999996
- type: mrr_at_1
value: 22.034000000000002
- type: mrr_at_10
value: 28.875
- type: mrr_at_100
value: 29.785
- type: mrr_at_1000
value: 29.876
- type: mrr_at_3
value: 27.023999999999997
- type: mrr_at_5
value: 28.058
- type: ndcg_at_1
value: 22.034000000000002
- type: ndcg_at_10
value: 31.148999999999997
- type: ndcg_at_100
value: 35.936
- type: ndcg_at_1000
value: 38.682
- type: ndcg_at_3
value: 27.230999999999998
- type: ndcg_at_5
value: 29.034
- type: precision_at_1
value: 22.034000000000002
- type: precision_at_10
value: 4.836
- type: precision_at_100
value: 0.754
- type: precision_at_1000
value: 0.10300000000000001
- type: precision_at_3
value: 11.562999999999999
- type: precision_at_5
value: 8.068
- type: recall_at_1
value: 20.286
- type: recall_at_10
value: 41.827999999999996
- type: recall_at_100
value: 63.922000000000004
- type: recall_at_1000
value: 84.639
- type: recall_at_3
value: 31.227
- type: recall_at_5
value: 35.546
- type: map_at_1
value: 13.488
- type: map_at_10
value: 18.595
- type: map_at_100
value: 19.783
- type: map_at_1000
value: 19.918
- type: map_at_3
value: 16.274
- type: map_at_5
value: 17.558
- type: mrr_at_1
value: 16.791
- type: mrr_at_10
value: 22.53
- type: mrr_at_100
value: 23.651
- type: mrr_at_1000
value: 23.738999999999997
- type: mrr_at_3
value: 20.232
- type: mrr_at_5
value: 21.644
- type: ndcg_at_1
value: 16.791
- type: ndcg_at_10
value: 22.672
- type: ndcg_at_100
value: 28.663
- type: ndcg_at_1000
value: 31.954
- type: ndcg_at_3
value: 18.372
- type: ndcg_at_5
value: 20.47
- type: precision_at_1
value: 16.791
- type: precision_at_10
value: 4.2540000000000004
- type: precision_at_100
value: 0.8370000000000001
- type: precision_at_1000
value: 0.125
- type: precision_at_3
value: 8.706
- type: precision_at_5
value: 6.666999999999999
- type: recall_at_1
value: 13.488
- type: recall_at_10
value: 31.451
- type: recall_at_100
value: 58.085
- type: recall_at_1000
value: 81.792
- type: recall_at_3
value: 19.811
- type: recall_at_5
value: 24.973
- type: map_at_1
value: 21.436
- type: map_at_10
value: 29.105999999999998
- type: map_at_100
value: 30.442000000000004
- type: map_at_1000
value: 30.567
- type: map_at_3
value: 26.430999999999997
- type: map_at_5
value: 27.866000000000003
- type: mrr_at_1
value: 26.083000000000002
- type: mrr_at_10
value: 33.975
- type: mrr_at_100
value: 35.014
- type: mrr_at_1000
value: 35.07
- type: mrr_at_3
value: 31.649
- type: mrr_at_5
value: 32.944
- type: ndcg_at_1
value: 26.083000000000002
- type: ndcg_at_10
value: 34.229
- type: ndcg_at_100
value: 40.439
- type: ndcg_at_1000
value: 43.081
- type: ndcg_at_3
value: 29.64
- type: ndcg_at_5
value: 31.704
- type: precision_at_1
value: 26.083000000000002
- type: precision_at_10
value: 6.246
- type: precision_at_100
value: 1.1199999999999999
- type: precision_at_1000
value: 0.155
- type: precision_at_3
value: 13.858999999999998
- type: precision_at_5
value: 10.01
- type: recall_at_1
value: 21.436
- type: recall_at_10
value: 44.938
- type: recall_at_100
value: 72.029
- type: recall_at_1000
value: 90.009
- type: recall_at_3
value: 31.954
- type: recall_at_5
value: 37.303
- type: map_at_1
value: 18.217
- type: map_at_10
value: 25.16
- type: map_at_100
value: 26.490000000000002
- type: map_at_1000
value: 26.619
- type: map_at_3
value: 22.926
- type: map_at_5
value: 24.251
- type: mrr_at_1
value: 22.831000000000003
- type: mrr_at_10
value: 30.009000000000004
- type: mrr_at_100
value: 31.045
- type: mrr_at_1000
value: 31.122
- type: mrr_at_3
value: 28.025
- type: mrr_at_5
value: 29.07
- type: ndcg_at_1
value: 22.831000000000003
- type: ndcg_at_10
value: 29.664
- type: ndcg_at_100
value: 35.900999999999996
- type: ndcg_at_1000
value: 38.932
- type: ndcg_at_3
value: 26.051000000000002
- type: ndcg_at_5
value: 27.741
- type: precision_at_1
value: 22.831000000000003
- type: precision_at_10
value: 5.479
- type: precision_at_100
value: 1.027
- type: precision_at_1000
value: 0.146
- type: precision_at_3
value: 12.481
- type: precision_at_5
value: 8.973
- type: recall_at_1
value: 18.217
- type: recall_at_10
value: 38.336
- type: recall_at_100
value: 65.854
- type: recall_at_1000
value: 87.498
- type: recall_at_3
value: 28.158
- type: recall_at_5
value: 32.841
- type: map_at_1
value: 19.100666666666665
- type: map_at_10
value: 26.22883333333333
- type: map_at_100
value: 27.34241666666667
- type: map_at_1000
value: 27.468416666666666
- type: map_at_3
value: 23.953916666666668
- type: map_at_5
value: 25.20125
- type: mrr_at_1
value: 22.729249999999997
- type: mrr_at_10
value: 29.86491666666667
- type: mrr_at_100
value: 30.76925
- type: mrr_at_1000
value: 30.846333333333337
- type: mrr_at_3
value: 27.733999999999998
- type: mrr_at_5
value: 28.94058333333333
- type: ndcg_at_1
value: 22.729249999999997
- type: ndcg_at_10
value: 30.708250000000003
- type: ndcg_at_100
value: 35.89083333333333
- type: ndcg_at_1000
value: 38.75891666666666
- type: ndcg_at_3
value: 26.661083333333334
- type: ndcg_at_5
value: 28.54
- type: precision_at_1
value: 22.729249999999997
- type: precision_at_10
value: 5.433833333333333
- type: precision_at_100
value: 0.9486666666666665
- type: precision_at_1000
value: 0.13808333333333334
- type: precision_at_3
value: 12.292166666666668
- type: precision_at_5
value: 8.825
- type: recall_at_1
value: 19.100666666666665
- type: recall_at_10
value: 40.54208333333334
- type: recall_at_100
value: 63.67975
- type: recall_at_1000
value: 84.13574999999999
- type: recall_at_3
value: 29.311000000000003
- type: recall_at_5
value: 34.1105
- type: map_at_1
value: 17.762
- type: map_at_10
value: 23.905
- type: map_at_100
value: 24.663
- type: map_at_1000
value: 24.765
- type: map_at_3
value: 22.032
- type: map_at_5
value: 23.025000000000002
- type: mrr_at_1
value: 20.244999999999997
- type: mrr_at_10
value: 26.162999999999997
- type: mrr_at_100
value: 26.907999999999998
- type: mrr_at_1000
value: 26.987
- type: mrr_at_3
value: 24.361
- type: mrr_at_5
value: 25.326999999999998
- type: ndcg_at_1
value: 20.244999999999997
- type: ndcg_at_10
value: 27.577
- type: ndcg_at_100
value: 31.473000000000003
- type: ndcg_at_1000
value: 34.217999999999996
- type: ndcg_at_3
value: 24.092
- type: ndcg_at_5
value: 25.657000000000004
- type: precision_at_1
value: 20.244999999999997
- type: precision_at_10
value: 4.433
- type: precision_at_100
value: 0.692
- type: precision_at_1000
value: 0.099
- type: precision_at_3
value: 10.634
- type: precision_at_5
value: 7.362
- type: recall_at_1
value: 17.762
- type: recall_at_10
value: 36.661
- type: recall_at_100
value: 54.581999999999994
- type: recall_at_1000
value: 75.28099999999999
- type: recall_at_3
value: 27.084999999999997
- type: recall_at_5
value: 31.064999999999998
- type: map_at_1
value: 12.998000000000001
- type: map_at_10
value: 18.926000000000002
- type: map_at_100
value: 19.836000000000002
- type: map_at_1000
value: 19.96
- type: map_at_3
value: 16.932
- type: map_at_5
value: 17.963
- type: mrr_at_1
value: 15.692
- type: mrr_at_10
value: 22.206
- type: mrr_at_100
value: 23.021
- type: mrr_at_1000
value: 23.108999999999998
- type: mrr_at_3
value: 20.114
- type: mrr_at_5
value: 21.241
- type: ndcg_at_1
value: 15.692
- type: ndcg_at_10
value: 22.997999999999998
- type: ndcg_at_100
value: 27.541
- type: ndcg_at_1000
value: 30.758000000000003
- type: ndcg_at_3
value: 19.117
- type: ndcg_at_5
value: 20.778
- type: precision_at_1
value: 15.692
- type: precision_at_10
value: 4.277
- type: precision_at_100
value: 0.774
- type: precision_at_1000
value: 0.122
- type: precision_at_3
value: 9.027000000000001
- type: precision_at_5
value: 6.641
- type: recall_at_1
value: 12.998000000000001
- type: recall_at_10
value: 32.135999999999996
- type: recall_at_100
value: 52.937
- type: recall_at_1000
value: 76.348
- type: recall_at_3
value: 21.292
- type: recall_at_5
value: 25.439
- type: map_at_1
value: 20.219
- type: map_at_10
value: 27.306
- type: map_at_100
value: 28.337
- type: map_at_1000
value: 28.459
- type: map_at_3
value: 25.423000000000002
- type: map_at_5
value: 26.375999999999998
- type: mrr_at_1
value: 23.787
- type: mrr_at_10
value: 30.977
- type: mrr_at_100
value: 31.85
- type: mrr_at_1000
value: 31.939
- type: mrr_at_3
value: 29.073
- type: mrr_at_5
value: 30.095
- type: ndcg_at_1
value: 23.787
- type: ndcg_at_10
value: 31.615
- type: ndcg_at_100
value: 36.641
- type: ndcg_at_1000
value: 39.707
- type: ndcg_at_3
value: 27.994000000000003
- type: ndcg_at_5
value: 29.508000000000003
- type: precision_at_1
value: 23.787
- type: precision_at_10
value: 5.271
- type: precision_at_100
value: 0.865
- type: precision_at_1000
value: 0.125
- type: precision_at_3
value: 12.748999999999999
- type: precision_at_5
value: 8.806
- type: recall_at_1
value: 20.219
- type: recall_at_10
value: 41.108
- type: recall_at_100
value: 63.596
- type: recall_at_1000
value: 85.54899999999999
- type: recall_at_3
value: 31.129
- type: recall_at_5
value: 34.845
- type: map_at_1
value: 19.949
- type: map_at_10
value: 26.629
- type: map_at_100
value: 28.006999999999998
- type: map_at_1000
value: 28.221
- type: map_at_3
value: 24.099999999999998
- type: map_at_5
value: 25.487
- type: mrr_at_1
value: 24.111
- type: mrr_at_10
value: 30.592000000000002
- type: mrr_at_100
value: 31.448999999999998
- type: mrr_at_1000
value: 31.538
- type: mrr_at_3
value: 28.128999999999998
- type: mrr_at_5
value: 29.503
- type: ndcg_at_1
value: 24.111
- type: ndcg_at_10
value: 31.373
- type: ndcg_at_100
value: 36.897999999999996
- type: ndcg_at_1000
value: 40.288000000000004
- type: ndcg_at_3
value: 26.895000000000003
- type: ndcg_at_5
value: 29.009
- type: precision_at_1
value: 24.111
- type: precision_at_10
value: 6.067
- type: precision_at_100
value: 1.269
- type: precision_at_1000
value: 0.22
- type: precision_at_3
value: 12.385
- type: precision_at_5
value: 9.249
- type: recall_at_1
value: 19.949
- type: recall_at_10
value: 40.394000000000005
- type: recall_at_100
value: 65.812
- type: recall_at_1000
value: 88.247
- type: recall_at_3
value: 28.116000000000003
- type: recall_at_5
value: 33.4
- type: map_at_1
value: 13.905999999999999
- type: map_at_10
value: 20.523
- type: map_at_100
value: 21.547
- type: map_at_1000
value: 21.665
- type: map_at_3
value: 18.182000000000002
- type: map_at_5
value: 19.661
- type: mrr_at_1
value: 14.972
- type: mrr_at_10
value: 22.092
- type: mrr_at_100
value: 23.055999999999997
- type: mrr_at_1000
value: 23.150000000000002
- type: mrr_at_3
value: 19.778000000000002
- type: mrr_at_5
value: 21.229
- type: ndcg_at_1
value: 14.972
- type: ndcg_at_10
value: 24.547
- type: ndcg_at_100
value: 29.948999999999998
- type: ndcg_at_1000
value: 33.084
- type: ndcg_at_3
value: 20.036
- type: ndcg_at_5
value: 22.567
- type: precision_at_1
value: 14.972
- type: precision_at_10
value: 4.067
- type: precision_at_100
value: 0.743
- type: precision_at_1000
value: 0.11100000000000002
- type: precision_at_3
value: 8.811
- type: precision_at_5
value: 6.654
- type: recall_at_1
value: 13.905999999999999
- type: recall_at_10
value: 35.493
- type: recall_at_100
value: 60.67399999999999
- type: recall_at_1000
value: 84.371
- type: recall_at_3
value: 23.555
- type: recall_at_5
value: 29.729
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: climate-fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 7.529
- type: map_at_10
value: 12.794
- type: map_at_100
value: 14.315
- type: map_at_1000
value: 14.523
- type: map_at_3
value: 10.367999999999999
- type: map_at_5
value: 11.546
- type: mrr_at_1
value: 16.872999999999998
- type: mrr_at_10
value: 25.709
- type: mrr_at_100
value: 26.907999999999998
- type: mrr_at_1000
value: 26.962000000000003
- type: mrr_at_3
value: 22.486
- type: mrr_at_5
value: 24.245
- type: ndcg_at_1
value: 16.872999999999998
- type: ndcg_at_10
value: 19.005
- type: ndcg_at_100
value: 25.990999999999996
- type: ndcg_at_1000
value: 29.955
- type: ndcg_at_3
value: 14.573
- type: ndcg_at_5
value: 16.118
- type: precision_at_1
value: 16.872999999999998
- type: precision_at_10
value: 6.235
- type: precision_at_100
value: 1.374
- type: precision_at_1000
value: 0.21
- type: precision_at_3
value: 10.793
- type: precision_at_5
value: 8.73
- type: recall_at_1
value: 7.529
- type: recall_at_10
value: 24.007
- type: recall_at_100
value: 48.742000000000004
- type: recall_at_1000
value: 71.35000000000001
- type: recall_at_3
value: 13.467
- type: recall_at_5
value: 17.502000000000002
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: dbpedia-entity
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 5.614
- type: map_at_10
value: 11.42
- type: map_at_100
value: 15.873000000000001
- type: map_at_1000
value: 17.021
- type: map_at_3
value: 8.495
- type: map_at_5
value: 9.790000000000001
- type: mrr_at_1
value: 42.0
- type: mrr_at_10
value: 52.477
- type: mrr_at_100
value: 53.095000000000006
- type: mrr_at_1000
value: 53.135
- type: mrr_at_3
value: 49.833
- type: mrr_at_5
value: 51.183
- type: ndcg_at_1
value: 31.374999999999996
- type: ndcg_at_10
value: 25.27
- type: ndcg_at_100
value: 29.709999999999997
- type: ndcg_at_1000
value: 36.975
- type: ndcg_at_3
value: 27.688000000000002
- type: ndcg_at_5
value: 25.987
- type: precision_at_1
value: 42.0
- type: precision_at_10
value: 21.2
- type: precision_at_100
value: 7.053
- type: precision_at_1000
value: 1.512
- type: precision_at_3
value: 32.333
- type: precision_at_5
value: 26.6
- type: recall_at_1
value: 5.614
- type: recall_at_10
value: 16.112000000000002
- type: recall_at_100
value: 36.165000000000006
- type: recall_at_1000
value: 60.362
- type: recall_at_3
value: 9.761000000000001
- type: recall_at_5
value: 12.279
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 40.085
- type: f1
value: 35.53934111316537
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 34.185
- type: map_at_10
value: 44.491
- type: map_at_100
value: 45.204
- type: map_at_1000
value: 45.254
- type: map_at_3
value: 42.006
- type: map_at_5
value: 43.516
- type: mrr_at_1
value: 37.024
- type: mrr_at_10
value: 47.524
- type: mrr_at_100
value: 48.185
- type: mrr_at_1000
value: 48.227
- type: mrr_at_3
value: 45.086999999999996
- type: mrr_at_5
value: 46.575
- type: ndcg_at_1
value: 37.024
- type: ndcg_at_10
value: 50.126000000000005
- type: ndcg_at_100
value: 53.577
- type: ndcg_at_1000
value: 54.906
- type: ndcg_at_3
value: 45.25
- type: ndcg_at_5
value: 47.842
- type: precision_at_1
value: 37.024
- type: precision_at_10
value: 7.132
- type: precision_at_100
value: 0.898
- type: precision_at_1000
value: 0.10300000000000001
- type: precision_at_3
value: 18.767
- type: precision_at_5
value: 12.676000000000002
- type: recall_at_1
value: 34.185
- type: recall_at_10
value: 64.703
- type: recall_at_100
value: 80.58
- type: recall_at_1000
value: 90.742
- type: recall_at_3
value: 51.483000000000004
- type: recall_at_5
value: 57.775
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: fiqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 9.358
- type: map_at_10
value: 16.391
- type: map_at_100
value: 17.698
- type: map_at_1000
value: 17.912
- type: map_at_3
value: 13.831
- type: map_at_5
value: 15.187000000000001
- type: mrr_at_1
value: 18.673000000000002
- type: mrr_at_10
value: 26.907999999999998
- type: mrr_at_100
value: 27.842
- type: mrr_at_1000
value: 27.933000000000003
- type: mrr_at_3
value: 24.486
- type: mrr_at_5
value: 25.766
- type: ndcg_at_1
value: 18.673000000000002
- type: ndcg_at_10
value: 22.137
- type: ndcg_at_100
value: 28.126
- type: ndcg_at_1000
value: 32.489000000000004
- type: ndcg_at_3
value: 18.723
- type: ndcg_at_5
value: 19.858
- type: precision_at_1
value: 18.673000000000002
- type: precision_at_10
value: 6.389
- type: precision_at_100
value: 1.262
- type: precision_at_1000
value: 0.202
- type: precision_at_3
value: 12.757
- type: precision_at_5
value: 9.753
- type: recall_at_1
value: 9.358
- type: recall_at_10
value: 28.605000000000004
- type: recall_at_100
value: 51.713
- type: recall_at_1000
value: 78.408
- type: recall_at_3
value: 17.674
- type: recall_at_5
value: 21.97
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: hotpotqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 22.997999999999998
- type: map_at_10
value: 32.957
- type: map_at_100
value: 33.972
- type: map_at_1000
value: 34.072
- type: map_at_3
value: 30.44
- type: map_at_5
value: 31.869999999999997
- type: mrr_at_1
value: 45.995999999999995
- type: mrr_at_10
value: 54.473000000000006
- type: mrr_at_100
value: 55.103
- type: mrr_at_1000
value: 55.139
- type: mrr_at_3
value: 52.349999999999994
- type: mrr_at_5
value: 53.61900000000001
- type: ndcg_at_1
value: 45.995999999999995
- type: ndcg_at_10
value: 41.333
- type: ndcg_at_100
value: 45.635999999999996
- type: ndcg_at_1000
value: 47.847
- type: ndcg_at_3
value: 36.825
- type: ndcg_at_5
value: 39.099000000000004
- type: precision_at_1
value: 45.995999999999995
- type: precision_at_10
value: 9.020999999999999
- type: precision_at_100
value: 1.244
- type: precision_at_1000
value: 0.154
- type: precision_at_3
value: 23.34
- type: precision_at_5
value: 15.8
- type: recall_at_1
value: 22.997999999999998
- type: recall_at_10
value: 45.105000000000004
- type: recall_at_100
value: 62.188
- type: recall_at_1000
value: 76.907
- type: recall_at_3
value: 35.010000000000005
- type: recall_at_5
value: 39.5
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 80.0944
- type: ap
value: 74.43301569395831
- type: f1
value: 80.04407647044388
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: msmarco
config: default
split: dev
revision: None
metrics:
- type: map_at_1
value: 10.171
- type: map_at_10
value: 17.558
- type: map_at_100
value: 18.694
- type: map_at_1000
value: 18.787000000000003
- type: map_at_3
value: 14.826
- type: map_at_5
value: 16.249
- type: mrr_at_1
value: 10.473
- type: mrr_at_10
value: 17.967
- type: mrr_at_100
value: 19.089
- type: mrr_at_1000
value: 19.177
- type: mrr_at_3
value: 15.222
- type: mrr_at_5
value: 16.655
- type: ndcg_at_1
value: 10.473
- type: ndcg_at_10
value: 22.148
- type: ndcg_at_100
value: 28.028
- type: ndcg_at_1000
value: 30.659
- type: ndcg_at_3
value: 16.474
- type: ndcg_at_5
value: 19.017
- type: precision_at_1
value: 10.473
- type: precision_at_10
value: 3.7969999999999997
- type: precision_at_100
value: 0.6779999999999999
- type: precision_at_1000
value: 0.09
- type: precision_at_3
value: 7.187
- type: precision_at_5
value: 5.599
- type: recall_at_1
value: 10.171
- type: recall_at_10
value: 36.459
- type: recall_at_100
value: 64.512
- type: recall_at_1000
value: 85.27900000000001
- type: recall_at_3
value: 20.868000000000002
- type: recall_at_5
value: 26.933
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 90.35795713634292
- type: f1
value: 89.72064544336776
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 66.4546283629731
- type: f1
value: 49.487271168215095
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 67.58238063214527
- type: f1
value: 65.54281371907213
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 73.47343644922664
- type: f1
value: 72.80522894672785
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 32.53600917473176
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 28.04699774280647
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 30.984352865575797
- type: mrr
value: 32.02736001972659
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: nfcorpus
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 4.666
- type: map_at_10
value: 10.066
- type: map_at_100
value: 12.794
- type: map_at_1000
value: 14.184
- type: map_at_3
value: 7.622
- type: map_at_5
value: 8.587
- type: mrr_at_1
value: 39.318999999999996
- type: mrr_at_10
value: 47.678
- type: mrr_at_100
value: 48.355
- type: mrr_at_1000
value: 48.400999999999996
- type: mrr_at_3
value: 45.82
- type: mrr_at_5
value: 46.656
- type: ndcg_at_1
value: 37.926
- type: ndcg_at_10
value: 29.049999999999997
- type: ndcg_at_100
value: 26.826
- type: ndcg_at_1000
value: 35.841
- type: ndcg_at_3
value: 33.513
- type: ndcg_at_5
value: 31.227
- type: precision_at_1
value: 39.318999999999996
- type: precision_at_10
value: 21.424000000000003
- type: precision_at_100
value: 7.231999999999999
- type: precision_at_1000
value: 2.012
- type: precision_at_3
value: 30.857
- type: precision_at_5
value: 26.378
- type: recall_at_1
value: 4.666
- type: recall_at_10
value: 13.898
- type: recall_at_100
value: 26.983
- type: recall_at_1000
value: 59.485
- type: recall_at_3
value: 8.953
- type: recall_at_5
value: 10.496
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: nq
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 9.26
- type: map_at_10
value: 17.907999999999998
- type: map_at_100
value: 19.245
- type: map_at_1000
value: 19.339000000000002
- type: map_at_3
value: 14.634
- type: map_at_5
value: 16.386
- type: mrr_at_1
value: 10.574
- type: mrr_at_10
value: 19.438
- type: mrr_at_100
value: 20.638
- type: mrr_at_1000
value: 20.715
- type: mrr_at_3
value: 16.276
- type: mrr_at_5
value: 17.971999999999998
- type: ndcg_at_1
value: 10.574
- type: ndcg_at_10
value: 23.451
- type: ndcg_at_100
value: 29.982
- type: ndcg_at_1000
value: 32.449
- type: ndcg_at_3
value: 16.817
- type: ndcg_at_5
value: 19.867
- type: precision_at_1
value: 10.574
- type: precision_at_10
value: 4.609
- type: precision_at_100
value: 0.8330000000000001
- type: precision_at_1000
value: 0.107
- type: precision_at_3
value: 8.266
- type: precision_at_5
value: 6.6739999999999995
- type: recall_at_1
value: 9.26
- type: recall_at_10
value: 39.224
- type: recall_at_100
value: 69.107
- type: recall_at_1000
value: 87.908
- type: recall_at_3
value: 21.490000000000002
- type: recall_at_5
value: 28.560999999999996
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: quora
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 65.655
- type: map_at_10
value: 79.199
- type: map_at_100
value: 79.937
- type: map_at_1000
value: 79.964
- type: map_at_3
value: 76.19399999999999
- type: map_at_5
value: 78.08800000000001
- type: mrr_at_1
value: 75.53999999999999
- type: mrr_at_10
value: 82.89
- type: mrr_at_100
value: 83.074
- type: mrr_at_1000
value: 83.077
- type: mrr_at_3
value: 81.577
- type: mrr_at_5
value: 82.452
- type: ndcg_at_1
value: 75.53999999999999
- type: ndcg_at_10
value: 83.62899999999999
- type: ndcg_at_100
value: 85.411
- type: ndcg_at_1000
value: 85.646
- type: ndcg_at_3
value: 80.23700000000001
- type: ndcg_at_5
value: 82.107
- type: precision_at_1
value: 75.53999999999999
- type: precision_at_10
value: 12.695
- type: precision_at_100
value: 1.493
- type: precision_at_1000
value: 0.156
- type: precision_at_3
value: 34.983
- type: precision_at_5
value: 23.164
- type: recall_at_1
value: 65.655
- type: recall_at_10
value: 92.269
- type: recall_at_100
value: 98.598
- type: recall_at_1000
value: 99.815
- type: recall_at_3
value: 82.616
- type: recall_at_5
value: 87.75800000000001
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 43.67844919460687
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 54.32866004447611
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: scidocs
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 3.238
- type: map_at_10
value: 8.539
- type: map_at_100
value: 10.267
- type: map_at_1000
value: 10.552999999999999
- type: map_at_3
value: 6.165
- type: map_at_5
value: 7.22
- type: mrr_at_1
value: 15.9
- type: mrr_at_10
value: 25.557999999999996
- type: mrr_at_100
value: 26.867
- type: mrr_at_1000
value: 26.939
- type: mrr_at_3
value: 22.633
- type: mrr_at_5
value: 24.233
- type: ndcg_at_1
value: 15.9
- type: ndcg_at_10
value: 14.954
- type: ndcg_at_100
value: 22.486
- type: ndcg_at_1000
value: 27.986
- type: ndcg_at_3
value: 14.069
- type: ndcg_at_5
value: 12.200999999999999
- type: precision_at_1
value: 15.9
- type: precision_at_10
value: 7.9399999999999995
- type: precision_at_100
value: 1.8929999999999998
- type: precision_at_1000
value: 0.32299999999999995
- type: precision_at_3
value: 13.5
- type: precision_at_5
value: 10.9
- type: recall_at_1
value: 3.238
- type: recall_at_10
value: 16.1
- type: recall_at_100
value: 38.427
- type: recall_at_1000
value: 65.498
- type: recall_at_3
value: 8.212
- type: recall_at_5
value: 11.032
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 80.7612029200118
- type: cos_sim_spearman
value: 74.17706899450974
- type: euclidean_pearson
value: 78.6240925347838
- type: euclidean_spearman
value: 74.22104652352341
- type: manhattan_pearson
value: 78.49956480878576
- type: manhattan_spearman
value: 74.0528957569391
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 80.0377294417705
- type: cos_sim_spearman
value: 72.19570903733732
- type: euclidean_pearson
value: 77.060604990743
- type: euclidean_spearman
value: 71.54251658956483
- type: manhattan_pearson
value: 77.28301977645965
- type: manhattan_spearman
value: 71.77449045278667
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 79.69841558517969
- type: cos_sim_spearman
value: 80.54022353649157
- type: euclidean_pearson
value: 80.03651743688496
- type: euclidean_spearman
value: 80.45116824930123
- type: manhattan_pearson
value: 79.89688370680031
- type: manhattan_spearman
value: 80.27208259746283
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 79.92235427443056
- type: cos_sim_spearman
value: 76.20243980748161
- type: euclidean_pearson
value: 79.28031963400572
- type: euclidean_spearman
value: 76.3568261868673
- type: manhattan_pearson
value: 79.24527845959733
- type: manhattan_spearman
value: 76.39886696744185
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 84.2762365324788
- type: cos_sim_spearman
value: 85.19929628214842
- type: euclidean_pearson
value: 84.82568872953075
- type: euclidean_spearman
value: 85.11039387706913
- type: manhattan_pearson
value: 84.72922084197847
- type: manhattan_spearman
value: 85.04448532444505
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 80.23256564746382
- type: cos_sim_spearman
value: 81.92968415429543
- type: euclidean_pearson
value: 81.12612888308936
- type: euclidean_spearman
value: 81.97396557448675
- type: manhattan_pearson
value: 81.15685601512081
- type: manhattan_spearman
value: 82.01929408689
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 85.35057935029289
- type: cos_sim_spearman
value: 86.60658025867397
- type: euclidean_pearson
value: 86.48666975508912
- type: euclidean_spearman
value: 86.70310223264862
- type: manhattan_pearson
value: 86.23959282751626
- type: manhattan_spearman
value: 86.48318896577922
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 63.15375299804011
- type: cos_sim_spearman
value: 65.4588500819246
- type: euclidean_pearson
value: 65.60180021985416
- type: euclidean_spearman
value: 65.55596512146833
- type: manhattan_pearson
value: 66.12421335157649
- type: manhattan_spearman
value: 66.05163838991123
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 81.82391915730462
- type: cos_sim_spearman
value: 81.93942545767499
- type: euclidean_pearson
value: 83.16752744889406
- type: euclidean_spearman
value: 82.31380947581034
- type: manhattan_pearson
value: 82.98915741609575
- type: manhattan_spearman
value: 82.16585239338073
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 77.19504204180527
- type: mrr
value: 92.85429983959396
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: scifact
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 49.528
- type: map_at_10
value: 57.62199999999999
- type: map_at_100
value: 58.544
- type: map_at_1000
value: 58.573
- type: map_at_3
value: 54.56999999999999
- type: map_at_5
value: 56.552
- type: mrr_at_1
value: 52.0
- type: mrr_at_10
value: 58.939
- type: mrr_at_100
value: 59.653
- type: mrr_at_1000
value: 59.68
- type: mrr_at_3
value: 56.389
- type: mrr_at_5
value: 57.989000000000004
- type: ndcg_at_1
value: 52.0
- type: ndcg_at_10
value: 61.964
- type: ndcg_at_100
value: 65.871
- type: ndcg_at_1000
value: 66.724
- type: ndcg_at_3
value: 56.621
- type: ndcg_at_5
value: 59.551
- type: precision_at_1
value: 52.0
- type: precision_at_10
value: 8.333
- type: precision_at_100
value: 1.04
- type: precision_at_1000
value: 0.11100000000000002
- type: precision_at_3
value: 21.778
- type: precision_at_5
value: 14.933
- type: recall_at_1
value: 49.528
- type: recall_at_10
value: 74.2
- type: recall_at_100
value: 91.5
- type: recall_at_1000
value: 98.333
- type: recall_at_3
value: 60.06700000000001
- type: recall_at_5
value: 67.133
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.81287128712871
- type: cos_sim_ap
value: 95.15039468118793
- type: cos_sim_f1
value: 90.48817312531455
- type: cos_sim_precision
value: 91.08409321175279
- type: cos_sim_recall
value: 89.9
- type: dot_accuracy
value: 99.78019801980199
- type: dot_ap
value: 93.60256835857994
- type: dot_f1
value: 88.73096446700508
- type: dot_precision
value: 90.10309278350516
- type: dot_recall
value: 87.4
- type: euclidean_accuracy
value: 99.81188118811882
- type: euclidean_ap
value: 95.15954231276913
- type: euclidean_f1
value: 90.48096192384769
- type: euclidean_precision
value: 90.66265060240963
- type: euclidean_recall
value: 90.3
- type: manhattan_accuracy
value: 99.81188118811882
- type: manhattan_ap
value: 95.17107000565468
- type: manhattan_f1
value: 90.5
- type: manhattan_precision
value: 90.5
- type: manhattan_recall
value: 90.5
- type: max_accuracy
value: 99.81287128712871
- type: max_ap
value: 95.17107000565468
- type: max_f1
value: 90.5
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 51.77488276525734
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 33.30657214418171
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 47.84571922992432
- type: mrr
value: 48.549107142857146
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 29.840750357585556
- type: cos_sim_spearman
value: 29.832953864936567
- type: dot_pearson
value: 30.499687946740657
- type: dot_spearman
value: 30.73436062481656
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: trec-covid
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 0.16999999999999998
- type: map_at_10
value: 1.014
- type: map_at_100
value: 5.623
- type: map_at_1000
value: 15.190999999999999
- type: map_at_3
value: 0.377
- type: map_at_5
value: 0.577
- type: mrr_at_1
value: 68.0
- type: mrr_at_10
value: 74.45
- type: mrr_at_100
value: 74.846
- type: mrr_at_1000
value: 74.846
- type: mrr_at_3
value: 71.333
- type: mrr_at_5
value: 73.533
- type: ndcg_at_1
value: 64.0
- type: ndcg_at_10
value: 47.52
- type: ndcg_at_100
value: 37.419999999999995
- type: ndcg_at_1000
value: 36.318
- type: ndcg_at_3
value: 51.13999999999999
- type: ndcg_at_5
value: 49.101
- type: precision_at_1
value: 68.0
- type: precision_at_10
value: 50.8
- type: precision_at_100
value: 39.160000000000004
- type: precision_at_1000
value: 16.948
- type: precision_at_3
value: 52.0
- type: precision_at_5
value: 51.6
- type: recall_at_1
value: 0.16999999999999998
- type: recall_at_10
value: 1.269
- type: recall_at_100
value: 8.937000000000001
- type: recall_at_1000
value: 35.036
- type: recall_at_3
value: 0.396
- type: recall_at_5
value: 0.6669999999999999
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: webis-touche2020
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 1.672
- type: map_at_10
value: 6.739000000000001
- type: map_at_100
value: 12.006
- type: map_at_1000
value: 13.474
- type: map_at_3
value: 2.617
- type: map_at_5
value: 4.329000000000001
- type: mrr_at_1
value: 20.408
- type: mrr_at_10
value: 30.764000000000003
- type: mrr_at_100
value: 32.457
- type: mrr_at_1000
value: 32.481
- type: mrr_at_3
value: 26.531
- type: mrr_at_5
value: 28.877999999999997
- type: ndcg_at_1
value: 18.367
- type: ndcg_at_10
value: 17.471999999999998
- type: ndcg_at_100
value: 29.341
- type: ndcg_at_1000
value: 41.005
- type: ndcg_at_3
value: 14.64
- type: ndcg_at_5
value: 17.039
- type: precision_at_1
value: 20.408
- type: precision_at_10
value: 17.551
- type: precision_at_100
value: 6.673
- type: precision_at_1000
value: 1.4160000000000001
- type: precision_at_3
value: 14.966
- type: precision_at_5
value: 18.776
- type: recall_at_1
value: 1.672
- type: recall_at_10
value: 12.795000000000002
- type: recall_at_100
value: 41.289
- type: recall_at_1000
value: 76.947
- type: recall_at_3
value: 3.334
- type: recall_at_5
value: 6.864000000000001
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 69.3424
- type: ap
value: 13.45149708639965
- type: f1
value: 53.278180518373574
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 57.60045274476513
- type: f1
value: 57.9395926195531
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 36.649067825169446
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 83.68599868868093
- type: cos_sim_ap
value: 65.7938550603812
- type: cos_sim_f1
value: 61.81946735800141
- type: cos_sim_precision
value: 55.85604770017035
- type: cos_sim_recall
value: 69.2084432717678
- type: dot_accuracy
value: 82.09453418370389
- type: dot_ap
value: 61.00867337905922
- type: dot_f1
value: 58.56196783349101
- type: dot_precision
value: 53.06472353193313
- type: dot_recall
value: 65.32981530343008
- type: euclidean_accuracy
value: 83.68599868868093
- type: euclidean_ap
value: 66.17065796133883
- type: euclidean_f1
value: 62.440610152538135
- type: euclidean_precision
value: 59.3393536121673
- type: euclidean_recall
value: 65.88390501319262
- type: manhattan_accuracy
value: 83.57870894677237
- type: manhattan_ap
value: 65.89925640001532
- type: manhattan_f1
value: 62.2255119664446
- type: manhattan_precision
value: 58.43373493975904
- type: manhattan_recall
value: 66.54353562005278
- type: max_accuracy
value: 83.68599868868093
- type: max_ap
value: 66.17065796133883
- type: max_f1
value: 62.440610152538135
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 87.68579966623976
- type: cos_sim_ap
value: 83.2666595805096
- type: cos_sim_f1
value: 75.11536297129996
- type: cos_sim_precision
value: 73.24943294065999
- type: cos_sim_recall
value: 77.07884200800738
- type: dot_accuracy
value: 86.76213761788334
- type: dot_ap
value: 80.85199640255004
- type: dot_f1
value: 73.27634898520165
- type: dot_precision
value: 71.70756872282409
- type: dot_recall
value: 74.91530643671081
- type: euclidean_accuracy
value: 87.79640625606395
- type: euclidean_ap
value: 83.52666327503474
- type: euclidean_f1
value: 75.37022886875523
- type: euclidean_precision
value: 71.4522249051397
- type: euclidean_recall
value: 79.74283954419464
- type: manhattan_accuracy
value: 87.80804905499282
- type: manhattan_ap
value: 83.4995899990913
- type: manhattan_f1
value: 75.44320420223242
- type: manhattan_precision
value: 71.68307223069458
- type: manhattan_recall
value: 79.6196489066831
- type: max_accuracy
value: 87.80804905499282
- type: max_ap
value: 83.52666327503474
- type: max_f1
value: 75.44320420223242
---
| [
"SUMMARIZATION"
] | [
"BIOSSES",
"SCIFACT"
] |
d0rj/e5-large-en-ru | d0rj | sentence-similarity | [
"transformers",
"pytorch",
"safetensors",
"xlm-roberta",
"feature-extraction",
"mteb",
"retrieval",
"retriever",
"pruned",
"e5",
"sentence-transformers",
"sentence-similarity",
"en",
"ru",
"license:mit",
"model-index",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
] | 2023-09-18T14:44:07 | 2023-09-21T13:05:05 | 1,439 | 9 | ---
language:
- en
- ru
library_name: transformers
license: mit
metrics:
- accuracy
- f1
- recall
pipeline_tag: sentence-similarity
tags:
- mteb
- retrieval
- retriever
- pruned
- e5
- sentence-transformers
- feature-extraction
- sentence-similarity
model-index:
- name: e5-large-en-ru
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 79.5671641791045
- type: ap
value: 44.011060753169424
- type: f1
value: 73.76504135120175
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 57.69669466706412
- type: mrr
value: 70.61370531592138
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 86.36465960226795
- type: cos_sim_spearman
value: 84.57602350761223
- type: euclidean_pearson
value: 84.31391364490506
- type: euclidean_spearman
value: 84.57602350761223
- type: manhattan_pearson
value: 84.15796224236456
- type: manhattan_spearman
value: 84.3645729064343
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 31.105698873583098
- type: mrr
value: 32.163780846856206
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 83.75973907678062
- type: cos_sim_spearman
value: 80.54994608351296
- type: euclidean_pearson
value: 80.58496551316748
- type: euclidean_spearman
value: 80.54993996457814
- type: manhattan_pearson
value: 80.49280884070782
- type: manhattan_spearman
value: 80.41230093993471
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 87.345503928209
- type: cos_sim_spearman
value: 80.4634619001261
- type: euclidean_pearson
value: 84.2666575030677
- type: euclidean_spearman
value: 80.46347579495351
- type: manhattan_pearson
value: 84.14370038922885
- type: manhattan_spearman
value: 80.36565043629274
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 75.14644787456163
- type: cos_sim_spearman
value: 75.88443166051762
- type: euclidean_pearson
value: 76.19117255044588
- type: euclidean_spearman
value: 75.88443166051762
- type: manhattan_pearson
value: 76.00450128624708
- type: manhattan_spearman
value: 75.69943934692938
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 77.60763524019471
- type: cos_sim_spearman
value: 77.2591077818027
- type: euclidean_pearson
value: 77.14021401348042
- type: euclidean_spearman
value: 77.25911027186999
- type: manhattan_pearson
value: 76.87139081109731
- type: manhattan_spearman
value: 76.98379627773018
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 88.18321035966198
- type: cos_sim_spearman
value: 89.0469892725742
- type: euclidean_pearson
value: 88.05085809092137
- type: euclidean_spearman
value: 89.04698194601134
- type: manhattan_pearson
value: 88.03620967628684
- type: manhattan_spearman
value: 89.02859425307943
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 82.39166503459249
- type: cos_sim_spearman
value: 83.71826060604693
- type: euclidean_pearson
value: 82.70145770530107
- type: euclidean_spearman
value: 83.71826045549452
- type: manhattan_pearson
value: 82.56870669205291
- type: manhattan_spearman
value: 83.55353737670136
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 89.58290721169323
- type: cos_sim_spearman
value: 89.25956993522081
- type: euclidean_pearson
value: 89.4716703635447
- type: euclidean_spearman
value: 89.25956993522081
- type: manhattan_pearson
value: 89.4475864648432
- type: manhattan_spearman
value: 89.14694174575615
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 81.4879065181404
- type: mrr
value: 94.81295937178291
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.73960396039604
- type: cos_sim_ap
value: 92.70840767967965
- type: cos_sim_f1
value: 86.90890990542557
- type: cos_sim_precision
value: 86.5213082259663
- type: cos_sim_recall
value: 87.3
- type: dot_accuracy
value: 99.73960396039604
- type: dot_ap
value: 92.70828452993575
- type: dot_f1
value: 86.90890990542557
- type: dot_precision
value: 86.5213082259663
- type: dot_recall
value: 87.3
- type: euclidean_accuracy
value: 99.73960396039604
- type: euclidean_ap
value: 92.7084093403562
- type: euclidean_f1
value: 86.90890990542557
- type: euclidean_precision
value: 86.5213082259663
- type: euclidean_recall
value: 87.3
- type: manhattan_accuracy
value: 99.74059405940594
- type: manhattan_ap
value: 92.7406819850299
- type: manhattan_f1
value: 87.01234567901234
- type: manhattan_precision
value: 85.95121951219512
- type: manhattan_recall
value: 88.1
- type: max_accuracy
value: 99.74059405940594
- type: max_ap
value: 92.7406819850299
- type: max_f1
value: 87.01234567901234
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 48.566931484512196
- type: mrr
value: 49.23111100500807
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 86.27287357692079
- type: cos_sim_ap
value: 74.20855854505362
- type: cos_sim_f1
value: 69.09903201787044
- type: cos_sim_precision
value: 65.22961574507966
- type: cos_sim_recall
value: 73.45646437994723
- type: dot_accuracy
value: 86.27287357692079
- type: dot_ap
value: 74.20853189774614
- type: dot_f1
value: 69.09903201787044
- type: dot_precision
value: 65.22961574507966
- type: dot_recall
value: 73.45646437994723
- type: euclidean_accuracy
value: 86.27287357692079
- type: euclidean_ap
value: 74.20857455896677
- type: euclidean_f1
value: 69.09903201787044
- type: euclidean_precision
value: 65.22961574507966
- type: euclidean_recall
value: 73.45646437994723
- type: manhattan_accuracy
value: 86.2192287059665
- type: manhattan_ap
value: 74.0513280969461
- type: manhattan_f1
value: 69.13344473621389
- type: manhattan_precision
value: 63.12118570183086
- type: manhattan_recall
value: 76.41160949868075
- type: max_accuracy
value: 86.27287357692079
- type: max_ap
value: 74.20857455896677
- type: max_f1
value: 69.13344473621389
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 89.16055419722902
- type: cos_sim_ap
value: 86.03614264194854
- type: cos_sim_f1
value: 78.89855695205357
- type: cos_sim_precision
value: 73.74656938215409
- type: cos_sim_recall
value: 84.82445334154605
- type: dot_accuracy
value: 89.16055419722902
- type: dot_ap
value: 86.03614225282097
- type: dot_f1
value: 78.89855695205357
- type: dot_precision
value: 73.74656938215409
- type: dot_recall
value: 84.82445334154605
- type: euclidean_accuracy
value: 89.16055419722902
- type: euclidean_ap
value: 86.0361548355667
- type: euclidean_f1
value: 78.89855695205357
- type: euclidean_precision
value: 73.74656938215409
- type: euclidean_recall
value: 84.82445334154605
- type: manhattan_accuracy
value: 89.11786393448985
- type: manhattan_ap
value: 86.00799361972808
- type: manhattan_f1
value: 78.84721152788472
- type: manhattan_precision
value: 75.26776338816941
- type: manhattan_recall
value: 82.78410840776101
- type: max_accuracy
value: 89.16055419722902
- type: max_ap
value: 86.0361548355667
- type: max_f1
value: 78.89855695205357
---
# E5-large-en-ru
## Model info
This is vocabulary pruned version of [intfloat/multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large).
Uses only russian and english tokens.
### Size
| | intfloat/multilingual-e5-large | d0rj/e5-large-en-ru |
| --- | --- | --- |
| Model size (MB) | 2135.82 | 1394.8 |
| Params (count) | 559,890,946 | 365,638,14 |
| Word embeddings dim | 256,002,048 | 61,749,248 |
### Performance
Equal performance on SberQuAD dev benchmark.
| Metric on SberQuAD (4122 questions) | intfloat/multilingual-e5-large | d0rj/e5-large-en-ru |
| --- | --- | --- |
| recall@3 | 0.787239204269772 | **0.7882096069868996** |
| map@3 | 0.7230713245997101 | **0.723192624939351** |
| mrr@3 | 0.7241630276564784 | **0.7243651948892132** |
| recall@5 | 0.8277535177098496 | **0.8284813197476953** |
| map@5 | 0.7301603186155587 | **0.7302573588872716** |
| mrr@5 | 0.7334667637069385 | **0.7335718906679607** |
| recall@10 | **0.8716642406598738** | 0.871421639980592 |
| map@10 | **0.7314774917730316** | 0.7313000338687417 |
| mrr@10 | **0.7392223685527911** | 0.7391814537556898 |
## Usage
- Use **dot product** distance for retrieval.
- Use "query: " and "passage: " correspondingly for asymmetric tasks such as passage retrieval in open QA, ad-hoc information retrieval.
- Use "query: " prefix for symmetric tasks such as semantic similarity, bitext mining, paraphrase retrieval.
- Use "query: " prefix if you want to use embeddings as features, such as linear probing classification, clustering.
### transformers
#### Direct usage
```python
import torch.nn.functional as F
from torch import Tensor
from transformers import XLMRobertaTokenizer, XLMRobertaModel
def average_pool(last_hidden_states: Tensor, attention_mask: Tensor) -> Tensor:
last_hidden = last_hidden_states.masked_fill(~attention_mask[..., None].bool(), 0.0)
return last_hidden.sum(dim=1) / attention_mask.sum(dim=1)[..., None]
input_texts = [
'query: How does a corporate website differ from a business card website?',
'query: Где был создан первый троллейбус?',
'passage: The first trolleybus was created in Germany by engineer Werner von Siemens, probably influenced by the idea of his brother, Dr. Wilhelm Siemens, who lived in England, expressed on May 18, 1881 at the twenty-second meeting of the Royal Scientific Society. The electrical circuit was carried out by an eight-wheeled cart (Kontaktwagen) rolling along two parallel contact wires. The wires were located quite close to each other, and in strong winds they often overlapped, which led to short circuits. An experimental trolleybus line with a length of 540 m (591 yards), opened by Siemens & Halske in the Berlin suburb of Halensee, operated from April 29 to June 13, 1882.',
'passage: Корпоративный сайт — содержит полную информацию о компании-владельце, услугах/продукции, событиях в жизни компании. Отличается от сайта-визитки и представительского сайта полнотой представленной информации, зачастую содержит различные функциональные инструменты для работы с контентом (поиск и фильтры, календари событий, фотогалереи, корпоративные блоги, форумы). Может быть интегрирован с внутренними информационными системами компании-владельца (КИС, CRM, бухгалтерскими системами). Может содержать закрытые разделы для тех или иных групп пользователей — сотрудников, дилеров, контрагентов и пр.',
]
tokenizer = XLMRobertaTokenizer.from_pretrained('d0rj/e5-large-en-ru', use_cache=False)
model = XLMRobertaModel.from_pretrained('d0rj/e5-large-en-ru', use_cache=False)
batch_dict = tokenizer(input_texts, max_length=512, padding=True, truncation=True, return_tensors='pt')
outputs = model(**batch_dict)
embeddings = average_pool(outputs.last_hidden_state, batch_dict['attention_mask'])
embeddings = F.normalize(embeddings, p=2, dim=1)
scores = (embeddings[:2] @ embeddings[2:].T) * 100
print(scores.tolist())
# [[68.59542846679688, 81.75910949707031], [80.36100769042969, 64.77748107910156]]
```
#### Pipeline
```python
from transformers import pipeline
pipe = pipeline('feature-extraction', model='d0rj/e5-large-en-ru')
embeddings = pipe(input_texts, return_tensors=True)
embeddings[0].size()
# torch.Size([1, 17, 1024])
```
### sentence-transformers
```python
from sentence_transformers import SentenceTransformer
sentences = [
'query: Что такое круглые тензоры?',
'passage: Abstract: we introduce a novel method for compressing round tensors based on their inherent radial symmetry. We start by generalising PCA and eigen decomposition on round tensors...',
]
model = SentenceTransformer('d0rj/e5-large-en-ru')
embeddings = model.encode(sentences, convert_to_tensor=True)
embeddings.size()
# torch.Size([2, 1024])
``` | [
"SEMANTIC_SIMILARITY"
] | [
"BIOSSES"
] |
jxm/cde-small-v1 | jxm | feature-extraction | [
"sentence-transformers",
"safetensors",
"feature-extraction",
"mteb",
"transformers",
"custom_code",
"arxiv:2410.02525",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-09-24T03:24:53 | 2025-01-21T15:13:14 | 1,400 | 285 | ---
tags:
- mteb
- transformers
- sentence-transformers
new_version: jxm/cde-small-v2
model-index:
- name: cde-small-v1
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 87.02985074626866
- type: ap
value: 56.706190238632956
- type: ap_weighted
value: 56.706190238632956
- type: f1
value: 81.93161953007674
- type: f1_weighted
value: 87.7650174177188
- type: main_score
value: 87.02985074626866
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification (default)
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 94.664175
- type: ap
value: 91.68668057762052
- type: ap_weighted
value: 91.68668057762052
- type: f1
value: 94.65859470333152
- type: f1_weighted
value: 94.65859470333152
- type: main_score
value: 94.664175
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 55.762
- type: f1
value: 55.06427827477677
- type: f1_weighted
value: 55.06427827477677
- type: main_score
value: 55.762
- task:
type: Retrieval
dataset:
name: MTEB ArguAna (default)
type: mteb/arguana
config: default
split: test
revision: c22ab2a51041ffd869aaddef7af8d8215647e41a
metrics:
- type: main_score
value: 71.99600000000001
- type: map_at_1
value: 49.004
- type: map_at_10
value: 64.741
- type: map_at_100
value: 65.045
- type: map_at_1000
value: 65.048
- type: map_at_20
value: 64.999
- type: map_at_3
value: 61.344
- type: map_at_5
value: 63.595
- type: mrr_at_1
value: 50.71123755334281
- type: mrr_at_10
value: 65.32688703741336
- type: mrr_at_100
value: 65.63793917015693
- type: mrr_at_1000
value: 65.64038101143724
- type: mrr_at_20
value: 65.59178002869953
- type: mrr_at_3
value: 61.960644855381695
- type: mrr_at_5
value: 64.12636320531058
- type: nauc_map_at_1000_diff1
value: 15.961240220366024
- type: nauc_map_at_1000_max
value: -7.44765810583741
- type: nauc_map_at_1000_std
value: -17.07167824225605
- type: nauc_map_at_100_diff1
value: 15.965616911760689
- type: nauc_map_at_100_max
value: -7.440609797442297
- type: nauc_map_at_100_std
value: -17.069175070766125
- type: nauc_map_at_10_diff1
value: 16.0053641689455
- type: nauc_map_at_10_max
value: -7.292003400856069
- type: nauc_map_at_10_std
value: -17.21891231777586
- type: nauc_map_at_1_diff1
value: 16.775859614223965
- type: nauc_map_at_1_max
value: -10.812150486389175
- type: nauc_map_at_1_std
value: -18.447209756110635
- type: nauc_map_at_20_diff1
value: 16.00477985164213
- type: nauc_map_at_20_max
value: -7.344399709169316
- type: nauc_map_at_20_std
value: -17.011815937847548
- type: nauc_map_at_3_diff1
value: 15.730294091913994
- type: nauc_map_at_3_max
value: -7.13902722192326
- type: nauc_map_at_3_std
value: -16.846251134000045
- type: nauc_map_at_5_diff1
value: 15.952653874864062
- type: nauc_map_at_5_max
value: -6.730509527119155
- type: nauc_map_at_5_std
value: -16.586379153220353
- type: nauc_mrr_at_1000_diff1
value: 10.221278338563085
- type: nauc_mrr_at_1000_max
value: -10.513831642963527
- type: nauc_mrr_at_1000_std
value: -16.340880407651863
- type: nauc_mrr_at_100_diff1
value: 10.226217465992063
- type: nauc_mrr_at_100_max
value: -10.506478667638874
- type: nauc_mrr_at_100_std
value: -16.33847358633176
- type: nauc_mrr_at_10_diff1
value: 10.293491655887369
- type: nauc_mrr_at_10_max
value: -10.357229664747909
- type: nauc_mrr_at_10_std
value: -16.496874845739885
- type: nauc_mrr_at_1_diff1
value: 12.049863016253427
- type: nauc_mrr_at_1_max
value: -11.968579522299635
- type: nauc_mrr_at_1_std
value: -16.65245790056632
- type: nauc_mrr_at_20_diff1
value: 10.276109067921565
- type: nauc_mrr_at_20_max
value: -10.404100283652397
- type: nauc_mrr_at_20_std
value: -16.282098762560164
- type: nauc_mrr_at_3_diff1
value: 10.338008940592475
- type: nauc_mrr_at_3_max
value: -10.123508259477648
- type: nauc_mrr_at_3_std
value: -16.218834894850918
- type: nauc_mrr_at_5_diff1
value: 10.114375457049043
- type: nauc_mrr_at_5_max
value: -9.987361588255437
- type: nauc_mrr_at_5_std
value: -15.723897501895118
- type: nauc_ndcg_at_1000_diff1
value: 16.00889445347496
- type: nauc_ndcg_at_1000_max
value: -6.746746500535893
- type: nauc_ndcg_at_1000_std
value: -16.567047531839382
- type: nauc_ndcg_at_100_diff1
value: 16.10719535312808
- type: nauc_ndcg_at_100_max
value: -6.59354665730934
- type: nauc_ndcg_at_100_std
value: -16.513298001700566
- type: nauc_ndcg_at_10_diff1
value: 16.396485814351973
- type: nauc_ndcg_at_10_max
value: -5.7111859345525895
- type: nauc_ndcg_at_10_std
value: -17.13416103510026
- type: nauc_ndcg_at_1_diff1
value: 16.775859614223965
- type: nauc_ndcg_at_1_max
value: -10.812150486389175
- type: nauc_ndcg_at_1_std
value: -18.447209756110635
- type: nauc_ndcg_at_20_diff1
value: 16.414235526534497
- type: nauc_ndcg_at_20_max
value: -5.890463457153039
- type: nauc_ndcg_at_20_std
value: -16.124783371499017
- type: nauc_ndcg_at_3_diff1
value: 15.683431770601713
- type: nauc_ndcg_at_3_max
value: -5.546675513691499
- type: nauc_ndcg_at_3_std
value: -15.973244504586676
- type: nauc_ndcg_at_5_diff1
value: 16.193847874581166
- type: nauc_ndcg_at_5_max
value: -4.471638454091411
- type: nauc_ndcg_at_5_std
value: -15.517824617814629
- type: nauc_precision_at_1000_diff1
value: 3.170440311533737
- type: nauc_precision_at_1000_max
value: 25.521992526080666
- type: nauc_precision_at_1000_std
value: 68.4373013145641
- type: nauc_precision_at_100_diff1
value: 30.283338663457897
- type: nauc_precision_at_100_max
value: 44.33747104624998
- type: nauc_precision_at_100_std
value: 42.28887350925609
- type: nauc_precision_at_10_diff1
value: 23.390956301235633
- type: nauc_precision_at_10_max
value: 15.468288261126773
- type: nauc_precision_at_10_std
value: -18.2942744669977
- type: nauc_precision_at_1_diff1
value: 16.775859614223965
- type: nauc_precision_at_1_max
value: -10.812150486389175
- type: nauc_precision_at_1_std
value: -18.447209756110635
- type: nauc_precision_at_20_diff1
value: 37.14254275219614
- type: nauc_precision_at_20_max
value: 46.984729023754824
- type: nauc_precision_at_20_std
value: 22.763524786900717
- type: nauc_precision_at_3_diff1
value: 15.651406928218881
- type: nauc_precision_at_3_max
value: 0.7775458885343681
- type: nauc_precision_at_3_std
value: -12.438132482295773
- type: nauc_precision_at_5_diff1
value: 18.10074574210355
- type: nauc_precision_at_5_max
value: 9.373350504221532
- type: nauc_precision_at_5_std
value: -9.13125987784625
- type: nauc_recall_at_1000_diff1
value: 3.1704403115262325
- type: nauc_recall_at_1000_max
value: 25.521992526077756
- type: nauc_recall_at_1000_std
value: 68.4373013145603
- type: nauc_recall_at_100_diff1
value: 30.283338663455616
- type: nauc_recall_at_100_max
value: 44.337471046250556
- type: nauc_recall_at_100_std
value: 42.28887350925341
- type: nauc_recall_at_10_diff1
value: 23.390956301235168
- type: nauc_recall_at_10_max
value: 15.468288261126578
- type: nauc_recall_at_10_std
value: -18.294274466997873
- type: nauc_recall_at_1_diff1
value: 16.775859614223965
- type: nauc_recall_at_1_max
value: -10.812150486389175
- type: nauc_recall_at_1_std
value: -18.447209756110635
- type: nauc_recall_at_20_diff1
value: 37.14254275219513
- type: nauc_recall_at_20_max
value: 46.98472902375421
- type: nauc_recall_at_20_std
value: 22.763524786899644
- type: nauc_recall_at_3_diff1
value: 15.65140692821902
- type: nauc_recall_at_3_max
value: 0.7775458885343522
- type: nauc_recall_at_3_std
value: -12.43813248229578
- type: nauc_recall_at_5_diff1
value: 18.10074574210355
- type: nauc_recall_at_5_max
value: 9.373350504221595
- type: nauc_recall_at_5_std
value: -9.131259877846116
- type: ndcg_at_1
value: 49.004
- type: ndcg_at_10
value: 71.99600000000001
- type: ndcg_at_100
value: 73.173
- type: ndcg_at_1000
value: 73.214
- type: ndcg_at_20
value: 72.91
- type: ndcg_at_3
value: 65.21900000000001
- type: ndcg_at_5
value: 69.284
- type: precision_at_1
value: 49.004
- type: precision_at_10
value: 9.452
- type: precision_at_100
value: 0.9939999999999999
- type: precision_at_1000
value: 0.1
- type: precision_at_20
value: 4.904
- type: precision_at_3
value: 25.462
- type: precision_at_5
value: 17.255000000000003
- type: recall_at_1
value: 49.004
- type: recall_at_10
value: 94.523
- type: recall_at_100
value: 99.36
- type: recall_at_1000
value: 99.644
- type: recall_at_20
value: 98.08
- type: recall_at_3
value: 76.387
- type: recall_at_5
value: 86.273
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P (default)
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: main_score
value: 48.629569816593516
- type: v_measure
value: 48.629569816593516
- type: v_measure_std
value: 14.01810149072028
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S (default)
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: main_score
value: 40.52366904677561
- type: v_measure
value: 40.52366904677561
- type: v_measure_std
value: 14.375876773823757
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions (default)
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: main_score
value: 61.27347206107508
- type: map
value: 61.27347206107508
- type: mrr
value: 74.49105219188321
- type: nAUC_map_diff1
value: 13.442645655149457
- type: nAUC_map_max
value: 25.013363268430027
- type: nAUC_map_std
value: 17.60175231611674
- type: nAUC_mrr_diff1
value: 25.217675209249435
- type: nAUC_mrr_max
value: 32.37381560372622
- type: nAUC_mrr_std
value: 22.584922632508412
- task:
type: STS
dataset:
name: MTEB BIOSSES (default)
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cosine_pearson
value: 89.09452267906886
- type: cosine_spearman
value: 86.73450642504955
- type: euclidean_pearson
value: 87.1275130552617
- type: euclidean_spearman
value: 86.93812552248012
- type: main_score
value: 86.73450642504955
- type: manhattan_pearson
value: 86.79403606129864
- type: manhattan_spearman
value: 86.76824213349957
- type: pearson
value: 89.09452267906886
- type: spearman
value: 86.73450642504955
- task:
type: Classification
dataset:
name: MTEB Banking77Classification (default)
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 88.58116883116884
- type: f1
value: 88.54536316207125
- type: f1_weighted
value: 88.54536316207125
- type: main_score
value: 88.58116883116884
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P (default)
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: main_score
value: 44.89554099528695
- type: v_measure
value: 44.89554099528695
- type: v_measure_std
value: 0.6101675839696261
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S (default)
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: main_score
value: 37.89775676199564
- type: v_measure
value: 37.89775676199564
- type: v_measure_std
value: 0.6980439644171996
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval (default)
type: mteb/cqadupstack-android
config: default
split: test
revision: f46a197baaae43b4f621051089b82a364682dfeb
metrics:
- type: main_score
value: 49.239
- type: map_at_1
value: 31.407
- type: map_at_10
value: 42.788
- type: map_at_100
value: 44.163999999999994
- type: map_at_1000
value: 44.285000000000004
- type: map_at_20
value: 43.531
- type: map_at_3
value: 39.381
- type: map_at_5
value: 41.296
- type: mrr_at_1
value: 38.91273247496424
- type: mrr_at_10
value: 48.82553307446011
- type: mrr_at_100
value: 49.5278584841276
- type: mrr_at_1000
value: 49.56897938168851
- type: mrr_at_20
value: 49.27034318525701
- type: mrr_at_3
value: 46.423462088698145
- type: mrr_at_5
value: 47.83261802575108
- type: nauc_map_at_1000_diff1
value: 51.50772644391144
- type: nauc_map_at_1000_max
value: 39.57698592158747
- type: nauc_map_at_1000_std
value: -5.092734127689174
- type: nauc_map_at_100_diff1
value: 51.51650908644926
- type: nauc_map_at_100_max
value: 39.579607215550325
- type: nauc_map_at_100_std
value: -5.112306014245407
- type: nauc_map_at_10_diff1
value: 51.80732269410239
- type: nauc_map_at_10_max
value: 39.312012392020854
- type: nauc_map_at_10_std
value: -5.844192947783184
- type: nauc_map_at_1_diff1
value: 58.51885994004338
- type: nauc_map_at_1_max
value: 35.306905646597656
- type: nauc_map_at_1_std
value: -6.4627870729629455
- type: nauc_map_at_20_diff1
value: 51.560698537725294
- type: nauc_map_at_20_max
value: 39.40865218451427
- type: nauc_map_at_20_std
value: -5.46140640509653
- type: nauc_map_at_3_diff1
value: 52.845784777873305
- type: nauc_map_at_3_max
value: 38.55976877563459
- type: nauc_map_at_3_std
value: -5.72430771104222
- type: nauc_map_at_5_diff1
value: 52.29343919325049
- type: nauc_map_at_5_max
value: 38.98194700024613
- type: nauc_map_at_5_std
value: -6.062278166282727
- type: nauc_mrr_at_1000_diff1
value: 48.824012243253904
- type: nauc_mrr_at_1000_max
value: 40.36119735345816
- type: nauc_mrr_at_1000_std
value: -4.371172318529068
- type: nauc_mrr_at_100_diff1
value: 48.80142209066577
- type: nauc_mrr_at_100_max
value: 40.35371141231279
- type: nauc_mrr_at_100_std
value: -4.382000140837231
- type: nauc_mrr_at_10_diff1
value: 48.89408963706152
- type: nauc_mrr_at_10_max
value: 40.48043029859513
- type: nauc_mrr_at_10_std
value: -4.5927306729163835
- type: nauc_mrr_at_1_diff1
value: 53.18491414251319
- type: nauc_mrr_at_1_max
value: 38.43746618754316
- type: nauc_mrr_at_1_std
value: -6.2489159406458965
- type: nauc_mrr_at_20_diff1
value: 48.763867640789634
- type: nauc_mrr_at_20_max
value: 40.369114351255135
- type: nauc_mrr_at_20_std
value: -4.400065130027329
- type: nauc_mrr_at_3_diff1
value: 48.87375252127912
- type: nauc_mrr_at_3_max
value: 40.810763259212116
- type: nauc_mrr_at_3_std
value: -3.4938483699692657
- type: nauc_mrr_at_5_diff1
value: 49.186967577714285
- type: nauc_mrr_at_5_max
value: 40.48882253846611
- type: nauc_mrr_at_5_std
value: -4.621076155915746
- type: nauc_ndcg_at_1000_diff1
value: 49.24642669558249
- type: nauc_ndcg_at_1000_max
value: 41.00404222082434
- type: nauc_ndcg_at_1000_std
value: -2.7356065308278392
- type: nauc_ndcg_at_100_diff1
value: 48.92939354546236
- type: nauc_ndcg_at_100_max
value: 40.972699158281586
- type: nauc_ndcg_at_100_std
value: -3.0561983632108776
- type: nauc_ndcg_at_10_diff1
value: 49.60179215238792
- type: nauc_ndcg_at_10_max
value: 40.89678771623847
- type: nauc_ndcg_at_10_std
value: -5.096633756025252
- type: nauc_ndcg_at_1_diff1
value: 53.18491414251319
- type: nauc_ndcg_at_1_max
value: 38.43746618754316
- type: nauc_ndcg_at_1_std
value: -6.2489159406458965
- type: nauc_ndcg_at_20_diff1
value: 48.826483305583984
- type: nauc_ndcg_at_20_max
value: 40.592200374154466
- type: nauc_ndcg_at_20_std
value: -4.185196398682058
- type: nauc_ndcg_at_3_diff1
value: 49.9798291819845
- type: nauc_ndcg_at_3_max
value: 40.50211559049151
- type: nauc_ndcg_at_3_std
value: -3.9606100546649
- type: nauc_ndcg_at_5_diff1
value: 50.222364976292454
- type: nauc_ndcg_at_5_max
value: 40.477461845726694
- type: nauc_ndcg_at_5_std
value: -5.025922873253527
- type: nauc_precision_at_1000_diff1
value: -24.208256297106363
- type: nauc_precision_at_1000_max
value: -10.21103761078881
- type: nauc_precision_at_1000_std
value: -0.06753142735419307
- type: nauc_precision_at_100_diff1
value: -15.392095697703853
- type: nauc_precision_at_100_max
value: 3.3764259600400375
- type: nauc_precision_at_100_std
value: 7.032273000803224
- type: nauc_precision_at_10_diff1
value: 8.050911372676126
- type: nauc_precision_at_10_max
value: 26.426542125643365
- type: nauc_precision_at_10_std
value: 2.3142807003880423
- type: nauc_precision_at_1_diff1
value: 53.18491414251319
- type: nauc_precision_at_1_max
value: 38.43746618754316
- type: nauc_precision_at_1_std
value: -6.2489159406458965
- type: nauc_precision_at_20_diff1
value: -2.4038370945777605
- type: nauc_precision_at_20_max
value: 18.29255413962441
- type: nauc_precision_at_20_std
value: 6.963786700698579
- type: nauc_precision_at_3_diff1
value: 27.590923102137978
- type: nauc_precision_at_3_max
value: 36.809716569640635
- type: nauc_precision_at_3_std
value: -0.4588749991090731
- type: nauc_precision_at_5_diff1
value: 18.31451430104417
- type: nauc_precision_at_5_max
value: 31.76792278657563
- type: nauc_precision_at_5_std
value: -0.23205753470623663
- type: nauc_recall_at_1000_diff1
value: 38.6186488416617
- type: nauc_recall_at_1000_max
value: 58.02448766170835
- type: nauc_recall_at_1000_std
value: 43.005151313404625
- type: nauc_recall_at_100_diff1
value: 36.14901358957452
- type: nauc_recall_at_100_max
value: 42.97412072448754
- type: nauc_recall_at_100_std
value: 8.434723462734665
- type: nauc_recall_at_10_diff1
value: 42.953316965307245
- type: nauc_recall_at_10_max
value: 40.54865147159118
- type: nauc_recall_at_10_std
value: -4.9425741693714125
- type: nauc_recall_at_1_diff1
value: 58.51885994004338
- type: nauc_recall_at_1_max
value: 35.306905646597656
- type: nauc_recall_at_1_std
value: -6.4627870729629455
- type: nauc_recall_at_20_diff1
value: 38.27628659312007
- type: nauc_recall_at_20_max
value: 39.50607176714142
- type: nauc_recall_at_20_std
value: -1.002089290215587
- type: nauc_recall_at_3_diff1
value: 47.263415527062676
- type: nauc_recall_at_3_max
value: 40.82836525135613
- type: nauc_recall_at_3_std
value: -2.2314232915782504
- type: nauc_recall_at_5_diff1
value: 46.13867315478644
- type: nauc_recall_at_5_max
value: 39.93028001594826
- type: nauc_recall_at_5_std
value: -4.809283400175646
- type: ndcg_at_1
value: 38.913
- type: ndcg_at_10
value: 49.239
- type: ndcg_at_100
value: 54.325
- type: ndcg_at_1000
value: 56.226
- type: ndcg_at_20
value: 51.212999999999994
- type: ndcg_at_3
value: 44.559
- type: ndcg_at_5
value: 46.69
- type: precision_at_1
value: 38.913
- type: precision_at_10
value: 9.227
- type: precision_at_100
value: 1.4909999999999999
- type: precision_at_1000
value: 0.197
- type: precision_at_20
value: 5.494000000000001
- type: precision_at_3
value: 21.65
- type: precision_at_5
value: 15.336
- type: recall_at_1
value: 31.407
- type: recall_at_10
value: 61.961999999999996
- type: recall_at_100
value: 82.993
- type: recall_at_1000
value: 94.887
- type: recall_at_20
value: 68.771
- type: recall_at_3
value: 47.77
- type: recall_at_5
value: 53.895
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackEnglishRetrieval (default)
type: mteb/cqadupstack-english
config: default
split: test
revision: ad9991cb51e31e31e430383c75ffb2885547b5f0
metrics:
- type: main_score
value: 44.391000000000005
- type: map_at_1
value: 29.157
- type: map_at_10
value: 38.723
- type: map_at_100
value: 39.864
- type: map_at_1000
value: 39.995999999999995
- type: map_at_20
value: 39.287
- type: map_at_3
value: 35.751
- type: map_at_5
value: 37.373
- type: mrr_at_1
value: 36.81528662420382
- type: mrr_at_10
value: 44.82939035486806
- type: mrr_at_100
value: 45.437834419775484
- type: mrr_at_1000
value: 45.48695197590834
- type: mrr_at_20
value: 45.15519263295387
- type: mrr_at_3
value: 42.55838641188959
- type: mrr_at_5
value: 43.87685774946922
- type: nauc_map_at_1000_diff1
value: 51.086880931657944
- type: nauc_map_at_1000_max
value: 36.870501109568856
- type: nauc_map_at_1000_std
value: -9.041748740450098
- type: nauc_map_at_100_diff1
value: 51.13349280885669
- type: nauc_map_at_100_max
value: 36.81376788959824
- type: nauc_map_at_100_std
value: -9.168817557968493
- type: nauc_map_at_10_diff1
value: 51.43767101896258
- type: nauc_map_at_10_max
value: 36.13512723388837
- type: nauc_map_at_10_std
value: -10.340353132146591
- type: nauc_map_at_1_diff1
value: 57.97216876426843
- type: nauc_map_at_1_max
value: 32.093932122348804
- type: nauc_map_at_1_std
value: -12.44326469749823
- type: nauc_map_at_20_diff1
value: 51.35742644989209
- type: nauc_map_at_20_max
value: 36.362008583908754
- type: nauc_map_at_20_std
value: -9.925604455959942
- type: nauc_map_at_3_diff1
value: 52.97191265890149
- type: nauc_map_at_3_max
value: 35.216095114265
- type: nauc_map_at_3_std
value: -11.505843284384989
- type: nauc_map_at_5_diff1
value: 52.13435748405322
- type: nauc_map_at_5_max
value: 35.63014323147684
- type: nauc_map_at_5_std
value: -11.15253714131609
- type: nauc_mrr_at_1000_diff1
value: 49.806361508243526
- type: nauc_mrr_at_1000_max
value: 39.60825242174082
- type: nauc_mrr_at_1000_std
value: -4.581320333963986
- type: nauc_mrr_at_100_diff1
value: 49.794023465886575
- type: nauc_mrr_at_100_max
value: 39.606036503563935
- type: nauc_mrr_at_100_std
value: -4.580524433129927
- type: nauc_mrr_at_10_diff1
value: 49.62511317783946
- type: nauc_mrr_at_10_max
value: 39.524849843022054
- type: nauc_mrr_at_10_std
value: -4.784364837521214
- type: nauc_mrr_at_1_diff1
value: 55.03485605539673
- type: nauc_mrr_at_1_max
value: 38.26074360694823
- type: nauc_mrr_at_1_std
value: -6.990940922024673
- type: nauc_mrr_at_20_diff1
value: 49.77823031843402
- type: nauc_mrr_at_20_max
value: 39.62943812120721
- type: nauc_mrr_at_20_std
value: -4.664971744136187
- type: nauc_mrr_at_3_diff1
value: 50.60933103133387
- type: nauc_mrr_at_3_max
value: 39.920174010377444
- type: nauc_mrr_at_3_std
value: -5.404917304425809
- type: nauc_mrr_at_5_diff1
value: 50.137405938227886
- type: nauc_mrr_at_5_max
value: 39.7046033416223
- type: nauc_mrr_at_5_std
value: -4.9683994219777965
- type: nauc_ndcg_at_1000_diff1
value: 48.26320826156127
- type: nauc_ndcg_at_1000_max
value: 39.11158925773445
- type: nauc_ndcg_at_1000_std
value: -3.958164717220878
- type: nauc_ndcg_at_100_diff1
value: 48.29325255469789
- type: nauc_ndcg_at_100_max
value: 39.00224428862792
- type: nauc_ndcg_at_100_std
value: -4.739309326434606
- type: nauc_ndcg_at_10_diff1
value: 48.62405764367444
- type: nauc_ndcg_at_10_max
value: 38.04015783804633
- type: nauc_ndcg_at_10_std
value: -7.379427256377835
- type: nauc_ndcg_at_1_diff1
value: 55.03485605539673
- type: nauc_ndcg_at_1_max
value: 38.26074360694823
- type: nauc_ndcg_at_1_std
value: -6.990940922024673
- type: nauc_ndcg_at_20_diff1
value: 48.793146636748155
- type: nauc_ndcg_at_20_max
value: 38.188247609309734
- type: nauc_ndcg_at_20_std
value: -6.893163590780488
- type: nauc_ndcg_at_3_diff1
value: 49.72527867128085
- type: nauc_ndcg_at_3_max
value: 38.397771643337876
- type: nauc_ndcg_at_3_std
value: -7.396734926261662
- type: nauc_ndcg_at_5_diff1
value: 49.45897046963514
- type: nauc_ndcg_at_5_max
value: 38.00788817919171
- type: nauc_ndcg_at_5_std
value: -7.98773024373368
- type: nauc_precision_at_1000_diff1
value: -15.203088093712378
- type: nauc_precision_at_1000_max
value: 13.932931359528938
- type: nauc_precision_at_1000_std
value: 28.443903216719125
- type: nauc_precision_at_100_diff1
value: -9.833515062825485
- type: nauc_precision_at_100_max
value: 25.501133048619252
- type: nauc_precision_at_100_std
value: 29.28522368814619
- type: nauc_precision_at_10_diff1
value: 11.048052024883837
- type: nauc_precision_at_10_max
value: 35.12225756686281
- type: nauc_precision_at_10_std
value: 13.549314875239492
- type: nauc_precision_at_1_diff1
value: 55.03485605539673
- type: nauc_precision_at_1_max
value: 38.26074360694823
- type: nauc_precision_at_1_std
value: -6.990940922024673
- type: nauc_precision_at_20_diff1
value: 3.6119660166254564
- type: nauc_precision_at_20_max
value: 31.80991909502872
- type: nauc_precision_at_20_std
value: 19.289172474937768
- type: nauc_precision_at_3_diff1
value: 30.93845075141858
- type: nauc_precision_at_3_max
value: 41.2363485550859
- type: nauc_precision_at_3_std
value: 3.304016059128308
- type: nauc_precision_at_5_diff1
value: 22.383511628600537
- type: nauc_precision_at_5_max
value: 38.3094647733712
- type: nauc_precision_at_5_std
value: 7.010497480008379
- type: nauc_recall_at_1000_diff1
value: 31.611750140993035
- type: nauc_recall_at_1000_max
value: 42.982693130692894
- type: nauc_recall_at_1000_std
value: 25.50352029753317
- type: nauc_recall_at_100_diff1
value: 36.466866132011525
- type: nauc_recall_at_100_max
value: 39.8896195569174
- type: nauc_recall_at_100_std
value: 8.056466272308052
- type: nauc_recall_at_10_diff1
value: 40.55869867748143
- type: nauc_recall_at_10_max
value: 35.35219000254458
- type: nauc_recall_at_10_std
value: -6.935500599977123
- type: nauc_recall_at_1_diff1
value: 57.97216876426843
- type: nauc_recall_at_1_max
value: 32.093932122348804
- type: nauc_recall_at_1_std
value: -12.44326469749823
- type: nauc_recall_at_20_diff1
value: 40.699604166249046
- type: nauc_recall_at_20_max
value: 36.441366652406835
- type: nauc_recall_at_20_std
value: -4.519436682877613
- type: nauc_recall_at_3_diff1
value: 47.15019730046201
- type: nauc_recall_at_3_max
value: 35.1649979105234
- type: nauc_recall_at_3_std
value: -10.908395079450377
- type: nauc_recall_at_5_diff1
value: 44.535088248003156
- type: nauc_recall_at_5_max
value: 34.89949777715303
- type: nauc_recall_at_5_std
value: -10.361237744830412
- type: ndcg_at_1
value: 36.815
- type: ndcg_at_10
value: 44.391000000000005
- type: ndcg_at_100
value: 48.515
- type: ndcg_at_1000
value: 50.76199999999999
- type: ndcg_at_20
value: 45.788000000000004
- type: ndcg_at_3
value: 40.178000000000004
- type: ndcg_at_5
value: 42.045
- type: precision_at_1
value: 36.815
- type: precision_at_10
value: 8.408
- type: precision_at_100
value: 1.343
- type: precision_at_1000
value: 0.182
- type: precision_at_20
value: 4.873
- type: precision_at_3
value: 19.299
- type: precision_at_5
value: 13.758000000000001
- type: recall_at_1
value: 29.157
- type: recall_at_10
value: 54.214
- type: recall_at_100
value: 71.929
- type: recall_at_1000
value: 86.533
- type: recall_at_20
value: 59.421
- type: recall_at_3
value: 41.569
- type: recall_at_5
value: 46.791
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGamingRetrieval (default)
type: mteb/cqadupstack-gaming
config: default
split: test
revision: 4885aa143210c98657558c04aaf3dc47cfb54340
metrics:
- type: main_score
value: 59.03699999999999
- type: map_at_1
value: 41.476
- type: map_at_10
value: 53.400000000000006
- type: map_at_100
value: 54.452999999999996
- type: map_at_1000
value: 54.504
- type: map_at_20
value: 54.045
- type: map_at_3
value: 50.153999999999996
- type: map_at_5
value: 52.079
- type: mrr_at_1
value: 46.95924764890282
- type: mrr_at_10
value: 56.68495297805642
- type: mrr_at_100
value: 57.34582096937295
- type: mrr_at_1000
value: 57.37100347158495
- type: mrr_at_20
value: 57.10508892444508
- type: mrr_at_3
value: 54.242424242424235
- type: mrr_at_5
value: 55.76593521421108
- type: nauc_map_at_1000_diff1
value: 53.36527106664
- type: nauc_map_at_1000_max
value: 43.486776333687835
- type: nauc_map_at_1000_std
value: -5.509558143849234
- type: nauc_map_at_100_diff1
value: 53.34097797467696
- type: nauc_map_at_100_max
value: 43.476003610937234
- type: nauc_map_at_100_std
value: -5.520166623777559
- type: nauc_map_at_10_diff1
value: 53.432351035276746
- type: nauc_map_at_10_max
value: 42.75788423195968
- type: nauc_map_at_10_std
value: -6.504192409274652
- type: nauc_map_at_1_diff1
value: 57.34963186677463
- type: nauc_map_at_1_max
value: 36.95146202384373
- type: nauc_map_at_1_std
value: -9.460645936916988
- type: nauc_map_at_20_diff1
value: 53.29779847033195
- type: nauc_map_at_20_max
value: 43.22342023309121
- type: nauc_map_at_20_std
value: -5.953002390034157
- type: nauc_map_at_3_diff1
value: 54.09550124289603
- type: nauc_map_at_3_max
value: 41.09664412682725
- type: nauc_map_at_3_std
value: -8.797917588156473
- type: nauc_map_at_5_diff1
value: 53.47735307728038
- type: nauc_map_at_5_max
value: 42.1420557369995
- type: nauc_map_at_5_std
value: -6.982023249979087
- type: nauc_mrr_at_1000_diff1
value: 53.84548396450655
- type: nauc_mrr_at_1000_max
value: 45.70711475929243
- type: nauc_mrr_at_1000_std
value: -3.572519075485509
- type: nauc_mrr_at_100_diff1
value: 53.831585937143345
- type: nauc_mrr_at_100_max
value: 45.71866605712688
- type: nauc_mrr_at_100_std
value: -3.5531077992494087
- type: nauc_mrr_at_10_diff1
value: 53.77550386915942
- type: nauc_mrr_at_10_max
value: 45.61906078824265
- type: nauc_mrr_at_10_std
value: -3.7647971491069567
- type: nauc_mrr_at_1_diff1
value: 57.59578262230993
- type: nauc_mrr_at_1_max
value: 43.132298775083996
- type: nauc_mrr_at_1_std
value: -6.820570895500843
- type: nauc_mrr_at_20_diff1
value: 53.757844034161984
- type: nauc_mrr_at_20_max
value: 45.67787807420582
- type: nauc_mrr_at_20_std
value: -3.6741549159529816
- type: nauc_mrr_at_3_diff1
value: 54.41366916196891
- type: nauc_mrr_at_3_max
value: 45.48753195460355
- type: nauc_mrr_at_3_std
value: -4.536347261239106
- type: nauc_mrr_at_5_diff1
value: 53.81844478829885
- type: nauc_mrr_at_5_max
value: 45.77186226917752
- type: nauc_mrr_at_5_std
value: -3.560088004877736
- type: nauc_ndcg_at_1000_diff1
value: 52.474274223239945
- type: nauc_ndcg_at_1000_max
value: 45.88297620389939
- type: nauc_ndcg_at_1000_std
value: -2.236689460240769
- type: nauc_ndcg_at_100_diff1
value: 51.99537297728399
- type: nauc_ndcg_at_100_max
value: 46.162105938598245
- type: nauc_ndcg_at_100_std
value: -1.636252027390496
- type: nauc_ndcg_at_10_diff1
value: 51.981635840094334
- type: nauc_ndcg_at_10_max
value: 44.72098290105285
- type: nauc_ndcg_at_10_std
value: -4.26133599970984
- type: nauc_ndcg_at_1_diff1
value: 57.43124530432752
- type: nauc_ndcg_at_1_max
value: 42.987773648572045
- type: nauc_ndcg_at_1_std
value: -6.975930064288375
- type: nauc_ndcg_at_20_diff1
value: 51.709989593496665
- type: nauc_ndcg_at_20_max
value: 45.35511346806507
- type: nauc_ndcg_at_20_std
value: -3.441945043133369
- type: nauc_ndcg_at_3_diff1
value: 52.83956836083957
- type: nauc_ndcg_at_3_max
value: 43.14243257908553
- type: nauc_ndcg_at_3_std
value: -6.906786756066083
- type: nauc_ndcg_at_5_diff1
value: 51.92395247597085
- type: nauc_ndcg_at_5_max
value: 44.28584104560978
- type: nauc_ndcg_at_5_std
value: -4.432556679370336
- type: nauc_precision_at_1000_diff1
value: -10.137271271355312
- type: nauc_precision_at_1000_max
value: 21.053415390964915
- type: nauc_precision_at_1000_std
value: 31.437645188936003
- type: nauc_precision_at_100_diff1
value: -5.869005161223761
- type: nauc_precision_at_100_max
value: 28.74652505762229
- type: nauc_precision_at_100_std
value: 33.42249624017563
- type: nauc_precision_at_10_diff1
value: 14.075300860742587
- type: nauc_precision_at_10_max
value: 36.90717719533496
- type: nauc_precision_at_10_std
value: 15.27522825163519
- type: nauc_precision_at_1_diff1
value: 57.43124530432752
- type: nauc_precision_at_1_max
value: 42.987773648572045
- type: nauc_precision_at_1_std
value: -6.975930064288375
- type: nauc_precision_at_20_diff1
value: 4.831146517476065
- type: nauc_precision_at_20_max
value: 34.600390709037775
- type: nauc_precision_at_20_std
value: 21.879191470976977
- type: nauc_precision_at_3_diff1
value: 33.75586535854295
- type: nauc_precision_at_3_max
value: 41.8963728460937
- type: nauc_precision_at_3_std
value: 0.30853391781218725
- type: nauc_precision_at_5_diff1
value: 23.619374234162443
- type: nauc_precision_at_5_max
value: 40.26315749312306
- type: nauc_precision_at_5_std
value: 9.496779653807806
- type: nauc_recall_at_1000_diff1
value: 39.650899433995065
- type: nauc_recall_at_1000_max
value: 65.95997046182639
- type: nauc_recall_at_1000_std
value: 41.52010213404674
- type: nauc_recall_at_100_diff1
value: 37.021652104886904
- type: nauc_recall_at_100_max
value: 57.901229136609636
- type: nauc_recall_at_100_std
value: 27.173492395498428
- type: nauc_recall_at_10_diff1
value: 44.29968361744853
- type: nauc_recall_at_10_max
value: 44.18295286662639
- type: nauc_recall_at_10_std
value: -1.5721790203147754
- type: nauc_recall_at_1_diff1
value: 57.34963186677463
- type: nauc_recall_at_1_max
value: 36.95146202384373
- type: nauc_recall_at_1_std
value: -9.460645936916988
- type: nauc_recall_at_20_diff1
value: 41.603580598985126
- type: nauc_recall_at_20_max
value: 47.702934198286876
- type: nauc_recall_at_20_std
value: 3.019298754051616
- type: nauc_recall_at_3_diff1
value: 49.02194332102533
- type: nauc_recall_at_3_max
value: 41.38275177493884
- type: nauc_recall_at_3_std
value: -8.055685087264179
- type: nauc_recall_at_5_diff1
value: 45.213060998923496
- type: nauc_recall_at_5_max
value: 43.53976038303946
- type: nauc_recall_at_5_std
value: -1.7312187150046634
- type: ndcg_at_1
value: 47.022000000000006
- type: ndcg_at_10
value: 59.03699999999999
- type: ndcg_at_100
value: 63.077000000000005
- type: ndcg_at_1000
value: 64.098
- type: ndcg_at_20
value: 60.84
- type: ndcg_at_3
value: 53.657999999999994
- type: ndcg_at_5
value: 56.501000000000005
- type: precision_at_1
value: 47.022000000000006
- type: precision_at_10
value: 9.342
- type: precision_at_100
value: 1.2309999999999999
- type: precision_at_1000
value: 0.136
- type: precision_at_20
value: 5.232
- type: precision_at_3
value: 23.552999999999997
- type: precision_at_5
value: 16.250999999999998
- type: recall_at_1
value: 41.476
- type: recall_at_10
value: 72.283
- type: recall_at_100
value: 89.545
- type: recall_at_1000
value: 96.798
- type: recall_at_20
value: 78.84100000000001
- type: recall_at_3
value: 58.114
- type: recall_at_5
value: 65.007
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGisRetrieval (default)
type: mteb/cqadupstack-gis
config: default
split: test
revision: 5003b3064772da1887988e05400cf3806fe491f2
metrics:
- type: main_score
value: 37.673
- type: map_at_1
value: 25.324
- type: map_at_10
value: 33.17
- type: map_at_100
value: 34.095
- type: map_at_1000
value: 34.182
- type: map_at_20
value: 33.654
- type: map_at_3
value: 30.879
- type: map_at_5
value: 32.26
- type: mrr_at_1
value: 27.34463276836158
- type: mrr_at_10
value: 35.2258541834813
- type: mrr_at_100
value: 36.00404498547979
- type: mrr_at_1000
value: 36.07566444493976
- type: mrr_at_20
value: 35.63110644891617
- type: mrr_at_3
value: 32.95668549905838
- type: mrr_at_5
value: 34.25612052730697
- type: nauc_map_at_1000_diff1
value: 46.058990680271485
- type: nauc_map_at_1000_max
value: 28.600543996662374
- type: nauc_map_at_1000_std
value: -3.8218348925653505
- type: nauc_map_at_100_diff1
value: 46.04742556273763
- type: nauc_map_at_100_max
value: 28.58845010683153
- type: nauc_map_at_100_std
value: -3.8241454424665746
- type: nauc_map_at_10_diff1
value: 46.318380971509015
- type: nauc_map_at_10_max
value: 28.445154969629815
- type: nauc_map_at_10_std
value: -4.668418336182435
- type: nauc_map_at_1_diff1
value: 50.84712517695217
- type: nauc_map_at_1_max
value: 24.956820608742856
- type: nauc_map_at_1_std
value: -7.408652214171463
- type: nauc_map_at_20_diff1
value: 46.02082882551024
- type: nauc_map_at_20_max
value: 28.71729950175136
- type: nauc_map_at_20_std
value: -3.8899400482521864
- type: nauc_map_at_3_diff1
value: 47.017578094263065
- type: nauc_map_at_3_max
value: 27.57393258045568
- type: nauc_map_at_3_std
value: -5.578535499711579
- type: nauc_map_at_5_diff1
value: 46.64174901816308
- type: nauc_map_at_5_max
value: 28.12934751037357
- type: nauc_map_at_5_std
value: -4.623605944585039
- type: nauc_mrr_at_1000_diff1
value: 44.80745580850706
- type: nauc_mrr_at_1000_max
value: 30.08660965092525
- type: nauc_mrr_at_1000_std
value: -1.8483739575689273
- type: nauc_mrr_at_100_diff1
value: 44.79929065561873
- type: nauc_mrr_at_100_max
value: 30.068319004487208
- type: nauc_mrr_at_100_std
value: -1.8439865469408845
- type: nauc_mrr_at_10_diff1
value: 45.04202172389592
- type: nauc_mrr_at_10_max
value: 30.006082516512294
- type: nauc_mrr_at_10_std
value: -2.4476357227718673
- type: nauc_mrr_at_1_diff1
value: 49.710330210449705
- type: nauc_mrr_at_1_max
value: 27.652926800227444
- type: nauc_mrr_at_1_std
value: -4.963221847243473
- type: nauc_mrr_at_20_diff1
value: 44.74348822631581
- type: nauc_mrr_at_20_max
value: 30.232310892837866
- type: nauc_mrr_at_20_std
value: -1.8627482467585263
- type: nauc_mrr_at_3_diff1
value: 45.63996732955718
- type: nauc_mrr_at_3_max
value: 29.71071543929027
- type: nauc_mrr_at_3_std
value: -2.9488868732728264
- type: nauc_mrr_at_5_diff1
value: 45.31282418942023
- type: nauc_mrr_at_5_max
value: 29.59225270015164
- type: nauc_mrr_at_5_std
value: -2.571596169990907
- type: nauc_ndcg_at_1000_diff1
value: 43.44153526801899
- type: nauc_ndcg_at_1000_max
value: 30.264809827186745
- type: nauc_ndcg_at_1000_std
value: -0.3673459026557417
- type: nauc_ndcg_at_100_diff1
value: 42.9260780049435
- type: nauc_ndcg_at_100_max
value: 29.971290021267254
- type: nauc_ndcg_at_100_std
value: 0.07223943237736839
- type: nauc_ndcg_at_10_diff1
value: 43.89936991271991
- type: nauc_ndcg_at_10_max
value: 29.883246789724915
- type: nauc_ndcg_at_10_std
value: -2.842441401911265
- type: nauc_ndcg_at_1_diff1
value: 50.14865712693543
- type: nauc_ndcg_at_1_max
value: 27.111609058341863
- type: nauc_ndcg_at_1_std
value: -5.5675174385570925
- type: nauc_ndcg_at_20_diff1
value: 42.84709307426253
- type: nauc_ndcg_at_20_max
value: 30.76378099168594
- type: nauc_ndcg_at_20_std
value: -0.42561135386508475
- type: nauc_ndcg_at_3_diff1
value: 45.4326566931524
- type: nauc_ndcg_at_3_max
value: 28.61889737624481
- type: nauc_ndcg_at_3_std
value: -4.348200281698876
- type: nauc_ndcg_at_5_diff1
value: 44.630092727271034
- type: nauc_ndcg_at_5_max
value: 29.04891878562973
- type: nauc_ndcg_at_5_std
value: -2.8900608482934165
- type: nauc_precision_at_1000_diff1
value: 1.563823692486198
- type: nauc_precision_at_1000_max
value: 18.07524759715147
- type: nauc_precision_at_1000_std
value: 10.75651488435518
- type: nauc_precision_at_100_diff1
value: 15.84032553897459
- type: nauc_precision_at_100_max
value: 26.9982332859951
- type: nauc_precision_at_100_std
value: 13.809307316031362
- type: nauc_precision_at_10_diff1
value: 33.44005568824001
- type: nauc_precision_at_10_max
value: 35.31365313654245
- type: nauc_precision_at_10_std
value: 2.1516208493844817
- type: nauc_precision_at_1_diff1
value: 50.14865712693543
- type: nauc_precision_at_1_max
value: 27.111609058341863
- type: nauc_precision_at_1_std
value: -5.5675174385570925
- type: nauc_precision_at_20_diff1
value: 26.453560867406594
- type: nauc_precision_at_20_max
value: 36.754320258234735
- type: nauc_precision_at_20_std
value: 10.960004664156314
- type: nauc_precision_at_3_diff1
value: 39.5339842087826
- type: nauc_precision_at_3_max
value: 32.43079763654043
- type: nauc_precision_at_3_std
value: -1.1149107052174205
- type: nauc_precision_at_5_diff1
value: 36.75997042257077
- type: nauc_precision_at_5_max
value: 32.936394052992256
- type: nauc_precision_at_5_std
value: 2.253739058194602
- type: nauc_recall_at_1000_diff1
value: 26.620883791876672
- type: nauc_recall_at_1000_max
value: 40.036249354126255
- type: nauc_recall_at_1000_std
value: 24.67019914079094
- type: nauc_recall_at_100_diff1
value: 29.06050311303032
- type: nauc_recall_at_100_max
value: 31.719103788027674
- type: nauc_recall_at_100_std
value: 16.517714390661105
- type: nauc_recall_at_10_diff1
value: 36.292924258716106
- type: nauc_recall_at_10_max
value: 32.02173242085442
- type: nauc_recall_at_10_std
value: 1.016713326361783
- type: nauc_recall_at_1_diff1
value: 50.84712517695217
- type: nauc_recall_at_1_max
value: 24.956820608742856
- type: nauc_recall_at_1_std
value: -7.408652214171463
- type: nauc_recall_at_20_diff1
value: 31.875810510992398
- type: nauc_recall_at_20_max
value: 35.1225435012755
- type: nauc_recall_at_20_std
value: 10.08081240374867
- type: nauc_recall_at_3_diff1
value: 41.31843254728666
- type: nauc_recall_at_3_max
value: 29.083015930837323
- type: nauc_recall_at_3_std
value: -2.6812306676938906
- type: nauc_recall_at_5_diff1
value: 38.74912094651174
- type: nauc_recall_at_5_max
value: 29.713413529317663
- type: nauc_recall_at_5_std
value: 0.6429485746621083
- type: ndcg_at_1
value: 27.232
- type: ndcg_at_10
value: 37.673
- type: ndcg_at_100
value: 42.379
- type: ndcg_at_1000
value: 44.664
- type: ndcg_at_20
value: 39.282000000000004
- type: ndcg_at_3
value: 33.178999999999995
- type: ndcg_at_5
value: 35.481
- type: precision_at_1
value: 27.232
- type: precision_at_10
value: 5.593
- type: precision_at_100
value: 0.845
- type: precision_at_1000
value: 0.108
- type: precision_at_20
value: 3.1809999999999996
- type: precision_at_3
value: 13.898
- type: precision_at_5
value: 9.605
- type: recall_at_1
value: 25.324
- type: recall_at_10
value: 49.66
- type: recall_at_100
value: 71.702
- type: recall_at_1000
value: 88.884
- type: recall_at_20
value: 55.63399999999999
- type: recall_at_3
value: 37.557
- type: recall_at_5
value: 43.086
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackMathematicaRetrieval (default)
type: mteb/cqadupstack-mathematica
config: default
split: test
revision: 90fceea13679c63fe563ded68f3b6f06e50061de
metrics:
- type: main_score
value: 27.683000000000003
- type: map_at_1
value: 15.440000000000001
- type: map_at_10
value: 22.708000000000002
- type: map_at_100
value: 23.891000000000002
- type: map_at_1000
value: 24.009
- type: map_at_20
value: 23.362
- type: map_at_3
value: 20.173
- type: map_at_5
value: 21.512999999999998
- type: mrr_at_1
value: 19.154228855721392
- type: mrr_at_10
value: 27.14907604832978
- type: mrr_at_100
value: 28.134401799106946
- type: mrr_at_1000
value: 28.210652971960727
- type: mrr_at_20
value: 27.743116715423334
- type: mrr_at_3
value: 24.64759535655058
- type: mrr_at_5
value: 26.0530679933665
- type: nauc_map_at_1000_diff1
value: 26.45225395954919
- type: nauc_map_at_1000_max
value: 18.88821201176001
- type: nauc_map_at_1000_std
value: -6.743073428818526
- type: nauc_map_at_100_diff1
value: 26.46163797092885
- type: nauc_map_at_100_max
value: 18.91020517272631
- type: nauc_map_at_100_std
value: -6.715512753190824
- type: nauc_map_at_10_diff1
value: 25.93830061738008
- type: nauc_map_at_10_max
value: 18.230821464212788
- type: nauc_map_at_10_std
value: -7.723714557953293
- type: nauc_map_at_1_diff1
value: 32.6143819833978
- type: nauc_map_at_1_max
value: 18.229434406703447
- type: nauc_map_at_1_std
value: -8.826503266807608
- type: nauc_map_at_20_diff1
value: 26.267375356189532
- type: nauc_map_at_20_max
value: 18.74372577827996
- type: nauc_map_at_20_std
value: -7.1213741256387495
- type: nauc_map_at_3_diff1
value: 26.502658255222222
- type: nauc_map_at_3_max
value: 17.34676548965769
- type: nauc_map_at_3_std
value: -8.661705532483479
- type: nauc_map_at_5_diff1
value: 25.947975266973
- type: nauc_map_at_5_max
value: 18.26579025252041
- type: nauc_map_at_5_std
value: -7.988152286698193
- type: nauc_mrr_at_1000_diff1
value: 27.43240261182634
- type: nauc_mrr_at_1000_max
value: 19.59851548113691
- type: nauc_mrr_at_1000_std
value: -5.8659045748819505
- type: nauc_mrr_at_100_diff1
value: 27.42860371902458
- type: nauc_mrr_at_100_max
value: 19.61291439961396
- type: nauc_mrr_at_100_std
value: -5.840170365425997
- type: nauc_mrr_at_10_diff1
value: 26.996629286135576
- type: nauc_mrr_at_10_max
value: 19.09125992187832
- type: nauc_mrr_at_10_std
value: -6.401949732007706
- type: nauc_mrr_at_1_diff1
value: 33.20355103883785
- type: nauc_mrr_at_1_max
value: 18.84271700427976
- type: nauc_mrr_at_1_std
value: -6.846362536084065
- type: nauc_mrr_at_20_diff1
value: 27.342295700872445
- type: nauc_mrr_at_20_max
value: 19.59730195635629
- type: nauc_mrr_at_20_std
value: -6.045183866074472
- type: nauc_mrr_at_3_diff1
value: 27.921898978571868
- type: nauc_mrr_at_3_max
value: 19.028747822887816
- type: nauc_mrr_at_3_std
value: -6.651966049443023
- type: nauc_mrr_at_5_diff1
value: 27.280695824148392
- type: nauc_mrr_at_5_max
value: 19.430798343725524
- type: nauc_mrr_at_5_std
value: -6.747383339145715
- type: nauc_ndcg_at_1000_diff1
value: 25.38902736172073
- type: nauc_ndcg_at_1000_max
value: 20.45917423943934
- type: nauc_ndcg_at_1000_std
value: -3.2757947022252076
- type: nauc_ndcg_at_100_diff1
value: 25.732803165259238
- type: nauc_ndcg_at_100_max
value: 20.836040539884642
- type: nauc_ndcg_at_100_std
value: -2.9535785746014396
- type: nauc_ndcg_at_10_diff1
value: 23.946041122415746
- type: nauc_ndcg_at_10_max
value: 18.62752297015455
- type: nauc_ndcg_at_10_std
value: -6.405272980276195
- type: nauc_ndcg_at_1_diff1
value: 33.20355103883785
- type: nauc_ndcg_at_1_max
value: 18.84271700427976
- type: nauc_ndcg_at_1_std
value: -6.846362536084065
- type: nauc_ndcg_at_20_diff1
value: 24.77178243398418
- type: nauc_ndcg_at_20_max
value: 20.27057276120682
- type: nauc_ndcg_at_20_std
value: -4.789054638686646
- type: nauc_ndcg_at_3_diff1
value: 25.93797698971861
- type: nauc_ndcg_at_3_max
value: 17.7626073837572
- type: nauc_ndcg_at_3_std
value: -8.049324539903097
- type: nauc_ndcg_at_5_diff1
value: 24.628424554881647
- type: nauc_ndcg_at_5_max
value: 18.989213649165613
- type: nauc_ndcg_at_5_std
value: -7.173452770970873
- type: nauc_precision_at_1000_diff1
value: 5.456508320365408
- type: nauc_precision_at_1000_max
value: 4.8136815217087205
- type: nauc_precision_at_1000_std
value: 4.947456448109757
- type: nauc_precision_at_100_diff1
value: 16.260577000896543
- type: nauc_precision_at_100_max
value: 16.7039900850556
- type: nauc_precision_at_100_std
value: 9.11227641718042
- type: nauc_precision_at_10_diff1
value: 16.365122567702535
- type: nauc_precision_at_10_max
value: 17.065003280187348
- type: nauc_precision_at_10_std
value: -2.229290931287804
- type: nauc_precision_at_1_diff1
value: 33.20355103883785
- type: nauc_precision_at_1_max
value: 18.84271700427976
- type: nauc_precision_at_1_std
value: -6.846362536084065
- type: nauc_precision_at_20_diff1
value: 16.91214381595962
- type: nauc_precision_at_20_max
value: 19.58308083494222
- type: nauc_precision_at_20_std
value: 2.253335365165219
- type: nauc_precision_at_3_diff1
value: 19.85085379824151
- type: nauc_precision_at_3_max
value: 16.27352732420782
- type: nauc_precision_at_3_std
value: -7.201882607059234
- type: nauc_precision_at_5_diff1
value: 17.966240404329092
- type: nauc_precision_at_5_max
value: 18.231425958226044
- type: nauc_precision_at_5_std
value: -4.043751510938105
- type: nauc_recall_at_1000_diff1
value: 13.957143176090353
- type: nauc_recall_at_1000_max
value: 25.052247631159652
- type: nauc_recall_at_1000_std
value: 17.326355613640054
- type: nauc_recall_at_100_diff1
value: 21.440869340994407
- type: nauc_recall_at_100_max
value: 24.311867728047343
- type: nauc_recall_at_100_std
value: 9.336321796584325
- type: nauc_recall_at_10_diff1
value: 16.696814266222432
- type: nauc_recall_at_10_max
value: 17.145710052014486
- type: nauc_recall_at_10_std
value: -4.135339167818864
- type: nauc_recall_at_1_diff1
value: 32.6143819833978
- type: nauc_recall_at_1_max
value: 18.229434406703447
- type: nauc_recall_at_1_std
value: -8.826503266807608
- type: nauc_recall_at_20_diff1
value: 18.34311797149379
- type: nauc_recall_at_20_max
value: 21.832943514273143
- type: nauc_recall_at_20_std
value: 0.8894706565637946
- type: nauc_recall_at_3_diff1
value: 20.992985988081557
- type: nauc_recall_at_3_max
value: 16.255791972442506
- type: nauc_recall_at_3_std
value: -7.097037821828232
- type: nauc_recall_at_5_diff1
value: 18.60326978035633
- type: nauc_recall_at_5_max
value: 18.615371576760275
- type: nauc_recall_at_5_std
value: -6.049891295196573
- type: ndcg_at_1
value: 19.154
- type: ndcg_at_10
value: 27.683000000000003
- type: ndcg_at_100
value: 33.213
- type: ndcg_at_1000
value: 36.141
- type: ndcg_at_20
value: 29.854999999999997
- type: ndcg_at_3
value: 22.987
- type: ndcg_at_5
value: 25.106
- type: precision_at_1
value: 19.154
- type: precision_at_10
value: 5.224
- type: precision_at_100
value: 0.919
- type: precision_at_1000
value: 0.13
- type: precision_at_20
value: 3.215
- type: precision_at_3
value: 11.318
- type: precision_at_5
value: 8.383000000000001
- type: recall_at_1
value: 15.440000000000001
- type: recall_at_10
value: 38.734
- type: recall_at_100
value: 62.576
- type: recall_at_1000
value: 83.541
- type: recall_at_20
value: 46.45
- type: recall_at_3
value: 25.438
- type: recall_at_5
value: 30.891000000000002
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackPhysicsRetrieval (default)
type: mteb/cqadupstack-physics
config: default
split: test
revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4
metrics:
- type: main_score
value: 45.196999999999996
- type: map_at_1
value: 29.438
- type: map_at_10
value: 39.497
- type: map_at_100
value: 40.757
- type: map_at_1000
value: 40.865
- type: map_at_20
value: 40.21
- type: map_at_3
value: 36.649
- type: map_at_5
value: 38.278
- type: mrr_at_1
value: 35.514918190567855
- type: mrr_at_10
value: 44.939158531555066
- type: mrr_at_100
value: 45.71399223764184
- type: mrr_at_1000
value: 45.767047236444185
- type: mrr_at_20
value: 45.40064162616659
- type: mrr_at_3
value: 42.49278152069297
- type: mrr_at_5
value: 43.999037536092395
- type: nauc_map_at_1000_diff1
value: 48.2911083967695
- type: nauc_map_at_1000_max
value: 33.0567223033294
- type: nauc_map_at_1000_std
value: -7.5831018828087435
- type: nauc_map_at_100_diff1
value: 48.266195527072156
- type: nauc_map_at_100_max
value: 33.03915960499412
- type: nauc_map_at_100_std
value: -7.606925986310037
- type: nauc_map_at_10_diff1
value: 48.328320797346294
- type: nauc_map_at_10_max
value: 32.7070148720631
- type: nauc_map_at_10_std
value: -8.512811841258646
- type: nauc_map_at_1_diff1
value: 52.88608162356222
- type: nauc_map_at_1_max
value: 31.24794941358492
- type: nauc_map_at_1_std
value: -11.706848009285954
- type: nauc_map_at_20_diff1
value: 48.2969260156472
- type: nauc_map_at_20_max
value: 32.86081996380274
- type: nauc_map_at_20_std
value: -8.020958942798524
- type: nauc_map_at_3_diff1
value: 48.743817641945114
- type: nauc_map_at_3_max
value: 32.605458230621856
- type: nauc_map_at_3_std
value: -8.638274842287737
- type: nauc_map_at_5_diff1
value: 48.78806923732555
- type: nauc_map_at_5_max
value: 32.61566250570677
- type: nauc_map_at_5_std
value: -8.780064299161241
- type: nauc_mrr_at_1000_diff1
value: 48.402407250061934
- type: nauc_mrr_at_1000_max
value: 32.73963018253408
- type: nauc_mrr_at_1000_std
value: -7.600714897746363
- type: nauc_mrr_at_100_diff1
value: 48.38722402499983
- type: nauc_mrr_at_100_max
value: 32.74291939054888
- type: nauc_mrr_at_100_std
value: -7.584196436282831
- type: nauc_mrr_at_10_diff1
value: 48.324992370558576
- type: nauc_mrr_at_10_max
value: 32.65326566012142
- type: nauc_mrr_at_10_std
value: -7.960957871756174
- type: nauc_mrr_at_1_diff1
value: 52.51790849738347
- type: nauc_mrr_at_1_max
value: 31.979743734335504
- type: nauc_mrr_at_1_std
value: -11.101383949942232
- type: nauc_mrr_at_20_diff1
value: 48.375346158446725
- type: nauc_mrr_at_20_max
value: 32.73895555822591
- type: nauc_mrr_at_20_std
value: -7.642914670396977
- type: nauc_mrr_at_3_diff1
value: 48.83160990949774
- type: nauc_mrr_at_3_max
value: 32.80880922901924
- type: nauc_mrr_at_3_std
value: -7.760362168094019
- type: nauc_mrr_at_5_diff1
value: 48.60255139323125
- type: nauc_mrr_at_5_max
value: 32.72728351371156
- type: nauc_mrr_at_5_std
value: -8.038189749481258
- type: nauc_ndcg_at_1000_diff1
value: 46.67101320125475
- type: nauc_ndcg_at_1000_max
value: 34.0504701772667
- type: nauc_ndcg_at_1000_std
value: -4.032878112637376
- type: nauc_ndcg_at_100_diff1
value: 46.248748827447265
- type: nauc_ndcg_at_100_max
value: 33.74751928599088
- type: nauc_ndcg_at_100_std
value: -3.991862266355337
- type: nauc_ndcg_at_10_diff1
value: 46.46100196084458
- type: nauc_ndcg_at_10_max
value: 32.807685888284794
- type: nauc_ndcg_at_10_std
value: -7.457478747984192
- type: nauc_ndcg_at_1_diff1
value: 52.51790849738347
- type: nauc_ndcg_at_1_max
value: 31.979743734335504
- type: nauc_ndcg_at_1_std
value: -11.101383949942232
- type: nauc_ndcg_at_20_diff1
value: 46.410656199509944
- type: nauc_ndcg_at_20_max
value: 33.1581309808876
- type: nauc_ndcg_at_20_std
value: -5.99183846380811
- type: nauc_ndcg_at_3_diff1
value: 47.26764972559635
- type: nauc_ndcg_at_3_max
value: 33.08614197399897
- type: nauc_ndcg_at_3_std
value: -7.0742507391341345
- type: nauc_ndcg_at_5_diff1
value: 47.35898227835041
- type: nauc_ndcg_at_5_max
value: 32.84468179240444
- type: nauc_ndcg_at_5_std
value: -7.714927192881523
- type: nauc_precision_at_1000_diff1
value: -9.52692395683019
- type: nauc_precision_at_1000_max
value: 7.374303479576268
- type: nauc_precision_at_1000_std
value: 20.79761650113592
- type: nauc_precision_at_100_diff1
value: -0.5511806256392863
- type: nauc_precision_at_100_max
value: 14.260122126630634
- type: nauc_precision_at_100_std
value: 20.84530821188996
- type: nauc_precision_at_10_diff1
value: 19.572115874106533
- type: nauc_precision_at_10_max
value: 24.556082924046027
- type: nauc_precision_at_10_std
value: 5.323857400679805
- type: nauc_precision_at_1_diff1
value: 52.51790849738347
- type: nauc_precision_at_1_max
value: 31.979743734335504
- type: nauc_precision_at_1_std
value: -11.101383949942232
- type: nauc_precision_at_20_diff1
value: 12.356576945971826
- type: nauc_precision_at_20_max
value: 21.121689225096056
- type: nauc_precision_at_20_std
value: 12.177075559439556
- type: nauc_precision_at_3_diff1
value: 33.671667659871865
- type: nauc_precision_at_3_max
value: 30.98143183174062
- type: nauc_precision_at_3_std
value: 0.520604608152502
- type: nauc_precision_at_5_diff1
value: 30.06980809430162
- type: nauc_precision_at_5_max
value: 28.454115294663602
- type: nauc_precision_at_5_std
value: 0.8596400708828538
- type: nauc_recall_at_1000_diff1
value: 24.965587031650884
- type: nauc_recall_at_1000_max
value: 40.72840120992986
- type: nauc_recall_at_1000_std
value: 38.76857796467627
- type: nauc_recall_at_100_diff1
value: 32.790892696170374
- type: nauc_recall_at_100_max
value: 32.970070123139564
- type: nauc_recall_at_100_std
value: 14.657654854897062
- type: nauc_recall_at_10_diff1
value: 38.309181873423476
- type: nauc_recall_at_10_max
value: 30.28707855794435
- type: nauc_recall_at_10_std
value: -5.568997608502203
- type: nauc_recall_at_1_diff1
value: 52.88608162356222
- type: nauc_recall_at_1_max
value: 31.24794941358492
- type: nauc_recall_at_1_std
value: -11.706848009285954
- type: nauc_recall_at_20_diff1
value: 37.44816940285688
- type: nauc_recall_at_20_max
value: 31.24736990052554
- type: nauc_recall_at_20_std
value: -0.17027260910961897
- type: nauc_recall_at_3_diff1
value: 42.921582034772726
- type: nauc_recall_at_3_max
value: 31.861184780950513
- type: nauc_recall_at_3_std
value: -6.209754089638474
- type: nauc_recall_at_5_diff1
value: 41.74803396821156
- type: nauc_recall_at_5_max
value: 31.13023590637421
- type: nauc_recall_at_5_std
value: -6.608370086504567
- type: ndcg_at_1
value: 35.515
- type: ndcg_at_10
value: 45.196999999999996
- type: ndcg_at_100
value: 50.38399999999999
- type: ndcg_at_1000
value: 52.596
- type: ndcg_at_20
value: 47.233000000000004
- type: ndcg_at_3
value: 40.573
- type: ndcg_at_5
value: 42.853
- type: precision_at_1
value: 35.515
- type: precision_at_10
value: 8.017000000000001
- type: precision_at_100
value: 1.237
- type: precision_at_1000
value: 0.159
- type: precision_at_20
value: 4.687
- type: precision_at_3
value: 18.961
- type: precision_at_5
value: 13.34
- type: recall_at_1
value: 29.438
- type: recall_at_10
value: 56.603
- type: recall_at_100
value: 78.281
- type: recall_at_1000
value: 93.172
- type: recall_at_20
value: 63.571
- type: recall_at_3
value: 43.763000000000005
- type: recall_at_5
value: 49.717
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackProgrammersRetrieval (default)
type: mteb/cqadupstack-programmers
config: default
split: test
revision: 6184bc1440d2dbc7612be22b50686b8826d22b32
metrics:
- type: main_score
value: 41.967999999999996
- type: map_at_1
value: 27.991
- type: map_at_10
value: 36.815
- type: map_at_100
value: 38.14
- type: map_at_1000
value: 38.257999999999996
- type: map_at_20
value: 37.561
- type: map_at_3
value: 34.094
- type: map_at_5
value: 35.557
- type: mrr_at_1
value: 34.817351598173516
- type: mrr_at_10
value: 42.56500507356672
- type: mrr_at_100
value: 43.460463999764066
- type: mrr_at_1000
value: 43.52348583643295
- type: mrr_at_20
value: 43.11992252647868
- type: mrr_at_3
value: 40.20167427701675
- type: mrr_at_5
value: 41.45738203957382
- type: nauc_map_at_1000_diff1
value: 41.67048775212967
- type: nauc_map_at_1000_max
value: 43.99159244124849
- type: nauc_map_at_1000_std
value: 2.573128018829387
- type: nauc_map_at_100_diff1
value: 41.674051168864544
- type: nauc_map_at_100_max
value: 43.98147916359051
- type: nauc_map_at_100_std
value: 2.5254111056725157
- type: nauc_map_at_10_diff1
value: 41.7125704403198
- type: nauc_map_at_10_max
value: 43.474100183989364
- type: nauc_map_at_10_std
value: 1.6477791314522445
- type: nauc_map_at_1_diff1
value: 48.1867206901292
- type: nauc_map_at_1_max
value: 40.525641468978996
- type: nauc_map_at_1_std
value: -0.7568533902855162
- type: nauc_map_at_20_diff1
value: 41.64339598055937
- type: nauc_map_at_20_max
value: 43.62356989148736
- type: nauc_map_at_20_std
value: 2.087731774178381
- type: nauc_map_at_3_diff1
value: 43.473195638597325
- type: nauc_map_at_3_max
value: 42.94377216167118
- type: nauc_map_at_3_std
value: 0.2505945238603998
- type: nauc_map_at_5_diff1
value: 42.39542158097317
- type: nauc_map_at_5_max
value: 43.67892698262521
- type: nauc_map_at_5_std
value: 0.9895905882223653
- type: nauc_mrr_at_1000_diff1
value: 41.09671003865924
- type: nauc_mrr_at_1000_max
value: 46.28436379929593
- type: nauc_mrr_at_1000_std
value: 4.354037919152363
- type: nauc_mrr_at_100_diff1
value: 41.09244756994191
- type: nauc_mrr_at_100_max
value: 46.29034043110901
- type: nauc_mrr_at_100_std
value: 4.351726070204726
- type: nauc_mrr_at_10_diff1
value: 40.977946444819096
- type: nauc_mrr_at_10_max
value: 46.10718374892125
- type: nauc_mrr_at_10_std
value: 4.18336707456262
- type: nauc_mrr_at_1_diff1
value: 45.599332453292675
- type: nauc_mrr_at_1_max
value: 45.84726261326186
- type: nauc_mrr_at_1_std
value: 2.4345971000548854
- type: nauc_mrr_at_20_diff1
value: 40.95961993815576
- type: nauc_mrr_at_20_max
value: 46.18592650660265
- type: nauc_mrr_at_20_std
value: 4.305161755438331
- type: nauc_mrr_at_3_diff1
value: 42.32692907673492
- type: nauc_mrr_at_3_max
value: 46.26011359406279
- type: nauc_mrr_at_3_std
value: 2.948567577936104
- type: nauc_mrr_at_5_diff1
value: 41.34052580040367
- type: nauc_mrr_at_5_max
value: 46.34383226431204
- type: nauc_mrr_at_5_std
value: 3.633823850306508
- type: nauc_ndcg_at_1000_diff1
value: 39.93215369321293
- type: nauc_ndcg_at_1000_max
value: 45.687802170808574
- type: nauc_ndcg_at_1000_std
value: 6.430986118631789
- type: nauc_ndcg_at_100_diff1
value: 39.684859990483915
- type: nauc_ndcg_at_100_max
value: 45.80031091479213
- type: nauc_ndcg_at_100_std
value: 6.36066573145881
- type: nauc_ndcg_at_10_diff1
value: 39.23880630958678
- type: nauc_ndcg_at_10_max
value: 43.80038181935968
- type: nauc_ndcg_at_10_std
value: 3.3533556819103074
- type: nauc_ndcg_at_1_diff1
value: 45.94736367846991
- type: nauc_ndcg_at_1_max
value: 46.105763729560294
- type: nauc_ndcg_at_1_std
value: 2.5515460950343622
- type: nauc_ndcg_at_20_diff1
value: 39.077143576829634
- type: nauc_ndcg_at_20_max
value: 44.175755846357006
- type: nauc_ndcg_at_20_std
value: 4.5499430823825
- type: nauc_ndcg_at_3_diff1
value: 41.55043893779763
- type: nauc_ndcg_at_3_max
value: 44.369396288268
- type: nauc_ndcg_at_3_std
value: 1.8135062317910333
- type: nauc_ndcg_at_5_diff1
value: 40.27727274546977
- type: nauc_ndcg_at_5_max
value: 44.58055714919917
- type: nauc_ndcg_at_5_std
value: 2.3858438655025895
- type: nauc_precision_at_1000_diff1
value: -15.82921590565681
- type: nauc_precision_at_1000_max
value: 5.3200324911551276
- type: nauc_precision_at_1000_std
value: 17.059441605068066
- type: nauc_precision_at_100_diff1
value: -3.477661270951154
- type: nauc_precision_at_100_max
value: 23.102213467508363
- type: nauc_precision_at_100_std
value: 22.61050030511951
- type: nauc_precision_at_10_diff1
value: 13.022774804120216
- type: nauc_precision_at_10_max
value: 38.41004452998074
- type: nauc_precision_at_10_std
value: 15.569153607416283
- type: nauc_precision_at_1_diff1
value: 45.94736367846991
- type: nauc_precision_at_1_max
value: 46.105763729560294
- type: nauc_precision_at_1_std
value: 2.5515460950343622
- type: nauc_precision_at_20_diff1
value: 6.552231339783917
- type: nauc_precision_at_20_max
value: 33.144348451578914
- type: nauc_precision_at_20_std
value: 19.55599724769983
- type: nauc_precision_at_3_diff1
value: 28.52937551899466
- type: nauc_precision_at_3_max
value: 45.2056127705799
- type: nauc_precision_at_3_std
value: 7.5353087497146785
- type: nauc_precision_at_5_diff1
value: 21.680390063172492
- type: nauc_precision_at_5_max
value: 44.075542142279645
- type: nauc_precision_at_5_std
value: 10.933211341141087
- type: nauc_recall_at_1000_diff1
value: 31.550619753305593
- type: nauc_recall_at_1000_max
value: 49.1096811911254
- type: nauc_recall_at_1000_std
value: 39.51532818925666
- type: nauc_recall_at_100_diff1
value: 30.696662503429863
- type: nauc_recall_at_100_max
value: 47.21608565384206
- type: nauc_recall_at_100_std
value: 20.894556840831438
- type: nauc_recall_at_10_diff1
value: 30.61623779072834
- type: nauc_recall_at_10_max
value: 38.964392138468114
- type: nauc_recall_at_10_std
value: 5.00024473264126
- type: nauc_recall_at_1_diff1
value: 48.1867206901292
- type: nauc_recall_at_1_max
value: 40.525641468978996
- type: nauc_recall_at_1_std
value: -0.7568533902855162
- type: nauc_recall_at_20_diff1
value: 29.07251333097125
- type: nauc_recall_at_20_max
value: 39.03312242614524
- type: nauc_recall_at_20_std
value: 8.959922224970903
- type: nauc_recall_at_3_diff1
value: 38.724975690747826
- type: nauc_recall_at_3_max
value: 41.3025635407677
- type: nauc_recall_at_3_std
value: 0.6484284398052167
- type: nauc_recall_at_5_diff1
value: 34.09423664395091
- type: nauc_recall_at_5_max
value: 41.34844327450573
- type: nauc_recall_at_5_std
value: 2.3349428535301424
- type: ndcg_at_1
value: 34.703
- type: ndcg_at_10
value: 41.967999999999996
- type: ndcg_at_100
value: 47.607
- type: ndcg_at_1000
value: 49.984
- type: ndcg_at_20
value: 44.285000000000004
- type: ndcg_at_3
value: 37.582
- type: ndcg_at_5
value: 39.454
- type: precision_at_1
value: 34.703
- type: precision_at_10
value: 7.306
- type: precision_at_100
value: 1.191
- type: precision_at_1000
value: 0.156
- type: precision_at_20
value: 4.406000000000001
- type: precision_at_3
value: 17.541999999999998
- type: precision_at_5
value: 12.26
- type: recall_at_1
value: 27.991
- type: recall_at_10
value: 52.016
- type: recall_at_100
value: 75.807
- type: recall_at_1000
value: 91.84400000000001
- type: recall_at_20
value: 60.171
- type: recall_at_3
value: 39.268
- type: recall_at_5
value: 44.548
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackRetrieval (default)
type: CQADupstackRetrieval_is_a_combined_dataset
config: default
split: test
revision: CQADupstackRetrieval_is_a_combined_dataset
metrics:
- type: main_score
value: 39.80483333333333
- type: ndcg_at_10
value: 39.80483333333333
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackStatsRetrieval (default)
type: mteb/cqadupstack-stats
config: default
split: test
revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a
metrics:
- type: main_score
value: 34.888999999999996
- type: map_at_1
value: 24.257
- type: map_at_10
value: 30.85
- type: map_at_100
value: 31.653
- type: map_at_1000
value: 31.744
- type: map_at_20
value: 31.235000000000003
- type: map_at_3
value: 28.742
- type: map_at_5
value: 29.743000000000002
- type: mrr_at_1
value: 26.68711656441718
- type: mrr_at_10
value: 33.22828415619827
- type: mrr_at_100
value: 33.9510074708967
- type: mrr_at_1000
value: 34.019092955305204
- type: mrr_at_20
value: 33.600871234124
- type: mrr_at_3
value: 31.160531697341508
- type: mrr_at_5
value: 32.14212678936605
- type: nauc_map_at_1000_diff1
value: 52.717440487225275
- type: nauc_map_at_1000_max
value: 44.60170963845081
- type: nauc_map_at_1000_std
value: -3.1996706483359136
- type: nauc_map_at_100_diff1
value: 52.71189673586013
- type: nauc_map_at_100_max
value: 44.57163638567482
- type: nauc_map_at_100_std
value: -3.2345902627286436
- type: nauc_map_at_10_diff1
value: 53.02449930693637
- type: nauc_map_at_10_max
value: 44.35369795372346
- type: nauc_map_at_10_std
value: -3.8104783477282513
- type: nauc_map_at_1_diff1
value: 61.69412555489549
- type: nauc_map_at_1_max
value: 45.687572761686425
- type: nauc_map_at_1_std
value: -5.706950124921224
- type: nauc_map_at_20_diff1
value: 52.762382597962855
- type: nauc_map_at_20_max
value: 44.42527816578249
- type: nauc_map_at_20_std
value: -3.62442115557958
- type: nauc_map_at_3_diff1
value: 54.218133325934595
- type: nauc_map_at_3_max
value: 43.886110491155
- type: nauc_map_at_3_std
value: -5.373779809729606
- type: nauc_map_at_5_diff1
value: 53.87314356227072
- type: nauc_map_at_5_max
value: 44.19838867906011
- type: nauc_map_at_5_std
value: -4.657996273921579
- type: nauc_mrr_at_1000_diff1
value: 52.608759486406065
- type: nauc_mrr_at_1000_max
value: 46.43225035608919
- type: nauc_mrr_at_1000_std
value: -1.0825740469149292
- type: nauc_mrr_at_100_diff1
value: 52.59290039623913
- type: nauc_mrr_at_100_max
value: 46.43031739568791
- type: nauc_mrr_at_100_std
value: -1.110101172332684
- type: nauc_mrr_at_10_diff1
value: 52.860476269889055
- type: nauc_mrr_at_10_max
value: 46.48418329087753
- type: nauc_mrr_at_10_std
value: -1.3374238019386193
- type: nauc_mrr_at_1_diff1
value: 61.441947428807666
- type: nauc_mrr_at_1_max
value: 48.54756533074311
- type: nauc_mrr_at_1_std
value: -2.3680485432053135
- type: nauc_mrr_at_20_diff1
value: 52.665535367800906
- type: nauc_mrr_at_20_max
value: 46.41185879304558
- type: nauc_mrr_at_20_std
value: -1.3444595758714797
- type: nauc_mrr_at_3_diff1
value: 54.172851649909134
- type: nauc_mrr_at_3_max
value: 46.15833772250591
- type: nauc_mrr_at_3_std
value: -2.6730529379570642
- type: nauc_mrr_at_5_diff1
value: 53.723702014945175
- type: nauc_mrr_at_5_max
value: 46.297316686693016
- type: nauc_mrr_at_5_std
value: -2.159788610857334
- type: nauc_ndcg_at_1000_diff1
value: 48.49475884804671
- type: nauc_ndcg_at_1000_max
value: 45.2504813678727
- type: nauc_ndcg_at_1000_std
value: 1.3660441371017331
- type: nauc_ndcg_at_100_diff1
value: 48.328439839293004
- type: nauc_ndcg_at_100_max
value: 45.1976848279064
- type: nauc_ndcg_at_100_std
value: 0.984414559030773
- type: nauc_ndcg_at_10_diff1
value: 49.57495706841805
- type: nauc_ndcg_at_10_max
value: 44.32422841398523
- type: nauc_ndcg_at_10_std
value: -1.8938863954712948
- type: nauc_ndcg_at_1_diff1
value: 61.441947428807666
- type: nauc_ndcg_at_1_max
value: 48.54756533074311
- type: nauc_ndcg_at_1_std
value: -2.3680485432053135
- type: nauc_ndcg_at_20_diff1
value: 48.698704369155664
- type: nauc_ndcg_at_20_max
value: 44.32085785234671
- type: nauc_ndcg_at_20_std
value: -1.5370200957389617
- type: nauc_ndcg_at_3_diff1
value: 51.87602761155865
- type: nauc_ndcg_at_3_max
value: 43.836423952288946
- type: nauc_ndcg_at_3_std
value: -4.519331726990856
- type: nauc_ndcg_at_5_diff1
value: 51.536849644847216
- type: nauc_ndcg_at_5_max
value: 44.05267508410536
- type: nauc_ndcg_at_5_std
value: -3.7646800644981484
- type: nauc_precision_at_1000_diff1
value: -3.114425136121477
- type: nauc_precision_at_1000_max
value: 21.219654091584214
- type: nauc_precision_at_1000_std
value: 23.620715661080197
- type: nauc_precision_at_100_diff1
value: 13.781387623485253
- type: nauc_precision_at_100_max
value: 37.7816424452238
- type: nauc_precision_at_100_std
value: 24.719409110027726
- type: nauc_precision_at_10_diff1
value: 29.300018648484276
- type: nauc_precision_at_10_max
value: 42.111386830242296
- type: nauc_precision_at_10_std
value: 10.14768426081145
- type: nauc_precision_at_1_diff1
value: 61.441947428807666
- type: nauc_precision_at_1_max
value: 48.54756533074311
- type: nauc_precision_at_1_std
value: -2.3680485432053135
- type: nauc_precision_at_20_diff1
value: 24.056049155242437
- type: nauc_precision_at_20_max
value: 41.1201344685915
- type: nauc_precision_at_20_std
value: 12.97512554259156
- type: nauc_precision_at_3_diff1
value: 40.917570494530224
- type: nauc_precision_at_3_max
value: 42.15043236961856
- type: nauc_precision_at_3_std
value: -0.589880165120388
- type: nauc_precision_at_5_diff1
value: 36.58196834265981
- type: nauc_precision_at_5_max
value: 41.630431483145955
- type: nauc_precision_at_5_std
value: 2.792434474028848
- type: nauc_recall_at_1000_diff1
value: 22.038599119727685
- type: nauc_recall_at_1000_max
value: 40.92494951502034
- type: nauc_recall_at_1000_std
value: 30.098168212129906
- type: nauc_recall_at_100_diff1
value: 30.27278930698841
- type: nauc_recall_at_100_max
value: 43.08655404016066
- type: nauc_recall_at_100_std
value: 16.415020332792015
- type: nauc_recall_at_10_diff1
value: 38.75370707674917
- type: nauc_recall_at_10_max
value: 40.98674256815627
- type: nauc_recall_at_10_std
value: 1.4170954879979862
- type: nauc_recall_at_1_diff1
value: 61.69412555489549
- type: nauc_recall_at_1_max
value: 45.687572761686425
- type: nauc_recall_at_1_std
value: -5.706950124921224
- type: nauc_recall_at_20_diff1
value: 34.95998605858319
- type: nauc_recall_at_20_max
value: 40.10527957275843
- type: nauc_recall_at_20_std
value: 2.1856254846998895
- type: nauc_recall_at_3_diff1
value: 46.10618270844218
- type: nauc_recall_at_3_max
value: 39.94724438255762
- type: nauc_recall_at_3_std
value: -6.261263180948628
- type: nauc_recall_at_5_diff1
value: 45.37034670682598
- type: nauc_recall_at_5_max
value: 40.996211974958655
- type: nauc_recall_at_5_std
value: -3.8795589504838945
- type: ndcg_at_1
value: 26.687
- type: ndcg_at_10
value: 34.888999999999996
- type: ndcg_at_100
value: 38.967
- type: ndcg_at_1000
value: 41.408
- type: ndcg_at_20
value: 36.202
- type: ndcg_at_3
value: 30.763
- type: ndcg_at_5
value: 32.369
- type: precision_at_1
value: 26.687
- type: precision_at_10
value: 5.428999999999999
- type: precision_at_100
value: 0.8099999999999999
- type: precision_at_1000
value: 0.11
- type: precision_at_20
value: 3.0669999999999997
- type: precision_at_3
value: 12.883
- type: precision_at_5
value: 8.895999999999999
- type: recall_at_1
value: 24.257
- type: recall_at_10
value: 45.013999999999996
- type: recall_at_100
value: 63.55800000000001
- type: recall_at_1000
value: 81.649
- type: recall_at_20
value: 49.786
- type: recall_at_3
value: 33.623
- type: recall_at_5
value: 37.489
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackTexRetrieval (default)
type: mteb/cqadupstack-tex
config: default
split: test
revision: 46989137a86843e03a6195de44b09deda022eec7
metrics:
- type: main_score
value: 27.174
- type: map_at_1
value: 16.683
- type: map_at_10
value: 22.965
- type: map_at_100
value: 23.954
- type: map_at_1000
value: 24.078
- type: map_at_20
value: 23.49
- type: map_at_3
value: 20.918999999999997
- type: map_at_5
value: 22.027
- type: mrr_at_1
value: 19.92429456297316
- type: mrr_at_10
value: 26.551319656102862
- type: mrr_at_100
value: 27.428968210944316
- type: mrr_at_1000
value: 27.510501144435317
- type: mrr_at_20
value: 27.051813881383698
- type: mrr_at_3
value: 24.483826565726083
- type: mrr_at_5
value: 25.624569855471435
- type: nauc_map_at_1000_diff1
value: 39.70294552750383
- type: nauc_map_at_1000_max
value: 31.317466455201227
- type: nauc_map_at_1000_std
value: -1.762559086629105
- type: nauc_map_at_100_diff1
value: 39.71390899838813
- type: nauc_map_at_100_max
value: 31.29204970199068
- type: nauc_map_at_100_std
value: -1.791535537876596
- type: nauc_map_at_10_diff1
value: 40.01482969019678
- type: nauc_map_at_10_max
value: 31.23314156393745
- type: nauc_map_at_10_std
value: -2.3274535397042513
- type: nauc_map_at_1_diff1
value: 46.72895932959986
- type: nauc_map_at_1_max
value: 29.819875651168548
- type: nauc_map_at_1_std
value: -3.6639434506444912
- type: nauc_map_at_20_diff1
value: 39.79895580803141
- type: nauc_map_at_20_max
value: 31.18209733793537
- type: nauc_map_at_20_std
value: -2.052399285243834
- type: nauc_map_at_3_diff1
value: 41.98314483627424
- type: nauc_map_at_3_max
value: 31.410399587944422
- type: nauc_map_at_3_std
value: -3.1256987241100957
- type: nauc_map_at_5_diff1
value: 40.68955549018378
- type: nauc_map_at_5_max
value: 31.529138053527888
- type: nauc_map_at_5_std
value: -2.5106031609548727
- type: nauc_mrr_at_1000_diff1
value: 38.843425454050774
- type: nauc_mrr_at_1000_max
value: 32.080747972542476
- type: nauc_mrr_at_1000_std
value: -1.8813140227198037
- type: nauc_mrr_at_100_diff1
value: 38.844774433232246
- type: nauc_mrr_at_100_max
value: 32.07767547525176
- type: nauc_mrr_at_100_std
value: -1.8853968240347412
- type: nauc_mrr_at_10_diff1
value: 38.9943638829038
- type: nauc_mrr_at_10_max
value: 32.113199636613224
- type: nauc_mrr_at_10_std
value: -2.2808765253620997
- type: nauc_mrr_at_1_diff1
value: 45.204551111582504
- type: nauc_mrr_at_1_max
value: 31.33271495263982
- type: nauc_mrr_at_1_std
value: -4.310808417520686
- type: nauc_mrr_at_20_diff1
value: 38.809653957002475
- type: nauc_mrr_at_20_max
value: 32.00087958077687
- type: nauc_mrr_at_20_std
value: -2.077240815930647
- type: nauc_mrr_at_3_diff1
value: 40.640559615359884
- type: nauc_mrr_at_3_max
value: 32.499874311042085
- type: nauc_mrr_at_3_std
value: -3.0250204118059623
- type: nauc_mrr_at_5_diff1
value: 39.730384199123904
- type: nauc_mrr_at_5_max
value: 32.54797498951286
- type: nauc_mrr_at_5_std
value: -2.483752446190051
- type: nauc_ndcg_at_1000_diff1
value: 35.67309434839137
- type: nauc_ndcg_at_1000_max
value: 31.968665383689366
- type: nauc_ndcg_at_1000_std
value: 1.8902841143765996
- type: nauc_ndcg_at_100_diff1
value: 35.532320541105456
- type: nauc_ndcg_at_100_max
value: 31.39262363611392
- type: nauc_ndcg_at_100_std
value: 1.3738974219360591
- type: nauc_ndcg_at_10_diff1
value: 36.89304493982828
- type: nauc_ndcg_at_10_max
value: 31.413699188823262
- type: nauc_ndcg_at_10_std
value: -1.4406496834360265
- type: nauc_ndcg_at_1_diff1
value: 45.204551111582504
- type: nauc_ndcg_at_1_max
value: 31.33271495263982
- type: nauc_ndcg_at_1_std
value: -4.310808417520686
- type: nauc_ndcg_at_20_diff1
value: 36.10603668893203
- type: nauc_ndcg_at_20_max
value: 31.08596071268814
- type: nauc_ndcg_at_20_std
value: -0.5716127582631676
- type: nauc_ndcg_at_3_diff1
value: 40.3406275054372
- type: nauc_ndcg_at_3_max
value: 32.30746163378498
- type: nauc_ndcg_at_3_std
value: -2.9826906381184086
- type: nauc_ndcg_at_5_diff1
value: 38.435436080533805
- type: nauc_ndcg_at_5_max
value: 32.28159769507487
- type: nauc_ndcg_at_5_std
value: -1.896502637808091
- type: nauc_precision_at_1000_diff1
value: -1.3272380913114576
- type: nauc_precision_at_1000_max
value: 16.97452439042005
- type: nauc_precision_at_1000_std
value: 6.727514561355023
- type: nauc_precision_at_100_diff1
value: 9.050886288633748
- type: nauc_precision_at_100_max
value: 22.793531578995857
- type: nauc_precision_at_100_std
value: 9.041251836945914
- type: nauc_precision_at_10_diff1
value: 23.58024783123664
- type: nauc_precision_at_10_max
value: 30.911229044947746
- type: nauc_precision_at_10_std
value: 0.49206924465533297
- type: nauc_precision_at_1_diff1
value: 45.204551111582504
- type: nauc_precision_at_1_max
value: 31.33271495263982
- type: nauc_precision_at_1_std
value: -4.310808417520686
- type: nauc_precision_at_20_diff1
value: 18.72722750869453
- type: nauc_precision_at_20_max
value: 28.168309388621456
- type: nauc_precision_at_20_std
value: 3.5580796098534906
- type: nauc_precision_at_3_diff1
value: 34.21934456307853
- type: nauc_precision_at_3_max
value: 34.50963041596628
- type: nauc_precision_at_3_std
value: -2.1474684485851876
- type: nauc_precision_at_5_diff1
value: 29.967346999613596
- type: nauc_precision_at_5_max
value: 33.958476515854954
- type: nauc_precision_at_5_std
value: -0.45778793347456004
- type: nauc_recall_at_1000_diff1
value: 12.06453658572338
- type: nauc_recall_at_1000_max
value: 30.788667195142633
- type: nauc_recall_at_1000_std
value: 27.271269189751713
- type: nauc_recall_at_100_diff1
value: 19.6231994553196
- type: nauc_recall_at_100_max
value: 27.00238503628109
- type: nauc_recall_at_100_std
value: 13.294514312384601
- type: nauc_recall_at_10_diff1
value: 27.755272572613222
- type: nauc_recall_at_10_max
value: 28.332855891388125
- type: nauc_recall_at_10_std
value: 0.8241434995618968
- type: nauc_recall_at_1_diff1
value: 46.72895932959986
- type: nauc_recall_at_1_max
value: 29.819875651168548
- type: nauc_recall_at_1_std
value: -3.6639434506444912
- type: nauc_recall_at_20_diff1
value: 24.731671276025146
- type: nauc_recall_at_20_max
value: 26.949426211227795
- type: nauc_recall_at_20_std
value: 3.412457763382852
- type: nauc_recall_at_3_diff1
value: 36.38111388907899
- type: nauc_recall_at_3_max
value: 31.47754397495634
- type: nauc_recall_at_3_std
value: -2.1874715383733956
- type: nauc_recall_at_5_diff1
value: 31.68529930399809
- type: nauc_recall_at_5_max
value: 31.090941464639744
- type: nauc_recall_at_5_std
value: -0.1674878655815559
- type: ndcg_at_1
value: 19.924
- type: ndcg_at_10
value: 27.174
- type: ndcg_at_100
value: 32.065
- type: ndcg_at_1000
value: 35.106
- type: ndcg_at_20
value: 28.939999999999998
- type: ndcg_at_3
value: 23.372999999999998
- type: ndcg_at_5
value: 25.096
- type: precision_at_1
value: 19.924
- type: precision_at_10
value: 4.855
- type: precision_at_100
value: 0.857
- type: precision_at_1000
value: 0.129
- type: precision_at_20
value: 2.94
- type: precision_at_3
value: 10.897
- type: precision_at_5
value: 7.7909999999999995
- type: recall_at_1
value: 16.683
- type: recall_at_10
value: 36.276
- type: recall_at_100
value: 58.437
- type: recall_at_1000
value: 80.35900000000001
- type: recall_at_20
value: 42.79
- type: recall_at_3
value: 25.663999999999998
- type: recall_at_5
value: 30.213
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackUnixRetrieval (default)
type: mteb/cqadupstack-unix
config: default
split: test
revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53
metrics:
- type: main_score
value: 38.34
- type: map_at_1
value: 25.924999999999997
- type: map_at_10
value: 33.53
- type: map_at_100
value: 34.635
- type: map_at_1000
value: 34.739
- type: map_at_20
value: 34.117999999999995
- type: map_at_3
value: 30.94
- type: map_at_5
value: 32.411
- type: mrr_at_1
value: 30.223880597014922
- type: mrr_at_10
value: 37.598873193556024
- type: mrr_at_100
value: 38.48001202116003
- type: mrr_at_1000
value: 38.53998687212744
- type: mrr_at_20
value: 38.0922428291824
- type: mrr_at_3
value: 35.26119402985074
- type: mrr_at_5
value: 36.627798507462686
- type: nauc_map_at_1000_diff1
value: 48.99658121611321
- type: nauc_map_at_1000_max
value: 43.36514689969973
- type: nauc_map_at_1000_std
value: 1.2743138438292323
- type: nauc_map_at_100_diff1
value: 49.00383839256485
- type: nauc_map_at_100_max
value: 43.34421843813268
- type: nauc_map_at_100_std
value: 1.2381577394429648
- type: nauc_map_at_10_diff1
value: 48.976968357570804
- type: nauc_map_at_10_max
value: 43.21656545934543
- type: nauc_map_at_10_std
value: 0.8806229946576106
- type: nauc_map_at_1_diff1
value: 54.79429701172901
- type: nauc_map_at_1_max
value: 44.94497297225627
- type: nauc_map_at_1_std
value: 0.3424876477921997
- type: nauc_map_at_20_diff1
value: 49.05500453067965
- type: nauc_map_at_20_max
value: 43.313867184227114
- type: nauc_map_at_20_std
value: 1.0599077751868857
- type: nauc_map_at_3_diff1
value: 50.202191345168735
- type: nauc_map_at_3_max
value: 43.16428713411531
- type: nauc_map_at_3_std
value: 0.33035782399351366
- type: nauc_map_at_5_diff1
value: 49.43896179760421
- type: nauc_map_at_5_max
value: 43.36309937252455
- type: nauc_map_at_5_std
value: 0.6152011411226946
- type: nauc_mrr_at_1000_diff1
value: 48.359023685110486
- type: nauc_mrr_at_1000_max
value: 42.5315010808791
- type: nauc_mrr_at_1000_std
value: 0.5920431228924952
- type: nauc_mrr_at_100_diff1
value: 48.33949213883611
- type: nauc_mrr_at_100_max
value: 42.501697399914725
- type: nauc_mrr_at_100_std
value: 0.5683233598385363
- type: nauc_mrr_at_10_diff1
value: 48.17405374349975
- type: nauc_mrr_at_10_max
value: 42.36829702421452
- type: nauc_mrr_at_10_std
value: 0.3918636512799242
- type: nauc_mrr_at_1_diff1
value: 54.41613067936997
- type: nauc_mrr_at_1_max
value: 44.91551488557509
- type: nauc_mrr_at_1_std
value: -0.7697411188700982
- type: nauc_mrr_at_20_diff1
value: 48.29085774083497
- type: nauc_mrr_at_20_max
value: 42.46692350994534
- type: nauc_mrr_at_20_std
value: 0.49667689004854476
- type: nauc_mrr_at_3_diff1
value: 49.32403876113614
- type: nauc_mrr_at_3_max
value: 42.420974899262816
- type: nauc_mrr_at_3_std
value: -0.17054785857862576
- type: nauc_mrr_at_5_diff1
value: 48.5386866012484
- type: nauc_mrr_at_5_max
value: 42.49752447209939
- type: nauc_mrr_at_5_std
value: -0.030068724695007015
- type: nauc_ndcg_at_1000_diff1
value: 46.482903430093685
- type: nauc_ndcg_at_1000_max
value: 43.18727440958746
- type: nauc_ndcg_at_1000_std
value: 3.8397045352936874
- type: nauc_ndcg_at_100_diff1
value: 46.272241119098105
- type: nauc_ndcg_at_100_max
value: 42.44044067518221
- type: nauc_ndcg_at_100_std
value: 3.0744093549329374
- type: nauc_ndcg_at_10_diff1
value: 46.35820553525149
- type: nauc_ndcg_at_10_max
value: 42.05754989284268
- type: nauc_ndcg_at_10_std
value: 1.6140781134179982
- type: nauc_ndcg_at_1_diff1
value: 54.41613067936997
- type: nauc_ndcg_at_1_max
value: 44.91551488557509
- type: nauc_ndcg_at_1_std
value: -0.7697411188700982
- type: nauc_ndcg_at_20_diff1
value: 46.56173859192192
- type: nauc_ndcg_at_20_max
value: 42.39990803441754
- type: nauc_ndcg_at_20_std
value: 2.2301958940613518
- type: nauc_ndcg_at_3_diff1
value: 48.45451921294981
- type: nauc_ndcg_at_3_max
value: 42.1519683087422
- type: nauc_ndcg_at_3_std
value: 0.43355376702150983
- type: nauc_ndcg_at_5_diff1
value: 47.329516258529
- type: nauc_ndcg_at_5_max
value: 42.39325493165628
- type: nauc_ndcg_at_5_std
value: 0.8719863795035224
- type: nauc_precision_at_1000_diff1
value: -10.427395700183098
- type: nauc_precision_at_1000_max
value: 1.3695831886594074
- type: nauc_precision_at_1000_std
value: 5.396211335976429
- type: nauc_precision_at_100_diff1
value: 4.170216285720574
- type: nauc_precision_at_100_max
value: 14.393676436386233
- type: nauc_precision_at_100_std
value: 7.356250144868687
- type: nauc_precision_at_10_diff1
value: 25.406793843503
- type: nauc_precision_at_10_max
value: 30.469137431378485
- type: nauc_precision_at_10_std
value: 4.262031333274362
- type: nauc_precision_at_1_diff1
value: 54.41613067936997
- type: nauc_precision_at_1_max
value: 44.91551488557509
- type: nauc_precision_at_1_std
value: -0.7697411188700982
- type: nauc_precision_at_20_diff1
value: 20.989784339763254
- type: nauc_precision_at_20_max
value: 27.616892902118735
- type: nauc_precision_at_20_std
value: 5.021785061675381
- type: nauc_precision_at_3_diff1
value: 39.66665542900266
- type: nauc_precision_at_3_max
value: 37.76686222170862
- type: nauc_precision_at_3_std
value: 1.04925540752191
- type: nauc_precision_at_5_diff1
value: 32.88141076318413
- type: nauc_precision_at_5_max
value: 35.90401974619475
- type: nauc_precision_at_5_std
value: 2.2695242286100408
- type: nauc_recall_at_1000_diff1
value: 30.248973513875526
- type: nauc_recall_at_1000_max
value: 48.439331789791325
- type: nauc_recall_at_1000_std
value: 38.857189673518135
- type: nauc_recall_at_100_diff1
value: 33.090255913758874
- type: nauc_recall_at_100_max
value: 35.45818452208663
- type: nauc_recall_at_100_std
value: 12.58439358264515
- type: nauc_recall_at_10_diff1
value: 37.462082402733785
- type: nauc_recall_at_10_max
value: 36.99065942533105
- type: nauc_recall_at_10_std
value: 3.948587023033947
- type: nauc_recall_at_1_diff1
value: 54.79429701172901
- type: nauc_recall_at_1_max
value: 44.94497297225627
- type: nauc_recall_at_1_std
value: 0.3424876477921997
- type: nauc_recall_at_20_diff1
value: 37.34159405112872
- type: nauc_recall_at_20_max
value: 37.50873448555206
- type: nauc_recall_at_20_std
value: 6.669489660177887
- type: nauc_recall_at_3_diff1
value: 43.751405924588184
- type: nauc_recall_at_3_max
value: 38.5280847003097
- type: nauc_recall_at_3_std
value: 0.8234291612745726
- type: nauc_recall_at_5_diff1
value: 40.75537181461394
- type: nauc_recall_at_5_max
value: 38.64761171801593
- type: nauc_recall_at_5_std
value: 1.9783778065563666
- type: ndcg_at_1
value: 30.224
- type: ndcg_at_10
value: 38.34
- type: ndcg_at_100
value: 43.564
- type: ndcg_at_1000
value: 45.888
- type: ndcg_at_20
value: 40.285
- type: ndcg_at_3
value: 33.613
- type: ndcg_at_5
value: 35.868
- type: precision_at_1
value: 30.224
- type: precision_at_10
value: 6.343
- type: precision_at_100
value: 1.0030000000000001
- type: precision_at_1000
value: 0.131
- type: precision_at_20
value: 3.689
- type: precision_at_3
value: 14.832
- type: precision_at_5
value: 10.504
- type: recall_at_1
value: 25.924999999999997
- type: recall_at_10
value: 49.01
- type: recall_at_100
value: 71.935
- type: recall_at_1000
value: 88.191
- type: recall_at_20
value: 56.076
- type: recall_at_3
value: 36.344
- type: recall_at_5
value: 41.942
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWebmastersRetrieval (default)
type: mteb/cqadupstack-webmasters
config: default
split: test
revision: 160c094312a0e1facb97e55eeddb698c0abe3571
metrics:
- type: main_score
value: 39.007
- type: map_at_1
value: 25.195
- type: map_at_10
value: 33.29
- type: map_at_100
value: 34.919
- type: map_at_1000
value: 35.132999999999996
- type: map_at_20
value: 34.184
- type: map_at_3
value: 30.501
- type: map_at_5
value: 31.917
- type: mrr_at_1
value: 30.237154150197625
- type: mrr_at_10
value: 37.97901373988331
- type: mrr_at_100
value: 38.89357624578056
- type: mrr_at_1000
value: 38.96172508462875
- type: mrr_at_20
value: 38.489908488593
- type: mrr_at_3
value: 35.44137022397892
- type: mrr_at_5
value: 36.755599472990774
- type: nauc_map_at_1000_diff1
value: 54.52234288345771
- type: nauc_map_at_1000_max
value: 37.02933259777875
- type: nauc_map_at_1000_std
value: -1.8802414735497839
- type: nauc_map_at_100_diff1
value: 54.592085424308564
- type: nauc_map_at_100_max
value: 37.13861558972853
- type: nauc_map_at_100_std
value: -1.8864900602925623
- type: nauc_map_at_10_diff1
value: 55.32701084932018
- type: nauc_map_at_10_max
value: 36.97158176818064
- type: nauc_map_at_10_std
value: -3.364570079568588
- type: nauc_map_at_1_diff1
value: 62.56234442022803
- type: nauc_map_at_1_max
value: 37.725553737446866
- type: nauc_map_at_1_std
value: -5.9573495367577705
- type: nauc_map_at_20_diff1
value: 54.92567471295049
- type: nauc_map_at_20_max
value: 36.980006282091985
- type: nauc_map_at_20_std
value: -2.7416738048891243
- type: nauc_map_at_3_diff1
value: 57.6202035201006
- type: nauc_map_at_3_max
value: 36.85083307496426
- type: nauc_map_at_3_std
value: -4.929088209082444
- type: nauc_map_at_5_diff1
value: 56.43034014992742
- type: nauc_map_at_5_max
value: 36.65006798835753
- type: nauc_map_at_5_std
value: -4.776147213332607
- type: nauc_mrr_at_1000_diff1
value: 51.91684536214369
- type: nauc_mrr_at_1000_max
value: 35.50047477073224
- type: nauc_mrr_at_1000_std
value: -0.9638166168094422
- type: nauc_mrr_at_100_diff1
value: 51.89735751581897
- type: nauc_mrr_at_100_max
value: 35.48371938892366
- type: nauc_mrr_at_100_std
value: -0.9444977007097576
- type: nauc_mrr_at_10_diff1
value: 51.82990105533963
- type: nauc_mrr_at_10_max
value: 35.41678096580625
- type: nauc_mrr_at_10_std
value: -1.2998439543197369
- type: nauc_mrr_at_1_diff1
value: 57.36601705972182
- type: nauc_mrr_at_1_max
value: 36.90602990003092
- type: nauc_mrr_at_1_std
value: -3.4080880251307044
- type: nauc_mrr_at_20_diff1
value: 51.8613947241447
- type: nauc_mrr_at_20_max
value: 35.42345819928662
- type: nauc_mrr_at_20_std
value: -1.093870308993923
- type: nauc_mrr_at_3_diff1
value: 53.01993009463089
- type: nauc_mrr_at_3_max
value: 35.822666497908806
- type: nauc_mrr_at_3_std
value: -2.1165600076512474
- type: nauc_mrr_at_5_diff1
value: 52.34611304656942
- type: nauc_mrr_at_5_max
value: 35.49696929205688
- type: nauc_mrr_at_5_std
value: -2.0955274926266982
- type: nauc_ndcg_at_1000_diff1
value: 51.41120348218975
- type: nauc_ndcg_at_1000_max
value: 36.685342768279675
- type: nauc_ndcg_at_1000_std
value: 1.7205313748343651
- type: nauc_ndcg_at_100_diff1
value: 50.93701708514895
- type: nauc_ndcg_at_100_max
value: 36.162627377243275
- type: nauc_ndcg_at_100_std
value: 1.7640807675244328
- type: nauc_ndcg_at_10_diff1
value: 50.63098923593871
- type: nauc_ndcg_at_10_max
value: 35.34361464083639
- type: nauc_ndcg_at_10_std
value: -0.9402862458857915
- type: nauc_ndcg_at_1_diff1
value: 57.36601705972182
- type: nauc_ndcg_at_1_max
value: 36.90602990003092
- type: nauc_ndcg_at_1_std
value: -3.4080880251307044
- type: nauc_ndcg_at_20_diff1
value: 50.73961693837964
- type: nauc_ndcg_at_20_max
value: 35.01998564289338
- type: nauc_ndcg_at_20_std
value: -0.5241446967120867
- type: nauc_ndcg_at_3_diff1
value: 53.23302956511971
- type: nauc_ndcg_at_3_max
value: 35.708980757056295
- type: nauc_ndcg_at_3_std
value: -3.017125347557592
- type: nauc_ndcg_at_5_diff1
value: 52.335636773583396
- type: nauc_ndcg_at_5_max
value: 35.34227057005852
- type: nauc_ndcg_at_5_std
value: -2.9708664518544508
- type: nauc_precision_at_1000_diff1
value: -18.554677236277232
- type: nauc_precision_at_1000_max
value: -15.659740900843067
- type: nauc_precision_at_1000_std
value: 8.228155770924415
- type: nauc_precision_at_100_diff1
value: -12.195998995692928
- type: nauc_precision_at_100_max
value: -0.5888781565639164
- type: nauc_precision_at_100_std
value: 19.312752223375448
- type: nauc_precision_at_10_diff1
value: 12.921470127228105
- type: nauc_precision_at_10_max
value: 21.317929458256238
- type: nauc_precision_at_10_std
value: 13.148202187911012
- type: nauc_precision_at_1_diff1
value: 57.36601705972182
- type: nauc_precision_at_1_max
value: 36.90602990003092
- type: nauc_precision_at_1_std
value: -3.4080880251307044
- type: nauc_precision_at_20_diff1
value: 2.4696353004069906
- type: nauc_precision_at_20_max
value: 14.284343093524058
- type: nauc_precision_at_20_std
value: 17.480976091077217
- type: nauc_precision_at_3_diff1
value: 35.82856720298558
- type: nauc_precision_at_3_max
value: 29.613454822718143
- type: nauc_precision_at_3_std
value: 0.38030095211645343
- type: nauc_precision_at_5_diff1
value: 27.632641276435354
- type: nauc_precision_at_5_max
value: 27.238425775328967
- type: nauc_precision_at_5_std
value: 3.152744091929671
- type: nauc_recall_at_1000_diff1
value: 33.28570370310322
- type: nauc_recall_at_1000_max
value: 44.315453433115785
- type: nauc_recall_at_1000_std
value: 43.371884128363
- type: nauc_recall_at_100_diff1
value: 35.77059425104567
- type: nauc_recall_at_100_max
value: 31.48054575812204
- type: nauc_recall_at_100_std
value: 17.639416832754303
- type: nauc_recall_at_10_diff1
value: 40.179789202687914
- type: nauc_recall_at_10_max
value: 30.466946546206923
- type: nauc_recall_at_10_std
value: 0.8385433327977754
- type: nauc_recall_at_1_diff1
value: 62.56234442022803
- type: nauc_recall_at_1_max
value: 37.725553737446866
- type: nauc_recall_at_1_std
value: -5.9573495367577705
- type: nauc_recall_at_20_diff1
value: 38.70371818511684
- type: nauc_recall_at_20_max
value: 28.305350175132567
- type: nauc_recall_at_20_std
value: 3.8854966962347746
- type: nauc_recall_at_3_diff1
value: 51.22347884414916
- type: nauc_recall_at_3_max
value: 33.21612425601433
- type: nauc_recall_at_3_std
value: -4.48370860005988
- type: nauc_recall_at_5_diff1
value: 46.848014408337676
- type: nauc_recall_at_5_max
value: 31.254476917525555
- type: nauc_recall_at_5_std
value: -4.903427133365656
- type: ndcg_at_1
value: 30.237000000000002
- type: ndcg_at_10
value: 39.007
- type: ndcg_at_100
value: 44.585
- type: ndcg_at_1000
value: 47.464
- type: ndcg_at_20
value: 41.278999999999996
- type: ndcg_at_3
value: 34.472
- type: ndcg_at_5
value: 36.315
- type: precision_at_1
value: 30.237000000000002
- type: precision_at_10
value: 7.51
- type: precision_at_100
value: 1.478
- type: precision_at_1000
value: 0.234
- type: precision_at_20
value: 4.7829999999999995
- type: precision_at_3
value: 16.14
- type: precision_at_5
value: 11.462
- type: recall_at_1
value: 25.195
- type: recall_at_10
value: 49.507
- type: recall_at_100
value: 74.083
- type: recall_at_1000
value: 92.899
- type: recall_at_20
value: 58.291000000000004
- type: recall_at_3
value: 36.167
- type: recall_at_5
value: 41.749
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWordpressRetrieval (default)
type: mteb/cqadupstack-wordpress
config: default
split: test
revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4
metrics:
- type: main_score
value: 33.06
- type: map_at_1
value: 22.683
- type: map_at_10
value: 29.115000000000002
- type: map_at_100
value: 30.035
- type: map_at_1000
value: 30.141000000000002
- type: map_at_20
value: 29.585
- type: map_at_3
value: 27.436
- type: map_at_5
value: 28.186
- type: mrr_at_1
value: 24.953789279112755
- type: mrr_at_10
value: 31.512190828272157
- type: mrr_at_100
value: 32.30661079835987
- type: mrr_at_1000
value: 32.388485948646846
- type: mrr_at_20
value: 31.898454977555428
- type: mrr_at_3
value: 29.852125693160815
- type: mrr_at_5
value: 30.64695009242144
- type: nauc_map_at_1000_diff1
value: 41.37097481409692
- type: nauc_map_at_1000_max
value: 21.819472065390062
- type: nauc_map_at_1000_std
value: -5.511851233031371
- type: nauc_map_at_100_diff1
value: 41.38580981484577
- type: nauc_map_at_100_max
value: 21.796410887298222
- type: nauc_map_at_100_std
value: -5.56736379242138
- type: nauc_map_at_10_diff1
value: 41.63629903410976
- type: nauc_map_at_10_max
value: 21.90371149884218
- type: nauc_map_at_10_std
value: -6.152274677121426
- type: nauc_map_at_1_diff1
value: 45.84841941041374
- type: nauc_map_at_1_max
value: 20.461574274794568
- type: nauc_map_at_1_std
value: -7.769870515581234
- type: nauc_map_at_20_diff1
value: 41.616159838791376
- type: nauc_map_at_20_max
value: 21.879572436615728
- type: nauc_map_at_20_std
value: -6.001760143925003
- type: nauc_map_at_3_diff1
value: 42.690213994915474
- type: nauc_map_at_3_max
value: 21.35340820982141
- type: nauc_map_at_3_std
value: -6.118720026868332
- type: nauc_map_at_5_diff1
value: 42.107817663484575
- type: nauc_map_at_5_max
value: 22.02508826703247
- type: nauc_map_at_5_std
value: -5.655849953120985
- type: nauc_mrr_at_1000_diff1
value: 39.66954612386224
- type: nauc_mrr_at_1000_max
value: 22.150137067327954
- type: nauc_mrr_at_1000_std
value: -4.798006812425386
- type: nauc_mrr_at_100_diff1
value: 39.66409024535208
- type: nauc_mrr_at_100_max
value: 22.121525365416538
- type: nauc_mrr_at_100_std
value: -4.806603240713894
- type: nauc_mrr_at_10_diff1
value: 39.87117352487735
- type: nauc_mrr_at_10_max
value: 22.298568726426076
- type: nauc_mrr_at_10_std
value: -5.1451772190015195
- type: nauc_mrr_at_1_diff1
value: 43.86075692062394
- type: nauc_mrr_at_1_max
value: 20.51270620979276
- type: nauc_mrr_at_1_std
value: -7.589704558075294
- type: nauc_mrr_at_20_diff1
value: 39.820424398881215
- type: nauc_mrr_at_20_max
value: 22.173944895852095
- type: nauc_mrr_at_20_std
value: -5.0727540461865335
- type: nauc_mrr_at_3_diff1
value: 40.73278435693193
- type: nauc_mrr_at_3_max
value: 21.930995553135812
- type: nauc_mrr_at_3_std
value: -5.980722775097277
- type: nauc_mrr_at_5_diff1
value: 39.89679395564144
- type: nauc_mrr_at_5_max
value: 22.02821777103734
- type: nauc_mrr_at_5_std
value: -5.072135508421082
- type: nauc_ndcg_at_1000_diff1
value: 37.957587605367785
- type: nauc_ndcg_at_1000_max
value: 22.362257192820255
- type: nauc_ndcg_at_1000_std
value: -1.7757428668228084
- type: nauc_ndcg_at_100_diff1
value: 37.908544407246104
- type: nauc_ndcg_at_100_max
value: 21.536623476432354
- type: nauc_ndcg_at_100_std
value: -2.678355870833651
- type: nauc_ndcg_at_10_diff1
value: 39.36845261271005
- type: nauc_ndcg_at_10_max
value: 22.3150793248212
- type: nauc_ndcg_at_10_std
value: -5.646375413170874
- type: nauc_ndcg_at_1_diff1
value: 43.86075692062394
- type: nauc_ndcg_at_1_max
value: 20.51270620979276
- type: nauc_ndcg_at_1_std
value: -7.589704558075294
- type: nauc_ndcg_at_20_diff1
value: 39.30711049883703
- type: nauc_ndcg_at_20_max
value: 21.935544953883415
- type: nauc_ndcg_at_20_std
value: -5.20402304183158
- type: nauc_ndcg_at_3_diff1
value: 41.113286498750305
- type: nauc_ndcg_at_3_max
value: 21.635397999914282
- type: nauc_ndcg_at_3_std
value: -5.72866713630757
- type: nauc_ndcg_at_5_diff1
value: 40.06783309225114
- type: nauc_ndcg_at_5_max
value: 22.416356942701672
- type: nauc_ndcg_at_5_std
value: -4.886519038213331
- type: nauc_precision_at_1000_diff1
value: -17.52292838463402
- type: nauc_precision_at_1000_max
value: -5.389818321213827
- type: nauc_precision_at_1000_std
value: 26.772552854570375
- type: nauc_precision_at_100_diff1
value: 3.543169641476175
- type: nauc_precision_at_100_max
value: 9.574510694378198
- type: nauc_precision_at_100_std
value: 17.92832693421059
- type: nauc_precision_at_10_diff1
value: 24.894375565187694
- type: nauc_precision_at_10_max
value: 22.273016884986628
- type: nauc_precision_at_10_std
value: -0.32355612520474136
- type: nauc_precision_at_1_diff1
value: 43.86075692062394
- type: nauc_precision_at_1_max
value: 20.51270620979276
- type: nauc_precision_at_1_std
value: -7.589704558075294
- type: nauc_precision_at_20_diff1
value: 21.29826064932648
- type: nauc_precision_at_20_max
value: 19.79498027543001
- type: nauc_precision_at_20_std
value: 2.804941576632282
- type: nauc_precision_at_3_diff1
value: 33.72177316592598
- type: nauc_precision_at_3_max
value: 22.691241202228518
- type: nauc_precision_at_3_std
value: -2.7085967541341853
- type: nauc_precision_at_5_diff1
value: 30.51704379057159
- type: nauc_precision_at_5_max
value: 24.287775910544436
- type: nauc_precision_at_5_std
value: 0.6318618555538418
- type: nauc_recall_at_1000_diff1
value: 16.14163529457628
- type: nauc_recall_at_1000_max
value: 30.255937330833625
- type: nauc_recall_at_1000_std
value: 34.82149396857235
- type: nauc_recall_at_100_diff1
value: 24.81738199141423
- type: nauc_recall_at_100_max
value: 17.622405730191517
- type: nauc_recall_at_100_std
value: 9.943278532212068
- type: nauc_recall_at_10_diff1
value: 34.03447281460739
- type: nauc_recall_at_10_max
value: 22.077681180504047
- type: nauc_recall_at_10_std
value: -5.772153803762581
- type: nauc_recall_at_1_diff1
value: 45.84841941041374
- type: nauc_recall_at_1_max
value: 20.461574274794568
- type: nauc_recall_at_1_std
value: -7.769870515581234
- type: nauc_recall_at_20_diff1
value: 33.91749085377916
- type: nauc_recall_at_20_max
value: 20.226869969726543
- type: nauc_recall_at_20_std
value: -4.369285076602888
- type: nauc_recall_at_3_diff1
value: 38.25575445199975
- type: nauc_recall_at_3_max
value: 21.402983769895837
- type: nauc_recall_at_3_std
value: -5.96278802416301
- type: nauc_recall_at_5_diff1
value: 36.17314539524256
- type: nauc_recall_at_5_max
value: 23.115551795773314
- type: nauc_recall_at_5_std
value: -3.8407187471333697
- type: ndcg_at_1
value: 24.954
- type: ndcg_at_10
value: 33.06
- type: ndcg_at_100
value: 37.751000000000005
- type: ndcg_at_1000
value: 40.477000000000004
- type: ndcg_at_20
value: 34.587
- type: ndcg_at_3
value: 29.666999999999998
- type: ndcg_at_5
value: 30.929000000000002
- type: precision_at_1
value: 24.954
- type: precision_at_10
value: 4.972
- type: precision_at_100
value: 0.799
- type: precision_at_1000
value: 0.11499999999999999
- type: precision_at_20
value: 2.874
- type: precision_at_3
value: 12.446
- type: precision_at_5
value: 8.244
- type: recall_at_1
value: 22.683
- type: recall_at_10
value: 42.775
- type: recall_at_100
value: 65.05300000000001
- type: recall_at_1000
value: 85.251
- type: recall_at_20
value: 48.512
- type: recall_at_3
value: 33.423
- type: recall_at_5
value: 36.571
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER (default)
type: mteb/climate-fever
config: default
split: test
revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380
metrics:
- type: main_score
value: 25.713
- type: map_at_1
value: 10.995000000000001
- type: map_at_10
value: 18.183
- type: map_at_100
value: 19.758
- type: map_at_1000
value: 19.93
- type: map_at_20
value: 19.023
- type: map_at_3
value: 15.126999999999999
- type: map_at_5
value: 16.521
- type: mrr_at_1
value: 23.908794788273617
- type: mrr_at_10
value: 34.419626699756996
- type: mrr_at_100
value: 35.42205880765744
- type: mrr_at_1000
value: 35.465636585855435
- type: mrr_at_20
value: 35.04560320193987
- type: mrr_at_3
value: 31.31378935939197
- type: mrr_at_5
value: 32.98154180238871
- type: nauc_map_at_1000_diff1
value: 30.808649871031978
- type: nauc_map_at_1000_max
value: 38.44733700268257
- type: nauc_map_at_1000_std
value: 24.83849154952647
- type: nauc_map_at_100_diff1
value: 30.817681439188565
- type: nauc_map_at_100_max
value: 38.38165009049118
- type: nauc_map_at_100_std
value: 24.75945437667734
- type: nauc_map_at_10_diff1
value: 31.016072728955457
- type: nauc_map_at_10_max
value: 37.78482154934025
- type: nauc_map_at_10_std
value: 22.73087477402899
- type: nauc_map_at_1_diff1
value: 38.13786017193742
- type: nauc_map_at_1_max
value: 34.897924276187446
- type: nauc_map_at_1_std
value: 15.197914019142733
- type: nauc_map_at_20_diff1
value: 30.93811389613207
- type: nauc_map_at_20_max
value: 38.018621558175084
- type: nauc_map_at_20_std
value: 23.87402074626538
- type: nauc_map_at_3_diff1
value: 32.694558487234204
- type: nauc_map_at_3_max
value: 37.452175644150344
- type: nauc_map_at_3_std
value: 20.06796990357737
- type: nauc_map_at_5_diff1
value: 31.654957870346784
- type: nauc_map_at_5_max
value: 37.04115114192235
- type: nauc_map_at_5_std
value: 21.129693545324375
- type: nauc_mrr_at_1000_diff1
value: 29.802772421913403
- type: nauc_mrr_at_1000_max
value: 38.000278050301176
- type: nauc_mrr_at_1000_std
value: 23.48992856904152
- type: nauc_mrr_at_100_diff1
value: 29.788014379597026
- type: nauc_mrr_at_100_max
value: 38.0070275486147
- type: nauc_mrr_at_100_std
value: 23.522736661530086
- type: nauc_mrr_at_10_diff1
value: 29.5812602078958
- type: nauc_mrr_at_10_max
value: 37.73314132006107
- type: nauc_mrr_at_10_std
value: 23.34339817425411
- type: nauc_mrr_at_1_diff1
value: 36.24696165314146
- type: nauc_mrr_at_1_max
value: 36.63498565688475
- type: nauc_mrr_at_1_std
value: 16.627906626261446
- type: nauc_mrr_at_20_diff1
value: 29.765297131181562
- type: nauc_mrr_at_20_max
value: 37.8739248069123
- type: nauc_mrr_at_20_std
value: 23.44526626055555
- type: nauc_mrr_at_3_diff1
value: 30.428492046004795
- type: nauc_mrr_at_3_max
value: 37.917848006886125
- type: nauc_mrr_at_3_std
value: 21.90161780585706
- type: nauc_mrr_at_5_diff1
value: 29.93977431566972
- type: nauc_mrr_at_5_max
value: 37.69690203746751
- type: nauc_mrr_at_5_std
value: 22.75274068799061
- type: nauc_ndcg_at_1000_diff1
value: 27.523183792167266
- type: nauc_ndcg_at_1000_max
value: 40.93757048012577
- type: nauc_ndcg_at_1000_std
value: 32.30396817658341
- type: nauc_ndcg_at_100_diff1
value: 27.454763301587064
- type: nauc_ndcg_at_100_max
value: 40.45039618287942
- type: nauc_ndcg_at_100_std
value: 31.795801743619663
- type: nauc_ndcg_at_10_diff1
value: 28.012456489936806
- type: nauc_ndcg_at_10_max
value: 38.045278212869825
- type: nauc_ndcg_at_10_std
value: 25.963041085823978
- type: nauc_ndcg_at_1_diff1
value: 35.99513984271449
- type: nauc_ndcg_at_1_max
value: 36.62771507516844
- type: nauc_ndcg_at_1_std
value: 16.726124822038052
- type: nauc_ndcg_at_20_diff1
value: 28.012111240688963
- type: nauc_ndcg_at_20_max
value: 38.667107321330555
- type: nauc_ndcg_at_20_std
value: 28.198245721076976
- type: nauc_ndcg_at_3_diff1
value: 30.33073102826854
- type: nauc_ndcg_at_3_max
value: 37.995789997615354
- type: nauc_ndcg_at_3_std
value: 22.304331918813876
- type: nauc_ndcg_at_5_diff1
value: 29.141028641237632
- type: nauc_ndcg_at_5_max
value: 37.2113360591228
- type: nauc_ndcg_at_5_std
value: 23.53066714165745
- type: nauc_precision_at_1000_diff1
value: -1.0646702024743917
- type: nauc_precision_at_1000_max
value: 19.304218995700534
- type: nauc_precision_at_1000_std
value: 31.73840122818843
- type: nauc_precision_at_100_diff1
value: 5.427804568412734
- type: nauc_precision_at_100_max
value: 27.90881278884377
- type: nauc_precision_at_100_std
value: 38.45326235114876
- type: nauc_precision_at_10_diff1
value: 14.252021242340863
- type: nauc_precision_at_10_max
value: 32.047078663067914
- type: nauc_precision_at_10_std
value: 30.621835328899426
- type: nauc_precision_at_1_diff1
value: 35.99513984271449
- type: nauc_precision_at_1_max
value: 36.62771507516844
- type: nauc_precision_at_1_std
value: 16.726124822038052
- type: nauc_precision_at_20_diff1
value: 12.017354269524972
- type: nauc_precision_at_20_max
value: 29.906152963561322
- type: nauc_precision_at_20_std
value: 33.764105037332264
- type: nauc_precision_at_3_diff1
value: 23.486354895398577
- type: nauc_precision_at_3_max
value: 38.45096435794749
- type: nauc_precision_at_3_std
value: 26.636452479567645
- type: nauc_precision_at_5_diff1
value: 19.574760607896973
- type: nauc_precision_at_5_max
value: 34.51474571826715
- type: nauc_precision_at_5_std
value: 28.514859235740904
- type: nauc_recall_at_1000_diff1
value: 12.801905007251246
- type: nauc_recall_at_1000_max
value: 37.49463996225108
- type: nauc_recall_at_1000_std
value: 45.46087045204742
- type: nauc_recall_at_100_diff1
value: 15.082886168560034
- type: nauc_recall_at_100_max
value: 35.720813725614
- type: nauc_recall_at_100_std
value: 39.876934524809215
- type: nauc_recall_at_10_diff1
value: 20.08086437796489
- type: nauc_recall_at_10_max
value: 33.418507169063815
- type: nauc_recall_at_10_std
value: 27.309080075299562
- type: nauc_recall_at_1_diff1
value: 38.13786017193742
- type: nauc_recall_at_1_max
value: 34.897924276187446
- type: nauc_recall_at_1_std
value: 15.197914019142733
- type: nauc_recall_at_20_diff1
value: 18.984980462200134
- type: nauc_recall_at_20_max
value: 32.95474022914299
- type: nauc_recall_at_20_std
value: 30.77553423574554
- type: nauc_recall_at_3_diff1
value: 26.670776366276865
- type: nauc_recall_at_3_max
value: 37.07230392845629
- type: nauc_recall_at_3_std
value: 23.385309818709757
- type: nauc_recall_at_5_diff1
value: 23.45569235165577
- type: nauc_recall_at_5_max
value: 34.014688386664524
- type: nauc_recall_at_5_std
value: 24.50194439244803
- type: ndcg_at_1
value: 23.974
- type: ndcg_at_10
value: 25.713
- type: ndcg_at_100
value: 32.349
- type: ndcg_at_1000
value: 35.615
- type: ndcg_at_20
value: 28.28
- type: ndcg_at_3
value: 20.761
- type: ndcg_at_5
value: 22.225
- type: precision_at_1
value: 23.974
- type: precision_at_10
value: 8.052
- type: precision_at_100
value: 1.5110000000000001
- type: precision_at_1000
value: 0.211
- type: precision_at_20
value: 5.106999999999999
- type: precision_at_3
value: 15.157000000000002
- type: precision_at_5
value: 11.557
- type: recall_at_1
value: 10.995000000000001
- type: recall_at_10
value: 31.05
- type: recall_at_100
value: 54.233
- type: recall_at_1000
value: 72.75500000000001
- type: recall_at_20
value: 38.442
- type: recall_at_3
value: 18.839
- type: recall_at_5
value: 23.26
- task:
type: Retrieval
dataset:
name: MTEB DBPedia (default)
type: mteb/dbpedia
config: default
split: test
revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659
metrics:
- type: main_score
value: 40.091
- type: map_at_1
value: 8.112
- type: map_at_10
value: 18.911
- type: map_at_100
value: 27.29
- type: map_at_1000
value: 28.749000000000002
- type: map_at_20
value: 22.187
- type: map_at_3
value: 13.177
- type: map_at_5
value: 15.723999999999998
- type: mrr_at_1
value: 64.75
- type: mrr_at_10
value: 73.0328373015873
- type: mrr_at_100
value: 73.3904467983012
- type: mrr_at_1000
value: 73.40582528487944
- type: mrr_at_20
value: 73.25613317925624
- type: mrr_at_3
value: 71.58333333333333
- type: mrr_at_5
value: 72.52083333333333
- type: nauc_map_at_1000_diff1
value: 30.326073419291667
- type: nauc_map_at_1000_max
value: 41.2485655499243
- type: nauc_map_at_1000_std
value: 34.68797882732488
- type: nauc_map_at_100_diff1
value: 30.250567651424635
- type: nauc_map_at_100_max
value: 39.591743243203275
- type: nauc_map_at_100_std
value: 32.14962028433263
- type: nauc_map_at_10_diff1
value: 28.30330426974147
- type: nauc_map_at_10_max
value: 24.685858800003153
- type: nauc_map_at_10_std
value: 6.991461788881313
- type: nauc_map_at_1_diff1
value: 37.84825245885128
- type: nauc_map_at_1_max
value: 10.784383140794167
- type: nauc_map_at_1_std
value: -12.413788028731759
- type: nauc_map_at_20_diff1
value: 30.56644002866712
- type: nauc_map_at_20_max
value: 32.09850095008104
- type: nauc_map_at_20_std
value: 17.68312732143373
- type: nauc_map_at_3_diff1
value: 26.94636553986902
- type: nauc_map_at_3_max
value: 13.716258156642672
- type: nauc_map_at_3_std
value: -7.919396887763491
- type: nauc_map_at_5_diff1
value: 26.703766272524305
- type: nauc_map_at_5_max
value: 18.493432579075815
- type: nauc_map_at_5_std
value: -1.7953102028408285
- type: nauc_mrr_at_1000_diff1
value: 56.5585700690547
- type: nauc_mrr_at_1000_max
value: 68.59723304665478
- type: nauc_mrr_at_1000_std
value: 41.65741817361127
- type: nauc_mrr_at_100_diff1
value: 56.56488475063903
- type: nauc_mrr_at_100_max
value: 68.59436880973041
- type: nauc_mrr_at_100_std
value: 41.64008885243909
- type: nauc_mrr_at_10_diff1
value: 56.57992847970396
- type: nauc_mrr_at_10_max
value: 68.54809322422658
- type: nauc_mrr_at_10_std
value: 41.637196787701605
- type: nauc_mrr_at_1_diff1
value: 59.49013430944212
- type: nauc_mrr_at_1_max
value: 67.51266363522255
- type: nauc_mrr_at_1_std
value: 39.159077933489094
- type: nauc_mrr_at_20_diff1
value: 56.322141799066195
- type: nauc_mrr_at_20_max
value: 68.41241085079113
- type: nauc_mrr_at_20_std
value: 41.74023776153815
- type: nauc_mrr_at_3_diff1
value: 56.43465566121455
- type: nauc_mrr_at_3_max
value: 69.32027688455301
- type: nauc_mrr_at_3_std
value: 42.35441414676036
- type: nauc_mrr_at_5_diff1
value: 56.185426652218126
- type: nauc_mrr_at_5_max
value: 68.68507625781251
- type: nauc_mrr_at_5_std
value: 42.227673261247816
- type: nauc_ndcg_at_1000_diff1
value: 38.452991805224926
- type: nauc_ndcg_at_1000_max
value: 55.49295294630129
- type: nauc_ndcg_at_1000_std
value: 47.669258273236046
- type: nauc_ndcg_at_100_diff1
value: 37.94112950003329
- type: nauc_ndcg_at_100_max
value: 50.68816850295493
- type: nauc_ndcg_at_100_std
value: 40.72315230606931
- type: nauc_ndcg_at_10_diff1
value: 38.47467764455152
- type: nauc_ndcg_at_10_max
value: 49.25673297040027
- type: nauc_ndcg_at_10_std
value: 36.76815739343767
- type: nauc_ndcg_at_1_diff1
value: 54.434593584664995
- type: nauc_ndcg_at_1_max
value: 57.61369658753043
- type: nauc_ndcg_at_1_std
value: 33.10284117958805
- type: nauc_ndcg_at_20_diff1
value: 38.3053661549299
- type: nauc_ndcg_at_20_max
value: 49.26702623701029
- type: nauc_ndcg_at_20_std
value: 36.78366426340987
- type: nauc_ndcg_at_3_diff1
value: 38.34783510078573
- type: nauc_ndcg_at_3_max
value: 51.181351973892085
- type: nauc_ndcg_at_3_std
value: 35.13771937716931
- type: nauc_ndcg_at_5_diff1
value: 38.73137682217783
- type: nauc_ndcg_at_5_max
value: 51.289826741923875
- type: nauc_ndcg_at_5_std
value: 36.76670998246709
- type: nauc_precision_at_1000_diff1
value: -8.37698697546597
- type: nauc_precision_at_1000_max
value: 4.649648259545355
- type: nauc_precision_at_1000_std
value: 15.100762512885371
- type: nauc_precision_at_100_diff1
value: 4.538510496829277
- type: nauc_precision_at_100_max
value: 33.573044920932965
- type: nauc_precision_at_100_std
value: 50.15177354474223
- type: nauc_precision_at_10_diff1
value: 16.03217990213501
- type: nauc_precision_at_10_max
value: 45.22978979054545
- type: nauc_precision_at_10_std
value: 53.103286665555295
- type: nauc_precision_at_1_diff1
value: 59.49013430944212
- type: nauc_precision_at_1_max
value: 67.51266363522255
- type: nauc_precision_at_1_std
value: 39.159077933489094
- type: nauc_precision_at_20_diff1
value: 13.705605238285958
- type: nauc_precision_at_20_max
value: 44.08365262009368
- type: nauc_precision_at_20_std
value: 56.050420219607155
- type: nauc_precision_at_3_diff1
value: 21.409861522316014
- type: nauc_precision_at_3_max
value: 48.93702948445578
- type: nauc_precision_at_3_std
value: 42.8419067771303
- type: nauc_precision_at_5_diff1
value: 20.1310639195609
- type: nauc_precision_at_5_max
value: 49.59134352761235
- type: nauc_precision_at_5_std
value: 48.98546957350543
- type: nauc_recall_at_1000_diff1
value: 27.181172941984112
- type: nauc_recall_at_1000_max
value: 49.20832060504127
- type: nauc_recall_at_1000_std
value: 50.58754027710416
- type: nauc_recall_at_100_diff1
value: 25.831239736658713
- type: nauc_recall_at_100_max
value: 37.92978899965714
- type: nauc_recall_at_100_std
value: 32.84155059838547
- type: nauc_recall_at_10_diff1
value: 21.03971256731199
- type: nauc_recall_at_10_max
value: 16.34542184400448
- type: nauc_recall_at_10_std
value: 1.624004078039708
- type: nauc_recall_at_1_diff1
value: 37.84825245885128
- type: nauc_recall_at_1_max
value: 10.784383140794167
- type: nauc_recall_at_1_std
value: -12.413788028731759
- type: nauc_recall_at_20_diff1
value: 23.612410438391652
- type: nauc_recall_at_20_max
value: 24.731496668584725
- type: nauc_recall_at_20_std
value: 11.94162779763853
- type: nauc_recall_at_3_diff1
value: 21.124250217970754
- type: nauc_recall_at_3_max
value: 9.581953839031879
- type: nauc_recall_at_3_std
value: -9.955224094610848
- type: nauc_recall_at_5_diff1
value: 20.272821143755714
- type: nauc_recall_at_5_max
value: 12.80122421686649
- type: nauc_recall_at_5_std
value: -4.822509659730001
- type: ndcg_at_1
value: 52.87500000000001
- type: ndcg_at_10
value: 40.091
- type: ndcg_at_100
value: 45.007999999999996
- type: ndcg_at_1000
value: 51.522
- type: ndcg_at_20
value: 39.953
- type: ndcg_at_3
value: 44.627
- type: ndcg_at_5
value: 41.748000000000005
- type: precision_at_1
value: 64.75
- type: precision_at_10
value: 32.324999999999996
- type: precision_at_100
value: 10.583
- type: precision_at_1000
value: 1.992
- type: precision_at_20
value: 25.15
- type: precision_at_3
value: 48.5
- type: precision_at_5
value: 40.8
- type: recall_at_1
value: 8.112
- type: recall_at_10
value: 24.769
- type: recall_at_100
value: 51.92400000000001
- type: recall_at_1000
value: 72.60799999999999
- type: recall_at_20
value: 32.085
- type: recall_at_3
value: 14.707999999999998
- type: recall_at_5
value: 18.881
- task:
type: Classification
dataset:
name: MTEB EmotionClassification (default)
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 74.88499999999999
- type: f1
value: 69.55769956653745
- type: f1_weighted
value: 75.98938892167276
- type: main_score
value: 74.88499999999999
- task:
type: Retrieval
dataset:
name: MTEB FEVER (default)
type: mteb/fever
config: default
split: test
revision: bea83ef9e8fb933d90a2f1d5515737465d613e12
metrics:
- type: main_score
value: 86.088
- type: map_at_1
value: 74.21
- type: map_at_10
value: 82.238
- type: map_at_100
value: 82.467
- type: map_at_1000
value: 82.48
- type: map_at_20
value: 82.38
- type: map_at_3
value: 81.178
- type: map_at_5
value: 81.882
- type: mrr_at_1
value: 80.04800480048004
- type: mrr_at_10
value: 87.28162697222103
- type: mrr_at_100
value: 87.36425501689853
- type: mrr_at_1000
value: 87.36494888408146
- type: mrr_at_20
value: 87.33488767030532
- type: mrr_at_3
value: 86.5011501150115
- type: mrr_at_5
value: 87.04345434543454
- type: nauc_map_at_1000_diff1
value: 46.86807158039652
- type: nauc_map_at_1000_max
value: 17.537735239936584
- type: nauc_map_at_1000_std
value: -6.180991548000637
- type: nauc_map_at_100_diff1
value: 46.840981153123515
- type: nauc_map_at_100_max
value: 17.51241604543591
- type: nauc_map_at_100_std
value: -6.19572402233368
- type: nauc_map_at_10_diff1
value: 46.63164937877156
- type: nauc_map_at_10_max
value: 17.396231277218714
- type: nauc_map_at_10_std
value: -6.328960389468633
- type: nauc_map_at_1_diff1
value: 51.91442444295392
- type: nauc_map_at_1_max
value: 14.772868336313651
- type: nauc_map_at_1_std
value: -7.924628073687737
- type: nauc_map_at_20_diff1
value: 46.78996154399
- type: nauc_map_at_20_max
value: 17.52594082408568
- type: nauc_map_at_20_std
value: -6.2535816636418255
- type: nauc_map_at_3_diff1
value: 46.86720061616425
- type: nauc_map_at_3_max
value: 17.17282268255638
- type: nauc_map_at_3_std
value: -7.100454400283953
- type: nauc_map_at_5_diff1
value: 46.743320728340485
- type: nauc_map_at_5_max
value: 17.22026822962506
- type: nauc_map_at_5_std
value: -6.593983297795947
- type: nauc_mrr_at_1000_diff1
value: 64.22963921921831
- type: nauc_mrr_at_1000_max
value: 22.50147928007347
- type: nauc_mrr_at_1000_std
value: -10.753338651031981
- type: nauc_mrr_at_100_diff1
value: 64.22599646741416
- type: nauc_mrr_at_100_max
value: 22.49976292804203
- type: nauc_mrr_at_100_std
value: -10.753324625089736
- type: nauc_mrr_at_10_diff1
value: 64.24857003564016
- type: nauc_mrr_at_10_max
value: 22.721448283312323
- type: nauc_mrr_at_10_std
value: -10.698659951469375
- type: nauc_mrr_at_1_diff1
value: 65.80017393845672
- type: nauc_mrr_at_1_max
value: 19.56658619771462
- type: nauc_mrr_at_1_std
value: -10.691529848056236
- type: nauc_mrr_at_20_diff1
value: 64.22606211105564
- type: nauc_mrr_at_20_max
value: 22.60630203277465
- type: nauc_mrr_at_20_std
value: -10.698352035527936
- type: nauc_mrr_at_3_diff1
value: 64.03189495070804
- type: nauc_mrr_at_3_max
value: 23.197599099302078
- type: nauc_mrr_at_3_std
value: -10.941260656610341
- type: nauc_mrr_at_5_diff1
value: 64.21946450636831
- type: nauc_mrr_at_5_max
value: 22.869883457504613
- type: nauc_mrr_at_5_std
value: -10.773375222905306
- type: nauc_ndcg_at_1000_diff1
value: 48.18634946007256
- type: nauc_ndcg_at_1000_max
value: 19.635685645181443
- type: nauc_ndcg_at_1000_std
value: -5.008615485203909
- type: nauc_ndcg_at_100_diff1
value: 47.460702424024646
- type: nauc_ndcg_at_100_max
value: 19.197829510466093
- type: nauc_ndcg_at_100_std
value: -5.141098235552701
- type: nauc_ndcg_at_10_diff1
value: 46.75967320832195
- type: nauc_ndcg_at_10_max
value: 19.162998560532944
- type: nauc_ndcg_at_10_std
value: -5.680454888720109
- type: nauc_ndcg_at_1_diff1
value: 65.80017393845672
- type: nauc_ndcg_at_1_max
value: 19.56658619771462
- type: nauc_ndcg_at_1_std
value: -10.691529848056236
- type: nauc_ndcg_at_20_diff1
value: 47.15063801450417
- type: nauc_ndcg_at_20_max
value: 19.387976860064036
- type: nauc_ndcg_at_20_std
value: -5.434429887556901
- type: nauc_ndcg_at_3_diff1
value: 48.48013879703285
- type: nauc_ndcg_at_3_max
value: 19.563845683013074
- type: nauc_ndcg_at_3_std
value: -7.306366856511263
- type: nauc_ndcg_at_5_diff1
value: 47.4477936851643
- type: nauc_ndcg_at_5_max
value: 19.12745930840238
- type: nauc_ndcg_at_5_std
value: -6.338914655492511
- type: nauc_precision_at_1000_diff1
value: -4.975768805829236
- type: nauc_precision_at_1000_max
value: 10.078421203817527
- type: nauc_precision_at_1000_std
value: 10.15753365579419
- type: nauc_precision_at_100_diff1
value: -7.411336519288538
- type: nauc_precision_at_100_max
value: 11.116507499213043
- type: nauc_precision_at_100_std
value: 11.608241877542543
- type: nauc_precision_at_10_diff1
value: 2.6403449208341274
- type: nauc_precision_at_10_max
value: 20.668398953238633
- type: nauc_precision_at_10_std
value: 7.433281722501917
- type: nauc_precision_at_1_diff1
value: 65.80017393845672
- type: nauc_precision_at_1_max
value: 19.56658619771462
- type: nauc_precision_at_1_std
value: -10.691529848056236
- type: nauc_precision_at_20_diff1
value: -1.286553967637511
- type: nauc_precision_at_20_max
value: 17.30405603464926
- type: nauc_precision_at_20_std
value: 9.234773655809756
- type: nauc_precision_at_3_diff1
value: 31.364166410646675
- type: nauc_precision_at_3_max
value: 26.397101881343527
- type: nauc_precision_at_3_std
value: -5.0543954546843946
- type: nauc_precision_at_5_diff1
value: 17.1466778085294
- type: nauc_precision_at_5_max
value: 23.18905254179433
- type: nauc_precision_at_5_std
value: 1.6051724821489612
- type: nauc_recall_at_1000_diff1
value: -3.9377049069087935
- type: nauc_recall_at_1000_max
value: 27.168346654704095
- type: nauc_recall_at_1000_std
value: 38.58463265497753
- type: nauc_recall_at_100_diff1
value: -1.886570080947599
- type: nauc_recall_at_100_max
value: 16.12930964320666
- type: nauc_recall_at_100_std
value: 21.616391259129152
- type: nauc_recall_at_10_diff1
value: 15.941506685002588
- type: nauc_recall_at_10_max
value: 19.141995524332728
- type: nauc_recall_at_10_std
value: 5.860480767168416
- type: nauc_recall_at_1_diff1
value: 51.91442444295392
- type: nauc_recall_at_1_max
value: 14.772868336313651
- type: nauc_recall_at_1_std
value: -7.924628073687737
- type: nauc_recall_at_20_diff1
value: 11.583722825668058
- type: nauc_recall_at_20_max
value: 19.867221612869876
- type: nauc_recall_at_20_std
value: 10.141960757453084
- type: nauc_recall_at_3_diff1
value: 32.30936424972365
- type: nauc_recall_at_3_max
value: 20.11705236473992
- type: nauc_recall_at_3_std
value: -3.525144821962635
- type: nauc_recall_at_5_diff1
value: 25.68392975410304
- type: nauc_recall_at_5_max
value: 19.221295609032595
- type: nauc_recall_at_5_std
value: 0.576160647152633
- type: ndcg_at_1
value: 80.048
- type: ndcg_at_10
value: 86.088
- type: ndcg_at_100
value: 86.911
- type: ndcg_at_1000
value: 87.125
- type: ndcg_at_20
value: 86.468
- type: ndcg_at_3
value: 84.375
- type: ndcg_at_5
value: 85.384
- type: precision_at_1
value: 80.048
- type: precision_at_10
value: 10.236
- type: precision_at_100
value: 1.085
- type: precision_at_1000
value: 0.11199999999999999
- type: precision_at_20
value: 5.2330000000000005
- type: precision_at_3
value: 32.078
- type: precision_at_5
value: 19.895
- type: recall_at_1
value: 74.21
- type: recall_at_10
value: 93.077
- type: recall_at_100
value: 96.348
- type: recall_at_1000
value: 97.65700000000001
- type: recall_at_20
value: 94.36099999999999
- type: recall_at_3
value: 88.337
- type: recall_at_5
value: 90.948
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018 (default)
type: mteb/fiqa
config: default
split: test
revision: 27a168819829fe9bcd655c2df245fb19452e8e06
metrics:
- type: main_score
value: 45.405
- type: map_at_1
value: 22.325
- type: map_at_10
value: 36.975
- type: map_at_100
value: 38.846000000000004
- type: map_at_1000
value: 39.012
- type: map_at_20
value: 37.958999999999996
- type: map_at_3
value: 32.208
- type: map_at_5
value: 34.928
- type: mrr_at_1
value: 44.29012345679013
- type: mrr_at_10
value: 54.02030668234372
- type: mrr_at_100
value: 54.72897336245347
- type: mrr_at_1000
value: 54.76320283944561
- type: mrr_at_20
value: 54.50419077165938
- type: mrr_at_3
value: 51.41460905349795
- type: mrr_at_5
value: 53.11213991769548
- type: nauc_map_at_1000_diff1
value: 42.33950505310022
- type: nauc_map_at_1000_max
value: 32.814158723141745
- type: nauc_map_at_1000_std
value: -4.5297230544932825
- type: nauc_map_at_100_diff1
value: 42.316327406548695
- type: nauc_map_at_100_max
value: 32.706900013479725
- type: nauc_map_at_100_std
value: -4.564571222935577
- type: nauc_map_at_10_diff1
value: 42.17734361420548
- type: nauc_map_at_10_max
value: 31.527366385827854
- type: nauc_map_at_10_std
value: -5.559289874353945
- type: nauc_map_at_1_diff1
value: 47.33003471166015
- type: nauc_map_at_1_max
value: 21.535228737020457
- type: nauc_map_at_1_std
value: -11.649016586524858
- type: nauc_map_at_20_diff1
value: 42.11015618170868
- type: nauc_map_at_20_max
value: 32.18582282622051
- type: nauc_map_at_20_std
value: -5.042968429993695
- type: nauc_map_at_3_diff1
value: 43.26686524198236
- type: nauc_map_at_3_max
value: 28.849395895564083
- type: nauc_map_at_3_std
value: -6.976952334117308
- type: nauc_map_at_5_diff1
value: 42.95893517901293
- type: nauc_map_at_5_max
value: 30.871999781837612
- type: nauc_map_at_5_std
value: -6.149645006139908
- type: nauc_mrr_at_1000_diff1
value: 51.23708914241626
- type: nauc_mrr_at_1000_max
value: 40.298960389709
- type: nauc_mrr_at_1000_std
value: -5.188577391773796
- type: nauc_mrr_at_100_diff1
value: 51.24001351681103
- type: nauc_mrr_at_100_max
value: 40.318755039260886
- type: nauc_mrr_at_100_std
value: -5.164744512057911
- type: nauc_mrr_at_10_diff1
value: 51.116323465364566
- type: nauc_mrr_at_10_max
value: 40.18322650792177
- type: nauc_mrr_at_10_std
value: -5.42707335446156
- type: nauc_mrr_at_1_diff1
value: 54.623685354463625
- type: nauc_mrr_at_1_max
value: 38.52800456113852
- type: nauc_mrr_at_1_std
value: -8.561342078884513
- type: nauc_mrr_at_20_diff1
value: 51.082878864924076
- type: nauc_mrr_at_20_max
value: 40.25224355621811
- type: nauc_mrr_at_20_std
value: -5.1386035874860925
- type: nauc_mrr_at_3_diff1
value: 51.28771495504919
- type: nauc_mrr_at_3_max
value: 40.167661702884644
- type: nauc_mrr_at_3_std
value: -6.672938174195537
- type: nauc_mrr_at_5_diff1
value: 51.386811950131026
- type: nauc_mrr_at_5_max
value: 40.29452825209631
- type: nauc_mrr_at_5_std
value: -6.134184637482388
- type: nauc_ndcg_at_1000_diff1
value: 44.46948002237412
- type: nauc_ndcg_at_1000_max
value: 37.882877667376576
- type: nauc_ndcg_at_1000_std
value: -0.2441149985965938
- type: nauc_ndcg_at_100_diff1
value: 43.96014037390138
- type: nauc_ndcg_at_100_max
value: 36.96423036666587
- type: nauc_ndcg_at_100_std
value: 0.21228554480998071
- type: nauc_ndcg_at_10_diff1
value: 42.889923047150226
- type: nauc_ndcg_at_10_max
value: 33.95406097914127
- type: nauc_ndcg_at_10_std
value: -3.3077129078149796
- type: nauc_ndcg_at_1_diff1
value: 54.623685354463625
- type: nauc_ndcg_at_1_max
value: 38.52800456113852
- type: nauc_ndcg_at_1_std
value: -8.561342078884513
- type: nauc_ndcg_at_20_diff1
value: 42.806846626799626
- type: nauc_ndcg_at_20_max
value: 35.01566424207401
- type: nauc_ndcg_at_20_std
value: -2.01466646308545
- type: nauc_ndcg_at_3_diff1
value: 43.29070711758635
- type: nauc_ndcg_at_3_max
value: 35.81474510295669
- type: nauc_ndcg_at_3_std
value: -4.937712863159993
- type: nauc_ndcg_at_5_diff1
value: 43.533204764747346
- type: nauc_ndcg_at_5_max
value: 34.67200578229001
- type: nauc_ndcg_at_5_std
value: -4.220153646752217
- type: nauc_precision_at_1000_diff1
value: -0.24162611684046686
- type: nauc_precision_at_1000_max
value: 26.610031730319122
- type: nauc_precision_at_1000_std
value: 12.85473387814076
- type: nauc_precision_at_100_diff1
value: 6.593767812518609
- type: nauc_precision_at_100_max
value: 32.89478475065496
- type: nauc_precision_at_100_std
value: 16.66995461135905
- type: nauc_precision_at_10_diff1
value: 17.48446148168886
- type: nauc_precision_at_10_max
value: 36.54732448382068
- type: nauc_precision_at_10_std
value: 6.7478320020402
- type: nauc_precision_at_1_diff1
value: 54.623685354463625
- type: nauc_precision_at_1_max
value: 38.52800456113852
- type: nauc_precision_at_1_std
value: -8.561342078884513
- type: nauc_precision_at_20_diff1
value: 13.039974734569537
- type: nauc_precision_at_20_max
value: 36.49695572253983
- type: nauc_precision_at_20_std
value: 10.476938728091008
- type: nauc_precision_at_3_diff1
value: 30.19928557150241
- type: nauc_precision_at_3_max
value: 38.897101267116554
- type: nauc_precision_at_3_std
value: 1.121533090916794
- type: nauc_precision_at_5_diff1
value: 25.33029636435617
- type: nauc_precision_at_5_max
value: 39.59677600835699
- type: nauc_precision_at_5_std
value: 3.4416095155763244
- type: nauc_recall_at_1000_diff1
value: 34.823080033440434
- type: nauc_recall_at_1000_max
value: 43.87066795154745
- type: nauc_recall_at_1000_std
value: 42.23182031662749
- type: nauc_recall_at_100_diff1
value: 30.70809572521992
- type: nauc_recall_at_100_max
value: 31.598064007837852
- type: nauc_recall_at_100_std
value: 20.758185821213164
- type: nauc_recall_at_10_diff1
value: 30.674660204386957
- type: nauc_recall_at_10_max
value: 25.13675931430177
- type: nauc_recall_at_10_std
value: 1.1493152709013974
- type: nauc_recall_at_1_diff1
value: 47.33003471166015
- type: nauc_recall_at_1_max
value: 21.535228737020457
- type: nauc_recall_at_1_std
value: -11.649016586524858
- type: nauc_recall_at_20_diff1
value: 28.60023313868174
- type: nauc_recall_at_20_max
value: 26.576577612640655
- type: nauc_recall_at_20_std
value: 6.331498880910594
- type: nauc_recall_at_3_diff1
value: 36.61359637854836
- type: nauc_recall_at_3_max
value: 26.205709444189345
- type: nauc_recall_at_3_std
value: -4.41772315378875
- type: nauc_recall_at_5_diff1
value: 34.721622588958894
- type: nauc_recall_at_5_max
value: 26.870375540274104
- type: nauc_recall_at_5_std
value: -1.2959303042762926
- type: ndcg_at_1
value: 44.29
- type: ndcg_at_10
value: 45.405
- type: ndcg_at_100
value: 52.027
- type: ndcg_at_1000
value: 54.688
- type: ndcg_at_20
value: 47.967999999999996
- type: ndcg_at_3
value: 41.496
- type: ndcg_at_5
value: 42.902
- type: precision_at_1
value: 44.29
- type: precision_at_10
value: 12.469
- type: precision_at_100
value: 1.9349999999999998
- type: precision_at_1000
value: 0.243
- type: precision_at_20
value: 7.323
- type: precision_at_3
value: 27.622999999999998
- type: precision_at_5
value: 20.34
- type: recall_at_1
value: 22.325
- type: recall_at_10
value: 52.788999999999994
- type: recall_at_100
value: 77.274
- type: recall_at_1000
value: 92.94
- type: recall_at_20
value: 60.714
- type: recall_at_3
value: 37.502
- type: recall_at_5
value: 44.808
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA (default)
type: mteb/hotpotqa
config: default
split: test
revision: ab518f4d6fcca38d87c25209f94beba119d02014
metrics:
- type: main_score
value: 66.661
- type: map_at_1
value: 41.418
- type: map_at_10
value: 57.086999999999996
- type: map_at_100
value: 57.888
- type: map_at_1000
value: 57.955
- type: map_at_20
value: 57.544
- type: map_at_3
value: 54.112
- type: map_at_5
value: 55.942
- type: mrr_at_1
value: 82.79540850776502
- type: mrr_at_10
value: 87.24545298650632
- type: mrr_at_100
value: 87.3943716521154
- type: mrr_at_1000
value: 87.40052014901985
- type: mrr_at_20
value: 87.3376988773675
- type: mrr_at_3
value: 86.54287643484132
- type: mrr_at_5
value: 87.0162052667117
- type: nauc_map_at_1000_diff1
value: 13.347058320450778
- type: nauc_map_at_1000_max
value: 19.172918193696585
- type: nauc_map_at_1000_std
value: 1.6085652199402172
- type: nauc_map_at_100_diff1
value: 13.309459563369677
- type: nauc_map_at_100_max
value: 19.142490361521045
- type: nauc_map_at_100_std
value: 1.5997757026480046
- type: nauc_map_at_10_diff1
value: 13.821467981397284
- type: nauc_map_at_10_max
value: 19.47388049912085
- type: nauc_map_at_10_std
value: 0.7945082440633815
- type: nauc_map_at_1_diff1
value: 80.17822133984255
- type: nauc_map_at_1_max
value: 56.93232002015388
- type: nauc_map_at_1_std
value: -9.565010407038201
- type: nauc_map_at_20_diff1
value: 13.447193497393146
- type: nauc_map_at_20_max
value: 19.208078541028097
- type: nauc_map_at_20_std
value: 1.2699537557176803
- type: nauc_map_at_3_diff1
value: 16.854345839107967
- type: nauc_map_at_3_max
value: 21.648192526975727
- type: nauc_map_at_3_std
value: -0.6137487567045511
- type: nauc_map_at_5_diff1
value: 14.543663008536509
- type: nauc_map_at_5_max
value: 20.155541895741532
- type: nauc_map_at_5_std
value: 0.25148082760110224
- type: nauc_mrr_at_1000_diff1
value: 79.11825919796162
- type: nauc_mrr_at_1000_max
value: 60.10563640048739
- type: nauc_mrr_at_1000_std
value: -6.726621618014327
- type: nauc_mrr_at_100_diff1
value: 79.11854278578646
- type: nauc_mrr_at_100_max
value: 60.11377258817985
- type: nauc_mrr_at_100_std
value: -6.704065951576038
- type: nauc_mrr_at_10_diff1
value: 79.07961808239499
- type: nauc_mrr_at_10_max
value: 60.2138079214177
- type: nauc_mrr_at_10_std
value: -6.74779578820509
- type: nauc_mrr_at_1_diff1
value: 80.25371155548501
- type: nauc_mrr_at_1_max
value: 57.01027352172217
- type: nauc_mrr_at_1_std
value: -9.682353752598317
- type: nauc_mrr_at_20_diff1
value: 79.08786670986484
- type: nauc_mrr_at_20_max
value: 60.139471646688925
- type: nauc_mrr_at_20_std
value: -6.720404576075471
- type: nauc_mrr_at_3_diff1
value: 78.93741620023842
- type: nauc_mrr_at_3_max
value: 60.31902114928829
- type: nauc_mrr_at_3_std
value: -7.066082480981481
- type: nauc_mrr_at_5_diff1
value: 79.06255305350973
- type: nauc_mrr_at_5_max
value: 60.344631571197546
- type: nauc_mrr_at_5_std
value: -6.788165280997917
- type: nauc_ndcg_at_1000_diff1
value: 17.006951693217548
- type: nauc_ndcg_at_1000_max
value: 21.854859924097646
- type: nauc_ndcg_at_1000_std
value: 4.70138835806943
- type: nauc_ndcg_at_100_diff1
value: 16.195007796313384
- type: nauc_ndcg_at_100_max
value: 21.264332841663858
- type: nauc_ndcg_at_100_std
value: 4.620999926841355
- type: nauc_ndcg_at_10_diff1
value: 18.327522629298294
- type: nauc_ndcg_at_10_max
value: 22.686509071566917
- type: nauc_ndcg_at_10_std
value: 1.5527071297942836
- type: nauc_ndcg_at_1_diff1
value: 80.17822133984255
- type: nauc_ndcg_at_1_max
value: 56.93232002015388
- type: nauc_ndcg_at_1_std
value: -9.565010407038201
- type: nauc_ndcg_at_20_diff1
value: 17.11074173500959
- type: nauc_ndcg_at_20_max
value: 21.81160814631424
- type: nauc_ndcg_at_20_std
value: 2.858829825220597
- type: nauc_ndcg_at_3_diff1
value: 23.797089205140068
- type: nauc_ndcg_at_3_max
value: 26.659269305908296
- type: nauc_ndcg_at_3_std
value: -0.7545654502076451
- type: nauc_ndcg_at_5_diff1
value: 20.067483031938934
- type: nauc_ndcg_at_5_max
value: 24.23026610511652
- type: nauc_ndcg_at_5_std
value: 0.5097749208107711
- type: nauc_precision_at_1000_diff1
value: -21.807728330326697
- type: nauc_precision_at_1000_max
value: -2.9835997103120344
- type: nauc_precision_at_1000_std
value: 25.81739799194849
- type: nauc_precision_at_100_diff1
value: -16.05478872817429
- type: nauc_precision_at_100_max
value: 0.2665969008515287
- type: nauc_precision_at_100_std
value: 19.352798394287323
- type: nauc_precision_at_10_diff1
value: -3.3507602135961037
- type: nauc_precision_at_10_max
value: 8.867034772304718
- type: nauc_precision_at_10_std
value: 6.545361194526079
- type: nauc_precision_at_1_diff1
value: 80.17822133984255
- type: nauc_precision_at_1_max
value: 56.93232002015388
- type: nauc_precision_at_1_std
value: -9.565010407038201
- type: nauc_precision_at_20_diff1
value: -7.902542409127802
- type: nauc_precision_at_20_max
value: 5.62428878283396
- type: nauc_precision_at_20_std
value: 10.592045512127914
- type: nauc_precision_at_3_diff1
value: 8.132713424441485
- type: nauc_precision_at_3_max
value: 17.99416677485544
- type: nauc_precision_at_3_std
value: 1.9785114664304215
- type: nauc_precision_at_5_diff1
value: 1.38596734740728
- type: nauc_precision_at_5_max
value: 13.214138500817723
- type: nauc_precision_at_5_std
value: 4.15378198762281
- type: nauc_recall_at_1000_diff1
value: -21.807728330326455
- type: nauc_recall_at_1000_max
value: -2.9835997103117293
- type: nauc_recall_at_1000_std
value: 25.8173979919487
- type: nauc_recall_at_100_diff1
value: -16.054788728174266
- type: nauc_recall_at_100_max
value: 0.26659690085157123
- type: nauc_recall_at_100_std
value: 19.35279839428729
- type: nauc_recall_at_10_diff1
value: -3.350760213596107
- type: nauc_recall_at_10_max
value: 8.86703477230471
- type: nauc_recall_at_10_std
value: 6.5453611945261505
- type: nauc_recall_at_1_diff1
value: 80.17822133984255
- type: nauc_recall_at_1_max
value: 56.93232002015388
- type: nauc_recall_at_1_std
value: -9.565010407038201
- type: nauc_recall_at_20_diff1
value: -7.902542409127704
- type: nauc_recall_at_20_max
value: 5.6242887828340375
- type: nauc_recall_at_20_std
value: 10.592045512127953
- type: nauc_recall_at_3_diff1
value: 8.132713424441446
- type: nauc_recall_at_3_max
value: 17.99416677485538
- type: nauc_recall_at_3_std
value: 1.9785114664303751
- type: nauc_recall_at_5_diff1
value: 1.3859673474071779
- type: nauc_recall_at_5_max
value: 13.214138500817668
- type: nauc_recall_at_5_std
value: 4.153781987622754
- type: ndcg_at_1
value: 82.836
- type: ndcg_at_10
value: 66.661
- type: ndcg_at_100
value: 69.42399999999999
- type: ndcg_at_1000
value: 70.722
- type: ndcg_at_20
value: 67.777
- type: ndcg_at_3
value: 62.517
- type: ndcg_at_5
value: 64.79700000000001
- type: precision_at_1
value: 82.836
- type: precision_at_10
value: 13.350000000000001
- type: precision_at_100
value: 1.552
- type: precision_at_1000
value: 0.172
- type: precision_at_20
value: 7.034
- type: precision_at_3
value: 38.375
- type: precision_at_5
value: 24.829
- type: recall_at_1
value: 41.418
- type: recall_at_10
value: 66.752
- type: recall_at_100
value: 77.576
- type: recall_at_1000
value: 86.199
- type: recall_at_20
value: 70.338
- type: recall_at_3
value: 57.562000000000005
- type: recall_at_5
value: 62.073
- task:
type: Classification
dataset:
name: MTEB ImdbClassification (default)
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 93.58840000000001
- type: ap
value: 90.234834378287
- type: ap_weighted
value: 90.234834378287
- type: f1
value: 93.58346966422063
- type: f1_weighted
value: 93.58346966422063
- type: main_score
value: 93.58840000000001
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO (default)
type: mteb/msmarco
config: default
split: dev
revision: c5a29a104738b98a9e76336939199e264163d4a0
metrics:
- type: main_score
value: 41.48
- type: map_at_1
value: 22.078999999999997
- type: map_at_10
value: 34.416000000000004
- type: map_at_100
value: 35.541
- type: map_at_1000
value: 35.592
- type: map_at_20
value: 35.106
- type: map_at_3
value: 30.470000000000002
- type: map_at_5
value: 32.774
- type: mrr_at_1
value: 22.693409742120345
- type: mrr_at_10
value: 35.02055760221949
- type: mrr_at_100
value: 36.07282466487795
- type: mrr_at_1000
value: 36.11725121701468
- type: mrr_at_20
value: 35.667140877547986
- type: mrr_at_3
value: 31.122254059216814
- type: mrr_at_5
value: 33.40592168099331
- type: nauc_map_at_1000_diff1
value: 33.00333472064972
- type: nauc_map_at_1000_max
value: 5.156444947074947
- type: nauc_map_at_1000_std
value: -23.103939979826375
- type: nauc_map_at_100_diff1
value: 32.99943906977456
- type: nauc_map_at_100_max
value: 5.156792638157342
- type: nauc_map_at_100_std
value: -23.09927789432014
- type: nauc_map_at_10_diff1
value: 32.93427060211673
- type: nauc_map_at_10_max
value: 5.009847068055439
- type: nauc_map_at_10_std
value: -23.69229778425936
- type: nauc_map_at_1_diff1
value: 35.879541770806426
- type: nauc_map_at_1_max
value: 4.037000551161811
- type: nauc_map_at_1_std
value: -21.066913542507095
- type: nauc_map_at_20_diff1
value: 32.94459306136245
- type: nauc_map_at_20_max
value: 5.08450123260384
- type: nauc_map_at_20_std
value: -23.367858842401674
- type: nauc_map_at_3_diff1
value: 33.186734646971495
- type: nauc_map_at_3_max
value: 4.52958372002426
- type: nauc_map_at_3_std
value: -23.407182657661863
- type: nauc_map_at_5_diff1
value: 33.09447602825229
- type: nauc_map_at_5_max
value: 4.8295482352066275
- type: nauc_map_at_5_std
value: -23.977226416616457
- type: nauc_mrr_at_1000_diff1
value: 32.90248885790994
- type: nauc_mrr_at_1000_max
value: 5.345915497836417
- type: nauc_mrr_at_1000_std
value: -22.775176728644926
- type: nauc_mrr_at_100_diff1
value: 32.89830733234614
- type: nauc_mrr_at_100_max
value: 5.354794932204688
- type: nauc_mrr_at_100_std
value: -22.76281634843283
- type: nauc_mrr_at_10_diff1
value: 32.85362740239939
- type: nauc_mrr_at_10_max
value: 5.22277263020967
- type: nauc_mrr_at_10_std
value: -23.29890783663585
- type: nauc_mrr_at_1_diff1
value: 35.8004961400585
- type: nauc_mrr_at_1_max
value: 4.07480515690297
- type: nauc_mrr_at_1_std
value: -21.157419860722133
- type: nauc_mrr_at_20_diff1
value: 32.831058277421675
- type: nauc_mrr_at_20_max
value: 5.30231502729234
- type: nauc_mrr_at_20_std
value: -22.995188734787643
- type: nauc_mrr_at_3_diff1
value: 33.06512398614513
- type: nauc_mrr_at_3_max
value: 4.6832127086497675
- type: nauc_mrr_at_3_std
value: -23.185466086324016
- type: nauc_mrr_at_5_diff1
value: 32.95656016095678
- type: nauc_mrr_at_5_max
value: 5.0055516099566475
- type: nauc_mrr_at_5_std
value: -23.648076417104612
- type: nauc_ndcg_at_1000_diff1
value: 32.23911068627994
- type: nauc_ndcg_at_1000_max
value: 6.340890121521923
- type: nauc_ndcg_at_1000_std
value: -21.64542687396577
- type: nauc_ndcg_at_100_diff1
value: 32.11878167303473
- type: nauc_ndcg_at_100_max
value: 6.597128552520879
- type: nauc_ndcg_at_100_std
value: -21.03041945862791
- type: nauc_ndcg_at_10_diff1
value: 31.78511231016483
- type: nauc_ndcg_at_10_max
value: 5.784417481640047
- type: nauc_ndcg_at_10_std
value: -24.161027978905647
- type: nauc_ndcg_at_1_diff1
value: 35.74394132968329
- type: nauc_ndcg_at_1_max
value: 4.0476454646619215
- type: nauc_ndcg_at_1_std
value: -21.16866068260486
- type: nauc_ndcg_at_20_diff1
value: 31.722628551526604
- type: nauc_ndcg_at_20_max
value: 6.085473579598258
- type: nauc_ndcg_at_20_std
value: -23.01301453978275
- type: nauc_ndcg_at_3_diff1
value: 32.38743175334077
- type: nauc_ndcg_at_3_max
value: 4.708074286110014
- type: nauc_ndcg_at_3_std
value: -24.005841131351065
- type: nauc_ndcg_at_5_diff1
value: 32.19107640366649
- type: nauc_ndcg_at_5_max
value: 5.248392125691872
- type: nauc_ndcg_at_5_std
value: -24.9544454485758
- type: nauc_precision_at_1000_diff1
value: -2.0283123762593203
- type: nauc_precision_at_1000_max
value: 14.569550330630554
- type: nauc_precision_at_1000_std
value: 18.01811212416059
- type: nauc_precision_at_100_diff1
value: 14.463485381374719
- type: nauc_precision_at_100_max
value: 16.06415646423591
- type: nauc_precision_at_100_std
value: 8.987627462107199
- type: nauc_precision_at_10_diff1
value: 25.530846925228666
- type: nauc_precision_at_10_max
value: 8.075830710803086
- type: nauc_precision_at_10_std
value: -24.00010341583341
- type: nauc_precision_at_1_diff1
value: 35.74394132968329
- type: nauc_precision_at_1_max
value: 4.0476454646619215
- type: nauc_precision_at_1_std
value: -21.16866068260486
- type: nauc_precision_at_20_diff1
value: 22.490315165998652
- type: nauc_precision_at_20_max
value: 9.695438542678712
- type: nauc_precision_at_20_std
value: -16.779150840743586
- type: nauc_precision_at_3_diff1
value: 29.653053865297718
- type: nauc_precision_at_3_max
value: 4.956580341717329
- type: nauc_precision_at_3_std
value: -25.716768027801912
- type: nauc_precision_at_5_diff1
value: 28.466584677280675
- type: nauc_precision_at_5_max
value: 6.035813186905091
- type: nauc_precision_at_5_std
value: -27.40096435134959
- type: nauc_recall_at_1000_diff1
value: 16.188777617075157
- type: nauc_recall_at_1000_max
value: 45.1160674872711
- type: nauc_recall_at_1000_std
value: 50.8993030763505
- type: nauc_recall_at_100_diff1
value: 26.462748511423666
- type: nauc_recall_at_100_max
value: 20.17057177381908
- type: nauc_recall_at_100_std
value: 6.567222385661084
- type: nauc_recall_at_10_diff1
value: 27.694042744869897
- type: nauc_recall_at_10_max
value: 8.193922397003126
- type: nauc_recall_at_10_std
value: -25.428481461107726
- type: nauc_recall_at_1_diff1
value: 35.879541770806426
- type: nauc_recall_at_1_max
value: 4.037000551161811
- type: nauc_recall_at_1_std
value: -21.066913542507095
- type: nauc_recall_at_20_diff1
value: 26.412542837917503
- type: nauc_recall_at_20_max
value: 10.119778040160208
- type: nauc_recall_at_20_std
value: -20.353583276762542
- type: nauc_recall_at_3_diff1
value: 30.1723792933633
- type: nauc_recall_at_3_max
value: 4.991021506511908
- type: nauc_recall_at_3_std
value: -25.61028187578253
- type: nauc_recall_at_5_diff1
value: 29.546460816157307
- type: nauc_recall_at_5_max
value: 6.257065735729789
- type: nauc_recall_at_5_std
value: -27.757268209659046
- type: ndcg_at_1
value: 22.708000000000002
- type: ndcg_at_10
value: 41.48
- type: ndcg_at_100
value: 46.894999999999996
- type: ndcg_at_1000
value: 48.14
- type: ndcg_at_20
value: 43.918
- type: ndcg_at_3
value: 33.423
- type: ndcg_at_5
value: 37.553
- type: precision_at_1
value: 22.708000000000002
- type: precision_at_10
value: 6.6049999999999995
- type: precision_at_100
value: 0.9329999999999999
- type: precision_at_1000
value: 0.104
- type: precision_at_20
value: 3.811
- type: precision_at_3
value: 14.283999999999999
- type: precision_at_5
value: 10.685
- type: recall_at_1
value: 22.078999999999997
- type: recall_at_10
value: 63.269
- type: recall_at_100
value: 88.318
- type: recall_at_1000
value: 97.80799999999999
- type: recall_at_20
value: 72.741
- type: recall_at_3
value: 41.347
- type: recall_at_5
value: 51.271
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 96.0373917008664
- type: f1
value: 95.77672920037678
- type: f1_weighted
value: 96.06299804062722
- type: main_score
value: 96.0373917008664
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 89.1655266757866
- type: f1
value: 71.6595596649587
- type: f1_weighted
value: 90.44597470884298
- type: main_score
value: 89.1655266757866
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 4672e20407010da34463acc759c162ca9734bca6
metrics:
- type: accuracy
value: 76.60390047074647
- type: f1
value: 74.0382414657559
- type: f1_weighted
value: 76.53055023019932
- type: main_score
value: 76.60390047074647
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8
metrics:
- type: accuracy
value: 78.93073301950236
- type: f1
value: 78.58195068346751
- type: f1_weighted
value: 78.86975899493798
- type: main_score
value: 78.93073301950236
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P (default)
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: main_score
value: 37.66500681777215
- type: v_measure
value: 37.66500681777215
- type: v_measure_std
value: 1.4953449515069268
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S (default)
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: main_score
value: 35.51021437644991
- type: v_measure
value: 35.51021437644991
- type: v_measure_std
value: 1.3321174913629759
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking (default)
type: mteb/mind_small
config: default
split: test
revision: 59042f120c80e8afa9cdbb224f67076cec0fc9a7
metrics:
- type: main_score
value: 30.10020452046386
- type: map
value: 30.10020452046386
- type: mrr
value: 31.096861019258043
- type: nAUC_map_diff1
value: 12.853085612418742
- type: nAUC_map_max
value: -20.97077158351351
- type: nAUC_map_std
value: -2.459841546804226
- type: nAUC_mrr_diff1
value: 12.08750595893558
- type: nAUC_mrr_max
value: -15.502813020230475
- type: nAUC_mrr_std
value: -0.8069966088331175
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus (default)
type: mteb/nfcorpus
config: default
split: test
revision: ec0fa4fe99da2ff19ca1214b7966684033a58814
metrics:
- type: main_score
value: 34.725
- type: map_at_1
value: 5.901
- type: map_at_10
value: 12.992999999999999
- type: map_at_100
value: 16.402
- type: map_at_1000
value: 17.896
- type: map_at_20
value: 14.411
- type: map_at_3
value: 9.3
- type: map_at_5
value: 10.906
- type: mrr_at_1
value: 46.13003095975232
- type: mrr_at_10
value: 54.67123691581895
- type: mrr_at_100
value: 55.13154466663215
- type: mrr_at_1000
value: 55.18028030923489
- type: mrr_at_20
value: 54.89203403371564
- type: mrr_at_3
value: 52.47678018575851
- type: mrr_at_5
value: 54.10216718266254
- type: nauc_map_at_1000_diff1
value: 26.097980547292376
- type: nauc_map_at_1000_max
value: 31.716612190607847
- type: nauc_map_at_1000_std
value: 10.484226609845875
- type: nauc_map_at_100_diff1
value: 26.903184213500687
- type: nauc_map_at_100_max
value: 30.254077338590847
- type: nauc_map_at_100_std
value: 5.721213154053636
- type: nauc_map_at_10_diff1
value: 30.41995975934737
- type: nauc_map_at_10_max
value: 23.720851152044826
- type: nauc_map_at_10_std
value: -6.968119243629756
- type: nauc_map_at_1_diff1
value: 45.91087927776542
- type: nauc_map_at_1_max
value: 11.368756627277754
- type: nauc_map_at_1_std
value: -21.987291617576854
- type: nauc_map_at_20_diff1
value: 28.907069629931854
- type: nauc_map_at_20_max
value: 26.70846407056094
- type: nauc_map_at_20_std
value: -1.9126005785897775
- type: nauc_map_at_3_diff1
value: 38.73155355719495
- type: nauc_map_at_3_max
value: 17.769925571726496
- type: nauc_map_at_3_std
value: -15.240426410962574
- type: nauc_map_at_5_diff1
value: 34.6278617589197
- type: nauc_map_at_5_max
value: 20.54601986245645
- type: nauc_map_at_5_std
value: -11.566817873968779
- type: nauc_mrr_at_1000_diff1
value: 36.64991509982144
- type: nauc_mrr_at_1000_max
value: 49.697173212531744
- type: nauc_mrr_at_1000_std
value: 26.86511696261478
- type: nauc_mrr_at_100_diff1
value: 36.68743394598715
- type: nauc_mrr_at_100_max
value: 49.744202083676264
- type: nauc_mrr_at_100_std
value: 26.90232555840209
- type: nauc_mrr_at_10_diff1
value: 36.47029954847764
- type: nauc_mrr_at_10_max
value: 49.439023284006
- type: nauc_mrr_at_10_std
value: 26.690706480930444
- type: nauc_mrr_at_1_diff1
value: 36.59190142546215
- type: nauc_mrr_at_1_max
value: 41.74235868276634
- type: nauc_mrr_at_1_std
value: 18.414274177675807
- type: nauc_mrr_at_20_diff1
value: 36.681072119690086
- type: nauc_mrr_at_20_max
value: 49.800936007548934
- type: nauc_mrr_at_20_std
value: 26.961504252981683
- type: nauc_mrr_at_3_diff1
value: 36.63303178691115
- type: nauc_mrr_at_3_max
value: 48.628730526802904
- type: nauc_mrr_at_3_std
value: 25.157181938589225
- type: nauc_mrr_at_5_diff1
value: 36.41948638139246
- type: nauc_mrr_at_5_max
value: 49.180007480727134
- type: nauc_mrr_at_5_std
value: 26.145567865350543
- type: nauc_ndcg_at_1000_diff1
value: 26.257313381009283
- type: nauc_ndcg_at_1000_max
value: 46.45094846583072
- type: nauc_ndcg_at_1000_std
value: 30.74855470405661
- type: nauc_ndcg_at_100_diff1
value: 25.337713280261774
- type: nauc_ndcg_at_100_max
value: 42.51314175786316
- type: nauc_ndcg_at_100_std
value: 25.717600091835052
- type: nauc_ndcg_at_10_diff1
value: 27.28963504973803
- type: nauc_ndcg_at_10_max
value: 45.07020624629025
- type: nauc_ndcg_at_10_std
value: 29.017215904691902
- type: nauc_ndcg_at_1_diff1
value: 39.69547779212674
- type: nauc_ndcg_at_1_max
value: 39.944550572400225
- type: nauc_ndcg_at_1_std
value: 17.27308663512775
- type: nauc_ndcg_at_20_diff1
value: 26.88029364873597
- type: nauc_ndcg_at_20_max
value: 43.89319625918324
- type: nauc_ndcg_at_20_std
value: 29.182590252122804
- type: nauc_ndcg_at_3_diff1
value: 32.49288862835273
- type: nauc_ndcg_at_3_max
value: 45.57318753977976
- type: nauc_ndcg_at_3_std
value: 23.953534500127557
- type: nauc_ndcg_at_5_diff1
value: 29.578845399866545
- type: nauc_ndcg_at_5_max
value: 46.601862971633544
- type: nauc_ndcg_at_5_std
value: 27.55565792973463
- type: nauc_precision_at_1000_diff1
value: -4.397392180783799
- type: nauc_precision_at_1000_max
value: 17.406927055459345
- type: nauc_precision_at_1000_std
value: 47.8835834302276
- type: nauc_precision_at_100_diff1
value: -3.582470870457778
- type: nauc_precision_at_100_max
value: 30.6298826448415
- type: nauc_precision_at_100_std
value: 55.54858727751579
- type: nauc_precision_at_10_diff1
value: 6.591245947478634
- type: nauc_precision_at_10_max
value: 44.36069671353394
- type: nauc_precision_at_10_std
value: 45.85949796089425
- type: nauc_precision_at_1_diff1
value: 39.90620183792372
- type: nauc_precision_at_1_max
value: 41.93832955553217
- type: nauc_precision_at_1_std
value: 17.78208215842155
- type: nauc_precision_at_20_diff1
value: 3.1763559888676305
- type: nauc_precision_at_20_max
value: 40.19013491290661
- type: nauc_precision_at_20_std
value: 50.30896997510246
- type: nauc_precision_at_3_diff1
value: 21.346541990363338
- type: nauc_precision_at_3_max
value: 46.358486907663234
- type: nauc_precision_at_3_std
value: 30.30796100013066
- type: nauc_precision_at_5_diff1
value: 13.764960158282511
- type: nauc_precision_at_5_max
value: 47.38189520644064
- type: nauc_precision_at_5_std
value: 38.83370975791448
- type: nauc_recall_at_1000_diff1
value: 3.111013627981912
- type: nauc_recall_at_1000_max
value: 17.453303474327654
- type: nauc_recall_at_1000_std
value: 16.831446977812252
- type: nauc_recall_at_100_diff1
value: 16.59425078697382
- type: nauc_recall_at_100_max
value: 25.400896109980174
- type: nauc_recall_at_100_std
value: 10.794971059479254
- type: nauc_recall_at_10_diff1
value: 23.63271460212068
- type: nauc_recall_at_10_max
value: 20.991264958049598
- type: nauc_recall_at_10_std
value: -6.022250169253036
- type: nauc_recall_at_1_diff1
value: 45.91087927776542
- type: nauc_recall_at_1_max
value: 11.368756627277754
- type: nauc_recall_at_1_std
value: -21.987291617576854
- type: nauc_recall_at_20_diff1
value: 22.615984500854555
- type: nauc_recall_at_20_max
value: 23.637250829352997
- type: nauc_recall_at_20_std
value: 0.41128528477486354
- type: nauc_recall_at_3_diff1
value: 37.308271400820985
- type: nauc_recall_at_3_max
value: 18.63584930406467
- type: nauc_recall_at_3_std
value: -13.472251033244428
- type: nauc_recall_at_5_diff1
value: 31.142005435540852
- type: nauc_recall_at_5_max
value: 20.5834454794761
- type: nauc_recall_at_5_std
value: -9.81034234508067
- type: ndcg_at_1
value: 42.879
- type: ndcg_at_10
value: 34.725
- type: ndcg_at_100
value: 31.798
- type: ndcg_at_1000
value: 40.486
- type: ndcg_at_20
value: 32.535
- type: ndcg_at_3
value: 38.97
- type: ndcg_at_5
value: 37.602000000000004
- type: precision_at_1
value: 44.891999999999996
- type: precision_at_10
value: 26.192
- type: precision_at_100
value: 8.241
- type: precision_at_1000
value: 2.085
- type: precision_at_20
value: 19.52
- type: precision_at_3
value: 36.842000000000006
- type: precision_at_5
value: 33.312999999999995
- type: recall_at_1
value: 5.901
- type: recall_at_10
value: 17.171
- type: recall_at_100
value: 31.709
- type: recall_at_1000
value: 63.589
- type: recall_at_20
value: 20.782999999999998
- type: recall_at_3
value: 10.194
- type: recall_at_5
value: 12.934999999999999
- task:
type: Retrieval
dataset:
name: MTEB NQ (default)
type: mteb/nq
config: default
split: test
revision: b774495ed302d8c44a3a7ea25c90dbce03968f31
metrics:
- type: main_score
value: 59.951
- type: map_at_1
value: 36.718
- type: map_at_10
value: 52.518
- type: map_at_100
value: 53.373000000000005
- type: map_at_1000
value: 53.400000000000006
- type: map_at_20
value: 53.11
- type: map_at_3
value: 48.606
- type: map_at_5
value: 50.922999999999995
- type: mrr_at_1
value: 41.22247972190035
- type: mrr_at_10
value: 55.10211471610661
- type: mrr_at_100
value: 55.690424468447944
- type: mrr_at_1000
value: 55.709587669000626
- type: mrr_at_20
value: 55.51307514935747
- type: mrr_at_3
value: 52.10023174971031
- type: mrr_at_5
value: 53.85139049826188
- type: nauc_map_at_1000_diff1
value: 36.084432495766244
- type: nauc_map_at_1000_max
value: 32.106683448614696
- type: nauc_map_at_1000_std
value: 0.28114600458421135
- type: nauc_map_at_100_diff1
value: 36.076754155834685
- type: nauc_map_at_100_max
value: 32.124501222653386
- type: nauc_map_at_100_std
value: 0.3074172933687319
- type: nauc_map_at_10_diff1
value: 35.95846264899338
- type: nauc_map_at_10_max
value: 32.268962480678645
- type: nauc_map_at_10_std
value: -0.10550275250265802
- type: nauc_map_at_1_diff1
value: 39.29370524773578
- type: nauc_map_at_1_max
value: 25.991296131217062
- type: nauc_map_at_1_std
value: -2.5540466996583753
- type: nauc_map_at_20_diff1
value: 35.98377971994357
- type: nauc_map_at_20_max
value: 32.15683504409824
- type: nauc_map_at_20_std
value: 0.19145693127134786
- type: nauc_map_at_3_diff1
value: 36.0944254890347
- type: nauc_map_at_3_max
value: 30.2128510665515
- type: nauc_map_at_3_std
value: -1.9611081461308983
- type: nauc_map_at_5_diff1
value: 36.00156289591984
- type: nauc_map_at_5_max
value: 31.56149465902775
- type: nauc_map_at_5_std
value: -0.8373235686244762
- type: nauc_mrr_at_1000_diff1
value: 36.09152753153953
- type: nauc_mrr_at_1000_max
value: 32.43454228496553
- type: nauc_mrr_at_1000_std
value: 1.8517892571605596
- type: nauc_mrr_at_100_diff1
value: 36.09112009133751
- type: nauc_mrr_at_100_max
value: 32.44951869408173
- type: nauc_mrr_at_100_std
value: 1.8714844618486277
- type: nauc_mrr_at_10_diff1
value: 35.930421137614914
- type: nauc_mrr_at_10_max
value: 32.65451978743636
- type: nauc_mrr_at_10_std
value: 1.7723190829619009
- type: nauc_mrr_at_1_diff1
value: 39.396024242346954
- type: nauc_mrr_at_1_max
value: 28.132740347350953
- type: nauc_mrr_at_1_std
value: -0.5935576215439111
- type: nauc_mrr_at_20_diff1
value: 35.99903536497898
- type: nauc_mrr_at_20_max
value: 32.50256539352071
- type: nauc_mrr_at_20_std
value: 1.8829977887370852
- type: nauc_mrr_at_3_diff1
value: 35.91812477028109
- type: nauc_mrr_at_3_max
value: 31.595134192404796
- type: nauc_mrr_at_3_std
value: 0.6749658339604261
- type: nauc_mrr_at_5_diff1
value: 35.90541524153257
- type: nauc_mrr_at_5_max
value: 32.375076970871106
- type: nauc_mrr_at_5_std
value: 1.4530009988326982
- type: nauc_ndcg_at_1000_diff1
value: 35.52189976546703
- type: nauc_ndcg_at_1000_max
value: 33.97534043055662
- type: nauc_ndcg_at_1000_std
value: 2.7358127566748025
- type: nauc_ndcg_at_100_diff1
value: 35.32967760887528
- type: nauc_ndcg_at_100_max
value: 34.51536712950666
- type: nauc_ndcg_at_100_std
value: 3.561484184520643
- type: nauc_ndcg_at_10_diff1
value: 34.63981443982384
- type: nauc_ndcg_at_10_max
value: 35.2466755214177
- type: nauc_ndcg_at_10_std
value: 2.163469830591493
- type: nauc_ndcg_at_1_diff1
value: 39.47234805254548
- type: nauc_ndcg_at_1_max
value: 27.949377920983448
- type: nauc_ndcg_at_1_std
value: -0.7016496183295023
- type: nauc_ndcg_at_20_diff1
value: 34.77193782885647
- type: nauc_ndcg_at_20_max
value: 34.79563187118757
- type: nauc_ndcg_at_20_std
value: 3.0333339734937326
- type: nauc_ndcg_at_3_diff1
value: 34.84410905343334
- type: nauc_ndcg_at_3_max
value: 31.53857235413653
- type: nauc_ndcg_at_3_std
value: -1.2121011083371147
- type: nauc_ndcg_at_5_diff1
value: 34.70655373953545
- type: nauc_ndcg_at_5_max
value: 33.692790095442994
- type: nauc_ndcg_at_5_std
value: 0.6612260001056149
- type: nauc_precision_at_1000_diff1
value: -6.531497758654776
- type: nauc_precision_at_1000_max
value: 6.592383443768815
- type: nauc_precision_at_1000_std
value: 15.266065986503547
- type: nauc_precision_at_100_diff1
value: -2.0738709139302003
- type: nauc_precision_at_100_max
value: 15.324594432362842
- type: nauc_precision_at_100_std
value: 20.825895623533857
- type: nauc_precision_at_10_diff1
value: 9.98637582589397
- type: nauc_precision_at_10_max
value: 30.50457748285925
- type: nauc_precision_at_10_std
value: 13.73313229149034
- type: nauc_precision_at_1_diff1
value: 39.47234805254548
- type: nauc_precision_at_1_max
value: 27.949377920983448
- type: nauc_precision_at_1_std
value: -0.7016496183295023
- type: nauc_precision_at_20_diff1
value: 4.338247023429635
- type: nauc_precision_at_20_max
value: 23.76589815146598
- type: nauc_precision_at_20_std
value: 17.322633618978386
- type: nauc_precision_at_3_diff1
value: 23.17326950999716
- type: nauc_precision_at_3_max
value: 31.075717350827293
- type: nauc_precision_at_3_std
value: 2.762436540576557
- type: nauc_precision_at_5_diff1
value: 17.362008096246633
- type: nauc_precision_at_5_max
value: 32.08805696305664
- type: nauc_precision_at_5_std
value: 8.12524167169048
- type: nauc_recall_at_1000_diff1
value: 34.18415215294108
- type: nauc_recall_at_1000_max
value: 79.77930971993527
- type: nauc_recall_at_1000_std
value: 70.27189175741741
- type: nauc_recall_at_100_diff1
value: 28.249629521143465
- type: nauc_recall_at_100_max
value: 62.21529072406605
- type: nauc_recall_at_100_std
value: 46.23141649265807
- type: nauc_recall_at_10_diff1
value: 27.302420328273612
- type: nauc_recall_at_10_max
value: 47.57999826869166
- type: nauc_recall_at_10_std
value: 9.807109630878386
- type: nauc_recall_at_1_diff1
value: 39.29370524773578
- type: nauc_recall_at_1_max
value: 25.991296131217062
- type: nauc_recall_at_1_std
value: -2.5540466996583753
- type: nauc_recall_at_20_diff1
value: 26.264363964930997
- type: nauc_recall_at_20_max
value: 49.762297304442136
- type: nauc_recall_at_20_std
value: 18.650695925686502
- type: nauc_recall_at_3_diff1
value: 29.95231482486556
- type: nauc_recall_at_3_max
value: 33.054441143791394
- type: nauc_recall_at_3_std
value: -1.4133288694811754
- type: nauc_recall_at_5_diff1
value: 28.978660648633802
- type: nauc_recall_at_5_max
value: 38.844300548161186
- type: nauc_recall_at_5_std
value: 3.19644809086287
- type: ndcg_at_1
value: 41.193999999999996
- type: ndcg_at_10
value: 59.951
- type: ndcg_at_100
value: 63.343
- type: ndcg_at_1000
value: 63.941
- type: ndcg_at_20
value: 61.781
- type: ndcg_at_3
value: 52.756
- type: ndcg_at_5
value: 56.486999999999995
- type: precision_at_1
value: 41.193999999999996
- type: precision_at_10
value: 9.528
- type: precision_at_100
value: 1.145
- type: precision_at_1000
value: 0.12
- type: precision_at_20
value: 5.206
- type: precision_at_3
value: 23.696
- type: precision_at_5
value: 16.419
- type: recall_at_1
value: 36.718
- type: recall_at_10
value: 79.84
- type: recall_at_100
value: 94.228
- type: recall_at_1000
value: 98.648
- type: recall_at_20
value: 86.542
- type: recall_at_3
value: 61.31999999999999
- type: recall_at_5
value: 69.836
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval (default)
type: mteb/quora
config: default
split: test
revision: e4e08e0b7dbe3c8700f0daef558ff32256715259
metrics:
- type: main_score
value: 89.838
- type: map_at_1
value: 72.44500000000001
- type: map_at_10
value: 86.332
- type: map_at_100
value: 86.936
- type: map_at_1000
value: 86.95
- type: map_at_20
value: 86.72999999999999
- type: map_at_3
value: 83.417
- type: map_at_5
value: 85.292
- type: mrr_at_1
value: 83.5
- type: mrr_at_10
value: 89.20519444444444
- type: mrr_at_100
value: 89.2819086258491
- type: mrr_at_1000
value: 89.28214505128291
- type: mrr_at_20
value: 89.26673258007042
- type: mrr_at_3
value: 88.36
- type: mrr_at_5
value: 88.95100000000001
- type: nauc_map_at_1000_diff1
value: 76.90740671940051
- type: nauc_map_at_1000_max
value: 36.46444946338708
- type: nauc_map_at_1000_std
value: -56.60380240532508
- type: nauc_map_at_100_diff1
value: 76.91112078761572
- type: nauc_map_at_100_max
value: 36.45304363618243
- type: nauc_map_at_100_std
value: -56.67988410741111
- type: nauc_map_at_10_diff1
value: 77.09598611046616
- type: nauc_map_at_10_max
value: 35.96689922341558
- type: nauc_map_at_10_std
value: -58.68604909203303
- type: nauc_map_at_1_diff1
value: 80.37641963929528
- type: nauc_map_at_1_max
value: 27.046973659136057
- type: nauc_map_at_1_std
value: -49.41187376826384
- type: nauc_map_at_20_diff1
value: 76.9541622063172
- type: nauc_map_at_20_max
value: 36.29817666157097
- type: nauc_map_at_20_std
value: -57.58995860118392
- type: nauc_map_at_3_diff1
value: 77.79036430390953
- type: nauc_map_at_3_max
value: 33.23673927645347
- type: nauc_map_at_3_std
value: -60.10156884287652
- type: nauc_map_at_5_diff1
value: 77.33636903512307
- type: nauc_map_at_5_max
value: 35.003919992106006
- type: nauc_map_at_5_std
value: -59.97787405958172
- type: nauc_mrr_at_1000_diff1
value: 77.73000572331905
- type: nauc_mrr_at_1000_max
value: 38.561364157585324
- type: nauc_mrr_at_1000_std
value: -53.44976098044828
- type: nauc_mrr_at_100_diff1
value: 77.72981689727108
- type: nauc_mrr_at_100_max
value: 38.561425387623785
- type: nauc_mrr_at_100_std
value: -53.45033750871979
- type: nauc_mrr_at_10_diff1
value: 77.71709626439586
- type: nauc_mrr_at_10_max
value: 38.624900686387214
- type: nauc_mrr_at_10_std
value: -53.58765986161691
- type: nauc_mrr_at_1_diff1
value: 78.37565253706408
- type: nauc_mrr_at_1_max
value: 38.23888076842768
- type: nauc_mrr_at_1_std
value: -50.20603764579538
- type: nauc_mrr_at_20_diff1
value: 77.7306939391157
- type: nauc_mrr_at_20_max
value: 38.59165749191751
- type: nauc_mrr_at_20_std
value: -53.48812024214872
- type: nauc_mrr_at_3_diff1
value: 77.54353349806524
- type: nauc_mrr_at_3_max
value: 38.713759549229785
- type: nauc_mrr_at_3_std
value: -53.94582165002703
- type: nauc_mrr_at_5_diff1
value: 77.70283049254654
- type: nauc_mrr_at_5_max
value: 38.716317005111215
- type: nauc_mrr_at_5_std
value: -53.92085356926888
- type: nauc_ndcg_at_1000_diff1
value: 76.89855290894926
- type: nauc_ndcg_at_1000_max
value: 37.772216233524325
- type: nauc_ndcg_at_1000_std
value: -54.86144177114646
- type: nauc_ndcg_at_100_diff1
value: 76.90257905740786
- type: nauc_ndcg_at_100_max
value: 37.739876618823274
- type: nauc_ndcg_at_100_std
value: -55.18253534518033
- type: nauc_ndcg_at_10_diff1
value: 76.82906119719216
- type: nauc_ndcg_at_10_max
value: 37.09739956129085
- type: nauc_ndcg_at_10_std
value: -58.49646829288816
- type: nauc_ndcg_at_1_diff1
value: 78.37565253706408
- type: nauc_ndcg_at_1_max
value: 38.335351847985045
- type: nauc_ndcg_at_1_std
value: -50.212302001610745
- type: nauc_ndcg_at_20_diff1
value: 76.86843611975287
- type: nauc_ndcg_at_20_max
value: 37.38859864360577
- type: nauc_ndcg_at_20_std
value: -57.243383699901386
- type: nauc_ndcg_at_3_diff1
value: 76.43700144403104
- type: nauc_ndcg_at_3_max
value: 35.849266604568456
- type: nauc_ndcg_at_3_std
value: -58.26941196366757
- type: nauc_ndcg_at_5_diff1
value: 76.65368894551763
- type: nauc_ndcg_at_5_max
value: 36.67820873138469
- type: nauc_ndcg_at_5_std
value: -59.167875261562884
- type: nauc_precision_at_1000_diff1
value: -44.61035236776975
- type: nauc_precision_at_1000_max
value: -6.9906519553038535
- type: nauc_precision_at_1000_std
value: 45.26673634956755
- type: nauc_precision_at_100_diff1
value: -44.471568524106466
- type: nauc_precision_at_100_max
value: -6.513827405878257
- type: nauc_precision_at_100_std
value: 43.61461800235919
- type: nauc_precision_at_10_diff1
value: -40.63269213674181
- type: nauc_precision_at_10_max
value: -2.176686756124717
- type: nauc_precision_at_10_std
value: 29.834023361852225
- type: nauc_precision_at_1_diff1
value: 78.37565253706408
- type: nauc_precision_at_1_max
value: 38.335351847985045
- type: nauc_precision_at_1_std
value: -50.212302001610745
- type: nauc_precision_at_20_diff1
value: -43.166138321174
- type: nauc_precision_at_20_max
value: -4.551647757465525
- type: nauc_precision_at_20_std
value: 36.236925649882664
- type: nauc_precision_at_3_diff1
value: -22.241887562444298
- type: nauc_precision_at_3_max
value: 6.147594412705473
- type: nauc_precision_at_3_std
value: 6.206594648276548
- type: nauc_precision_at_5_diff1
value: -33.948204035499955
- type: nauc_precision_at_5_max
value: 1.551952866668139
- type: nauc_precision_at_5_std
value: 19.086692514199573
- type: nauc_recall_at_1000_diff1
value: 56.00550359595701
- type: nauc_recall_at_1000_max
value: 0.25076313433895114
- type: nauc_recall_at_1000_std
value: -19.767447908090993
- type: nauc_recall_at_100_diff1
value: 71.09157100014333
- type: nauc_recall_at_100_max
value: 36.803937541332566
- type: nauc_recall_at_100_std
value: -68.4065523296009
- type: nauc_recall_at_10_diff1
value: 72.74150240606814
- type: nauc_recall_at_10_max
value: 34.20323841659202
- type: nauc_recall_at_10_std
value: -81.23057156799683
- type: nauc_recall_at_1_diff1
value: 80.37641963929528
- type: nauc_recall_at_1_max
value: 27.046973659136057
- type: nauc_recall_at_1_std
value: -49.41187376826384
- type: nauc_recall_at_20_diff1
value: 72.23679243300582
- type: nauc_recall_at_20_max
value: 35.472624896485584
- type: nauc_recall_at_20_std
value: -83.96453691324263
- type: nauc_recall_at_3_diff1
value: 74.4436126143353
- type: nauc_recall_at_3_max
value: 30.220293116530584
- type: nauc_recall_at_3_std
value: -68.23230306181532
- type: nauc_recall_at_5_diff1
value: 72.89682914794618
- type: nauc_recall_at_5_max
value: 32.220311115253786
- type: nauc_recall_at_5_std
value: -74.53623789048245
- type: ndcg_at_1
value: 83.5
- type: ndcg_at_10
value: 89.838
- type: ndcg_at_100
value: 90.879
- type: ndcg_at_1000
value: 90.955
- type: ndcg_at_20
value: 90.422
- type: ndcg_at_3
value: 87.21799999999999
- type: ndcg_at_5
value: 88.727
- type: precision_at_1
value: 83.5
- type: precision_at_10
value: 13.571
- type: precision_at_100
value: 1.5350000000000001
- type: precision_at_1000
value: 0.157
- type: precision_at_20
value: 7.175
- type: precision_at_3
value: 38.12
- type: precision_at_5
value: 25.041999999999998
- type: recall_at_1
value: 72.44500000000001
- type: recall_at_10
value: 96.298
- type: recall_at_100
value: 99.696
- type: recall_at_1000
value: 99.98599999999999
- type: recall_at_20
value: 98.15700000000001
- type: recall_at_3
value: 88.633
- type: recall_at_5
value: 92.985
- task:
type: Clustering
dataset:
name: MTEB RedditClustering (default)
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: main_score
value: 59.36225093784713
- type: v_measure
value: 59.36225093784713
- type: v_measure_std
value: 3.9911509588570393
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P (default)
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 385e3cb46b4cfa89021f56c4380204149d0efe33
metrics:
- type: main_score
value: 64.46282036246124
- type: v_measure
value: 64.46282036246124
- type: v_measure_std
value: 12.49196304240264
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS (default)
type: mteb/scidocs
config: default
split: test
revision: f8c2fcf00f625baaa80f62ec5bd9e1fff3b8ae88
metrics:
- type: main_score
value: 21.781
- type: map_at_1
value: 5.103
- type: map_at_10
value: 13.152
- type: map_at_100
value: 15.421000000000001
- type: map_at_1000
value: 15.738
- type: map_at_20
value: 14.313
- type: map_at_3
value: 9.277000000000001
- type: map_at_5
value: 11.079
- type: mrr_at_1
value: 25.2
- type: mrr_at_10
value: 36.30464285714286
- type: mrr_at_100
value: 37.37083205414486
- type: mrr_at_1000
value: 37.41889994963302
- type: mrr_at_20
value: 36.99006600941199
- type: mrr_at_3
value: 33.11666666666667
- type: mrr_at_5
value: 34.971666666666664
- type: nauc_map_at_1000_diff1
value: 13.3829110188465
- type: nauc_map_at_1000_max
value: 26.200548089249203
- type: nauc_map_at_1000_std
value: 15.782390299656376
- type: nauc_map_at_100_diff1
value: 13.434823562595197
- type: nauc_map_at_100_max
value: 26.19757227269967
- type: nauc_map_at_100_std
value: 15.666149403001597
- type: nauc_map_at_10_diff1
value: 13.136752265014085
- type: nauc_map_at_10_max
value: 24.37704176159032
- type: nauc_map_at_10_std
value: 11.875468320642725
- type: nauc_map_at_1_diff1
value: 23.91080785158353
- type: nauc_map_at_1_max
value: 21.714915496600813
- type: nauc_map_at_1_std
value: 4.523659534794796
- type: nauc_map_at_20_diff1
value: 13.08994175195148
- type: nauc_map_at_20_max
value: 25.564250916023035
- type: nauc_map_at_20_std
value: 13.758854620282229
- type: nauc_map_at_3_diff1
value: 15.629634284012711
- type: nauc_map_at_3_max
value: 20.94416328947656
- type: nauc_map_at_3_std
value: 5.443733090008665
- type: nauc_map_at_5_diff1
value: 13.717844004379067
- type: nauc_map_at_5_max
value: 21.93083811259854
- type: nauc_map_at_5_std
value: 7.496869394816883
- type: nauc_mrr_at_1000_diff1
value: 19.466105991639516
- type: nauc_mrr_at_1000_max
value: 23.857199036893714
- type: nauc_mrr_at_1000_std
value: 10.400833057932964
- type: nauc_mrr_at_100_diff1
value: 19.45377482442327
- type: nauc_mrr_at_100_max
value: 23.86931198998342
- type: nauc_mrr_at_100_std
value: 10.43160252915245
- type: nauc_mrr_at_10_diff1
value: 19.595100505906498
- type: nauc_mrr_at_10_max
value: 23.828564831729913
- type: nauc_mrr_at_10_std
value: 10.158332218550582
- type: nauc_mrr_at_1_diff1
value: 23.639623316387265
- type: nauc_mrr_at_1_max
value: 21.91276584516334
- type: nauc_mrr_at_1_std
value: 4.555063005377011
- type: nauc_mrr_at_20_diff1
value: 19.42312083502562
- type: nauc_mrr_at_20_max
value: 23.998031015425354
- type: nauc_mrr_at_20_std
value: 10.507801798326819
- type: nauc_mrr_at_3_diff1
value: 20.50499706447941
- type: nauc_mrr_at_3_max
value: 22.89975536944602
- type: nauc_mrr_at_3_std
value: 8.976243818880809
- type: nauc_mrr_at_5_diff1
value: 19.59735376368769
- type: nauc_mrr_at_5_max
value: 23.079995863526243
- type: nauc_mrr_at_5_std
value: 9.558077494050336
- type: nauc_ndcg_at_1000_diff1
value: 13.411221925319488
- type: nauc_ndcg_at_1000_max
value: 28.874659943874605
- type: nauc_ndcg_at_1000_std
value: 22.92179424488089
- type: nauc_ndcg_at_100_diff1
value: 14.177059117246053
- type: nauc_ndcg_at_100_max
value: 29.49863202457167
- type: nauc_ndcg_at_100_std
value: 23.415432542915244
- type: nauc_ndcg_at_10_diff1
value: 14.034714269886518
- type: nauc_ndcg_at_10_max
value: 26.529324449228014
- type: nauc_ndcg_at_10_std
value: 15.0835036529515
- type: nauc_ndcg_at_1_diff1
value: 23.639623316387265
- type: nauc_ndcg_at_1_max
value: 21.91276584516334
- type: nauc_ndcg_at_1_std
value: 4.555063005377011
- type: nauc_ndcg_at_20_diff1
value: 13.639153726908837
- type: nauc_ndcg_at_20_max
value: 28.34934989257701
- type: nauc_ndcg_at_20_std
value: 18.346102705103505
- type: nauc_ndcg_at_3_diff1
value: 16.310949228363334
- type: nauc_ndcg_at_3_max
value: 21.96244399696209
- type: nauc_ndcg_at_3_std
value: 7.79248819842006
- type: nauc_ndcg_at_5_diff1
value: 14.630417187709366
- type: nauc_ndcg_at_5_max
value: 23.28452419937793
- type: nauc_ndcg_at_5_std
value: 10.132485346479228
- type: nauc_precision_at_1000_diff1
value: 0.4617378903286949
- type: nauc_precision_at_1000_max
value: 23.084163863883607
- type: nauc_precision_at_1000_std
value: 34.74028918125758
- type: nauc_precision_at_100_diff1
value: 7.744924657665058
- type: nauc_precision_at_100_max
value: 28.822902541968237
- type: nauc_precision_at_100_std
value: 35.872958881610344
- type: nauc_precision_at_10_diff1
value: 9.242022361674694
- type: nauc_precision_at_10_max
value: 27.707443555826906
- type: nauc_precision_at_10_std
value: 20.465290637452664
- type: nauc_precision_at_1_diff1
value: 23.639623316387265
- type: nauc_precision_at_1_max
value: 21.91276584516334
- type: nauc_precision_at_1_std
value: 4.555063005377011
- type: nauc_precision_at_20_diff1
value: 7.901785657316664
- type: nauc_precision_at_20_max
value: 29.678603802205057
- type: nauc_precision_at_20_std
value: 25.65946048724345
- type: nauc_precision_at_3_diff1
value: 13.650585769886394
- type: nauc_precision_at_3_max
value: 22.03045956299473
- type: nauc_precision_at_3_std
value: 9.155456520493106
- type: nauc_precision_at_5_diff1
value: 10.200134466214287
- type: nauc_precision_at_5_max
value: 23.308672947117167
- type: nauc_precision_at_5_std
value: 12.695862040385645
- type: nauc_recall_at_1000_diff1
value: 1.7286393025447204
- type: nauc_recall_at_1000_max
value: 23.322719223507704
- type: nauc_recall_at_1000_std
value: 36.358257876511956
- type: nauc_recall_at_100_diff1
value: 8.230846619688952
- type: nauc_recall_at_100_max
value: 28.880569830494963
- type: nauc_recall_at_100_std
value: 36.29115706966346
- type: nauc_recall_at_10_diff1
value: 9.362248846760513
- type: nauc_recall_at_10_max
value: 27.475538879580885
- type: nauc_recall_at_10_std
value: 20.314461649538373
- type: nauc_recall_at_1_diff1
value: 23.91080785158353
- type: nauc_recall_at_1_max
value: 21.714915496600813
- type: nauc_recall_at_1_std
value: 4.523659534794796
- type: nauc_recall_at_20_diff1
value: 8.140101636033602
- type: nauc_recall_at_20_max
value: 29.59131501693498
- type: nauc_recall_at_20_std
value: 25.876120433055316
- type: nauc_recall_at_3_diff1
value: 13.725759049941843
- type: nauc_recall_at_3_max
value: 21.75055584058006
- type: nauc_recall_at_3_std
value: 8.965766944507815
- type: nauc_recall_at_5_diff1
value: 10.366069494614596
- type: nauc_recall_at_5_max
value: 23.031784865881054
- type: nauc_recall_at_5_std
value: 12.411188897743521
- type: ndcg_at_1
value: 25.2
- type: ndcg_at_10
value: 21.781
- type: ndcg_at_100
value: 30.273
- type: ndcg_at_1000
value: 35.768
- type: ndcg_at_20
value: 24.967
- type: ndcg_at_3
value: 20.580000000000002
- type: ndcg_at_5
value: 17.926000000000002
- type: precision_at_1
value: 25.2
- type: precision_at_10
value: 11.4
- type: precision_at_100
value: 2.359
- type: precision_at_1000
value: 0.368
- type: precision_at_20
value: 7.545
- type: precision_at_3
value: 19.3
- type: precision_at_5
value: 15.78
- type: recall_at_1
value: 5.103
- type: recall_at_10
value: 23.083000000000002
- type: recall_at_100
value: 47.882999999999996
- type: recall_at_1000
value: 74.783
- type: recall_at_20
value: 30.592000000000002
- type: recall_at_3
value: 11.753
- type: recall_at_5
value: 15.983
- task:
type: STS
dataset:
name: MTEB SICK-R (default)
type: mteb/sickr-sts
config: default
split: test
revision: 20a6d6f312dd54037fe07a32d58e5e168867909d
metrics:
- type: cosine_pearson
value: 83.9841377195369
- type: cosine_spearman
value: 77.44919890597407
- type: euclidean_pearson
value: 81.21238548422511
- type: euclidean_spearman
value: 76.94405730272983
- type: main_score
value: 77.44919890597407
- type: manhattan_pearson
value: 81.16824677968528
- type: manhattan_spearman
value: 76.94296468591867
- type: pearson
value: 83.9841377195369
- type: spearman
value: 77.44919890597407
- task:
type: STS
dataset:
name: MTEB STS12 (default)
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cosine_pearson
value: 81.36071984442052
- type: cosine_spearman
value: 74.2212823495219
- type: euclidean_pearson
value: 78.31139429452078
- type: euclidean_spearman
value: 74.02790834412275
- type: main_score
value: 74.2212823495219
- type: manhattan_pearson
value: 78.26141328104697
- type: manhattan_spearman
value: 74.02545007676329
- type: pearson
value: 81.36071984442052
- type: spearman
value: 74.2212823495219
- task:
type: STS
dataset:
name: MTEB STS13 (default)
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cosine_pearson
value: 85.49925337918731
- type: cosine_spearman
value: 86.12368715292688
- type: euclidean_pearson
value: 85.71147581542367
- type: euclidean_spearman
value: 86.64112317821541
- type: main_score
value: 86.12368715292688
- type: manhattan_pearson
value: 85.58242941611371
- type: manhattan_spearman
value: 86.51041533466731
- type: pearson
value: 85.49925337918731
- type: spearman
value: 86.12368715292688
- task:
type: STS
dataset:
name: MTEB STS14 (default)
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cosine_pearson
value: 82.24735192639226
- type: cosine_spearman
value: 78.88155361224834
- type: euclidean_pearson
value: 80.52048132030517
- type: euclidean_spearman
value: 78.1335955670817
- type: main_score
value: 78.88155361224834
- type: manhattan_pearson
value: 80.48178866605353
- type: manhattan_spearman
value: 78.08994918255844
- type: pearson
value: 82.24735192639226
- type: spearman
value: 78.88155361224834
- task:
type: STS
dataset:
name: MTEB STS15 (default)
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cosine_pearson
value: 86.27381322229758
- type: cosine_spearman
value: 87.5038962579188
- type: euclidean_pearson
value: 86.7575259976948
- type: euclidean_spearman
value: 87.3358778981031
- type: main_score
value: 87.5038962579188
- type: manhattan_pearson
value: 86.72177109814491
- type: manhattan_spearman
value: 87.30593609243358
- type: pearson
value: 86.27381322229758
- type: spearman
value: 87.5038962579188
- task:
type: STS
dataset:
name: MTEB STS16 (default)
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cosine_pearson
value: 82.90364706517789
- type: cosine_spearman
value: 84.25854334490232
- type: euclidean_pearson
value: 83.30065780824273
- type: euclidean_spearman
value: 84.17467271748362
- type: main_score
value: 84.25854334490232
- type: manhattan_pearson
value: 83.21239264085494
- type: manhattan_spearman
value: 84.05456832118482
- type: pearson
value: 82.90364706517789
- type: spearman
value: 84.25854334490232
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: faeb762787bd10488a50c8b5be4a3b82e411949c
metrics:
- type: cosine_pearson
value: 88.88258729094343
- type: cosine_spearman
value: 89.68436656381257
- type: euclidean_pearson
value: 88.23417725579127
- type: euclidean_spearman
value: 87.96688277361433
- type: main_score
value: 89.68436656381257
- type: manhattan_pearson
value: 88.07673471897155
- type: manhattan_spearman
value: 87.7976329721765
- type: pearson
value: 88.88258729094343
- type: spearman
value: 89.68436656381257
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3
metrics:
- type: cosine_pearson
value: 65.24627744968292
- type: cosine_spearman
value: 65.96283849168346
- type: euclidean_pearson
value: 66.2111925054528
- type: euclidean_spearman
value: 65.83563143944401
- type: main_score
value: 65.96283849168346
- type: manhattan_pearson
value: 66.25664281582083
- type: manhattan_spearman
value: 65.8830797513158
- type: pearson
value: 65.24627744968292
- type: spearman
value: 65.96283849168346
- task:
type: STS
dataset:
name: MTEB STSBenchmark (default)
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cosine_pearson
value: 85.57515090752183
- type: cosine_spearman
value: 85.54441587714372
- type: euclidean_pearson
value: 85.53938106211463
- type: euclidean_spearman
value: 85.28473579067878
- type: main_score
value: 85.54441587714372
- type: manhattan_pearson
value: 85.51025100057596
- type: manhattan_spearman
value: 85.260887707662
- type: pearson
value: 85.57515090752183
- type: spearman
value: 85.54441587714372
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR (default)
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: main_score
value: 82.9058801876062
- type: map
value: 82.9058801876062
- type: mrr
value: 95.256220721907
- type: nAUC_map_diff1
value: 0.13078953297011875
- type: nAUC_map_max
value: 59.173980738758026
- type: nAUC_map_std
value: 73.35735418975649
- type: nAUC_mrr_diff1
value: 46.534353907114514
- type: nAUC_mrr_max
value: 89.56255914950661
- type: nAUC_mrr_std
value: 85.6716185155955
- task:
type: Retrieval
dataset:
name: MTEB SciFact (default)
type: mteb/scifact
config: default
split: test
revision: 0228b52cf27578f30900b9e5271d331663a030d7
metrics:
- type: main_score
value: 71.844
- type: map_at_1
value: 57.278
- type: map_at_10
value: 67.109
- type: map_at_100
value: 67.66499999999999
- type: map_at_1000
value: 67.685
- type: map_at_20
value: 67.482
- type: map_at_3
value: 64.16199999999999
- type: map_at_5
value: 65.82900000000001
- type: mrr_at_1
value: 60.0
- type: mrr_at_10
value: 68.19960317460317
- type: mrr_at_100
value: 68.62748949394921
- type: mrr_at_1000
value: 68.64515905414915
- type: mrr_at_20
value: 68.472601010101
- type: mrr_at_3
value: 66.0
- type: mrr_at_5
value: 67.21666666666667
- type: nauc_map_at_1000_diff1
value: 70.04313292027558
- type: nauc_map_at_1000_max
value: 57.24529193476731
- type: nauc_map_at_1000_std
value: -4.8888921470785585
- type: nauc_map_at_100_diff1
value: 70.04624674117014
- type: nauc_map_at_100_max
value: 57.25302539508853
- type: nauc_map_at_100_std
value: -4.907703072069842
- type: nauc_map_at_10_diff1
value: 70.06943109940849
- type: nauc_map_at_10_max
value: 57.39452715929109
- type: nauc_map_at_10_std
value: -4.743417671263566
- type: nauc_map_at_1_diff1
value: 76.61111479875207
- type: nauc_map_at_1_max
value: 52.822124992902374
- type: nauc_map_at_1_std
value: -7.6071857283495445
- type: nauc_map_at_20_diff1
value: 69.95251393140202
- type: nauc_map_at_20_max
value: 57.328356768833146
- type: nauc_map_at_20_std
value: -4.871357691032887
- type: nauc_map_at_3_diff1
value: 69.71499509001714
- type: nauc_map_at_3_max
value: 53.645107897260026
- type: nauc_map_at_3_std
value: -7.908850295935557
- type: nauc_map_at_5_diff1
value: 69.7531280646943
- type: nauc_map_at_5_max
value: 55.71038914997073
- type: nauc_map_at_5_std
value: -6.7813041970848476
- type: nauc_mrr_at_1000_diff1
value: 69.61840192382927
- type: nauc_mrr_at_1000_max
value: 58.419734360225696
- type: nauc_mrr_at_1000_std
value: -1.8503761885586425
- type: nauc_mrr_at_100_diff1
value: 69.6153571701724
- type: nauc_mrr_at_100_max
value: 58.422378816414565
- type: nauc_mrr_at_100_std
value: -1.8731915889302972
- type: nauc_mrr_at_10_diff1
value: 69.5874772943516
- type: nauc_mrr_at_10_max
value: 58.78121978366665
- type: nauc_mrr_at_10_std
value: -1.2843146465927913
- type: nauc_mrr_at_1_diff1
value: 74.35688136934793
- type: nauc_mrr_at_1_max
value: 57.487384980706416
- type: nauc_mrr_at_1_std
value: -1.3005837538340144
- type: nauc_mrr_at_20_diff1
value: 69.53988639045606
- type: nauc_mrr_at_20_max
value: 58.49631860342686
- type: nauc_mrr_at_20_std
value: -1.7220227513588833
- type: nauc_mrr_at_3_diff1
value: 68.94320178615871
- type: nauc_mrr_at_3_max
value: 56.60856449749424
- type: nauc_mrr_at_3_std
value: -3.3432894595086866
- type: nauc_mrr_at_5_diff1
value: 68.94240340867633
- type: nauc_mrr_at_5_max
value: 58.27068018852665
- type: nauc_mrr_at_5_std
value: -2.320192066949136
- type: nauc_ndcg_at_1000_diff1
value: 69.15093538086137
- type: nauc_ndcg_at_1000_max
value: 58.6801221127507
- type: nauc_ndcg_at_1000_std
value: -3.002038837722594
- type: nauc_ndcg_at_100_diff1
value: 69.11507044508373
- type: nauc_ndcg_at_100_max
value: 58.843490113137605
- type: nauc_ndcg_at_100_std
value: -3.2810475322338566
- type: nauc_ndcg_at_10_diff1
value: 68.71920945656667
- type: nauc_ndcg_at_10_max
value: 60.13600198034469
- type: nauc_ndcg_at_10_std
value: -1.6190106644777749
- type: nauc_ndcg_at_1_diff1
value: 74.35688136934793
- type: nauc_ndcg_at_1_max
value: 57.487384980706416
- type: nauc_ndcg_at_1_std
value: -1.3005837538340144
- type: nauc_ndcg_at_20_diff1
value: 68.33714726670162
- type: nauc_ndcg_at_20_max
value: 59.45907982196103
- type: nauc_ndcg_at_20_std
value: -2.5953063304797754
- type: nauc_ndcg_at_3_diff1
value: 67.33605891922716
- type: nauc_ndcg_at_3_max
value: 55.01142849375101
- type: nauc_ndcg_at_3_std
value: -6.5632981093508205
- type: nauc_ndcg_at_5_diff1
value: 67.59450950578172
- type: nauc_ndcg_at_5_max
value: 57.50106057747294
- type: nauc_ndcg_at_5_std
value: -5.415038422866616
- type: nauc_precision_at_1000_diff1
value: -33.21156082089814
- type: nauc_precision_at_1000_max
value: 19.132732038554398
- type: nauc_precision_at_1000_std
value: 44.091281225705714
- type: nauc_precision_at_100_diff1
value: -20.015823755259245
- type: nauc_precision_at_100_max
value: 26.507243354636085
- type: nauc_precision_at_100_std
value: 37.87274756817076
- type: nauc_precision_at_10_diff1
value: 8.35057694800983
- type: nauc_precision_at_10_max
value: 49.60611953844157
- type: nauc_precision_at_10_std
value: 32.18410475820039
- type: nauc_precision_at_1_diff1
value: 74.35688136934793
- type: nauc_precision_at_1_max
value: 57.487384980706416
- type: nauc_precision_at_1_std
value: -1.3005837538340144
- type: nauc_precision_at_20_diff1
value: -3.0872665961524612
- type: nauc_precision_at_20_max
value: 40.5565038905005
- type: nauc_precision_at_20_std
value: 32.15291813716766
- type: nauc_precision_at_3_diff1
value: 34.627722605371545
- type: nauc_precision_at_3_max
value: 49.65219072739979
- type: nauc_precision_at_3_std
value: 7.7588985130719434
- type: nauc_precision_at_5_diff1
value: 22.06911561993657
- type: nauc_precision_at_5_max
value: 49.09578970278826
- type: nauc_precision_at_5_std
value: 16.038789872070705
- type: nauc_recall_at_1000_diff1
value: .nan
- type: nauc_recall_at_1000_max
value: .nan
- type: nauc_recall_at_1000_std
value: .nan
- type: nauc_recall_at_100_diff1
value: 64.77257569694551
- type: nauc_recall_at_100_max
value: 65.07269574496497
- type: nauc_recall_at_100_std
value: -10.979947534569218
- type: nauc_recall_at_10_diff1
value: 62.14297161941494
- type: nauc_recall_at_10_max
value: 70.41353364022896
- type: nauc_recall_at_10_std
value: 9.172932719542075
- type: nauc_recall_at_1_diff1
value: 76.61111479875207
- type: nauc_recall_at_1_max
value: 52.822124992902374
- type: nauc_recall_at_1_std
value: -7.6071857283495445
- type: nauc_recall_at_20_diff1
value: 57.631464811333224
- type: nauc_recall_at_20_max
value: 67.83558221740536
- type: nauc_recall_at_20_std
value: 3.110691973832695
- type: nauc_recall_at_3_diff1
value: 60.39078444139112
- type: nauc_recall_at_3_max
value: 51.122425596651574
- type: nauc_recall_at_3_std
value: -10.307895490015559
- type: nauc_recall_at_5_diff1
value: 59.703727953513145
- type: nauc_recall_at_5_max
value: 59.81893786534298
- type: nauc_recall_at_5_std
value: -6.231017907901268
- type: ndcg_at_1
value: 60.0
- type: ndcg_at_10
value: 71.844
- type: ndcg_at_100
value: 74.278
- type: ndcg_at_1000
value: 74.74199999999999
- type: ndcg_at_20
value: 72.99
- type: ndcg_at_3
value: 66.721
- type: ndcg_at_5
value: 69.137
- type: precision_at_1
value: 60.0
- type: precision_at_10
value: 9.6
- type: precision_at_100
value: 1.093
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_20
value: 5.067
- type: precision_at_3
value: 26.111
- type: precision_at_5
value: 17.267
- type: recall_at_1
value: 57.278
- type: recall_at_10
value: 85.344
- type: recall_at_100
value: 96.5
- type: recall_at_1000
value: 100.0
- type: recall_at_20
value: 89.589
- type: recall_at_3
value: 71.45
- type: recall_at_5
value: 77.361
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions (default)
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cosine_accuracy
value: 99.8019801980198
- type: cosine_accuracy_threshold
value: 74.77510571479797
- type: cosine_ap
value: 95.30006120252773
- type: cosine_f1
value: 89.75265017667844
- type: cosine_f1_threshold
value: 72.93492555618286
- type: cosine_precision
value: 90.62181447502549
- type: cosine_recall
value: 88.9
- type: dot_accuracy
value: 99.74554455445545
- type: dot_accuracy_threshold
value: 794.2790985107422
- type: dot_ap
value: 93.33073289508414
- type: dot_f1
value: 87.11779448621553
- type: dot_f1_threshold
value: 793.5191631317139
- type: dot_precision
value: 87.33668341708542
- type: dot_recall
value: 86.9
- type: euclidean_accuracy
value: 99.7960396039604
- type: euclidean_accuracy_threshold
value: 238.72876167297363
- type: euclidean_ap
value: 95.04815354196363
- type: euclidean_f1
value: 89.53252032520325
- type: euclidean_f1_threshold
value: 241.42813682556152
- type: euclidean_precision
value: 91.01239669421489
- type: euclidean_recall
value: 88.1
- type: main_score
value: 95.30006120252773
- type: manhattan_accuracy
value: 99.7960396039604
- type: manhattan_accuracy_threshold
value: 5224.44953918457
- type: manhattan_ap
value: 95.02798265540767
- type: manhattan_f1
value: 89.4552723638181
- type: manhattan_f1_threshold
value: 5434.450531005859
- type: manhattan_precision
value: 89.41058941058941
- type: manhattan_recall
value: 89.5
- type: max_accuracy
value: 99.8019801980198
- type: max_ap
value: 95.30006120252773
- type: max_f1
value: 89.75265017667844
- type: max_precision
value: 91.01239669421489
- type: max_recall
value: 89.5
- type: similarity_accuracy
value: 99.8019801980198
- type: similarity_accuracy_threshold
value: 74.77510571479797
- type: similarity_ap
value: 95.30006120252773
- type: similarity_f1
value: 89.75265017667844
- type: similarity_f1_threshold
value: 72.93492555618286
- type: similarity_precision
value: 90.62181447502549
- type: similarity_recall
value: 88.9
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering (default)
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: main_score
value: 66.76593843797666
- type: v_measure
value: 66.76593843797666
- type: v_measure_std
value: 3.5421488096435416
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P (default)
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: main_score
value: 38.90007255920144
- type: v_measure
value: 38.90007255920144
- type: v_measure_std
value: 1.440894289494648
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions (default)
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: main_score
value: 52.71807785910519
- type: map
value: 52.71807785910519
- type: mrr
value: 53.51011427298192
- type: nAUC_map_diff1
value: 38.489341755206404
- type: nAUC_map_max
value: 12.810459097227756
- type: nAUC_map_std
value: 10.001723368468545
- type: nAUC_mrr_diff1
value: 38.1795784067288
- type: nAUC_mrr_max
value: 13.876071274342735
- type: nAUC_mrr_std
value: 10.809361649584433
- task:
type: Summarization
dataset:
name: MTEB SummEval (default)
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cosine_pearson
value: 31.51422308323083
- type: cosine_spearman
value: 31.22821719703179
- type: dot_pearson
value: 30.692806438778554
- type: dot_spearman
value: 30.440095026481913
- type: main_score
value: 31.22821719703179
- type: pearson
value: 31.51422308323083
- type: spearman
value: 31.22821719703179
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID (default)
type: mteb/trec-covid
config: default
split: test
revision: bb9466bac8153a0349341eb1b22e06409e78ef4e
metrics:
- type: main_score
value: 79.38199999999999
- type: map_at_1
value: 0.258
- type: map_at_10
value: 2.077
- type: map_at_100
value: 12.062000000000001
- type: map_at_1000
value: 28.717
- type: map_at_20
value: 3.6630000000000003
- type: map_at_3
value: 0.7040000000000001
- type: map_at_5
value: 1.114
- type: mrr_at_1
value: 96.0
- type: mrr_at_10
value: 97.66666666666667
- type: mrr_at_100
value: 97.66666666666667
- type: mrr_at_1000
value: 97.66666666666667
- type: mrr_at_20
value: 97.66666666666667
- type: mrr_at_3
value: 97.66666666666667
- type: mrr_at_5
value: 97.66666666666667
- type: nauc_map_at_1000_diff1
value: -19.606457542469276
- type: nauc_map_at_1000_max
value: 62.23126542837836
- type: nauc_map_at_1000_std
value: 78.11491433681955
- type: nauc_map_at_100_diff1
value: 1.056950862100428
- type: nauc_map_at_100_max
value: 43.14707718269215
- type: nauc_map_at_100_std
value: 54.99119932336741
- type: nauc_map_at_10_diff1
value: 31.26313513848752
- type: nauc_map_at_10_max
value: 18.729050164831303
- type: nauc_map_at_10_std
value: 12.501346100150942
- type: nauc_map_at_1_diff1
value: 50.67428371303766
- type: nauc_map_at_1_max
value: 8.26350705716926
- type: nauc_map_at_1_std
value: -2.802747360156509
- type: nauc_map_at_20_diff1
value: 23.85177292094862
- type: nauc_map_at_20_max
value: 24.907498374862385
- type: nauc_map_at_20_std
value: 23.15361092830954
- type: nauc_map_at_3_diff1
value: 44.34113488392741
- type: nauc_map_at_3_max
value: 16.13816628219856
- type: nauc_map_at_3_std
value: 1.64493293742063
- type: nauc_map_at_5_diff1
value: 43.35667417997146
- type: nauc_map_at_5_max
value: 16.651525778549175
- type: nauc_map_at_5_std
value: 5.344297729807275
- type: nauc_mrr_at_1000_diff1
value: 65.01934106976137
- type: nauc_mrr_at_1000_max
value: 74.5231425903695
- type: nauc_mrr_at_1000_std
value: 84.12698412698381
- type: nauc_mrr_at_100_diff1
value: 65.01934106976137
- type: nauc_mrr_at_100_max
value: 74.5231425903695
- type: nauc_mrr_at_100_std
value: 84.12698412698381
- type: nauc_mrr_at_10_diff1
value: 65.01934106976137
- type: nauc_mrr_at_10_max
value: 74.5231425903695
- type: nauc_mrr_at_10_std
value: 84.12698412698381
- type: nauc_mrr_at_1_diff1
value: 63.81886087768457
- type: nauc_mrr_at_1_max
value: 77.70774976657333
- type: nauc_mrr_at_1_std
value: 86.11111111111124
- type: nauc_mrr_at_20_diff1
value: 65.01934106976137
- type: nauc_mrr_at_20_max
value: 74.5231425903695
- type: nauc_mrr_at_20_std
value: 84.12698412698381
- type: nauc_mrr_at_3_diff1
value: 65.01934106976137
- type: nauc_mrr_at_3_max
value: 74.5231425903695
- type: nauc_mrr_at_3_std
value: 84.12698412698381
- type: nauc_mrr_at_5_diff1
value: 65.01934106976137
- type: nauc_mrr_at_5_max
value: 74.5231425903695
- type: nauc_mrr_at_5_std
value: 84.12698412698381
- type: nauc_ndcg_at_1000_diff1
value: -12.207934630430895
- type: nauc_ndcg_at_1000_max
value: 63.27131989733247
- type: nauc_ndcg_at_1000_std
value: 77.77862783776057
- type: nauc_ndcg_at_100_diff1
value: -31.139043418906777
- type: nauc_ndcg_at_100_max
value: 56.29288690229761
- type: nauc_ndcg_at_100_std
value: 80.54207709212822
- type: nauc_ndcg_at_10_diff1
value: -21.623075757241335
- type: nauc_ndcg_at_10_max
value: 42.00930185115019
- type: nauc_ndcg_at_10_std
value: 63.90085820733794
- type: nauc_ndcg_at_1_diff1
value: 27.03957293721711
- type: nauc_ndcg_at_1_max
value: 18.687865072917816
- type: nauc_ndcg_at_1_std
value: 40.65606746354093
- type: nauc_ndcg_at_20_diff1
value: -27.059567337111528
- type: nauc_ndcg_at_20_max
value: 44.873490488692845
- type: nauc_ndcg_at_20_std
value: 68.27056244238835
- type: nauc_ndcg_at_3_diff1
value: -2.2768439107759253
- type: nauc_ndcg_at_3_max
value: 33.16972612805963
- type: nauc_ndcg_at_3_std
value: 49.35785810423734
- type: nauc_ndcg_at_5_diff1
value: -8.380892599544165
- type: nauc_ndcg_at_5_max
value: 39.7045491756542
- type: nauc_ndcg_at_5_std
value: 56.662696632820044
- type: nauc_precision_at_1000_diff1
value: -39.853246552685256
- type: nauc_precision_at_1000_max
value: 45.82687391914263
- type: nauc_precision_at_1000_std
value: 51.6573155072073
- type: nauc_precision_at_100_diff1
value: -35.334152199143055
- type: nauc_precision_at_100_max
value: 57.74163988146608
- type: nauc_precision_at_100_std
value: 78.83424294782806
- type: nauc_precision_at_10_diff1
value: -29.572269138136193
- type: nauc_precision_at_10_max
value: 45.16249504588279
- type: nauc_precision_at_10_std
value: 63.92716685466912
- type: nauc_precision_at_1_diff1
value: 63.81886087768457
- type: nauc_precision_at_1_max
value: 77.70774976657333
- type: nauc_precision_at_1_std
value: 86.11111111111124
- type: nauc_precision_at_20_diff1
value: -31.155129521710613
- type: nauc_precision_at_20_max
value: 46.072522169609606
- type: nauc_precision_at_20_std
value: 64.29857883516294
- type: nauc_precision_at_3_diff1
value: -5.644268209909603
- type: nauc_precision_at_3_max
value: 54.62437037830888
- type: nauc_precision_at_3_std
value: 52.27021040974535
- type: nauc_precision_at_5_diff1
value: -15.560278135078049
- type: nauc_precision_at_5_max
value: 50.21344816658272
- type: nauc_precision_at_5_std
value: 58.94711332326674
- type: nauc_recall_at_1000_diff1
value: -8.016557237167058
- type: nauc_recall_at_1000_max
value: 58.857938362714165
- type: nauc_recall_at_1000_std
value: 66.83850522737738
- type: nauc_recall_at_100_diff1
value: 15.447588986377317
- type: nauc_recall_at_100_max
value: 37.515788055189084
- type: nauc_recall_at_100_std
value: 42.326000614078026
- type: nauc_recall_at_10_diff1
value: 34.99067421432679
- type: nauc_recall_at_10_max
value: 13.792789030946933
- type: nauc_recall_at_10_std
value: 7.066206327262477
- type: nauc_recall_at_1_diff1
value: 50.67428371303766
- type: nauc_recall_at_1_max
value: 8.26350705716926
- type: nauc_recall_at_1_std
value: -2.802747360156509
- type: nauc_recall_at_20_diff1
value: 31.277397618992136
- type: nauc_recall_at_20_max
value: 20.296127261717054
- type: nauc_recall_at_20_std
value: 16.117931287068437
- type: nauc_recall_at_3_diff1
value: 46.303571802817025
- type: nauc_recall_at_3_max
value: 14.03073426897129
- type: nauc_recall_at_3_std
value: -0.39592906337357797
- type: nauc_recall_at_5_diff1
value: 45.51206018811467
- type: nauc_recall_at_5_max
value: 12.263182926616867
- type: nauc_recall_at_5_std
value: 1.5451403387758214
- type: ndcg_at_1
value: 87.0
- type: ndcg_at_10
value: 79.38199999999999
- type: ndcg_at_100
value: 59.941
- type: ndcg_at_1000
value: 53.581999999999994
- type: ndcg_at_20
value: 74.244
- type: ndcg_at_3
value: 84.05
- type: ndcg_at_5
value: 82.328
- type: precision_at_1
value: 96.0
- type: precision_at_10
value: 85.2
- type: precision_at_100
value: 61.519999999999996
- type: precision_at_1000
value: 23.328
- type: precision_at_20
value: 78.4
- type: precision_at_3
value: 90.667
- type: precision_at_5
value: 88.4
- type: recall_at_1
value: 0.258
- type: recall_at_10
value: 2.225
- type: recall_at_100
value: 15.190999999999999
- type: recall_at_1000
value: 50.656
- type: recall_at_20
value: 4.063
- type: recall_at_3
value: 0.722
- type: recall_at_5
value: 1.168
- task:
type: Retrieval
dataset:
name: MTEB Touche2020 (default)
type: mteb/touche2020
config: default
split: test
revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f
metrics:
- type: main_score
value: 24.254
- type: map_at_1
value: 2.355
- type: map_at_10
value: 9.554
- type: map_at_100
value: 14.856
- type: map_at_1000
value: 16.320999999999998
- type: map_at_20
value: 11.594
- type: map_at_3
value: 5.624
- type: map_at_5
value: 6.948
- type: mrr_at_1
value: 28.57142857142857
- type: mrr_at_10
value: 45.30855199222546
- type: mrr_at_100
value: 46.29196367191565
- type: mrr_at_1000
value: 46.31499833524485
- type: mrr_at_20
value: 46.113797167218536
- type: mrr_at_3
value: 42.17687074829932
- type: mrr_at_5
value: 43.70748299319728
- type: nauc_map_at_1000_diff1
value: 16.20923402096991
- type: nauc_map_at_1000_max
value: -1.0790035381754648
- type: nauc_map_at_1000_std
value: 7.195462252108266
- type: nauc_map_at_100_diff1
value: 18.389136986949936
- type: nauc_map_at_100_max
value: -2.05569038009456
- type: nauc_map_at_100_std
value: 2.571693024788773
- type: nauc_map_at_10_diff1
value: 21.066136452964642
- type: nauc_map_at_10_max
value: 1.5731034935019352
- type: nauc_map_at_10_std
value: -10.470562156435545
- type: nauc_map_at_1_diff1
value: 18.809274247757674
- type: nauc_map_at_1_max
value: -8.68104031396317
- type: nauc_map_at_1_std
value: -30.619138463973307
- type: nauc_map_at_20_diff1
value: 23.36148432932364
- type: nauc_map_at_20_max
value: -0.38560029617230923
- type: nauc_map_at_20_std
value: -6.8825311118744485
- type: nauc_map_at_3_diff1
value: 18.9370153117886
- type: nauc_map_at_3_max
value: 2.2032967783435375
- type: nauc_map_at_3_std
value: -12.532694022066659
- type: nauc_map_at_5_diff1
value: 21.434904521858602
- type: nauc_map_at_5_max
value: 6.094611630406942
- type: nauc_map_at_5_std
value: -12.492795788667474
- type: nauc_mrr_at_1000_diff1
value: 11.961046636239269
- type: nauc_mrr_at_1000_max
value: -15.748297693665677
- type: nauc_mrr_at_1000_std
value: -12.067130971523385
- type: nauc_mrr_at_100_diff1
value: 11.95534277650038
- type: nauc_mrr_at_100_max
value: -15.684486171307041
- type: nauc_mrr_at_100_std
value: -11.98247014226321
- type: nauc_mrr_at_10_diff1
value: 12.191520381511925
- type: nauc_mrr_at_10_max
value: -16.510285123987302
- type: nauc_mrr_at_10_std
value: -11.93784570526233
- type: nauc_mrr_at_1_diff1
value: 18.162553375605516
- type: nauc_mrr_at_1_max
value: -18.920009881475387
- type: nauc_mrr_at_1_std
value: -31.201005281857086
- type: nauc_mrr_at_20_diff1
value: 11.85035482221006
- type: nauc_mrr_at_20_max
value: -16.18704935368085
- type: nauc_mrr_at_20_std
value: -11.424991900511088
- type: nauc_mrr_at_3_diff1
value: 14.733201594965836
- type: nauc_mrr_at_3_max
value: -11.75899459749356
- type: nauc_mrr_at_3_std
value: -11.499870896820976
- type: nauc_mrr_at_5_diff1
value: 12.874017458219845
- type: nauc_mrr_at_5_max
value: -13.642689819875791
- type: nauc_mrr_at_5_std
value: -11.64117086557618
- type: nauc_ndcg_at_1000_diff1
value: -6.849400123979281
- type: nauc_ndcg_at_1000_max
value: -3.8209628417621393
- type: nauc_ndcg_at_1000_std
value: 31.393629472927504
- type: nauc_ndcg_at_100_diff1
value: 5.4656320972286485
- type: nauc_ndcg_at_100_max
value: -11.571250999652408
- type: nauc_ndcg_at_100_std
value: 16.5511179303082
- type: nauc_ndcg_at_10_diff1
value: 9.553502614400788
- type: nauc_ndcg_at_10_max
value: -14.08266102380929
- type: nauc_ndcg_at_10_std
value: -5.404201943794988
- type: nauc_ndcg_at_1_diff1
value: 11.37824691229176
- type: nauc_ndcg_at_1_max
value: -21.31215334708879
- type: nauc_ndcg_at_1_std
value: -29.749958184219334
- type: nauc_ndcg_at_20_diff1
value: 13.396975021395857
- type: nauc_ndcg_at_20_max
value: -14.5189405742469
- type: nauc_ndcg_at_20_std
value: -1.6276921520570502
- type: nauc_ndcg_at_3_diff1
value: 2.3132968948746226
- type: nauc_ndcg_at_3_max
value: -11.351646560904848
- type: nauc_ndcg_at_3_std
value: -0.15036952995361091
- type: nauc_ndcg_at_5_diff1
value: 6.214320727021392
- type: nauc_ndcg_at_5_max
value: -9.797994041679638
- type: nauc_ndcg_at_5_std
value: -3.3742904276844223
- type: nauc_precision_at_1000_diff1
value: -32.78708155144845
- type: nauc_precision_at_1000_max
value: 34.81622247650308
- type: nauc_precision_at_1000_std
value: 47.996245254718744
- type: nauc_precision_at_100_diff1
value: -10.867559709952797
- type: nauc_precision_at_100_max
value: 6.681915188055671
- type: nauc_precision_at_100_std
value: 61.989390090979356
- type: nauc_precision_at_10_diff1
value: 6.511211593484189
- type: nauc_precision_at_10_max
value: -16.842566662697454
- type: nauc_precision_at_10_std
value: 5.002600740433903
- type: nauc_precision_at_1_diff1
value: 18.162553375605516
- type: nauc_precision_at_1_max
value: -18.920009881475387
- type: nauc_precision_at_1_std
value: -31.201005281857086
- type: nauc_precision_at_20_diff1
value: 9.640744611970522
- type: nauc_precision_at_20_max
value: -18.27653996056668
- type: nauc_precision_at_20_std
value: 22.021814503656543
- type: nauc_precision_at_3_diff1
value: 6.916201107284145
- type: nauc_precision_at_3_max
value: -0.039381527098944095
- type: nauc_precision_at_3_std
value: 9.096821181866671
- type: nauc_precision_at_5_diff1
value: 9.032683328748616
- type: nauc_precision_at_5_max
value: -3.5989814795848223
- type: nauc_precision_at_5_std
value: 2.506947866544208
- type: nauc_recall_at_1000_diff1
value: -27.92405572104993
- type: nauc_recall_at_1000_max
value: 14.256848434706395
- type: nauc_recall_at_1000_std
value: 69.3546817240148
- type: nauc_recall_at_100_diff1
value: 6.613753533249129
- type: nauc_recall_at_100_max
value: -8.405822616363144
- type: nauc_recall_at_100_std
value: 29.430588706591397
- type: nauc_recall_at_10_diff1
value: 18.481730784371077
- type: nauc_recall_at_10_max
value: -7.763172381505888
- type: nauc_recall_at_10_std
value: -7.48570052741164
- type: nauc_recall_at_1_diff1
value: 18.809274247757674
- type: nauc_recall_at_1_max
value: -8.68104031396317
- type: nauc_recall_at_1_std
value: -30.619138463973307
- type: nauc_recall_at_20_diff1
value: 20.639977762281493
- type: nauc_recall_at_20_max
value: -11.301201172125623
- type: nauc_recall_at_20_std
value: 0.38755705583239786
- type: nauc_recall_at_3_diff1
value: 18.279383297820562
- type: nauc_recall_at_3_max
value: 5.287795698059438
- type: nauc_recall_at_3_std
value: -3.7312167565958316
- type: nauc_recall_at_5_diff1
value: 21.115852302465356
- type: nauc_recall_at_5_max
value: 5.318139212101227
- type: nauc_recall_at_5_std
value: -7.792885381250281
- type: ndcg_at_1
value: 25.509999999999998
- type: ndcg_at_10
value: 24.254
- type: ndcg_at_100
value: 34.660000000000004
- type: ndcg_at_1000
value: 45.798
- type: ndcg_at_20
value: 24.988
- type: ndcg_at_3
value: 29.273
- type: ndcg_at_5
value: 25.453
- type: precision_at_1
value: 28.571
- type: precision_at_10
value: 21.02
- type: precision_at_100
value: 7.122000000000001
- type: precision_at_1000
value: 1.435
- type: precision_at_20
value: 16.326999999999998
- type: precision_at_3
value: 31.293
- type: precision_at_5
value: 24.898
- type: recall_at_1
value: 2.355
- type: recall_at_10
value: 15.397
- type: recall_at_100
value: 43.647000000000006
- type: recall_at_1000
value: 77.089
- type: recall_at_20
value: 22.792
- type: recall_at_3
value: 6.847
- type: recall_at_5
value: 9.136
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification (default)
type: mteb/toxic_conversations_50k
config: default
split: test
revision: edfaf9da55d3dd50d43143d90c1ac476895ae6de
metrics:
- type: accuracy
value: 72.7734375
- type: ap
value: 15.655230461083708
- type: ap_weighted
value: 15.655230461083708
- type: f1
value: 56.31497978454638
- type: f1_weighted
value: 78.70509613747345
- type: main_score
value: 72.7734375
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification (default)
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 72.56366723259762
- type: f1
value: 72.90413275122202
- type: f1_weighted
value: 72.19948169084057
- type: main_score
value: 72.56366723259762
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering (default)
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: main_score
value: 56.90970017457857
- type: v_measure
value: 56.90970017457857
- type: v_measure_std
value: 1.5885885070403738
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015 (default)
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cosine_accuracy
value: 85.7006616200751
- type: cosine_accuracy_threshold
value: 75.78572630882263
- type: cosine_ap
value: 72.87577990245127
- type: cosine_f1
value: 67.36422521175885
- type: cosine_f1_threshold
value: 70.15678882598877
- type: cosine_precision
value: 63.80368098159509
- type: cosine_recall
value: 71.34564643799473
- type: dot_accuracy
value: 83.60851165285807
- type: dot_accuracy_threshold
value: 744.7918891906738
- type: dot_ap
value: 64.82619159813649
- type: dot_f1
value: 62.62379263968699
- type: dot_f1_threshold
value: 696.7735290527344
- type: dot_precision
value: 58.350421508316245
- type: dot_recall
value: 67.57255936675462
- type: euclidean_accuracy
value: 85.84371460928652
- type: euclidean_accuracy_threshold
value: 220.4747200012207
- type: euclidean_ap
value: 72.47837433257799
- type: euclidean_f1
value: 67.2811059907834
- type: euclidean_f1_threshold
value: 240.81902503967285
- type: euclidean_precision
value: 65.34062655395326
- type: euclidean_recall
value: 69.34036939313984
- type: main_score
value: 72.87577990245127
- type: manhattan_accuracy
value: 85.83179352685224
- type: manhattan_accuracy_threshold
value: 4910.404205322266
- type: manhattan_ap
value: 72.44111617709422
- type: manhattan_f1
value: 67.09989806320081
- type: manhattan_f1_threshold
value: 5333.793640136719
- type: manhattan_precision
value: 64.88417939871857
- type: manhattan_recall
value: 69.47229551451187
- type: max_accuracy
value: 85.84371460928652
- type: max_ap
value: 72.87577990245127
- type: max_f1
value: 67.36422521175885
- type: max_precision
value: 65.34062655395326
- type: max_recall
value: 71.34564643799473
- type: similarity_accuracy
value: 85.7006616200751
- type: similarity_accuracy_threshold
value: 75.78572630882263
- type: similarity_ap
value: 72.87577990245127
- type: similarity_f1
value: 67.36422521175885
- type: similarity_f1_threshold
value: 70.15678882598877
- type: similarity_precision
value: 63.80368098159509
- type: similarity_recall
value: 71.34564643799473
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus (default)
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cosine_accuracy
value: 88.88112702293631
- type: cosine_accuracy_threshold
value: 71.48405313491821
- type: cosine_ap
value: 85.88088882163336
- type: cosine_f1
value: 78.2251744598276
- type: cosine_f1_threshold
value: 70.09605169296265
- type: cosine_precision
value: 75.8997755087262
- type: cosine_recall
value: 80.69756698490914
- type: dot_accuracy
value: 88.04672643303451
- type: dot_accuracy_threshold
value: 700.6264686584473
- type: dot_ap
value: 83.52072844458456
- type: dot_f1
value: 76.24239256244634
- type: dot_f1_threshold
value: 664.9115562438965
- type: dot_precision
value: 74.0123233055455
- type: dot_recall
value: 78.61102556205728
- type: euclidean_accuracy
value: 88.72588970388482
- type: euclidean_accuracy_threshold
value: 226.53303146362305
- type: euclidean_ap
value: 85.51788295919707
- type: euclidean_f1
value: 77.73453426739316
- type: euclidean_f1_threshold
value: 238.7503147125244
- type: euclidean_precision
value: 74.94818097348296
- type: euclidean_recall
value: 80.73606405913151
- type: main_score
value: 85.88088882163336
- type: manhattan_accuracy
value: 88.68902084061008
- type: manhattan_accuracy_threshold
value: 5034.079742431641
- type: manhattan_ap
value: 85.49952903626239
- type: manhattan_f1
value: 77.74326743888625
- type: manhattan_f1_threshold
value: 5334.531021118164
- type: manhattan_precision
value: 73.98289171708741
- type: manhattan_recall
value: 81.90637511549123
- type: max_accuracy
value: 88.88112702293631
- type: max_ap
value: 85.88088882163336
- type: max_f1
value: 78.2251744598276
- type: max_precision
value: 75.8997755087262
- type: max_recall
value: 81.90637511549123
- type: similarity_accuracy
value: 88.88112702293631
- type: similarity_accuracy_threshold
value: 71.48405313491821
- type: similarity_ap
value: 85.88088882163336
- type: similarity_f1
value: 78.2251744598276
- type: similarity_f1_threshold
value: 70.09605169296265
- type: similarity_precision
value: 75.8997755087262
- type: similarity_recall
value: 80.69756698490914
---
# cde-small-v1
<div style="background-color: #f8f9fa; border-left: 6px solid #007bff; padding: 10px 20px; margin: 20px; font-family: Arial, sans-serif; line-height: 1.6;">
<p>The <strong>cde-small-v1</strong> model has been deprecated. We highly recommend transitioning to the improved <strong>cde-small-v2</strong> model for enhanced performance and support.</p>
<p>For more details and to access the latest version, please visit the <a href="https://huggingface.co/jxm/cde-small-v2" target="_blank" style="color: #007bff; text-decoration: none;">cde-small-v2 model page</a>.</p>
</div>
<a href="github.com/jxmorris12/cde">Github</a>
Our new model that naturally integrates "context tokens" into the embedding process. As of October 1st, 2024, `cde-small-v1` is the best small model (under 400M params) on the [MTEB leaderboard](https://huggingface.co/spaces/mteb/leaderboard) for text embedding models, with an average score of 65.00.
👉 <b><a href="https://colab.research.google.com/drive/1r8xwbp7_ySL9lP-ve4XMJAHjidB9UkbL?usp=sharing">Try on Colab</a></b>
<br>
👉 <b><a href="https://arxiv.org/abs/2410.02525">Contextual Document Embeddings (ArXiv)</a></b>

<br>
<hr>
# How to use `cde-small-v1`
Our embedding model needs to be used in *two stages*. The first stage is to gather some dataset information by embedding a subset of the corpus using our "first-stage" model. The second stage is to actually embed queries and documents, conditioning on the corpus information from the first stage. Note that we can do the first stage part offline and only use the second-stage weights at inference time.
</details>
## With Transformers
<details>
<summary>Click to learn how to use cde-small-v1 with Transformers</summary>
### Loading the model
Our model can be loaded using `transformers` out-of-the-box with "trust remote code" enabled. We use the default BERT uncased tokenizer:
```python
import transformers
model = transformers.AutoModel.from_pretrained("jxm/cde-small-v1", trust_remote_code=True)
tokenizer = transformers.AutoTokenizer.from_pretrained("bert-base-uncased")
```
#### Note on prefixes
*Nota bene*: Like all state-of-the-art embedding models, our model was trained with task-specific prefixes. To do retrieval, you can prepend the following strings to queries & documents:
```python
query_prefix = "search_query: "
document_prefix = "search_document: "
```
### First stage
```python
minicorpus_size = model.config.transductive_corpus_size
minicorpus_docs = [ ... ] # Put some strings here that are representative of your corpus, for example by calling random.sample(corpus, k=minicorpus_size)
assert len(minicorpus_docs) == minicorpus_size # You must use exactly this many documents in the minicorpus. You can oversample if your corpus is smaller.
minicorpus_docs = tokenizer(
[document_prefix + doc for doc in minicorpus_docs],
truncation=True,
padding=True,
max_length=512,
return_tensors="pt"
).to(model.device)
import torch
from tqdm.autonotebook import tqdm
batch_size = 32
dataset_embeddings = []
for i in tqdm(range(0, len(minicorpus_docs["input_ids"]), batch_size)):
minicorpus_docs_batch = {k: v[i:i+batch_size] for k,v in minicorpus_docs.items()}
with torch.no_grad():
dataset_embeddings.append(
model.first_stage_model(**minicorpus_docs_batch)
)
dataset_embeddings = torch.cat(dataset_embeddings)
```
### Running the second stage
Now that we have obtained "dataset embeddings" we can embed documents and queries like normal. Remember to use the document prefix for documents:
```python
docs = tokenizer(
[document_prefix + doc for doc in docs],
truncation=True,
padding=True,
max_length=512,
return_tensors="pt"
).to(model.device)
with torch.no_grad():
doc_embeddings = model.second_stage_model(
input_ids=docs["input_ids"],
attention_mask=docs["attention_mask"],
dataset_embeddings=dataset_embeddings,
)
doc_embeddings /= doc_embeddings.norm(p=2, dim=1, keepdim=True)
```
and the query prefix for queries:
```python
queries = queries.select(range(16))["text"]
queries = tokenizer(
[query_prefix + query for query in queries],
truncation=True,
padding=True,
max_length=512,
return_tensors="pt"
).to(model.device)
with torch.no_grad():
query_embeddings = model.second_stage_model(
input_ids=queries["input_ids"],
attention_mask=queries["attention_mask"],
dataset_embeddings=dataset_embeddings,
)
query_embeddings /= query_embeddings.norm(p=2, dim=1, keepdim=True)
```
these embeddings can be compared using dot product, since they're normalized.
</details>
### What if I don't know what my corpus will be ahead of time?
If you can't obtain corpus information ahead of time, you still have to pass *something* as the dataset embeddings; our model will work fine in this case, but not quite as well; without corpus information, our model performance drops from 65.0 to 63.8 on MTEB. We provide [some random strings](https://huggingface.co/jxm/cde-small-v1/resolve/main/random_strings.txt) that worked well for us that can be used as a substitute for corpus sampling.
## With Sentence Transformers
<details open="">
<summary>Click to learn how to use cde-small-v1 with Sentence Transformers</summary>
### Loading the model
Our model can be loaded using `sentence-transformers` out-of-the-box with "trust remote code" enabled:
```python
from sentence_transformers import SentenceTransformer
model = SentenceTransformer("jxm/cde-small-v1", trust_remote_code=True)
```
#### Note on prefixes
*Nota bene*: Like all state-of-the-art embedding models, our model was trained with task-specific prefixes. To do retrieval, you can use `prompt_name="query"` and `prompt_name="document"` in the `encode` method of the model when embedding queries and documents, respectively.
### First stage
```python
minicorpus_size = model[0].config.transductive_corpus_size
minicorpus_docs = [ ... ] # Put some strings here that are representative of your corpus, for example by calling random.sample(corpus, k=minicorpus_size)
assert len(minicorpus_docs) == minicorpus_size # You must use exactly this many documents in the minicorpus. You can oversample if your corpus is smaller.
dataset_embeddings = model.encode(
minicorpus_docs,
prompt_name="document",
convert_to_tensor=True
)
```
### Running the second stage
Now that we have obtained "dataset embeddings" we can embed documents and queries like normal. Remember to use the document prompt for documents:
```python
docs = [...]
queries = [...]
doc_embeddings = model.encode(
docs,
prompt_name="document",
dataset_embeddings=dataset_embeddings,
convert_to_tensor=True,
)
query_embeddings = model.encode(
queries,
prompt_name="query",
dataset_embeddings=dataset_embeddings,
convert_to_tensor=True,
)
```
these embeddings can be compared using cosine similarity via `model.similarity`:
```python
similarities = model.similarity(query_embeddings, doc_embeddings)
topk_values, topk_indices = similarities.topk(5)
```
<details>
<summary>Click here for a full copy-paste ready example</summary>
```python
from sentence_transformers import SentenceTransformer
from datasets import load_dataset
# 1. Load the Sentence Transformer model
model = SentenceTransformer("jxm/cde-small-v1", trust_remote_code=True)
context_docs_size = model[0].config.transductive_corpus_size # 512
# 2. Load the dataset: context dataset, docs, and queries
dataset = load_dataset("sentence-transformers/natural-questions", split="train")
dataset.shuffle(seed=42)
# 10 queries, 512 context docs, 500 docs
queries = dataset["query"][:10]
docs = dataset["answer"][:2000]
context_docs = dataset["answer"][-context_docs_size:] # Last 512 docs
# 3. First stage: embed the context docs
dataset_embeddings = model.encode(
context_docs,
prompt_name="document",
convert_to_tensor=True,
)
# 4. Second stage: embed the docs and queries
doc_embeddings = model.encode(
docs,
prompt_name="document",
dataset_embeddings=dataset_embeddings,
convert_to_tensor=True,
)
query_embeddings = model.encode(
queries,
prompt_name="query",
dataset_embeddings=dataset_embeddings,
convert_to_tensor=True,
)
# 5. Compute the similarity between the queries and docs
similarities = model.similarity(query_embeddings, doc_embeddings)
topk_values, topk_indices = similarities.topk(5)
print(topk_values)
print(topk_indices)
"""
tensor([[0.5495, 0.5426, 0.5423, 0.5292, 0.5286],
[0.6357, 0.6334, 0.6177, 0.5862, 0.5794],
[0.7648, 0.5452, 0.5000, 0.4959, 0.4881],
[0.6802, 0.5225, 0.5178, 0.5160, 0.5075],
[0.6947, 0.5843, 0.5619, 0.5344, 0.5298],
[0.7742, 0.7742, 0.7742, 0.7231, 0.6224],
[0.8853, 0.6667, 0.5829, 0.5795, 0.5769],
[0.6911, 0.6127, 0.6003, 0.5986, 0.5936],
[0.6796, 0.6053, 0.6000, 0.5911, 0.5884],
[0.7624, 0.5589, 0.5428, 0.5278, 0.5275]], device='cuda:0')
tensor([[ 0, 296, 234, 1651, 1184],
[1542, 466, 438, 1207, 1911],
[ 2, 1562, 632, 1852, 382],
[ 3, 694, 932, 1765, 662],
[ 4, 35, 747, 26, 432],
[ 534, 175, 5, 1495, 575],
[ 6, 1802, 1875, 747, 21],
[ 7, 1913, 1936, 640, 6],
[ 8, 747, 167, 1318, 1743],
[ 9, 1583, 1145, 219, 357]], device='cuda:0')
"""
# As you can see, almost every query_i has document_i as the most similar document.
# 6. Print the top-k results
for query_idx, top_doc_idx in enumerate(topk_indices[:, 0]):
print(f"Query {query_idx}: {queries[query_idx]}")
print(f"Top Document: {docs[top_doc_idx]}")
print()
"""
Query 0: when did richmond last play in a preliminary final
Top Document: Richmond Football Club Richmond began 2017 with 5 straight wins, a feat it had not achieved since 1995. A series of close losses hampered the Tigers throughout the middle of the season, including a 5-point loss to the Western Bulldogs, 2-point loss to Fremantle, and a 3-point loss to the Giants. Richmond ended the season strongly with convincing victories over Fremantle and St Kilda in the final two rounds, elevating the club to 3rd on the ladder. Richmond's first final of the season against the Cats at the MCG attracted a record qualifying final crowd of 95,028; the Tigers won by 51 points. Having advanced to the first preliminary finals for the first time since 2001, Richmond defeated Greater Western Sydney by 36 points in front of a crowd of 94,258 to progress to the Grand Final against Adelaide, their first Grand Final appearance since 1982. The attendance was 100,021, the largest crowd to a grand final since 1986. The Crows led at quarter time and led by as many as 13, but the Tigers took over the game as it progressed and scored seven straight goals at one point. They eventually would win by 48 points – 16.12 (108) to Adelaide's 8.12 (60) – to end their 37-year flag drought.[22] Dustin Martin also became the first player to win a Premiership medal, the Brownlow Medal and the Norm Smith Medal in the same season, while Damien Hardwick was named AFL Coaches Association Coach of the Year. Richmond's jump from 13th to premiers also marked the biggest jump from one AFL season to the next.
Query 1: who sang what in the world's come over you
Top Document: Life's What You Make It (Talk Talk song) "Life's What You Make It" is a song by the English band Talk Talk. It was released as a single in 1986, the first from the band's album The Colour of Spring. The single was a hit in the UK, peaking at No. 16, and charted in numerous other countries, often reaching the Top 20.
Query 2: who produces the most wool in the world
Top Document: Wool Global wool production is about 2 million tonnes per year, of which 60% goes into apparel. Wool comprises ca 3% of the global textile market, but its value is higher owing to dying and other modifications of the material.[1] Australia is a leading producer of wool which is mostly from Merino sheep but has been eclipsed by China in terms of total weight.[30] New Zealand (2016) is the third-largest producer of wool, and the largest producer of crossbred wool. Breeds such as Lincoln, Romney, Drysdale, and Elliotdale produce coarser fibers, and wool from these sheep is usually used for making carpets.
Query 3: where does alaska the last frontier take place
Top Document: Alaska: The Last Frontier Alaska: The Last Frontier is an American reality cable television series on the Discovery Channel, currently in its 7th season of broadcast. The show documents the extended Kilcher family, descendants of Swiss immigrants and Alaskan pioneers, Yule and Ruth Kilcher, at their homestead 11 miles outside of Homer.[1] By living without plumbing or modern heating, the clan chooses to subsist by farming, hunting and preparing for the long winters.[2] The Kilcher family are relatives of the singer Jewel,[1][3] who has appeared on the show.[4]
Query 4: a day to remember all i want cameos
Top Document: All I Want (A Day to Remember song) The music video for the song, which was filmed in October 2010,[4] was released on January 6, 2011.[5] It features cameos of numerous popular bands and musicians. The cameos are: Tom Denney (A Day to Remember's former guitarist), Pete Wentz, Winston McCall of Parkway Drive, The Devil Wears Prada, Bring Me the Horizon, Sam Carter of Architects, Tim Lambesis of As I Lay Dying, Silverstein, Andrew WK, August Burns Red, Seventh Star, Matt Heafy of Trivium, Vic Fuentes of Pierce the Veil, Mike Herrera of MxPx, and Set Your Goals.[5] Rock Sound called the video "quite excellent".[5]
Query 5: what does the red stripes mean on the american flag
Top Document: Flag of the United States The flag of the United States of America, often referred to as the American flag, is the national flag of the United States. It consists of thirteen equal horizontal stripes of red (top and bottom) alternating with white, with a blue rectangle in the canton (referred to specifically as the "union") bearing fifty small, white, five-pointed stars arranged in nine offset horizontal rows, where rows of six stars (top and bottom) alternate with rows of five stars. The 50 stars on the flag represent the 50 states of the United States of America, and the 13 stripes represent the thirteen British colonies that declared independence from the Kingdom of Great Britain, and became the first states in the U.S.[1] Nicknames for the flag include The Stars and Stripes,[2] Old Glory,[3] and The Star-Spangled Banner.
Query 6: where did they film diary of a wimpy kid
Top Document: Diary of a Wimpy Kid (film) Filming of Diary of a Wimpy Kid was in Vancouver and wrapped up on October 16, 2009.
Query 7: where was beasts of the southern wild filmed
Top Document: Beasts of the Southern Wild The film's fictional setting, "Isle de Charles Doucet", known to its residents as the Bathtub, was inspired by several isolated and independent fishing communities threatened by erosion, hurricanes and rising sea levels in Louisiana's Terrebonne Parish, most notably the rapidly eroding Isle de Jean Charles. It was filmed in Terrebonne Parish town Montegut.[5]
Query 8: what part of the country are you likely to find the majority of the mollisols
Top Document: Mollisol Mollisols occur in savannahs and mountain valleys (such as Central Asia, or the North American Great Plains). These environments have historically been strongly influenced by fire and abundant pedoturbation from organisms such as ants and earthworms. It was estimated that in 2003, only 14 to 26 percent of grassland ecosystems still remained in a relatively natural state (that is, they were not used for agriculture due to the fertility of the A horizon). Globally, they represent ~7% of ice-free land area. As the world's most agriculturally productive soil order, the Mollisols represent one of the more economically important soil orders.
Query 9: when did fosters home for imaginary friends start
Top Document: Foster's Home for Imaginary Friends McCracken conceived the series after adopting two dogs from an animal shelter and applying the concept to imaginary friends. The show first premiered on Cartoon Network on August 13, 2004, as a 90-minute television film. On August 20, it began its normal run of twenty-to-thirty-minute episodes on Fridays, at 7 pm. The series finished its run on May 3, 2009, with a total of six seasons and seventy-nine episodes. McCracken left Cartoon Network shortly after the series ended. Reruns have aired on Boomerang from August 11, 2012 to November 3, 2013 and again from June 1, 2014 to April 3, 2017.
"""
```
</details>
### Colab demo
We've set up a short demo in a Colab notebook showing how you might use our model:
[Try our model in Colab:](https://colab.research.google.com/drive/1r8xwbp7_ySL9lP-ve4XMJAHjidB9UkbL?usp=sharing)
### Acknowledgments
Early experiments on CDE were done with support from [Nomic](https://www.nomic.ai/) and [Hyperbolic](https://hyperbolic.xyz/). We're especially indebted to Nomic for [open-sourcing their efficient BERT implementation and contrastive pre-training data](https://www.nomic.ai/blog/posts/nomic-embed-text-v1), which proved vital in the development of CDE.
### Cite us
Used our model, method, or architecture? Want to cite us? Here's the ArXiv citation information:
```
@misc{morris2024contextualdocumentembeddings,
title={Contextual Document Embeddings},
author={John X. Morris and Alexander M. Rush},
year={2024},
eprint={2410.02525},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2410.02525},
}
```
| [
"SUMMARIZATION"
] | [
"BIOSSES",
"MEDAL",
"SCIFACT"
] |
Alibaba-NLP/gte-Qwen1.5-7B-instruct | Alibaba-NLP | sentence-similarity | [
"sentence-transformers",
"safetensors",
"qwen2",
"text-generation",
"mteb",
"transformers",
"Qwen",
"sentence-similarity",
"custom_code",
"arxiv:2308.03281",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"text-generation-inference",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
] | 2024-04-20T04:24:58 | 2025-01-11T07:10:24 | 1,363 | 102 | ---
license: apache-2.0
tags:
- mteb
- sentence-transformers
- transformers
- Qwen
- sentence-similarity
model-index:
- name: gte-qwen1.5-7b
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 83.16417910447761
- type: ap
value: 49.37655308937739
- type: f1
value: 77.52987230462615
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 96.6959
- type: ap
value: 94.90885739242472
- type: f1
value: 96.69477648952649
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 62.168
- type: f1
value: 60.411431278343755
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: mteb/arguana
config: default
split: test
revision: c22ab2a51041ffd869aaddef7af8d8215647e41a
metrics:
- type: map_at_1
value: 36.415
- type: map_at_10
value: 53.505
- type: map_at_100
value: 54.013
- type: map_at_1000
value: 54.013
- type: map_at_3
value: 48.459
- type: map_at_5
value: 51.524
- type: mrr_at_1
value: 36.842000000000006
- type: mrr_at_10
value: 53.679
- type: mrr_at_100
value: 54.17999999999999
- type: mrr_at_1000
value: 54.17999999999999
- type: mrr_at_3
value: 48.613
- type: mrr_at_5
value: 51.696
- type: ndcg_at_1
value: 36.415
- type: ndcg_at_10
value: 62.644999999999996
- type: ndcg_at_100
value: 64.60000000000001
- type: ndcg_at_1000
value: 64.60000000000001
- type: ndcg_at_3
value: 52.44799999999999
- type: ndcg_at_5
value: 57.964000000000006
- type: precision_at_1
value: 36.415
- type: precision_at_10
value: 9.161
- type: precision_at_100
value: 0.996
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 21.337
- type: precision_at_5
value: 15.476999999999999
- type: recall_at_1
value: 36.415
- type: recall_at_10
value: 91.607
- type: recall_at_100
value: 99.644
- type: recall_at_1000
value: 99.644
- type: recall_at_3
value: 64.011
- type: recall_at_5
value: 77.383
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 56.40183100758549
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 51.44814171373338
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 66.00208703259058
- type: mrr
value: 78.95165545442553
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 82.12591694410098
- type: cos_sim_spearman
value: 81.11570369802254
- type: euclidean_pearson
value: 80.91709076204458
- type: euclidean_spearman
value: 81.11570369802254
- type: manhattan_pearson
value: 80.71719561024605
- type: manhattan_spearman
value: 81.21510355327713
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 81.67857142857142
- type: f1
value: 80.84103272994895
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 49.008657468552016
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 45.05901064421589
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: f46a197baaae43b4f621051089b82a364682dfeb
metrics:
- type: map_at_1
value: 32.694
- type: map_at_10
value: 43.895
- type: map_at_100
value: 45.797
- type: map_at_1000
value: 45.922000000000004
- type: map_at_3
value: 40.141
- type: map_at_5
value: 42.077
- type: mrr_at_1
value: 40.2
- type: mrr_at_10
value: 50.11
- type: mrr_at_100
value: 51.101
- type: mrr_at_1000
value: 51.13100000000001
- type: mrr_at_3
value: 47.735
- type: mrr_at_5
value: 48.922
- type: ndcg_at_1
value: 40.2
- type: ndcg_at_10
value: 50.449999999999996
- type: ndcg_at_100
value: 56.85
- type: ndcg_at_1000
value: 58.345
- type: ndcg_at_3
value: 45.261
- type: ndcg_at_5
value: 47.298
- type: precision_at_1
value: 40.2
- type: precision_at_10
value: 9.742
- type: precision_at_100
value: 1.6480000000000001
- type: precision_at_1000
value: 0.214
- type: precision_at_3
value: 21.841
- type: precision_at_5
value: 15.68
- type: recall_at_1
value: 32.694
- type: recall_at_10
value: 62.751999999999995
- type: recall_at_100
value: 88.619
- type: recall_at_1000
value: 97.386
- type: recall_at_3
value: 47.087
- type: recall_at_5
value: 53.108999999999995
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackEnglishRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: ad9991cb51e31e31e430383c75ffb2885547b5f0
metrics:
- type: map_at_1
value: 27.849
- type: map_at_10
value: 37.938
- type: map_at_100
value: 39.211
- type: map_at_1000
value: 39.333
- type: map_at_3
value: 35.314
- type: map_at_5
value: 36.666
- type: mrr_at_1
value: 34.904
- type: mrr_at_10
value: 43.869
- type: mrr_at_100
value: 44.614
- type: mrr_at_1000
value: 44.662
- type: mrr_at_3
value: 41.815000000000005
- type: mrr_at_5
value: 42.943
- type: ndcg_at_1
value: 34.904
- type: ndcg_at_10
value: 43.605
- type: ndcg_at_100
value: 48.339999999999996
- type: ndcg_at_1000
value: 50.470000000000006
- type: ndcg_at_3
value: 39.835
- type: ndcg_at_5
value: 41.364000000000004
- type: precision_at_1
value: 34.904
- type: precision_at_10
value: 8.222999999999999
- type: precision_at_100
value: 1.332
- type: precision_at_1000
value: 0.183
- type: precision_at_3
value: 19.575
- type: precision_at_5
value: 13.58
- type: recall_at_1
value: 27.849
- type: recall_at_10
value: 53.635
- type: recall_at_100
value: 73.932
- type: recall_at_1000
value: 87.29599999999999
- type: recall_at_3
value: 42.019
- type: recall_at_5
value: 46.58
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGamingRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 4885aa143210c98657558c04aaf3dc47cfb54340
metrics:
- type: map_at_1
value: 29.182999999999996
- type: map_at_10
value: 41.233
- type: map_at_100
value: 42.52
- type: map_at_1000
value: 42.589
- type: map_at_3
value: 37.284
- type: map_at_5
value: 39.586
- type: mrr_at_1
value: 33.793
- type: mrr_at_10
value: 44.572
- type: mrr_at_100
value: 45.456
- type: mrr_at_1000
value: 45.497
- type: mrr_at_3
value: 41.275
- type: mrr_at_5
value: 43.278
- type: ndcg_at_1
value: 33.793
- type: ndcg_at_10
value: 47.823
- type: ndcg_at_100
value: 52.994
- type: ndcg_at_1000
value: 54.400000000000006
- type: ndcg_at_3
value: 40.82
- type: ndcg_at_5
value: 44.426
- type: precision_at_1
value: 33.793
- type: precision_at_10
value: 8.312999999999999
- type: precision_at_100
value: 1.191
- type: precision_at_1000
value: 0.136
- type: precision_at_3
value: 18.662
- type: precision_at_5
value: 13.668
- type: recall_at_1
value: 29.182999999999996
- type: recall_at_10
value: 64.14999999999999
- type: recall_at_100
value: 86.533
- type: recall_at_1000
value: 96.492
- type: recall_at_3
value: 45.7
- type: recall_at_5
value: 54.330999999999996
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGisRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 5003b3064772da1887988e05400cf3806fe491f2
metrics:
- type: map_at_1
value: 24.389
- type: map_at_10
value: 33.858
- type: map_at_100
value: 35.081
- type: map_at_1000
value: 35.161
- type: map_at_3
value: 30.793
- type: map_at_5
value: 32.336
- type: mrr_at_1
value: 27.006000000000004
- type: mrr_at_10
value: 36.378
- type: mrr_at_100
value: 37.345
- type: mrr_at_1000
value: 37.405
- type: mrr_at_3
value: 33.578
- type: mrr_at_5
value: 34.991
- type: ndcg_at_1
value: 27.006000000000004
- type: ndcg_at_10
value: 39.612
- type: ndcg_at_100
value: 45.216
- type: ndcg_at_1000
value: 47.12
- type: ndcg_at_3
value: 33.566
- type: ndcg_at_5
value: 36.105
- type: precision_at_1
value: 27.006000000000004
- type: precision_at_10
value: 6.372999999999999
- type: precision_at_100
value: 0.968
- type: precision_at_1000
value: 0.11800000000000001
- type: precision_at_3
value: 14.501
- type: precision_at_5
value: 10.169
- type: recall_at_1
value: 24.389
- type: recall_at_10
value: 55.131
- type: recall_at_100
value: 80.315
- type: recall_at_1000
value: 94.284
- type: recall_at_3
value: 38.643
- type: recall_at_5
value: 44.725
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackMathematicaRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 90fceea13679c63fe563ded68f3b6f06e50061de
metrics:
- type: map_at_1
value: 15.845999999999998
- type: map_at_10
value: 25.019000000000002
- type: map_at_100
value: 26.478
- type: map_at_1000
value: 26.598
- type: map_at_3
value: 21.595
- type: map_at_5
value: 23.335
- type: mrr_at_1
value: 20.274
- type: mrr_at_10
value: 29.221000000000004
- type: mrr_at_100
value: 30.354999999999997
- type: mrr_at_1000
value: 30.419
- type: mrr_at_3
value: 26.161
- type: mrr_at_5
value: 27.61
- type: ndcg_at_1
value: 20.274
- type: ndcg_at_10
value: 31.014000000000003
- type: ndcg_at_100
value: 37.699
- type: ndcg_at_1000
value: 40.363
- type: ndcg_at_3
value: 24.701999999999998
- type: ndcg_at_5
value: 27.261999999999997
- type: precision_at_1
value: 20.274
- type: precision_at_10
value: 6.219
- type: precision_at_100
value: 1.101
- type: precision_at_1000
value: 0.146
- type: precision_at_3
value: 12.231
- type: precision_at_5
value: 9.129
- type: recall_at_1
value: 15.845999999999998
- type: recall_at_10
value: 45.358
- type: recall_at_100
value: 74.232
- type: recall_at_1000
value: 92.985
- type: recall_at_3
value: 28.050000000000004
- type: recall_at_5
value: 34.588
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackPhysicsRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4
metrics:
- type: map_at_1
value: 33.808
- type: map_at_10
value: 46.86
- type: map_at_100
value: 48.237
- type: map_at_1000
value: 48.331
- type: map_at_3
value: 42.784
- type: map_at_5
value: 45.015
- type: mrr_at_1
value: 41.771
- type: mrr_at_10
value: 52.35300000000001
- type: mrr_at_100
value: 53.102000000000004
- type: mrr_at_1000
value: 53.132999999999996
- type: mrr_at_3
value: 49.663000000000004
- type: mrr_at_5
value: 51.27
- type: ndcg_at_1
value: 41.771
- type: ndcg_at_10
value: 53.562
- type: ndcg_at_100
value: 58.809999999999995
- type: ndcg_at_1000
value: 60.23
- type: ndcg_at_3
value: 47.514
- type: ndcg_at_5
value: 50.358999999999995
- type: precision_at_1
value: 41.771
- type: precision_at_10
value: 10.038
- type: precision_at_100
value: 1.473
- type: precision_at_1000
value: 0.17600000000000002
- type: precision_at_3
value: 22.875
- type: precision_at_5
value: 16.477
- type: recall_at_1
value: 33.808
- type: recall_at_10
value: 67.721
- type: recall_at_100
value: 89.261
- type: recall_at_1000
value: 98.042
- type: recall_at_3
value: 50.807
- type: recall_at_5
value: 58.162000000000006
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackProgrammersRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 6184bc1440d2dbc7612be22b50686b8826d22b32
metrics:
- type: map_at_1
value: 28.105000000000004
- type: map_at_10
value: 40.354
- type: map_at_100
value: 41.921
- type: map_at_1000
value: 42.021
- type: map_at_3
value: 36.532
- type: map_at_5
value: 38.671
- type: mrr_at_1
value: 34.475
- type: mrr_at_10
value: 45.342
- type: mrr_at_100
value: 46.300000000000004
- type: mrr_at_1000
value: 46.343
- type: mrr_at_3
value: 42.637
- type: mrr_at_5
value: 44.207
- type: ndcg_at_1
value: 34.475
- type: ndcg_at_10
value: 46.945
- type: ndcg_at_100
value: 52.939
- type: ndcg_at_1000
value: 54.645999999999994
- type: ndcg_at_3
value: 41.065000000000005
- type: ndcg_at_5
value: 43.832
- type: precision_at_1
value: 34.475
- type: precision_at_10
value: 8.892999999999999
- type: precision_at_100
value: 1.377
- type: precision_at_1000
value: 0.17099999999999999
- type: precision_at_3
value: 20.091
- type: precision_at_5
value: 14.452000000000002
- type: recall_at_1
value: 28.105000000000004
- type: recall_at_10
value: 61.253
- type: recall_at_100
value: 85.92
- type: recall_at_1000
value: 96.799
- type: recall_at_3
value: 45.094
- type: recall_at_5
value: 52.455
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4
metrics:
- type: map_at_1
value: 24.613833333333332
- type: map_at_10
value: 34.763
- type: map_at_100
value: 36.17066666666667
- type: map_at_1000
value: 36.2905
- type: map_at_3
value: 31.53541666666666
- type: map_at_5
value: 33.29216666666667
- type: mrr_at_1
value: 29.48725
- type: mrr_at_10
value: 38.92066666666667
- type: mrr_at_100
value: 39.88725000000001
- type: mrr_at_1000
value: 39.9435
- type: mrr_at_3
value: 36.284083333333335
- type: mrr_at_5
value: 37.73941666666667
- type: ndcg_at_1
value: 29.48725
- type: ndcg_at_10
value: 40.635083333333334
- type: ndcg_at_100
value: 46.479416666666665
- type: ndcg_at_1000
value: 48.63308333333334
- type: ndcg_at_3
value: 35.19483333333333
- type: ndcg_at_5
value: 37.68016666666667
- type: precision_at_1
value: 29.48725
- type: precision_at_10
value: 7.406499999999998
- type: precision_at_100
value: 1.2225833333333334
- type: precision_at_1000
value: 0.16108333333333336
- type: precision_at_3
value: 16.53375
- type: precision_at_5
value: 11.919416666666665
- type: recall_at_1
value: 24.613833333333332
- type: recall_at_10
value: 53.91766666666666
- type: recall_at_100
value: 79.18
- type: recall_at_1000
value: 93.85133333333333
- type: recall_at_3
value: 38.866166666666665
- type: recall_at_5
value: 45.21275000000001
- type: map_at_1
value: 12.328999999999999
- type: map_at_10
value: 20.078
- type: map_at_100
value: 21.166999999999998
- type: map_at_1000
value: 21.308
- type: map_at_3
value: 17.702
- type: map_at_5
value: 18.725
- type: mrr_at_1
value: 13.678
- type: mrr_at_10
value: 21.859
- type: mrr_at_100
value: 22.816
- type: mrr_at_1000
value: 22.926
- type: mrr_at_3
value: 19.378
- type: mrr_at_5
value: 20.385
- type: ndcg_at_1
value: 13.678
- type: ndcg_at_10
value: 24.993000000000002
- type: ndcg_at_100
value: 30.464999999999996
- type: ndcg_at_1000
value: 33.916000000000004
- type: ndcg_at_3
value: 19.966
- type: ndcg_at_5
value: 21.712999999999997
- type: precision_at_1
value: 13.678
- type: precision_at_10
value: 4.473
- type: precision_at_100
value: 0.784
- type: precision_at_1000
value: 0.116
- type: precision_at_3
value: 9.181000000000001
- type: precision_at_5
value: 6.506
- type: recall_at_1
value: 12.328999999999999
- type: recall_at_10
value: 38.592
- type: recall_at_100
value: 63.817
- type: recall_at_1000
value: 89.67500000000001
- type: recall_at_3
value: 24.726
- type: recall_at_5
value: 28.959000000000003
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackStatsRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a
metrics:
- type: map_at_1
value: 25.106
- type: map_at_10
value: 33.367999999999995
- type: map_at_100
value: 34.586
- type: map_at_1000
value: 34.681
- type: map_at_3
value: 31.022
- type: map_at_5
value: 32.548
- type: mrr_at_1
value: 28.374
- type: mrr_at_10
value: 36.521
- type: mrr_at_100
value: 37.55
- type: mrr_at_1000
value: 37.614999999999995
- type: mrr_at_3
value: 34.509
- type: mrr_at_5
value: 35.836
- type: ndcg_at_1
value: 28.374
- type: ndcg_at_10
value: 37.893
- type: ndcg_at_100
value: 43.694
- type: ndcg_at_1000
value: 46.001999999999995
- type: ndcg_at_3
value: 33.825
- type: ndcg_at_5
value: 36.201
- type: precision_at_1
value: 28.374
- type: precision_at_10
value: 5.966
- type: precision_at_100
value: 0.9650000000000001
- type: precision_at_1000
value: 0.124
- type: precision_at_3
value: 14.774999999999999
- type: precision_at_5
value: 10.459999999999999
- type: recall_at_1
value: 25.106
- type: recall_at_10
value: 48.607
- type: recall_at_100
value: 74.66000000000001
- type: recall_at_1000
value: 91.562
- type: recall_at_3
value: 37.669999999999995
- type: recall_at_5
value: 43.484
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackTexRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 46989137a86843e03a6195de44b09deda022eec7
metrics:
- type: map_at_1
value: 13.755
- type: map_at_10
value: 20.756
- type: map_at_100
value: 22.05
- type: map_at_1000
value: 22.201
- type: map_at_3
value: 18.243000000000002
- type: map_at_5
value: 19.512
- type: mrr_at_1
value: 16.93
- type: mrr_at_10
value: 24.276
- type: mrr_at_100
value: 25.349
- type: mrr_at_1000
value: 25.441000000000003
- type: mrr_at_3
value: 21.897
- type: mrr_at_5
value: 23.134
- type: ndcg_at_1
value: 16.93
- type: ndcg_at_10
value: 25.508999999999997
- type: ndcg_at_100
value: 31.777
- type: ndcg_at_1000
value: 35.112
- type: ndcg_at_3
value: 20.896
- type: ndcg_at_5
value: 22.857
- type: precision_at_1
value: 16.93
- type: precision_at_10
value: 4.972
- type: precision_at_100
value: 0.963
- type: precision_at_1000
value: 0.145
- type: precision_at_3
value: 10.14
- type: precision_at_5
value: 7.536
- type: recall_at_1
value: 13.755
- type: recall_at_10
value: 36.46
- type: recall_at_100
value: 64.786
- type: recall_at_1000
value: 88.287
- type: recall_at_3
value: 23.681
- type: recall_at_5
value: 28.615000000000002
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackUnixRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53
metrics:
- type: map_at_1
value: 26.99
- type: map_at_10
value: 38.009
- type: map_at_100
value: 39.384
- type: map_at_1000
value: 39.481
- type: map_at_3
value: 34.593
- type: map_at_5
value: 36.449999999999996
- type: mrr_at_1
value: 31.81
- type: mrr_at_10
value: 41.943000000000005
- type: mrr_at_100
value: 42.914
- type: mrr_at_1000
value: 42.962
- type: mrr_at_3
value: 39.179
- type: mrr_at_5
value: 40.798
- type: ndcg_at_1
value: 31.81
- type: ndcg_at_10
value: 44.086
- type: ndcg_at_100
value: 50.026
- type: ndcg_at_1000
value: 51.903999999999996
- type: ndcg_at_3
value: 38.23
- type: ndcg_at_5
value: 40.926
- type: precision_at_1
value: 31.81
- type: precision_at_10
value: 7.761
- type: precision_at_100
value: 1.205
- type: precision_at_1000
value: 0.148
- type: precision_at_3
value: 17.537
- type: precision_at_5
value: 12.649
- type: recall_at_1
value: 26.99
- type: recall_at_10
value: 58.467
- type: recall_at_100
value: 83.93
- type: recall_at_1000
value: 96.452
- type: recall_at_3
value: 42.685
- type: recall_at_5
value: 49.341
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWebmastersRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 160c094312a0e1facb97e55eeddb698c0abe3571
metrics:
- type: map_at_1
value: 25.312
- type: map_at_10
value: 35.788
- type: map_at_100
value: 37.616
- type: map_at_1000
value: 37.86
- type: map_at_3
value: 32.422000000000004
- type: map_at_5
value: 34.585
- type: mrr_at_1
value: 30.631999999999998
- type: mrr_at_10
value: 40.604
- type: mrr_at_100
value: 41.745
- type: mrr_at_1000
value: 41.788
- type: mrr_at_3
value: 37.582
- type: mrr_at_5
value: 39.499
- type: ndcg_at_1
value: 30.631999999999998
- type: ndcg_at_10
value: 42.129
- type: ndcg_at_100
value: 48.943
- type: ndcg_at_1000
value: 51.089
- type: ndcg_at_3
value: 36.658
- type: ndcg_at_5
value: 39.818999999999996
- type: precision_at_1
value: 30.631999999999998
- type: precision_at_10
value: 7.904999999999999
- type: precision_at_100
value: 1.664
- type: precision_at_1000
value: 0.256
- type: precision_at_3
value: 16.996
- type: precision_at_5
value: 12.727
- type: recall_at_1
value: 25.312
- type: recall_at_10
value: 54.886
- type: recall_at_100
value: 84.155
- type: recall_at_1000
value: 96.956
- type: recall_at_3
value: 40.232
- type: recall_at_5
value: 48.204
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: mteb/climate-fever
config: default
split: test
revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380
metrics:
- type: map_at_1
value: 19.147
- type: map_at_10
value: 33.509
- type: map_at_100
value: 35.573
- type: map_at_1000
value: 35.769
- type: map_at_3
value: 27.983999999999998
- type: map_at_5
value: 31.012
- type: mrr_at_1
value: 43.844
- type: mrr_at_10
value: 56.24
- type: mrr_at_100
value: 56.801
- type: mrr_at_1000
value: 56.826
- type: mrr_at_3
value: 53.290000000000006
- type: mrr_at_5
value: 55.13
- type: ndcg_at_1
value: 43.844
- type: ndcg_at_10
value: 43.996
- type: ndcg_at_100
value: 50.965
- type: ndcg_at_1000
value: 53.927
- type: ndcg_at_3
value: 37.263000000000005
- type: ndcg_at_5
value: 39.553
- type: precision_at_1
value: 43.844
- type: precision_at_10
value: 13.687
- type: precision_at_100
value: 2.139
- type: precision_at_1000
value: 0.269
- type: precision_at_3
value: 28.122000000000003
- type: precision_at_5
value: 21.303
- type: recall_at_1
value: 19.147
- type: recall_at_10
value: 50.449999999999996
- type: recall_at_100
value: 74.00099999999999
- type: recall_at_1000
value: 90.098
- type: recall_at_3
value: 33.343
- type: recall_at_5
value: 40.744
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: mteb/dbpedia
config: default
split: test
revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659
metrics:
- type: map_at_1
value: 8.773
- type: map_at_10
value: 21.172
- type: map_at_100
value: 30.244
- type: map_at_1000
value: 32.127
- type: map_at_3
value: 14.510000000000002
- type: map_at_5
value: 17.483
- type: mrr_at_1
value: 68.25
- type: mrr_at_10
value: 77.33
- type: mrr_at_100
value: 77.529
- type: mrr_at_1000
value: 77.536
- type: mrr_at_3
value: 75.708
- type: mrr_at_5
value: 76.72099999999999
- type: ndcg_at_1
value: 60.0
- type: ndcg_at_10
value: 48.045
- type: ndcg_at_100
value: 51.620999999999995
- type: ndcg_at_1000
value: 58.843999999999994
- type: ndcg_at_3
value: 52.922000000000004
- type: ndcg_at_5
value: 50.27
- type: precision_at_1
value: 68.25
- type: precision_at_10
value: 37.625
- type: precision_at_100
value: 11.774999999999999
- type: precision_at_1000
value: 2.395
- type: precision_at_3
value: 55.25
- type: precision_at_5
value: 47.599999999999994
- type: recall_at_1
value: 8.773
- type: recall_at_10
value: 27.332
- type: recall_at_100
value: 55.48499999999999
- type: recall_at_1000
value: 79.886
- type: recall_at_3
value: 15.823
- type: recall_at_5
value: 20.523
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 54.52999999999999
- type: f1
value: 47.396628088963645
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: mteb/fever
config: default
split: test
revision: bea83ef9e8fb933d90a2f1d5515737465d613e12
metrics:
- type: map_at_1
value: 85.397
- type: map_at_10
value: 90.917
- type: map_at_100
value: 91.109
- type: map_at_1000
value: 91.121
- type: map_at_3
value: 90.045
- type: map_at_5
value: 90.602
- type: mrr_at_1
value: 92.00399999999999
- type: mrr_at_10
value: 95.39999999999999
- type: mrr_at_100
value: 95.41
- type: mrr_at_1000
value: 95.41
- type: mrr_at_3
value: 95.165
- type: mrr_at_5
value: 95.348
- type: ndcg_at_1
value: 92.00399999999999
- type: ndcg_at_10
value: 93.345
- type: ndcg_at_100
value: 93.934
- type: ndcg_at_1000
value: 94.108
- type: ndcg_at_3
value: 92.32000000000001
- type: ndcg_at_5
value: 92.899
- type: precision_at_1
value: 92.00399999999999
- type: precision_at_10
value: 10.839
- type: precision_at_100
value: 1.1440000000000001
- type: precision_at_1000
value: 0.117
- type: precision_at_3
value: 34.298
- type: precision_at_5
value: 21.128
- type: recall_at_1
value: 85.397
- type: recall_at_10
value: 96.375
- type: recall_at_100
value: 98.518
- type: recall_at_1000
value: 99.515
- type: recall_at_3
value: 93.59100000000001
- type: recall_at_5
value: 95.134
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: mteb/fiqa
config: default
split: test
revision: 27a168819829fe9bcd655c2df245fb19452e8e06
metrics:
- type: map_at_1
value: 27.36
- type: map_at_10
value: 46.847
- type: map_at_100
value: 49.259
- type: map_at_1000
value: 49.389
- type: map_at_3
value: 41.095
- type: map_at_5
value: 44.084
- type: mrr_at_1
value: 51.852
- type: mrr_at_10
value: 61.67
- type: mrr_at_100
value: 62.395999999999994
- type: mrr_at_1000
value: 62.414
- type: mrr_at_3
value: 59.465
- type: mrr_at_5
value: 60.584
- type: ndcg_at_1
value: 51.852
- type: ndcg_at_10
value: 55.311
- type: ndcg_at_100
value: 62.6
- type: ndcg_at_1000
value: 64.206
- type: ndcg_at_3
value: 51.159
- type: ndcg_at_5
value: 52.038
- type: precision_at_1
value: 51.852
- type: precision_at_10
value: 15.370000000000001
- type: precision_at_100
value: 2.282
- type: precision_at_1000
value: 0.258
- type: precision_at_3
value: 34.721999999999994
- type: precision_at_5
value: 24.846
- type: recall_at_1
value: 27.36
- type: recall_at_10
value: 63.932
- type: recall_at_100
value: 89.824
- type: recall_at_1000
value: 98.556
- type: recall_at_3
value: 47.227999999999994
- type: recall_at_5
value: 53.724000000000004
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: mteb/hotpotqa
config: default
split: test
revision: ab518f4d6fcca38d87c25209f94beba119d02014
metrics:
- type: map_at_1
value: 40.655
- type: map_at_10
value: 63.824999999999996
- type: map_at_100
value: 64.793
- type: map_at_1000
value: 64.848
- type: map_at_3
value: 60.221000000000004
- type: map_at_5
value: 62.474
- type: mrr_at_1
value: 81.31
- type: mrr_at_10
value: 86.509
- type: mrr_at_100
value: 86.677
- type: mrr_at_1000
value: 86.682
- type: mrr_at_3
value: 85.717
- type: mrr_at_5
value: 86.21
- type: ndcg_at_1
value: 81.31
- type: ndcg_at_10
value: 72.251
- type: ndcg_at_100
value: 75.536
- type: ndcg_at_1000
value: 76.558
- type: ndcg_at_3
value: 67.291
- type: ndcg_at_5
value: 70.045
- type: precision_at_1
value: 81.31
- type: precision_at_10
value: 15.082999999999998
- type: precision_at_100
value: 1.764
- type: precision_at_1000
value: 0.19
- type: precision_at_3
value: 42.971
- type: precision_at_5
value: 27.956999999999997
- type: recall_at_1
value: 40.655
- type: recall_at_10
value: 75.41499999999999
- type: recall_at_100
value: 88.224
- type: recall_at_1000
value: 94.943
- type: recall_at_3
value: 64.456
- type: recall_at_5
value: 69.892
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 95.58120000000001
- type: ap
value: 93.0407063004784
- type: f1
value: 95.57849992996822
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: mteb/msmarco
config: default
split: dev
revision: c5a29a104738b98a9e76336939199e264163d4a0
metrics:
- type: map_at_1
value: 22.031
- type: map_at_10
value: 34.628
- type: map_at_100
value: 35.833
- type: map_at_1000
value: 35.881
- type: map_at_3
value: 30.619000000000003
- type: map_at_5
value: 32.982
- type: mrr_at_1
value: 22.736
- type: mrr_at_10
value: 35.24
- type: mrr_at_100
value: 36.381
- type: mrr_at_1000
value: 36.424
- type: mrr_at_3
value: 31.287
- type: mrr_at_5
value: 33.617000000000004
- type: ndcg_at_1
value: 22.736
- type: ndcg_at_10
value: 41.681000000000004
- type: ndcg_at_100
value: 47.371
- type: ndcg_at_1000
value: 48.555
- type: ndcg_at_3
value: 33.553
- type: ndcg_at_5
value: 37.771
- type: precision_at_1
value: 22.736
- type: precision_at_10
value: 6.625
- type: precision_at_100
value: 0.9450000000000001
- type: precision_at_1000
value: 0.105
- type: precision_at_3
value: 14.331
- type: precision_at_5
value: 10.734
- type: recall_at_1
value: 22.031
- type: recall_at_10
value: 63.378
- type: recall_at_100
value: 89.47699999999999
- type: recall_at_1000
value: 98.48400000000001
- type: recall_at_3
value: 41.388000000000005
- type: recall_at_5
value: 51.522999999999996
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 95.75239398084815
- type: f1
value: 95.51228043205194
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 84.25900592795259
- type: f1
value: 62.14790420114562
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 78.47007397444519
- type: f1
value: 76.92133583932912
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 78.19098856758575
- type: f1
value: 78.10820805879119
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 44.37013684222983
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 42.003012591979704
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 32.70743071063257
- type: mrr
value: 33.938337390083994
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: mteb/nfcorpus
config: default
split: test
revision: ec0fa4fe99da2ff19ca1214b7966684033a58814
metrics:
- type: map_at_1
value: 6.369
- type: map_at_10
value: 14.313
- type: map_at_100
value: 18.329
- type: map_at_1000
value: 20.017
- type: map_at_3
value: 10.257
- type: map_at_5
value: 12.264999999999999
- type: mrr_at_1
value: 49.536
- type: mrr_at_10
value: 58.464000000000006
- type: mrr_at_100
value: 59.016000000000005
- type: mrr_at_1000
value: 59.053
- type: mrr_at_3
value: 56.294999999999995
- type: mrr_at_5
value: 57.766
- type: ndcg_at_1
value: 47.678
- type: ndcg_at_10
value: 38.246
- type: ndcg_at_100
value: 35.370000000000005
- type: ndcg_at_1000
value: 44.517
- type: ndcg_at_3
value: 43.368
- type: ndcg_at_5
value: 41.892
- type: precision_at_1
value: 49.536
- type: precision_at_10
value: 28.235
- type: precision_at_100
value: 9.014999999999999
- type: precision_at_1000
value: 2.257
- type: precision_at_3
value: 40.557
- type: precision_at_5
value: 36.409000000000006
- type: recall_at_1
value: 6.369
- type: recall_at_10
value: 19.195999999999998
- type: recall_at_100
value: 37.042
- type: recall_at_1000
value: 69.203
- type: recall_at_3
value: 11.564
- type: recall_at_5
value: 15.264
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: mteb/nq
config: default
split: test
revision: b774495ed302d8c44a3a7ea25c90dbce03968f31
metrics:
- type: map_at_1
value: 39.323
- type: map_at_10
value: 54.608999999999995
- type: map_at_100
value: 55.523
- type: map_at_1000
value: 55.544000000000004
- type: map_at_3
value: 50.580000000000005
- type: map_at_5
value: 53.064
- type: mrr_at_1
value: 44.263999999999996
- type: mrr_at_10
value: 57.416
- type: mrr_at_100
value: 58.037000000000006
- type: mrr_at_1000
value: 58.05200000000001
- type: mrr_at_3
value: 54.330999999999996
- type: mrr_at_5
value: 56.302
- type: ndcg_at_1
value: 44.263999999999996
- type: ndcg_at_10
value: 61.785999999999994
- type: ndcg_at_100
value: 65.40599999999999
- type: ndcg_at_1000
value: 65.859
- type: ndcg_at_3
value: 54.518
- type: ndcg_at_5
value: 58.53699999999999
- type: precision_at_1
value: 44.263999999999996
- type: precision_at_10
value: 9.652
- type: precision_at_100
value: 1.169
- type: precision_at_1000
value: 0.121
- type: precision_at_3
value: 24.15
- type: precision_at_5
value: 16.848
- type: recall_at_1
value: 39.323
- type: recall_at_10
value: 80.663
- type: recall_at_100
value: 96.072
- type: recall_at_1000
value: 99.37700000000001
- type: recall_at_3
value: 62.23
- type: recall_at_5
value: 71.379
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: mteb/quora
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 72.02499999999999
- type: map_at_10
value: 86.14500000000001
- type: map_at_100
value: 86.764
- type: map_at_1000
value: 86.776
- type: map_at_3
value: 83.249
- type: map_at_5
value: 85.083
- type: mrr_at_1
value: 82.83
- type: mrr_at_10
value: 88.70599999999999
- type: mrr_at_100
value: 88.791
- type: mrr_at_1000
value: 88.791
- type: mrr_at_3
value: 87.815
- type: mrr_at_5
value: 88.435
- type: ndcg_at_1
value: 82.84
- type: ndcg_at_10
value: 89.61200000000001
- type: ndcg_at_100
value: 90.693
- type: ndcg_at_1000
value: 90.752
- type: ndcg_at_3
value: 86.96199999999999
- type: ndcg_at_5
value: 88.454
- type: precision_at_1
value: 82.84
- type: precision_at_10
value: 13.600000000000001
- type: precision_at_100
value: 1.543
- type: precision_at_1000
value: 0.157
- type: precision_at_3
value: 38.092999999999996
- type: precision_at_5
value: 25.024
- type: recall_at_1
value: 72.02499999999999
- type: recall_at_10
value: 96.21600000000001
- type: recall_at_100
value: 99.76
- type: recall_at_1000
value: 99.996
- type: recall_at_3
value: 88.57000000000001
- type: recall_at_5
value: 92.814
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 73.37297191949929
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 72.50752304246946
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: mteb/scidocs
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 6.4479999999999995
- type: map_at_10
value: 17.268
- type: map_at_100
value: 20.502000000000002
- type: map_at_1000
value: 20.904
- type: map_at_3
value: 11.951
- type: map_at_5
value: 14.494000000000002
- type: mrr_at_1
value: 31.900000000000002
- type: mrr_at_10
value: 45.084999999999994
- type: mrr_at_100
value: 46.145
- type: mrr_at_1000
value: 46.164
- type: mrr_at_3
value: 41.6
- type: mrr_at_5
value: 43.76
- type: ndcg_at_1
value: 31.900000000000002
- type: ndcg_at_10
value: 27.694000000000003
- type: ndcg_at_100
value: 39.016
- type: ndcg_at_1000
value: 44.448
- type: ndcg_at_3
value: 26.279999999999998
- type: ndcg_at_5
value: 22.93
- type: precision_at_1
value: 31.900000000000002
- type: precision_at_10
value: 14.399999999999999
- type: precision_at_100
value: 3.082
- type: precision_at_1000
value: 0.436
- type: precision_at_3
value: 24.667
- type: precision_at_5
value: 20.200000000000003
- type: recall_at_1
value: 6.4479999999999995
- type: recall_at_10
value: 29.243000000000002
- type: recall_at_100
value: 62.547
- type: recall_at_1000
value: 88.40299999999999
- type: recall_at_3
value: 14.988000000000001
- type: recall_at_5
value: 20.485
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 80.37839336866843
- type: cos_sim_spearman
value: 79.14737320486729
- type: euclidean_pearson
value: 78.74010870392799
- type: euclidean_spearman
value: 79.1472505448557
- type: manhattan_pearson
value: 78.76735626972086
- type: manhattan_spearman
value: 79.18509055331465
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 84.98947740740309
- type: cos_sim_spearman
value: 76.52068694652895
- type: euclidean_pearson
value: 81.10952542010847
- type: euclidean_spearman
value: 76.52162808897668
- type: manhattan_pearson
value: 81.13752577872523
- type: manhattan_spearman
value: 76.55073892851847
- type: cos_sim_pearson
value: 84.99292517797305
- type: cos_sim_spearman
value: 76.52287451692155
- type: euclidean_pearson
value: 81.11616055544546
- type: euclidean_spearman
value: 76.525387473028
- type: manhattan_pearson
value: 81.14367598670032
- type: manhattan_spearman
value: 76.55571799438607
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 88.14795728641734
- type: cos_sim_spearman
value: 88.62720469210905
- type: euclidean_pearson
value: 87.96160445129142
- type: euclidean_spearman
value: 88.62615925428736
- type: manhattan_pearson
value: 87.86760858379527
- type: manhattan_spearman
value: 88.5613166629411
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 85.06444249948838
- type: cos_sim_spearman
value: 83.32346434965837
- type: euclidean_pearson
value: 83.86264166785146
- type: euclidean_spearman
value: 83.32323156068114
- type: manhattan_pearson
value: 83.87253909108084
- type: manhattan_spearman
value: 83.42760090819642
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 87.00847937091636
- type: cos_sim_spearman
value: 87.50432670473445
- type: euclidean_pearson
value: 87.21611485565168
- type: euclidean_spearman
value: 87.50387351928698
- type: manhattan_pearson
value: 87.30690660623411
- type: manhattan_spearman
value: 87.61147161393255
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 85.51456553517488
- type: cos_sim_spearman
value: 86.39208323626035
- type: euclidean_pearson
value: 85.74698473006475
- type: euclidean_spearman
value: 86.3892506146807
- type: manhattan_pearson
value: 85.77493611949014
- type: manhattan_spearman
value: 86.42961510735024
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 88.63402051628222
- type: cos_sim_spearman
value: 87.78994504115502
- type: euclidean_pearson
value: 88.44861926968403
- type: euclidean_spearman
value: 87.80670473078185
- type: manhattan_pearson
value: 88.4773722010208
- type: manhattan_spearman
value: 87.85175600656768
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 65.9659729672951
- type: cos_sim_spearman
value: 66.39891735341361
- type: euclidean_pearson
value: 68.040150710449
- type: euclidean_spearman
value: 66.41777234484414
- type: manhattan_pearson
value: 68.16264809387305
- type: manhattan_spearman
value: 66.31608161700346
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 86.91024857159385
- type: cos_sim_spearman
value: 87.35031011815016
- type: euclidean_pearson
value: 86.94569462996033
- type: euclidean_spearman
value: 87.34929703462852
- type: manhattan_pearson
value: 86.94404111225616
- type: manhattan_spearman
value: 87.37827218003393
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 87.89077927002596
- type: mrr
value: 96.94650937297997
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: mteb/scifact
config: default
split: test
revision: 0228b52cf27578f30900b9e5271d331663a030d7
metrics:
- type: map_at_1
value: 57.994
- type: map_at_10
value: 70.07100000000001
- type: map_at_100
value: 70.578
- type: map_at_1000
value: 70.588
- type: map_at_3
value: 67.228
- type: map_at_5
value: 68.695
- type: mrr_at_1
value: 61.333000000000006
- type: mrr_at_10
value: 71.342
- type: mrr_at_100
value: 71.739
- type: mrr_at_1000
value: 71.75
- type: mrr_at_3
value: 69.389
- type: mrr_at_5
value: 70.322
- type: ndcg_at_1
value: 61.333000000000006
- type: ndcg_at_10
value: 75.312
- type: ndcg_at_100
value: 77.312
- type: ndcg_at_1000
value: 77.50200000000001
- type: ndcg_at_3
value: 70.72
- type: ndcg_at_5
value: 72.616
- type: precision_at_1
value: 61.333000000000006
- type: precision_at_10
value: 10.167
- type: precision_at_100
value: 1.117
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_3
value: 28.111000000000004
- type: precision_at_5
value: 18.333
- type: recall_at_1
value: 57.994
- type: recall_at_10
value: 89.944
- type: recall_at_100
value: 98.667
- type: recall_at_1000
value: 100.0
- type: recall_at_3
value: 77.694
- type: recall_at_5
value: 82.339
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.81485148514851
- type: cos_sim_ap
value: 95.99339654021689
- type: cos_sim_f1
value: 90.45971329708354
- type: cos_sim_precision
value: 89.44281524926686
- type: cos_sim_recall
value: 91.5
- type: dot_accuracy
value: 99.81485148514851
- type: dot_ap
value: 95.990792367539
- type: dot_f1
value: 90.54187192118228
- type: dot_precision
value: 89.2233009708738
- type: dot_recall
value: 91.9
- type: euclidean_accuracy
value: 99.81386138613861
- type: euclidean_ap
value: 95.99403827746491
- type: euclidean_f1
value: 90.45971329708354
- type: euclidean_precision
value: 89.44281524926686
- type: euclidean_recall
value: 91.5
- type: manhattan_accuracy
value: 99.81485148514851
- type: manhattan_ap
value: 96.06741547889861
- type: manhattan_f1
value: 90.55666003976144
- type: manhattan_precision
value: 90.01976284584981
- type: manhattan_recall
value: 91.10000000000001
- type: max_accuracy
value: 99.81485148514851
- type: max_ap
value: 96.06741547889861
- type: max_f1
value: 90.55666003976144
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 79.0667992003181
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 49.57086425048946
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 53.929415255105894
- type: mrr
value: 54.93889790764791
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 31.050700527286658
- type: cos_sim_spearman
value: 31.46077656458546
- type: dot_pearson
value: 31.056448416258263
- type: dot_spearman
value: 31.435272601921042
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: mteb/trec-covid
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 0.23500000000000001
- type: map_at_10
value: 1.812
- type: map_at_100
value: 10.041
- type: map_at_1000
value: 24.095
- type: map_at_3
value: 0.643
- type: map_at_5
value: 1.0
- type: mrr_at_1
value: 86.0
- type: mrr_at_10
value: 92.0
- type: mrr_at_100
value: 92.0
- type: mrr_at_1000
value: 92.0
- type: mrr_at_3
value: 91.667
- type: mrr_at_5
value: 91.667
- type: ndcg_at_1
value: 79.0
- type: ndcg_at_10
value: 72.72
- type: ndcg_at_100
value: 55.82899999999999
- type: ndcg_at_1000
value: 50.72
- type: ndcg_at_3
value: 77.715
- type: ndcg_at_5
value: 75.036
- type: precision_at_1
value: 86.0
- type: precision_at_10
value: 77.60000000000001
- type: precision_at_100
value: 56.46
- type: precision_at_1000
value: 22.23
- type: precision_at_3
value: 82.667
- type: precision_at_5
value: 80.4
- type: recall_at_1
value: 0.23500000000000001
- type: recall_at_10
value: 2.046
- type: recall_at_100
value: 13.708
- type: recall_at_1000
value: 47.451
- type: recall_at_3
value: 0.6709999999999999
- type: recall_at_5
value: 1.078
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: mteb/touche2020
config: default
split: test
revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f
metrics:
- type: map_at_1
value: 2.252
- type: map_at_10
value: 7.958
- type: map_at_100
value: 12.293
- type: map_at_1000
value: 13.832
- type: map_at_3
value: 4.299
- type: map_at_5
value: 5.514
- type: mrr_at_1
value: 30.612000000000002
- type: mrr_at_10
value: 42.329
- type: mrr_at_100
value: 43.506
- type: mrr_at_1000
value: 43.506
- type: mrr_at_3
value: 38.775999999999996
- type: mrr_at_5
value: 39.592
- type: ndcg_at_1
value: 28.571
- type: ndcg_at_10
value: 20.301
- type: ndcg_at_100
value: 30.703999999999997
- type: ndcg_at_1000
value: 43.155
- type: ndcg_at_3
value: 22.738
- type: ndcg_at_5
value: 20.515
- type: precision_at_1
value: 30.612000000000002
- type: precision_at_10
value: 17.347
- type: precision_at_100
value: 6.327000000000001
- type: precision_at_1000
value: 1.443
- type: precision_at_3
value: 22.448999999999998
- type: precision_at_5
value: 19.184
- type: recall_at_1
value: 2.252
- type: recall_at_10
value: 13.206999999999999
- type: recall_at_100
value: 40.372
- type: recall_at_1000
value: 78.071
- type: recall_at_3
value: 5.189
- type: recall_at_5
value: 7.338
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 78.75399999999999
- type: ap
value: 19.666483622175363
- type: f1
value: 61.575187470329176
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 66.00452744765137
- type: f1
value: 66.18291586829227
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 51.308747717084316
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 87.81069321094355
- type: cos_sim_ap
value: 79.3576921453847
- type: cos_sim_f1
value: 71.75811286328685
- type: cos_sim_precision
value: 70.89878959567345
- type: cos_sim_recall
value: 72.63852242744063
- type: dot_accuracy
value: 87.79877212850927
- type: dot_ap
value: 79.35550320857683
- type: dot_f1
value: 71.78153446033811
- type: dot_precision
value: 70.76923076923077
- type: dot_recall
value: 72.82321899736148
- type: euclidean_accuracy
value: 87.80473266972642
- type: euclidean_ap
value: 79.35792655436586
- type: euclidean_f1
value: 71.75672148264161
- type: euclidean_precision
value: 70.99690082644628
- type: euclidean_recall
value: 72.53298153034301
- type: manhattan_accuracy
value: 87.76300888120642
- type: manhattan_ap
value: 79.33615959143606
- type: manhattan_f1
value: 71.73219978746015
- type: manhattan_precision
value: 72.23113964686998
- type: manhattan_recall
value: 71.2401055408971
- type: max_accuracy
value: 87.81069321094355
- type: max_ap
value: 79.35792655436586
- type: max_f1
value: 71.78153446033811
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 89.3778864439011
- type: cos_sim_ap
value: 86.79005637312795
- type: cos_sim_f1
value: 79.14617791685293
- type: cos_sim_precision
value: 76.66714780600462
- type: cos_sim_recall
value: 81.79088389282414
- type: dot_accuracy
value: 89.37206504443668
- type: dot_ap
value: 86.78770290102123
- type: dot_f1
value: 79.14741392159786
- type: dot_precision
value: 76.6897746967071
- type: dot_recall
value: 81.76778564829073
- type: euclidean_accuracy
value: 89.37594597741297
- type: euclidean_ap
value: 86.7900899669397
- type: euclidean_f1
value: 79.13920845898953
- type: euclidean_precision
value: 76.62028692956528
- type: euclidean_recall
value: 81.8293809670465
- type: manhattan_accuracy
value: 89.38758877634183
- type: manhattan_ap
value: 86.78862564973224
- type: manhattan_f1
value: 79.1130985653065
- type: manhattan_precision
value: 76.6592041597458
- type: manhattan_recall
value: 81.72928857406838
- type: max_accuracy
value: 89.38758877634183
- type: max_ap
value: 86.7900899669397
- type: max_f1
value: 79.14741392159786
- task:
type: STS
dataset:
name: MTEB AFQMC
type: C-MTEB/AFQMC
config: default
split: validation
revision: b44c3b011063adb25877c13823db83bb193913c4
metrics:
- type: cos_sim_pearson
value: 50.01571015887356
- type: cos_sim_spearman
value: 58.47419994907958
- type: euclidean_pearson
value: 55.63582004345212
- type: euclidean_spearman
value: 58.47514484211099
- type: manhattan_pearson
value: 55.58487268871911
- type: manhattan_spearman
value: 58.411916843600075
- task:
type: STS
dataset:
name: MTEB ATEC
type: C-MTEB/ATEC
config: default
split: test
revision: 0f319b1142f28d00e055a6770f3f726ae9b7d865
metrics:
- type: cos_sim_pearson
value: 44.99231617937922
- type: cos_sim_spearman
value: 55.459227458516416
- type: euclidean_pearson
value: 52.98483376548224
- type: euclidean_spearman
value: 55.45938733128155
- type: manhattan_pearson
value: 52.946854805143964
- type: manhattan_spearman
value: 55.4272663113618
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (zh)
type: mteb/amazon_reviews_multi
config: zh
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 52.946000000000005
- type: f1
value: 49.299873931232725
- task:
type: STS
dataset:
name: MTEB BQ
type: C-MTEB/BQ
config: default
split: test
revision: e3dda5e115e487b39ec7e618c0c6a29137052a55
metrics:
- type: cos_sim_pearson
value: 74.66979530294986
- type: cos_sim_spearman
value: 77.59153258548018
- type: euclidean_pearson
value: 76.5862988380262
- type: euclidean_spearman
value: 77.59094368703879
- type: manhattan_pearson
value: 76.6034419552102
- type: manhattan_spearman
value: 77.6000715948404
- task:
type: Clustering
dataset:
name: MTEB CLSClusteringP2P
type: C-MTEB/CLSClusteringP2P
config: default
split: test
revision: 4b6227591c6c1a73bc76b1055f3b7f3588e72476
metrics:
- type: v_measure
value: 47.20931915009524
- task:
type: Clustering
dataset:
name: MTEB CLSClusteringS2S
type: C-MTEB/CLSClusteringS2S
config: default
split: test
revision: e458b3f5414b62b7f9f83499ac1f5497ae2e869f
metrics:
- type: v_measure
value: 45.787353610995474
- task:
type: Reranking
dataset:
name: MTEB CMedQAv1
type: C-MTEB/CMedQAv1-reranking
config: default
split: test
revision: 8d7f1e942507dac42dc58017c1a001c3717da7df
metrics:
- type: map
value: 86.37146026784607
- type: mrr
value: 88.52309523809524
- task:
type: Reranking
dataset:
name: MTEB CMedQAv2
type: C-MTEB/CMedQAv2-reranking
config: default
split: test
revision: 23d186750531a14a0357ca22cd92d712fd512ea0
metrics:
- type: map
value: 87.40699302584699
- type: mrr
value: 89.51591269841269
- task:
type: Retrieval
dataset:
name: MTEB CmedqaRetrieval
type: C-MTEB/CmedqaRetrieval
config: default
split: dev
revision: cd540c506dae1cf9e9a59c3e06f42030d54e7301
metrics:
- type: map_at_1
value: 24.465
- type: map_at_10
value: 36.689
- type: map_at_100
value: 38.605000000000004
- type: map_at_1000
value: 38.718
- type: map_at_3
value: 32.399
- type: map_at_5
value: 34.784
- type: mrr_at_1
value: 37.234
- type: mrr_at_10
value: 45.634
- type: mrr_at_100
value: 46.676
- type: mrr_at_1000
value: 46.717
- type: mrr_at_3
value: 42.94
- type: mrr_at_5
value: 44.457
- type: ndcg_at_1
value: 37.234
- type: ndcg_at_10
value: 43.469
- type: ndcg_at_100
value: 51.048
- type: ndcg_at_1000
value: 52.925999999999995
- type: ndcg_at_3
value: 37.942
- type: ndcg_at_5
value: 40.253
- type: precision_at_1
value: 37.234
- type: precision_at_10
value: 9.745
- type: precision_at_100
value: 1.5879999999999999
- type: precision_at_1000
value: 0.183
- type: precision_at_3
value: 21.505
- type: precision_at_5
value: 15.729000000000001
- type: recall_at_1
value: 24.465
- type: recall_at_10
value: 54.559999999999995
- type: recall_at_100
value: 85.97200000000001
- type: recall_at_1000
value: 98.32499999999999
- type: recall_at_3
value: 38.047
- type: recall_at_5
value: 45.08
- task:
type: PairClassification
dataset:
name: MTEB Cmnli
type: C-MTEB/CMNLI
config: default
split: validation
revision: 41bc36f332156f7adc9e38f53777c959b2ae9766
metrics:
- type: cos_sim_accuracy
value: 84.50992182802165
- type: cos_sim_ap
value: 91.81488661281966
- type: cos_sim_f1
value: 85.46855802524294
- type: cos_sim_precision
value: 81.82207014542344
- type: cos_sim_recall
value: 89.4552256254384
- type: dot_accuracy
value: 84.50992182802165
- type: dot_ap
value: 91.80547588176556
- type: dot_f1
value: 85.46492111446794
- type: dot_precision
value: 81.95278969957081
- type: dot_recall
value: 89.29155950432546
- type: euclidean_accuracy
value: 84.49789536981359
- type: euclidean_ap
value: 91.81495039620808
- type: euclidean_f1
value: 85.46817317373308
- type: euclidean_precision
value: 81.93908193908193
- type: euclidean_recall
value: 89.31494037877017
- type: manhattan_accuracy
value: 84.46181599518941
- type: manhattan_ap
value: 91.85400573633447
- type: manhattan_f1
value: 85.54283809312146
- type: manhattan_precision
value: 81.51207115628971
- type: manhattan_recall
value: 89.99298573766659
- type: max_accuracy
value: 84.50992182802165
- type: max_ap
value: 91.85400573633447
- type: max_f1
value: 85.54283809312146
- task:
type: Retrieval
dataset:
name: MTEB CovidRetrieval
type: C-MTEB/CovidRetrieval
config: default
split: dev
revision: 1271c7809071a13532e05f25fb53511ffce77117
metrics:
- type: map_at_1
value: 68.072
- type: map_at_10
value: 76.82900000000001
- type: map_at_100
value: 77.146
- type: map_at_1000
value: 77.14999999999999
- type: map_at_3
value: 74.939
- type: map_at_5
value: 76.009
- type: mrr_at_1
value: 68.282
- type: mrr_at_10
value: 76.818
- type: mrr_at_100
value: 77.13600000000001
- type: mrr_at_1000
value: 77.14
- type: mrr_at_3
value: 74.956
- type: mrr_at_5
value: 76.047
- type: ndcg_at_1
value: 68.282
- type: ndcg_at_10
value: 80.87299999999999
- type: ndcg_at_100
value: 82.191
- type: ndcg_at_1000
value: 82.286
- type: ndcg_at_3
value: 77.065
- type: ndcg_at_5
value: 78.965
- type: precision_at_1
value: 68.282
- type: precision_at_10
value: 9.452
- type: precision_at_100
value: 1.002
- type: precision_at_1000
value: 0.101
- type: precision_at_3
value: 27.889000000000003
- type: precision_at_5
value: 17.682000000000002
- type: recall_at_1
value: 68.072
- type: recall_at_10
value: 93.467
- type: recall_at_100
value: 99.157
- type: recall_at_1000
value: 99.895
- type: recall_at_3
value: 83.14
- type: recall_at_5
value: 87.67099999999999
- task:
type: Retrieval
dataset:
name: MTEB DuRetrieval
type: C-MTEB/DuRetrieval
config: default
split: dev
revision: a1a333e290fe30b10f3f56498e3a0d911a693ced
metrics:
- type: map_at_1
value: 26.107999999999997
- type: map_at_10
value: 78.384
- type: map_at_100
value: 81.341
- type: map_at_1000
value: 81.384
- type: map_at_3
value: 54.462999999999994
- type: map_at_5
value: 68.607
- type: mrr_at_1
value: 88.94999999999999
- type: mrr_at_10
value: 92.31
- type: mrr_at_100
value: 92.379
- type: mrr_at_1000
value: 92.38300000000001
- type: mrr_at_3
value: 91.85799999999999
- type: mrr_at_5
value: 92.146
- type: ndcg_at_1
value: 88.94999999999999
- type: ndcg_at_10
value: 86.00999999999999
- type: ndcg_at_100
value: 89.121
- type: ndcg_at_1000
value: 89.534
- type: ndcg_at_3
value: 84.69200000000001
- type: ndcg_at_5
value: 83.678
- type: precision_at_1
value: 88.94999999999999
- type: precision_at_10
value: 41.065000000000005
- type: precision_at_100
value: 4.781
- type: precision_at_1000
value: 0.488
- type: precision_at_3
value: 75.75
- type: precision_at_5
value: 63.93
- type: recall_at_1
value: 26.107999999999997
- type: recall_at_10
value: 87.349
- type: recall_at_100
value: 97.14699999999999
- type: recall_at_1000
value: 99.287
- type: recall_at_3
value: 56.601
- type: recall_at_5
value: 73.381
- task:
type: Retrieval
dataset:
name: MTEB EcomRetrieval
type: C-MTEB/EcomRetrieval
config: default
split: dev
revision: 687de13dc7294d6fd9be10c6945f9e8fec8166b9
metrics:
- type: map_at_1
value: 50.7
- type: map_at_10
value: 61.312999999999995
- type: map_at_100
value: 61.88399999999999
- type: map_at_1000
value: 61.9
- type: map_at_3
value: 58.983
- type: map_at_5
value: 60.238
- type: mrr_at_1
value: 50.7
- type: mrr_at_10
value: 61.312999999999995
- type: mrr_at_100
value: 61.88399999999999
- type: mrr_at_1000
value: 61.9
- type: mrr_at_3
value: 58.983
- type: mrr_at_5
value: 60.238
- type: ndcg_at_1
value: 50.7
- type: ndcg_at_10
value: 66.458
- type: ndcg_at_100
value: 69.098
- type: ndcg_at_1000
value: 69.539
- type: ndcg_at_3
value: 61.637
- type: ndcg_at_5
value: 63.92099999999999
- type: precision_at_1
value: 50.7
- type: precision_at_10
value: 8.260000000000002
- type: precision_at_100
value: 0.946
- type: precision_at_1000
value: 0.098
- type: precision_at_3
value: 23.1
- type: precision_at_5
value: 14.979999999999999
- type: recall_at_1
value: 50.7
- type: recall_at_10
value: 82.6
- type: recall_at_100
value: 94.6
- type: recall_at_1000
value: 98.1
- type: recall_at_3
value: 69.3
- type: recall_at_5
value: 74.9
- task:
type: Classification
dataset:
name: MTEB IFlyTek
type: C-MTEB/IFlyTek-classification
config: default
split: validation
revision: 421605374b29664c5fc098418fe20ada9bd55f8a
metrics:
- type: accuracy
value: 53.76683339746056
- type: f1
value: 40.026100192683714
- task:
type: Classification
dataset:
name: MTEB JDReview
type: C-MTEB/JDReview-classification
config: default
split: test
revision: b7c64bd89eb87f8ded463478346f76731f07bf8b
metrics:
- type: accuracy
value: 88.19887429643526
- type: ap
value: 59.02998120976959
- type: f1
value: 83.3659125921227
- task:
type: STS
dataset:
name: MTEB LCQMC
type: C-MTEB/LCQMC
config: default
split: test
revision: 17f9b096f80380fce5ed12a9be8be7784b337daf
metrics:
- type: cos_sim_pearson
value: 72.53955204856854
- type: cos_sim_spearman
value: 76.28996886746215
- type: euclidean_pearson
value: 75.31184890026394
- type: euclidean_spearman
value: 76.28984471300522
- type: manhattan_pearson
value: 75.36930361638623
- type: manhattan_spearman
value: 76.34021995551348
- task:
type: Reranking
dataset:
name: MTEB MMarcoReranking
type: C-MTEB/Mmarco-reranking
config: default
split: dev
revision: None
metrics:
- type: map
value: 23.63666512532725
- type: mrr
value: 22.49642857142857
- task:
type: Retrieval
dataset:
name: MTEB MMarcoRetrieval
type: C-MTEB/MMarcoRetrieval
config: default
split: dev
revision: 539bbde593d947e2a124ba72651aafc09eb33fc2
metrics:
- type: map_at_1
value: 60.645
- type: map_at_10
value: 69.733
- type: map_at_100
value: 70.11699999999999
- type: map_at_1000
value: 70.135
- type: map_at_3
value: 67.585
- type: map_at_5
value: 68.904
- type: mrr_at_1
value: 62.765
- type: mrr_at_10
value: 70.428
- type: mrr_at_100
value: 70.77
- type: mrr_at_1000
value: 70.785
- type: mrr_at_3
value: 68.498
- type: mrr_at_5
value: 69.69
- type: ndcg_at_1
value: 62.765
- type: ndcg_at_10
value: 73.83
- type: ndcg_at_100
value: 75.593
- type: ndcg_at_1000
value: 76.05199999999999
- type: ndcg_at_3
value: 69.66499999999999
- type: ndcg_at_5
value: 71.929
- type: precision_at_1
value: 62.765
- type: precision_at_10
value: 9.117
- type: precision_at_100
value: 1.0
- type: precision_at_1000
value: 0.104
- type: precision_at_3
value: 26.323
- type: precision_at_5
value: 16.971
- type: recall_at_1
value: 60.645
- type: recall_at_10
value: 85.907
- type: recall_at_100
value: 93.947
- type: recall_at_1000
value: 97.531
- type: recall_at_3
value: 74.773
- type: recall_at_5
value: 80.16799999999999
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (zh-CN)
type: mteb/amazon_massive_intent
config: zh-CN
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 76.25084061869536
- type: f1
value: 73.65064492827022
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (zh-CN)
type: mteb/amazon_massive_scenario
config: zh-CN
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 77.2595830531271
- type: f1
value: 77.15217273559321
- task:
type: Retrieval
dataset:
name: MTEB MedicalRetrieval
type: C-MTEB/MedicalRetrieval
config: default
split: dev
revision: 2039188fb5800a9803ba5048df7b76e6fb151fc6
metrics:
- type: map_at_1
value: 52.400000000000006
- type: map_at_10
value: 58.367000000000004
- type: map_at_100
value: 58.913000000000004
- type: map_at_1000
value: 58.961
- type: map_at_3
value: 56.882999999999996
- type: map_at_5
value: 57.743
- type: mrr_at_1
value: 52.400000000000006
- type: mrr_at_10
value: 58.367000000000004
- type: mrr_at_100
value: 58.913000000000004
- type: mrr_at_1000
value: 58.961
- type: mrr_at_3
value: 56.882999999999996
- type: mrr_at_5
value: 57.743
- type: ndcg_at_1
value: 52.400000000000006
- type: ndcg_at_10
value: 61.329
- type: ndcg_at_100
value: 64.264
- type: ndcg_at_1000
value: 65.669
- type: ndcg_at_3
value: 58.256
- type: ndcg_at_5
value: 59.813
- type: precision_at_1
value: 52.400000000000006
- type: precision_at_10
value: 7.07
- type: precision_at_100
value: 0.851
- type: precision_at_1000
value: 0.096
- type: precision_at_3
value: 20.732999999999997
- type: precision_at_5
value: 13.200000000000001
- type: recall_at_1
value: 52.400000000000006
- type: recall_at_10
value: 70.7
- type: recall_at_100
value: 85.1
- type: recall_at_1000
value: 96.39999999999999
- type: recall_at_3
value: 62.2
- type: recall_at_5
value: 66.0
- task:
type: Classification
dataset:
name: MTEB MultilingualSentiment
type: C-MTEB/MultilingualSentiment-classification
config: default
split: validation
revision: 46958b007a63fdbf239b7672c25d0bea67b5ea1a
metrics:
- type: accuracy
value: 77.42333333333333
- type: f1
value: 77.24849313989888
- task:
type: PairClassification
dataset:
name: MTEB Ocnli
type: C-MTEB/OCNLI
config: default
split: validation
revision: 66e76a618a34d6d565d5538088562851e6daa7ec
metrics:
- type: cos_sim_accuracy
value: 80.12994044396319
- type: cos_sim_ap
value: 85.21793541189636
- type: cos_sim_f1
value: 81.91489361702128
- type: cos_sim_precision
value: 75.55753791257806
- type: cos_sim_recall
value: 89.44033790918691
- type: dot_accuracy
value: 80.12994044396319
- type: dot_ap
value: 85.22568672443236
- type: dot_f1
value: 81.91489361702128
- type: dot_precision
value: 75.55753791257806
- type: dot_recall
value: 89.44033790918691
- type: euclidean_accuracy
value: 80.12994044396319
- type: euclidean_ap
value: 85.21643342357407
- type: euclidean_f1
value: 81.8830242510699
- type: euclidean_precision
value: 74.48096885813149
- type: euclidean_recall
value: 90.91869060190075
- type: manhattan_accuracy
value: 80.5630752571738
- type: manhattan_ap
value: 85.27682975032671
- type: manhattan_f1
value: 82.03883495145631
- type: manhattan_precision
value: 75.92093441150045
- type: manhattan_recall
value: 89.22914466737065
- type: max_accuracy
value: 80.5630752571738
- type: max_ap
value: 85.27682975032671
- type: max_f1
value: 82.03883495145631
- task:
type: Classification
dataset:
name: MTEB OnlineShopping
type: C-MTEB/OnlineShopping-classification
config: default
split: test
revision: e610f2ebd179a8fda30ae534c3878750a96db120
metrics:
- type: accuracy
value: 94.47999999999999
- type: ap
value: 92.81177660844013
- type: f1
value: 94.47045470502114
- task:
type: STS
dataset:
name: MTEB PAWSX
type: C-MTEB/PAWSX
config: default
split: test
revision: 9c6a90e430ac22b5779fb019a23e820b11a8b5e1
metrics:
- type: cos_sim_pearson
value: 46.13154582182421
- type: cos_sim_spearman
value: 50.21718723757444
- type: euclidean_pearson
value: 49.41535243569054
- type: euclidean_spearman
value: 50.21831909208907
- type: manhattan_pearson
value: 49.50756578601167
- type: manhattan_spearman
value: 50.229118655684566
- task:
type: STS
dataset:
name: MTEB QBQTC
type: C-MTEB/QBQTC
config: default
split: test
revision: 790b0510dc52b1553e8c49f3d2afb48c0e5c48b7
metrics:
- type: cos_sim_pearson
value: 30.787794367421956
- type: cos_sim_spearman
value: 31.81774306987836
- type: euclidean_pearson
value: 29.809436608089495
- type: euclidean_spearman
value: 31.817379098812165
- type: manhattan_pearson
value: 30.377027186607787
- type: manhattan_spearman
value: 32.42286865176827
- task:
type: STS
dataset:
name: MTEB STS22 (zh)
type: mteb/sts22-crosslingual-sts
config: zh
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 61.29839896616376
- type: cos_sim_spearman
value: 67.36328213286453
- type: euclidean_pearson
value: 64.33899267794008
- type: euclidean_spearman
value: 67.36552580196211
- type: manhattan_pearson
value: 65.20010308796022
- type: manhattan_spearman
value: 67.50982972902
- task:
type: STS
dataset:
name: MTEB STSB
type: C-MTEB/STSB
config: default
split: test
revision: 0cde68302b3541bb8b3c340dc0644b0b745b3dc0
metrics:
- type: cos_sim_pearson
value: 81.23278996774297
- type: cos_sim_spearman
value: 81.369375466486
- type: euclidean_pearson
value: 79.91030863727944
- type: euclidean_spearman
value: 81.36824495466793
- type: manhattan_pearson
value: 79.88047052896854
- type: manhattan_spearman
value: 81.3369604332008
- task:
type: Reranking
dataset:
name: MTEB T2Reranking
type: C-MTEB/T2Reranking
config: default
split: dev
revision: 76631901a18387f85eaa53e5450019b87ad58ef9
metrics:
- type: map
value: 68.109205221286
- type: mrr
value: 78.40703619520477
- task:
type: Retrieval
dataset:
name: MTEB T2Retrieval
type: C-MTEB/T2Retrieval
config: default
split: dev
revision: 8731a845f1bf500a4f111cf1070785c793d10e64
metrics:
- type: map_at_1
value: 26.704
- type: map_at_10
value: 75.739
- type: map_at_100
value: 79.606
- type: map_at_1000
value: 79.666
- type: map_at_3
value: 52.803
- type: map_at_5
value: 65.068
- type: mrr_at_1
value: 88.48899999999999
- type: mrr_at_10
value: 91.377
- type: mrr_at_100
value: 91.474
- type: mrr_at_1000
value: 91.47800000000001
- type: mrr_at_3
value: 90.846
- type: mrr_at_5
value: 91.18
- type: ndcg_at_1
value: 88.48899999999999
- type: ndcg_at_10
value: 83.581
- type: ndcg_at_100
value: 87.502
- type: ndcg_at_1000
value: 88.1
- type: ndcg_at_3
value: 84.433
- type: ndcg_at_5
value: 83.174
- type: precision_at_1
value: 88.48899999999999
- type: precision_at_10
value: 41.857
- type: precision_at_100
value: 5.039
- type: precision_at_1000
value: 0.517
- type: precision_at_3
value: 73.938
- type: precision_at_5
value: 62.163000000000004
- type: recall_at_1
value: 26.704
- type: recall_at_10
value: 83.092
- type: recall_at_100
value: 95.659
- type: recall_at_1000
value: 98.779
- type: recall_at_3
value: 54.678000000000004
- type: recall_at_5
value: 68.843
- task:
type: Classification
dataset:
name: MTEB TNews
type: C-MTEB/TNews-classification
config: default
split: validation
revision: 317f262bf1e6126357bbe89e875451e4b0938fe4
metrics:
- type: accuracy
value: 51.235
- type: f1
value: 48.14373844331604
- task:
type: Clustering
dataset:
name: MTEB ThuNewsClusteringP2P
type: C-MTEB/ThuNewsClusteringP2P
config: default
split: test
revision: 5798586b105c0434e4f0fe5e767abe619442cf93
metrics:
- type: v_measure
value: 87.42930040493792
- task:
type: Clustering
dataset:
name: MTEB ThuNewsClusteringS2S
type: C-MTEB/ThuNewsClusteringS2S
config: default
split: test
revision: 8a8b2caeda43f39e13c4bc5bea0f8a667896e10d
metrics:
- type: v_measure
value: 87.90254094650042
- task:
type: Retrieval
dataset:
name: MTEB VideoRetrieval
type: C-MTEB/VideoRetrieval
config: default
split: dev
revision: 58c2597a5943a2ba48f4668c3b90d796283c5639
metrics:
- type: map_at_1
value: 54.900000000000006
- type: map_at_10
value: 64.92
- type: map_at_100
value: 65.424
- type: map_at_1000
value: 65.43900000000001
- type: map_at_3
value: 63.132999999999996
- type: map_at_5
value: 64.208
- type: mrr_at_1
value: 54.900000000000006
- type: mrr_at_10
value: 64.92
- type: mrr_at_100
value: 65.424
- type: mrr_at_1000
value: 65.43900000000001
- type: mrr_at_3
value: 63.132999999999996
- type: mrr_at_5
value: 64.208
- type: ndcg_at_1
value: 54.900000000000006
- type: ndcg_at_10
value: 69.41199999999999
- type: ndcg_at_100
value: 71.824
- type: ndcg_at_1000
value: 72.301
- type: ndcg_at_3
value: 65.79700000000001
- type: ndcg_at_5
value: 67.713
- type: precision_at_1
value: 54.900000000000006
- type: precision_at_10
value: 8.33
- type: precision_at_100
value: 0.9450000000000001
- type: precision_at_1000
value: 0.098
- type: precision_at_3
value: 24.5
- type: precision_at_5
value: 15.620000000000001
- type: recall_at_1
value: 54.900000000000006
- type: recall_at_10
value: 83.3
- type: recall_at_100
value: 94.5
- type: recall_at_1000
value: 98.4
- type: recall_at_3
value: 73.5
- type: recall_at_5
value: 78.10000000000001
- task:
type: Classification
dataset:
name: MTEB Waimai
type: C-MTEB/waimai-classification
config: default
split: test
revision: 339287def212450dcaa9df8c22bf93e9980c7023
metrics:
- type: accuracy
value: 88.63
- type: ap
value: 73.78658340897097
- type: f1
value: 87.16764294033919
---
## gte-Qwen1.5-7B-instruct
**gte-Qwen1.5-7B-instruct** is the latest addition to the gte embedding family. This model has been engineered starting from the [Qwen1.5-7B](https://huggingface.co/Qwen/Qwen1.5-7B) LLM, drawing on the robust natural language processing capabilities of the Qwen1.5-7B model. Enhanced through our sophisticated embedding training techniques, the model incorporates several key advancements:
- Integration of bidirectional attention mechanisms, enriching its contextual understanding.
- Instruction tuning, applied solely on the query side for streamlined efficiency
- Comprehensive training across a vast, multilingual text corpus spanning diverse domains and scenarios. This training leverages both weakly supervised and supervised data, ensuring the model's applicability across numerous languages and a wide array of downstream tasks.
We also present [gte-base-en-v1.5](https://huggingface.co/Alibaba-NLP/gte-base-en-v1.5) and [gte-large-en-v1.5](https://huggingface.co/Alibaba-NLP/gte-large-en-v1.5),
SOTA English embedding models that achieve state-of-the-art scores on the MTEB benchmark within the same model size category and support the context length of up to 8192.
## Model Information
- Model Size: 7B
- Embedding Dimension: 4096
- Max Input Tokens: 32k
## Requirements
```
transformers>=4.39.2
flash_attn>=2.5.6
```
## Usage
### Sentence Transformers
```python
from sentence_transformers import SentenceTransformer
model = SentenceTransformer("Alibaba-NLP/gte-Qwen1.5-7B-instruct", trust_remote_code=True)
# In case you want to reduce the maximum length:
model.max_seq_length = 8192
queries = [
"how much protein should a female eat",
"summit define",
]
documents = [
"As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.",
"Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments.",
]
query_embeddings = model.encode(queries, prompt_name="query")
document_embeddings = model.encode(documents)
scores = (query_embeddings @ document_embeddings.T) * 100
print(scores.tolist())
# [[70.00668334960938, 8.184843063354492], [14.62419319152832, 77.71407318115234]]
```
Observe the [config_sentence_transformers.json](config_sentence_transformers.json) to see all pre-built prompt names. Otherwise, you can use `model.encode(queries, prompt="Instruct: ...\nQuery: "` to use a custom prompt of your choice.
### Transformers
```python
import torch
import torch.nn.functional as F
from torch import Tensor
from transformers import AutoTokenizer, AutoModel
def last_token_pool(last_hidden_states: Tensor,
attention_mask: Tensor) -> Tensor:
left_padding = (attention_mask[:, -1].sum() == attention_mask.shape[0])
if left_padding:
return last_hidden_states[:, -1]
else:
sequence_lengths = attention_mask.sum(dim=1) - 1
batch_size = last_hidden_states.shape[0]
return last_hidden_states[torch.arange(batch_size, device=last_hidden_states.device), sequence_lengths]
def get_detailed_instruct(task_description: str, query: str) -> str:
return f'Instruct: {task_description}\nQuery: {query}'
# Each query must come with a one-sentence instruction that describes the task
task = 'Given a web search query, retrieve relevant passages that answer the query'
queries = [
get_detailed_instruct(task, 'how much protein should a female eat'),
get_detailed_instruct(task, 'summit define')
]
# No need to add instruction for retrieval documents
documents = [
"As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.",
"Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments."
]
input_texts = queries + documents
tokenizer = AutoTokenizer.from_pretrained('Alibaba-NLP/gte-Qwen1.5-7B-instruct', trust_remote_code=True)
model = AutoModel.from_pretrained('Alibaba-NLP/gte-Qwen1.5-7B-instruct', trust_remote_code=True)
max_length = 8192
# Tokenize the input texts
batch_dict = tokenizer(input_texts, max_length=max_length, padding=True, truncation=True, return_tensors='pt')
outputs = model(**batch_dict)
embeddings = last_token_pool(outputs.last_hidden_state, batch_dict['attention_mask'])
# normalize embeddings
embeddings = F.normalize(embeddings, p=2, dim=1)
scores = (embeddings[:2] @ embeddings[2:].T) * 100
print(scores.tolist())
# [[70.00666809082031, 8.184867858886719], [14.62420654296875, 77.71405792236328]]
```
## Evaluation
### MTEB & C-MTEB
You can use the [scripts/eval_mteb.py](https://huggingface.co/Alibaba-NLP/gte-Qwen1.5-7B-instruct/blob/main/scripts/eval_mteb.py) to reproduce the following result of **gte-Qwen1.5-7B-instruct** on MTEB(English)/C-MTEB(Chinese):
| Model Name | MTEB(56) | C-MTEB(35) |
|:----:|:---:|:---:|
| [bge-base-en-1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | 64.23 | - |
| [bge-large-en-1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | 63.55 | - |
| [gte-large-en-v1.5](https://huggingface.co/Alibaba-NLP/gte-large-en-v1.5) | 65.39 | - |
| [gte-base-en-v1.5](https://huggingface.co/Alibaba-NLP/gte-large-en-v1.5) | 64.11 | - |
| [mxbai-embed-large-v1](https://huggingface.co/mixedbread-ai/mxbai-embed-large-v1) | 64.68 | - |
| [acge_text_embedding](https://huggingface.co/aspire/acge_text_embedding) | - | 69.07 |
| [stella-mrl-large-zh-v3.5-1792d](https://huggingface.co/infgrad/stella-mrl-large-zh-v3.5-1792d)] | - | 68.55 |
| [gte-large-zh](https://huggingface.co/thenlper/gte-large-zh) | - | 66.72 |
| [multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) | 59.45 | 56.21 |
| [multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large) | 61.50 | 58.81 |
| [e5-mistral-7b-instruct](https://huggingface.co/intfloat/e5-mistral-7b-instruct) | 66.63 | 60.81 |
| [**gte-Qwen1.5-7B-instruct**](https://huggingface.co/Alibaba-NLP/gte-Qwen1.5-7B-instruct) | 67.34 | 69.52 |
## Citation
If you find our paper or models helpful, please consider cite:
```
@article{li2023towards,
title={Towards general text embeddings with multi-stage contrastive learning},
author={Li, Zehan and Zhang, Xin and Zhang, Yanzhao and Long, Dingkun and Xie, Pengjun and Zhang, Meishan},
journal={arXiv preprint arXiv:2308.03281},
year={2023}
}
``` | [
"SUMMARIZATION"
] | [
"BIOSSES",
"SCIFACT"
] |
mini1013/master_item_top_el_flat | mini1013 | text-classification | [
"setfit",
"safetensors",
"roberta",
"sentence-transformers",
"text-classification",
"generated_from_setfit_trainer",
"arxiv:2209.11055",
"base_model:klue/roberta-base",
"base_model:finetune:klue/roberta-base",
"model-index",
"region:us"
] | 2025-01-26T08:15:23 | 2025-01-26T08:15:48 | 1,345 | 0 | ---
base_model: klue/roberta-base
library_name: setfit
metrics:
- accuracy
pipeline_tag: text-classification
tags:
- setfit
- sentence-transformers
- text-classification
- generated_from_setfit_trainer
widget:
- text: 500666 차량용 가습기 소형 미니 사무실 탁상용 반중력 공기 물방울 향수 아로마 테라피 8 시간 작동 청정기 직송 500ml 선택01
black (#M)홈>생활/건강>자동차용품>편의용품>차량용가습기 Naverstore > 가전 > 계절가전 > 가습기/에어워셔 > 차량용 가습기
- text: 해피콜 프리미엄 초고속 블렌더 브리즈탭 LED 터치 UI 믹서기 분쇄기 차콜그레이 (#M)디지털/가전>주방가전>믹서기 Naverstore
> 가전 > 주방가전 > 믹서기/블렌더 > 믹서기
- text: '[ 8/31입고예정] LG전자 24MP400 24인치모니터 IPS패널 FHD 슬림베젤 LED 모니터 컴퓨터모니터 사무용 인강용모니터 (#M)디지털/가전>모니터
Naverstore > 컴퓨터 > 모니터 > 화면크기별 > 26인치 이하'
- text: 콘에어 핸디형 스팀다리미 모음전 02. GS25PKK - 초강력 핸디스팀다리미 (#M)가전·컴퓨터>생활가전>다리미·미싱·기타>스팀다리미
Tmon > 가전·디지털 > 가전·컴퓨터 > 생활가전 > 다리미·미싱·기타 > 스팀다리미
- text: '[ 가130만원대]LG 디오스 오브제컬렉션 냉장고 S834BW12 832L 1. S834BW12 11st > 가전/디지털 > 냉장고
> 양문형 > 양문형;(#M)11st>냉장고>양문형>양문형 11st > 가전/디지털 > 냉장고 > 양문형 > 양문형'
inference: true
model-index:
- name: SetFit with klue/roberta-base
results:
- task:
type: text-classification
name: Text Classification
dataset:
name: Unknown
type: unknown
split: test
metrics:
- type: accuracy
value: 0.9081549631816543
name: Accuracy
---
# SetFit with klue/roberta-base
This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [klue/roberta-base](https://huggingface.co/klue/roberta-base) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification.
The model has been trained using an efficient few-shot learning technique that involves:
1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
2. Training a classification head with features from the fine-tuned Sentence Transformer.
## Model Details
### Model Description
- **Model Type:** SetFit
- **Sentence Transformer body:** [klue/roberta-base](https://huggingface.co/klue/roberta-base)
- **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance
- **Maximum Sequence Length:** 512 tokens
- **Number of Classes:** 232 classes
<!-- - **Training Dataset:** [Unknown](https://huggingface.co/datasets/unknown) -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit)
- **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055)
- **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit)
### Model Labels
| Label | Examples |
|:------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 187.0 | <ul><li>'키친아트 전기후라이팬 사각 대형잔치팬 피자팬 빨간뚜껑후라이팬 잔치팬-KPP-6627 (#M)디지털/가전>주방가전>전기팬 GFK > Naverstore > 가전 > 주방가전 > 전기그릴/팬 > 전기팬'</li><li>'코스트코 잔치팬 해마루 대형 사각 명절 전기 후라이팬 TC-3000 (#M)디지털/가전>주방가전>전기그릴 GFK > traverse > Naverstore > 가전 > 주방가전 > 전기그릴/팬'</li><li>'대원 특대형 사각 큰집잔치팬 전기팬 설날 추석 전부치는 후라이팬 DWP-530A (#M)디지털/가전>주방가전>전기팬 Naverstore > 가전 > 주방가전 > 전기그릴/팬'</li></ul> |
| 87.0 | <ul><li>'건조기배기호스 파이프 연장 배기관 주방 내경 호환 B. 11-10CM 어댑터 (#M)세탁기/건조기>세탁기 건조기 세트>세탁기 건조기 세트 GFK > 11st > 가전/디지털 > 세탁기/건조기 > 세탁기 건조기 세트 > 세탁기 건조기 세트'</li><li>'건조기 세탁기 받침대 스토퍼 진동 소음 밀림방지패드 (#M)디지털/가전>생활가전>세탁기>일반세탁기 GFK > traverse > Naverstore > 가전 > 세탁기/건조기 > 일반세탁기'</li><li>'세탁기 받침대 4P 진동 소음 수평 높이 조절 냉장고 대형 4개 세트 (#M)디지털/가전>생활가전>세탁기>일반세탁기 GFK > traverse > Naverstore > 가전 > 세탁기/건조기 > 일반세탁기'</li></ul> |
| 37.0 | <ul><li>'바툼 회전 미니 온풍기 탁상용 소형 책상용 BTMH600 (#M)디지털/가전>계절가전>온풍기>전기온풍기 GFK > live > Naverstore > Shop Live > 테크 > 20241119 > 11:00 ~ 13:00'</li><li>'신일 전기히터 바닥용 탁상용 미니온풍기 [SEH-P20] (#M)계절가전>온풍기>전기온풍기 GFK > traverse > 11st > 가전/디지털 > 계절가전 > 온풍기'</li><li>'소싱 웜베이비 미니 온풍기 / 회전온풍기/ 탁상용 가정용 캠핑용 500W 베이비핑크 (#M)홈>전체상품 Naverstore > 디지털/가전 > 계절가전 > 온풍기'</li></ul> |
| 153.0 | <ul><li>'SK매직 GRA-850SRLNG(도시가스) SK매직 GRA-850SR LNG(도시가스) (#M)홈>전체상품 Naverstore > 가전 > 주방가전 > 가스레인지 > 스탠드형'</li><li>'SK매직 GRA-850SR (#M)홈>디지털/가전>주방가전>가스레인지>일반가스레인지 Naverstore > 가전 > 주방가전 > 가스레인지 > 스탠드형'</li><li>'(SK매직) 원터치 점화 가스레인지(2구) 레드 GRAC290R-본 LNG(도시가스) (#M)가전·컴퓨터>주방가전>전기·가스레인지>가스레인지 Tmon > 가전·디지털 > 가전·컴퓨터 > 주방가전 > 전기·가스레인지 > 가스레인지'</li></ul> |
| 167.0 | <ul><li>'한일전기 세이프티 UV 살균 식기건조기 세이프티 UV 살균 식기건조기+NPay 5천원 (#M)디지털/가전>주방가전>식기세척/건조기>식기건조기 GFK > Naverstore > 가전 > 생활가전 > 살균소독기 > 살균건조기'</li><li>'칼 도마 살균기 도마 3종+칼5종 세트 살균 소독 분리형 슬림 칼5종+살균기 화이트에디션 세트 (#M)홈>디지털/가전>주방가전>식기세척/건조기>식기건조기 Naverstore > 가전 > 생활가전 > 살균소독기 > 살균건조기'</li><li>'락앤락 텀블러 살균 건조기 락앤락 텀블러 살균 건조기_그레이 (#M)홈>전체상품 Naverstore > 가전 > 주방가전 > 위생관리 > 식기건조기'</li></ul> |
| 194.0 | <ul><li>'휴롬 착즙기 H430 저속착즙 H72ST-BFS02WH 코스트코 (#M)디지털/가전>주방가전>쥬서기/녹즙기 GFK > traverse > Naverstore > 가전 > 주방가전 > 믹서기/블렌더 > 착즙기'</li><li>'휴롬 H300L 그레이 딥그린 코랄 딥그린 (#M)디지털/가전>주방가전>쥬서기/녹즙기 Naverstore > 가전 > 주방가전 > 믹서기/블렌더 > 착즙기'</li><li>'제니퍼룸 스텐 착즙기 화이트 JO-M8101WH (#M)디지털/가전>주방가전>쥬서기/녹즙기 Naverstore > 가전 > 주방가전 > 믹서기/블렌더 > 착즙기'</li></ul> |
| 210.0 | <ul><li>'QR코드 바코드스캐너 거치대포함 2D 1D 유무선 2D무선-블랙 (#M)프린터/복합기>스캐너>일반 스캐너 GFK > traverse > 11st > 가전/디지털 > 프린터/복합기 > 스캐너'</li><li>'유무선 바코드스캐너 QR코드 서점바코드 가성비바코드스캐너 거치대포함 무선바코드스캐너 마트바코드 1D무선-아이보리 (#M)프린터/복합기>스캐너>일반 스캐너 GFK > traverse > 11st > 가전/디지털 > 프린터/복합기 > 스캐너 > 일반 스캐너'</li><li>'유무선 바코드스캐너 QR코드 서점바코드 가성비바코드스캐너 거치대포함 무선바코드스캐너 마트바코드 2D유선-블랙 (#M)프린터/복합기>스캐너>일반 스캐너 GFK > traverse > 11st > 가전/디지털 > 프린터/복합기 > 스캐너 > 일반 스캐너'</li></ul> |
| 143.0 | <ul><li>'필립스 헤어 드라이어 (BHD004/19) 필립스 헤어 드라이어 (BHD004/19) (#M)홈>헤어케어>헤어기기>헤어드라이기 OLIVEYOUNG > 헤어케어 > 헤어기기 > 헤어드라이기'</li><li>'프리미엄케어 볼륨&케어 HV-7461K0 HV7461 [볼륨 마사지 디퓨저 / 파워모터 / 3 LotteOn > 뷰티 > 뷰티기기 > 헤어스타일러 LotteOn > 뷰티 > 뷰티기기 > 헤어스타일러 > 헤어드라이어'</li><li>'헤어드라이기추천 2000W 미니 가정용 전문가용 드라이기 비달사순 접이식 휴대용 여행용 모이스트랩 접이식 1201K (#M)디지털/가전>이미용가전>헤어기기>드라이어 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 이미용가전 > 드라이어'</li></ul> |
| 18.0 | <ul><li>'미니가습기필터 스위스윙거 가습기 램프 캔 레인우 호환가습기필터 110mm X 8mm (레인보우가습기용) (#M)홈>전체상품 Naverstore > 가전 > 계절가전 > 가습기/에어워셔 > 필터/액세서리'</li><li>'가습기필터 미니 8mm 10mm 스프링 필터 신제품 더블 제트 공기 가습기 USB 대용량 가정 자동차 가습기필터 미니 8mm 10mm 스프링 필터 신제품 더블 제트 공기 가습기 USB 대용량 가정 자동차_05 spray humidif (#M)가전·컴퓨터>계절가전>가습기 액세서리 Tmon > 가전·디지털 > 가전·컴퓨터 > 계절가전 > 가습기 액세서리'</li><li>'가습기필터 미니 8mm 10mm 스프링 필터 100ML 가습기 아로마 에센셜 오일 디퓨저, 향수 디퓨져 가습기필터 미니 8mm 10mm 스프링 필터 100ML 가습기 아로마 에센셜 오일 디퓨저, 향수 디퓨져_08 2pcs Jasmine (#M)가전·컴퓨터>계절가전>가습기 액세서리 Tmon > 가전·디지털 > 가전·컴퓨터 > 계절가전 > 가습기 액세서리'</li></ul> |
| 179.0 | <ul><li>'도깨비 미니 와플메이커 MWM2200 와플메이커 (#M)11st>주방가전>전기쿠커>전기찜기 11st > 가전/디지털 > 주방가전 > 전기쿠커 > 전기찜기'</li><li>'쿠폰가 27.900 [GR-WAFPK] 쿠진아트 와플팬(GR-4NKR/GR-5KR/CGR-10KR 호환) (#M)디지털/가전>주방가전>와플제조기 Naverstore > 가전 > 주방가전 > 간식메이커 > 와플'</li><li>'(한성커머스)키친아트 렉스2구 크로플 와플기계 디저트메이커 KP-21JT 와플메이커 (#M)위메프 > 가전·디지털·컴퓨터 > 주방가전 > 홈메이드가전 > 홈베이킹/토스터기 위메프 > 가전·디지털·컴퓨터 > 주방가전 > 홈메이드가전 > 홈베이킹/토스터기'</li></ul> |
| 3.0 | <ul><li>'인텔 코어i5-13세대 13600K (랩터레이크) / 신품 벌크 / 쿨러X (#M)디지털/가전>PC부품>CPU GFK > Naverstore > 컴퓨터 > 부품 > CPU'</li><li>'인텔 코어i5-10세대 10400 (코멧레이크S) 벌크 쿨러포함 (#M)11st>PC부품>CPU>코어i5 11st > 가전/디지털 > PC부품 > CPU > 코어i5'</li><li>'인텔 CPU i5 4690 하스웰 리프레시 (#M)디지털/가전>PC부품>CPU Naverstore > 컴퓨터 > 부품 > CPU > 인텔'</li></ul> |
| 99.0 | <ul><li>'◆ GRAND SALE & ◆ 부라더미싱 TR14A /초급자추천모델 자동실끼우기 /수강증+서적 (#M)디지털/가전>생활가전>재봉틀 Naverstore > 가전 > 생활가전 > 재봉틀'</li><li>'브랜드 1위 혼스 미니재봉틀 HSSM-1201 한땀한땀 프로 한땀한땀 프로(핑크) (#M)디지털/가전>생활가전>재봉틀 Naverstore > 가전 > 생활가전 > 재봉틀'</li><li>'코스날 미니재봉틀 미니미싱 초간편 핸드미싱 휴대용 가정용 미싱기 아답터 받침대 추가가능 미니재봉틀 (아답터있음)+받침대 (#M)디지털/가전>생활가전>재봉틀 Naverstore > 가전 > 생활가전 > 재봉틀'</li></ul> |
| 25.0 | <ul><li>'신일 전기 컨벡터 SEH-P4000SS 컨벡터히터 동파방지 라디에이터 (#M)위메프 > 가전·디지털·컴퓨터 > 계절가전/에어컨 > 난방가전 > 라디에이터 위메프 > 가전·디지털·컴퓨터 > 계절가전/에어컨 > 난방가전 > 라디에이터'</li><li>'흥신 캠핑라디에이터 오르씨 500W 9월 캠핑용 난로 난방 캠핑용품 ORRCY-21 올블랙(가방제외) (#M)디지털/가전>계절가전>라디에이터 GFK > Naverstore > 가전 > 계절가전 > 난방가전 > 라디에이터'</li><li>'흥신 라디에이터 오르씨 가정용에디션 국산 사무실 화장실 전기난로 7핀 13핀(1500W/4평) (#M)디지털/가전>계절가전>라디에이터 GFK > Naverstore > 가전 > 계절가전 > 난방가전 > 라디에이터'</li></ul> |
| 96.0 | <ul><li>'미닉스 미니 건조기 PRO 3kg 수건 속옷 양말 클래식베이지 (#M)디지털/가전>생활가전>건조기/탈수기>의류건조기 GFK > Naverstore > 가전 > 세탁기/건조기 > 의류건조기'</li><li>'위닉스 컴팩트 4KG 건조기 HS2E400-MGK (#M)가전·컴퓨터>TV·냉장고·세탁기>세탁기·건조기>그외 브랜드 Tmon > 가전·디지털 > 가전·컴퓨터 > TV·냉장고·세탁기 > 세탁기·건조기 > 그외 브랜드'</li><li>'[미닉스]미니 건조기 PRO 3kg 소형 빨래 원룸 자취 아기옷 클래식베이지 (#M)디지털/가전>생활가전>건조기/탈수기>의류건조기 Naverstore > 가전 > 세탁기/건조기 > 의류건조기'</li></ul> |
| 100.0 | <ul><li>'[SUMSEI] 섬세이 에어샤워 2세대 / 바디드라이어 자갈 블랙_1. 에어샤워 (#M)디지털/가전>생활가전>전신건조기 Naverstore > 가전 > 욕실가전 > 전신건조기'</li><li>'보랄 에어타운 바디드라이어 BR-1320DR 전신건조기 (#M)홈>디지털/가전>생활가전>전신건조기 Naverstore > 가전 > 욕실가전 > 전신건조기'</li><li>'에어드롭 헤어&바디드라이어 (고급형 HTM-2011) 고급형 (색상 : 그레이)_설치 필요 (#M)디지털/가전>생활가전>전신건조기 Naverstore > 가전 > 욕실가전 > 전신건조기'</li></ul> |
| 21.0 | <ul><li>'LG 공기청정기 AS303DWFA NS홈 LG 공기청정기 AS303DWFA 무료배송 NS홈 (#M)11st>계절가전>공기청정기>필터식 11st > 가전/디지털 > 계절가전 > 공기청정기 > 필터식'</li><li>'[LG전자]LG AS062PYHAR 에어로퍼니처 원형[32600111] 11st > 가전/디지털 > 계절가전 > 공기청정기 > 필터/액세서리;(#M)11st>계절가전>공기청정기>필터/액세서리 11st > 가전/디지털 > 계절가전 > 공기청정기 > 필터/액세서리'</li><li>'LG 퓨리케어 에어로타워 오브제(온풍겸용)FS061PSSA,FS061PGSA 네이처 그린 (#M)11st>계절가전>공기청정기>필터식 11st > 가전/디지털 > 계절가전 > 공기청정기 > 필터식'</li></ul> |
| 2.0 | <ul><li>'게이밍 조립 컴퓨터 세트 조립PC 롤 발로란트 오버워치 배그 바른컴퓨터 본체 풀세트F11 본체 + 모니터 풀세트_F11 홈>디지털/가전>PC>조립/베어본PC;홈>[게임용 & 사무용 풀세트 PC];홈>전체상품;(#M)홈>[게임용 & 사무용 컴퓨터 PC] Naverstore > 컴퓨터 > 데스크탑 > 조립/반조립PC(베어본)'</li><li>'Beelink-미니 S PC 윈도우 11, 인텔 11th 셀러론 N5095 8GB DDR4 128GB/256GB SSD 데스크탑 게임용 컴퓨터 VS U59 GK 미니 J4125 Beelink-미니 S PC 윈도우 11 인텔 11th 셀러론 N5095 8GB DDR4_CHINA_16GB DDR4 256GB SSD+미국 (#M)가전·컴퓨터>노트북·데스크탑>브랜드PC·올인원>미니PC·기타 Tmon > 가전·디지털 > 가전·컴퓨터 > 노트북·데스크탑 > 브랜드PC·올인원 > 미니PC·기타'</li><li>'인텔 NUC 누크 11세대 타이거캐년 i5 프로세서 미니PC 베어본 NUC11TNKi5 (#M)11st>데스크톱>조립/베이본PC>코어 i5 11st > 가전/디지털 > 데스크톱 > 조립/베이본PC > 코어 i5'</li></ul> |
| 171.0 | <ul><li>'가디브 무지외반증 교정기 발가락링 엄지 발가락 통증 1등급 의료기기 대(15일 무료체험)_교정용 (#M)생활/건강>발건강용품>발가락교정기 GFK > Naverstore > 건강/의료용품 > 발건강용품'</li><li>'LG전자 오브제 컬렉션 양문형 냉장고 S634BB35Q (OK) MinSellAmount (#M)주방가전>냉장고/냉동고>양문형냉장고 Gmarket > 가전 > 주방가전 > 냉장고/냉동고 > 양문형냉장고'</li><li>'삼성전자 양문형 냉장고 RS84B5041M9 (846L) 서울지역 (#M)11st>냉장고>양문형>양문형 11st > 가전/디지털 > 냉장고 > 양문형 > 양문형'</li></ul> |
| 112.0 | <ul><li>'프로크리에이트 질감 인물화 브러쉬 9종 (+튜토리얼) 질감 인물화 브러쉬 9종 (#M)디지털/가전>소프트웨어>유틸리티 GFK > Naverstore > 컴퓨터 > 소프트웨어'</li><li>'2024 굿노트 다이어리 날짜형 속지 아이패드 갤럭시탭 먼슬리 위클리 하이퍼링크 플래너 PDF 베이지블로썸_모눈+타임(월월)_첫구매자 (#M)디지털/가전>소프트웨어>유틸리티 GFK > Naverstore > 컴퓨터 > 소프트웨어 > 유틸리티'</li><li>'[1분발송]리훈 오늘곰부 굿노트 스터디플래너 다이어리 속지 아이패드 양식 노타빌리티 PDF 필기 1.오늘곰부_오른손잡이용 (#M)디지털/가전>소프트웨어>유틸리티 GFK > Naverstore > 컴퓨터 > 소프트웨어 > 유틸리티'</li></ul> |
| 180.0 | <ul><li>'키친아트 요거트메이커 용기8개 온도설정 디지털D3081 (#M)홈>디지털/가전>주방가전>요구르트제조기 Naverstore > 가전 > 주방가전 > 간식메이커 > 요구르트,치즈'</li><li>'주코 라미 요거트메이커 ZY-ZC501M 주코 라미 요거트메이커 ZY-ZC501M (#M)디지털/가전>주방가전>요구르트제조기 Naverstore > 가전 > 주방가전 > 간식메이커 > 요구르트,치즈'</li><li>'키친아트 요거트메이커 그릭요거트만들기 기계 요거메이트 요구르트제조기 옵션1. 500ml (#M)홈>🔴 디지털가전 Naverstore > 가전 > 주방가전 > 간식메이커 > 요구르트,치즈'</li></ul> |
| 60.0 | <ul><li>'[ 가 118만원✅SSD 무상업글] 삼성 갤럭시북2 프로 NT930XEW-A51A 엔씨디 빠르고 가벼운 휴대용 대학생 사무용 문서작업 튼튼한 최신 인텔12세대 13.3 노트북 실버 컬러 (W-A51AS)_무선 마우스+파우치+액정보호 필름+키스킨_NVMe 500G 개봉장착+256G 추가동봉 (#M)홈>▼ 추천 노트북>가벼운 노트북 추천 Naverstore > 컴퓨터 > 노트북 > 삼성갤럭시북'</li><li>'삼성전자 노트북 플러스2 NT550XDA-K14A 정품Win11탑재 인강용 사무용 재택 노트북 화이트(NVMe 128GB+RAM 4GB) (#M)11st>노트북>삼성전자>AMD 11st > 가전/디지털 > 노트북 > 삼성전자 > AMD'</li><li>'[LG] 노트북 가성비부터 최고사양 노트북모음. 002.LG울트라PC 15UD40R-GX56K (#M)가전·컴퓨터>노트북·데스크탑>노트북>일반·사무용 Tmon > 가전·디지털 > 가전·컴퓨터 > 노트북·데스크탑 > 노트북 > 일반·사무용'</li></ul> |
| 138.0 | <ul><li>'[세정액 2개] 브라운 전기면도기 최신 시리즈9 PRO PLUS 충전 세척스테이션 구성 그라파이트[9F65]+세정액2개[BO31] (#M)이미용가전>전기면도기>남성용 GFK > traverse > 11st > 가전/디지털 > 이미용가전 > 전기면도기 > 남성용'</li><li>'손흥민에디션 질레트 랩스 딥클렌징바 면도기 (핸들+1입면도날+거치대+쉐이빙젤) (#M)디지털/가전>이미용가전>면도기소모품>기타면도기소모품 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 이미용가전 > 면도기/면도용품'</li><li>'질레트 프로쉴드 옐로우 파워 면도기 (핸들+1입날) [저자극+밀착 면도] 질레트 프로쉴드 옐로우 파워 면도기 (핸들+1입날) [저자극+밀착 면도] 홈>바디케어>데오/제모>면도기;홈;홈>남성>쉐이빙>면도기/면도날;홈>바디케어>제모용품>면도기;홈>바디케어>제모용품>면도기/제모의료기기;(#M)홈>바디케어>제모/왁싱>남성 쉐이빙 OLIVEYOUNG > 남성 > 쉐이빙 > 면도기/면도날'</li></ul> |
| 6.0 | <ul><li>'이엠텍 지포스 RTX 4060 STORM X Dual D6 8GB.~ (#M)PC부품>그래픽카드>지포스(nVidia) GFK > traverse > 11st > 가전/디지털 > PC부품 > 그래픽카드 > 지포스(nVidia)'</li><li>'노트북 DDR3 4G PC3 10600S 램 삼성 정품 (#M)디지털/가전>PC부품>RAM>노트북용 GFK > Naverstore > 컴퓨터 > 부품 > RAM'</li><li>'삼성전자 DDR4 16GB PC4 - 21300(2666V) 데스크탑 메모리 삼성 16GB 21300(2666V) (#M)디지털/가전>PC부품>RAM>데스크탑용 GFK > Naverstore > 컴퓨터 > 부품 > RAM > 데스크탑용'</li></ul> |
| 79.0 | <ul><li>'에버넷 디지털도어락 현관문도어락 현관도어락 터치키 번호키 EN250-N EN250N(카드키 없음) (#M)디지털/가전>생활가전>디지털도어록>보조키형 Naverstore > 가전 > 생활가전 > 디지털도어록 > 보조키형'</li><li>'도어락 스티커 카드키 태그 RFID RF 디지털 도어록 터치 13.56Mhz 라벨 스티커 태그 05.메탈 스티커 태그B(No.100T) (#M)홈>RFID 태그&카드👍 Naverstore > 가전 > 생활가전 > 디지털도어록 > 보조키형'</li><li>'삼성도어락카드키 SDS 스티커 부착형 카드키 아파트 현관 삼성 도어락카드키 부착형 (화이트) 랜덤발송 홈>카드키;홈>전체상품;(#M)홈>도어락 카드키 Naverstore > 가전 > 생활가전 > 디지털도어록 > 주키형'</li></ul> |
| 11.0 | <ul><li>'파워 파워서플라이 컴퓨터파워 앱코 SUITMASTER SETTLER 700W 화이트 벌크 (#M)디지털/가전>PC부품>파워서플라이>ATX파워 GFK > Naverstore > 컴퓨터 > 부품 > 파워서플라이 > ATX파워'</li><li>'darkFlash UPMOST 850W 80PLUS GOLD FULL MODULAR 블랙 (#M)11st>PC부품>파워>ATX파워 11st > 가전/디지털 > PC부품 > 파워 > ATX파워'</li><li>'오랄비 스테이지스 파워 어린이 전동칫솔 유아 겨울왕국 D12K 겨울왕국 전동칫솔 (#M)디지털/가전>생활가전>구강청정기>전동칫솔 GFK > naver_plus_traverse > Naverstore > 가전 > 욕실가전 > 전동칫솔'</li></ul> |
| 216.0 | <ul><li>'BS 니콘정품 Z30 16-50mm KIT 새상품 (#M)디지털/가전>카메라/캠코더용품>미러리스디카 GFK > naver_plus_traverse_extension > Naverstore > 휴대폰/카메라 > 카메라 > 미러리스카메라'</li><li>'파나소닉 루믹스 DC-S9 + S 18-40mm KIT 정품/TR 다크 올리브 (#M)카메라/주변기기>미러리스카메라>미러리스카메라 GFK > traverse > 11st > 가전/디지털 > 카메라/주변기기 > 미러리스카메라 > 미러리스카메라'</li><li>'시그마 (Sigma) SIGMA 풀 사이즈 미러리스 SLR 카메라 fp 바디 (#M)SSG.COM>카메라/캠코더>디지털카메라/액션캠>DSLR GFK > traverse > ssg > 디지털/렌탈 > 카메라/캠코더 > 디지털카메라/액션캠 > DSLR'</li></ul> |
| 43.0 | <ul><li>'신일 컨백션 전기히터 컨벡터 컨벡션 온열기 난로 가정용 사무실 리모컨 온도조절 안전 SEH-C310 (#M)디지털/가전>계절가전>컨벡터 GFK > Naverstore > 가전 > 계절가전 > 난방가전 > 컨벡터'</li><li>'신일 전기 컨벡션 히터 컨벡터 동파방지 벽걸이라디에이터 대류식난방기 T15HSS 신일 컨벡터 T15HSS (#M)디지털/가전>계절가전>컨벡터 GFK > Naverstore > 가전 > 계절가전 > 난방가전 > 컨벡터'</li><li>'밀 MILL 북유럽 가정용 전기 컨벡터 히터 타이머 온풍기 전기난로 MILL1900TMAX (#M)디지털/가전>계절가전>컨벡터 GFK > Naverstore > 가전 > 계절가전 > 난방가전 > 컨벡터'</li></ul> |
| 53.0 | <ul><li>'IPTIME EFM네트웍스 아이피타임 A3000U 무선랜카드 NPAYMALL (#M)디지털/가전>네트워크장비>랜카드>무선랜카드 Naverstore > 컴퓨터 > 주변기기 > 랜카드 > 무선'</li><li>'EFM ipTIME A3000UA USB 무선 랜카드 (#M)홈>디지털/가전>네트워크장비>랜카드>무선랜카드 Naverstore > 컴퓨터 > 주변기기 > 랜카드 > 무선'</li><li>'EFM ipTIME U1G-C USB 3.0 기가비트 랜카드 (#M)컴퓨터 주변기기>네트워크장비>LAN카드 GFK > 11st > 가전/디지털 > 컴퓨터 주변기기 > 네트워크장비 > LAN카드'</li></ul> |
| 117.0 | <ul><li>'IFI ZEN Air DAC '</li><li>'아이리버 SE300 포터블 하이엔드 DAP.R-2R DAC . Class A AMP (#M)음향가전>기타 음향기기>음향기기 기타 GFK > traverse > 11st > 가전/디지털 > 음향가전 > 기타 음향기기 > 음향기기 기타'</li><li>'오닉스 Onix Mystic XP1 DAC-AMP [한국총판] 해외배송 (설 연휴 이후 발송)_뮤직파이 공구 Mystic XP1 (#M)디지털/가전>음향가전>DAC GFK > traverse > Naverstore > 디지털 > 음향기기 > 플레이어 > 기타'</li></ul> |
| 183.0 | <ul><li>'LG전자 엘지 B101W14 B101S14 일반냉장고 소형 미니 입원실 원룸 사무실 B101S14(샤인) (#M)11st>냉장고>일반형>일반형 11st > 가전/디지털 > 냉장고 > 일반형 > 일반형'</li><li>'윈텍 WC-32CGN 레트로 냉장고 무소음 그린 32L 음료냉장고 가정용 (#M)위메프 > 가전·디지털·컴퓨터 > 대형가전 > 냉장고 > 일반 냉장고 위메프 > 가전·디지털·컴퓨터 > 대형가전 > 냉장고 > 일반 냉장고'</li><li>'소형 냉장고 기숙사 중형 미니 사무실 가정용 간식보관 모텔 스마트 07)더블도어/80A168/실버/과일케이스 (#M)위메프 > 가전·디지털·컴퓨터 > 대형가전 > 냉장고 > 일반 냉장고 위메프 > 가전·디지털·컴퓨터 > 대형가전 > 냉장고 > 일반 냉장고'</li></ul> |
| 211.0 | <ul><li>'아우스 V-REX 컴퓨터 게이밍의자 발받침 높이 각도조절 게임용 PC방 의자 화이트 (#M)가구/인테리어>서재/사무용가구>의자>목받침의자 GFK > Naverstore > 디지털 > 게이밍 > 게이밍가구 > 게이밍의자'</li><li>'Qwertykeys QK65v2 추가 파츠 (#M)디지털/가전>PC부품>튜닝용품>기타튜닝용품 GFK > traverse > Naverstore > 컴퓨터 > 부품 > 튜닝용품 > 기타튜닝용품'</li><li>'레노버 샤오신패드 프로 12.7 8+128GB Pad Pro 2023년 내수롬 8+128GB 그레이 (#M)디지털/가전>태블릿PC GFK > traverse > Naverstore > 컴퓨터 > 노트북 > 태블릿PC'</li></ul> |
| 225.0 | <ul><li>'프리즘 LED 스탠드 PL-1400 충전식 무선 시력보호 듀얼헤드 각도조절 책상 조명 (#M)디지털/가전>생활가전>스탠드>LED스탠드 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 생활가전 > 기타생활가전'</li><li>'Holy Stone ID 13.9g 리모트 외장 발신기 드론 등록 제도 대응 국토 교통성 대응 모델 5시간 (#M)SSG.COM>카메라/캠코더>촬영용 드론 GFK > traverse > ssg > 디지털/렌탈 > 카메라/캠코더 > 촬영용 드론'</li><li>'라미 3WAY 대형 카메라 스마트폰 삼각대 RM-MT180 (4단/180cm) 라미 삼각대 RM-MT180 PRO(3단) (#M)디지털/가전>카메라/캠코더용품>삼각대/헤드>삼각대 GFK > traverse > Naverstore > 디지털 > 카메라 > 삼각대/헤드 > 삼각대'</li></ul> |
| 4.0 | <ul><li>'랜선 랜케이블 인터넷선 UTP LAN 선 다이렉트 인터넷 연결선 CAT.5E 0.3m 7.CAT8 SFTP (40G) 고품질_2m 블랙 홈>케이블(영상,음성,통신)>랜 케이블;(#M)홈>디지털/가전>PC부품>PC케이블>랜케이블 Naverstore > 컴퓨터 > 주변기기 > 케이블/젠더 > 케이블'</li><li>'키크론 프리미엄 기계식 키보드 항공 케이블 코일 USB C타입키크론 항공케이블 스트레이트_퍼플 (#M)가전·컴퓨터>PC부품·주변기기>기타 부품 Tmon > 가전·디지털 > 가전·컴퓨터 > PC부품·주변기기 > 기타 부품'</li><li>'마하링크 스테레오 AUX 고급형 케이블 1M ML-STH010 (#M)디지털/가전>PC부품>PC케이블>오디오케이블 Naverstore > 컴퓨터 > 주변기기 > 케이블/젠더 > 케이블'</li></ul> |
| 98.0 | <ul><li>'카리스 자외선 살균기 소독기 KRS-989 10리터 (#M)디지털/가전>생활가전>자외선소독기 GFK > Naverstore > 가전 > 생활가전 > 살균소독기'</li><li>'(기념 중) 국산 다용도 이동 공간살균기(아래 동영상 참조) 집먼지진드기퇴치 세균박멸등 특허등록 CE인증 자외선 UVC led 엔퓨텍 XD-2D04 (#M)디지털/가전>생활가전>자외선소독기 GFK > Naverstore > 가전 > 생활가전 > 살균소독기'</li><li>'모스티브 탁상용 철제류 네일 살균기 (#M)디지털/가전>생활가전>자외선소독기 GFK > Naverstore > 가전 > 생활가전 > 살균소독기'</li></ul> |
| 26.0 | <ul><li>'부산보일러 린나이 RC610-N-15KFN 친환경 콘덴싱 창원김해울산양산 설치 교체 (#M)디지털/가전>계절가전>보일러>가스보일러 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 계절가전 > 보일러'</li><li>'대성보일러 DNC1-15D 서울 의정부 남양주 강북구 도봉구 노원구 수리 교체 당일 설치 (#M)디지털/가전>계절가전>보일러>가스보일러 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 계절가전 > 보일러'</li><li>'삼양 구동기 CEC VA-200 / 지멘스 구동기 삼양 커넥터 (#M)디지털/가전>계절가전>보일러>가스보일러 GFK > Naverstore > 가전 > 계절가전 > 난방가전 > 보일러'</li></ul> |
| 106.0 | <ul><li>'부성핫슈 핸드드라이어 BSHD-2807 드라이 손건조기 업소용 초강력 손건조 화이트(WD-07) (#M)홈>디지털/가전>생활가전>핸드드라이어 Naverstore > 가전 > 욕실가전 > 손건조기'</li><li>'모두의만물 초고속핸드드라이어 HTM-350 전면LED 강력한바람 온풍 2,1000W 일반 HTM-350[2100W] (#M)디지털/가전>생활가전>핸드드라이어 Naverstore > 가전 > 욕실가전 > 손건조기'</li><li>'다이슨 에어블레이드 핸드드라이어 V / 니켈 1번-왼쪽_선택안함 홈>다이슨 핸드드라이어;(#M)홈>환경위생>핸드드라이어 Naverstore > 가전 > 욕실가전 > 손건조기'</li></ul> |
| 68.0 | <ul><li>'[로지텍] Logitech C920 PRO HD WebCam 웹캠 화상카메라 벌크 택배 병행 당일출고 C920 (#M)디지털/가전>멀티미디어장비>웹캠 GFK > traverse > Naverstore > 컴퓨터 > 주변기기 > 웹캠'</li><li>'앱코 ABKO, APC720 Lite HD 웹캠 화상카메라 캠 컴퓨터카메라 (#M)디지털/가전>멀티미디어장비>웹캠 Naverstore > 컴퓨터 > 주변기기 > 웹캠'</li><li>'프리에이티브 고해상도 웹캠 AF500FHD 500만화소 풀HD 줌 온라인 수업 구루미캠 1080P 60FPS 하이엔드 AFC80FHD (#M)디지털/가전>멀티미디어장비>웹캠 Naverstore > 컴퓨터 > 주변기기 > 웹캠'</li></ul> |
| 101.0 | <ul><li>'빅버튼 유선전화기사무실 회사 집 가정용 발신자표시 선택1 : OID-500 (#M)홈>전체상품 Naverstore > 가전 > 생활가전 > 전화기 > 유선'</li><li>'전화기선 키폰 수화기선 줄 코드 전화선 케이블 송수화기선 전화기선-검정 (#M)디지털/가전>생활가전>전화기>유선전화기 GFK > Naverstore > 가전 > 생활가전 > 전화기'</li><li>'맥슨 유선 전화기 집 사무실 일반전화기 옛날 (#M)디지털/가전>생활가전>전화기>유선전화기 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 생활가전 > 기타생활가전'</li></ul> |
| 156.0 | <ul><li>'[초강력세척] 비안크루세 가정용 야채 과일 초음파 세척기 '</li><li>'클로베리 프리미엄 과일야채 살균세척기 '</li><li>'리비다 채칼 전동 자동 만능 오토 돌돌이 슬라이서 야채 양배추 당근 감자 무 채써는기계 (#M)디지털/가전>주방가전>기타주방가전 GFK > traverse > Naverstore > 가전 > 주방가전'</li></ul> |
| 41.0 | <ul><li>'매장판 온라인 단독 오엘라 제습기 01. 오엘라 소형 제습기 SD01 (#M)가전·컴퓨터>계절가전>제습기 Tmon > 가전·디지털 > 가전·컴퓨터 > 계절가전 > 제습기'</li><li>'ThinkAir DL12 제습기 (#M)11st>계절가전>제습기>가정용 11st > 가전/디지털 > 계절가전 > 제습기 > 가정용'</li><li>'삼성 제습기 1등급 인버터 원룸 미니 AY18CG7500GED 베이지 18L 23년 신형 세이지 그린 (#M)홈>✨23년 NEW 제습기✨ Naverstore > 가전 > 계절가전 > 제습기'</li></ul> |
| 173.0 | <ul><li>'(+1.8L 컨테이너 볼 추가 증정) 콘체 X5 초강력 블렌더 카페믹서기 업소용 블렌더 티타늄코팅 칼날 '</li><li>'신일 대용량믹서기 4500ml 스텐레스/김장/대형/업소용 '</li><li>'최신형 vitamix 바이타믹스 콰이어트원 블랜더+추가볼 (에어레이팅볼 선택) /정품 '</li></ul> |
| 27.0 | <ul><li>'가습기 미니가습기 가열실가습기 천연가습기 대용량가습기 복합식가습기 안티 중력 800ML UV 공기 청정기 가습기 미니가습기 가열실가습기 천연가습기 대용량가습기 복합식가습기 안티 중력 800ML UV 공기 청정기_02 800ml Light Green (#M)가전·컴퓨터>계절가전>가습기 Tmon > 가전·디지털 > 가전·컴퓨터 > 계절가전 > 가습기'</li><li>'가습기 불멍 타워형 복합식 대용량 생수병 거실용 미니 아로마 케어 침실 용 대형 룸 (2L 워터 탱크) 쿨 미스트 탑 필 (에센셜 오일 디퓨저 포함) 가습기 불멍 타워형 복합식 대용량 생수병 거실용 미니 아로마_white_JP 플러그 (#M)가전·컴퓨터>계절가전>가습기 Tmon > 가전·디지털 > 가전·컴퓨터 > 계절가전 > 가습기'</li><li>'9L 대용량 복합식 가열식 THE완벽한가습기 AMH 9000 /23년형 상부급수 통세척 2 원대 프리미엄 무선 물걸레청소기 글라이드S AMC-2500 전용거치대+세탁 (#M)위메프 > 가전·디지털·컴퓨터 > 계절가전/에어컨 > 가습/청정가전 > 가습기 위메프 > 가전·디지털·컴퓨터 > 계절가전/에어컨 > 가습/청정가전 > 가습기'</li></ul> |
| 97.0 | <ul><li>'에어베리 스마트 의류관리기 2set 세트 구성 향기 1대+살균 1대+향기블럭3개+제습겔1팩_코코브리즈 (3개) (#M)홈>스마트 의류관리기 Naverstore > 가전 > 세탁기/건조기 > 의류관리기'</li><li>'LG 올 뉴 스타일러 오브제컬렉션 SC5GMR81H 상의 5벌 + 하의 1벌 블랙틴트미러 (GD) (#M)세탁기/건조기>의류관리기>의류관리기 GFK > 11st > 가전/디지털 > 세탁기/건조기 > 의류관리기'</li><li>'LG S5BBU 스타일러 5벌+바지 1벌 / KN (#M)세탁기/건조기>의류관리기>의류관리기 GFK > 11st > 가전/디지털 > 세탁기/건조기 > 의류관리기'</li></ul> |
| 170.0 | <ul><li>'자일렉 가정용 소프트 아이스크림메이커 ZL-214S '</li><li>'소프트아이스크림기계 메이커 업소용 상하목장 카페 테이블 요거트아이스크림 머신 콘 정품AS '</li><li>'브레빌 아이스크림 메이커 스마트 스쿱 BCI600 (#M)디지털/가전>주방가전>아이스크림제조기 Naverstore > 가전 > 주방가전 > 간식메이커 > 아이스크림'</li></ul> |
| 158.0 | <ul><li>'쁘띠냉장고 수납물 선반 위 냉장고 상단 공간선반 주방선반 조가비 층칸막이 냉장고에서 T19-밀리터리그린 화이트 헐렁헐값_선택하세요 (#M)홈>디지털/가전>주방가전>냉장고>일반형냉장고 Naverstore > 가전 > 냉장고 > 3,4도어'</li><li>'저온창고 도어 문 Haier 냉장고 씰 스트립 고무 링 마그네틱 흡입 원래 액세서리 범용 가죽 클로저 134 단일 도어 "업그레이드 두껍게-강한 자기 매력" (#M)위메프 > 가전·디지털·컴퓨터 > 대형가전 > 냉장고 > 양문형 냉장고 위메프 > 가전·디지털·컴퓨터 > 대형가전 > 냉장고 > 양문형 냉장고'</li><li>'저온창고 도어 문 Haier 냉장고 씰 스트립 고무 링 마그네틱 흡입 원래 액세서리 범용 가죽 클로저 134 옆집 (#M)위메프 > 가전·디지털·컴퓨터 > 대형가전 > 냉장고 > 양문형 냉장고 위메프 > 가전·디지털·컴퓨터 > 대형가전 > 냉장고 > 양문형 냉장고'</li></ul> |
| 8.0 | <ul><li>'[공식몰/ ] GIGABYTE B760M DS3H D4 피씨디렉트 (#M)11st>PC부품>메인보드>인텔 CPU용 11st > 가전/디지털 > PC부품 > 메인보드 > 인텔 CPU용'</li><li>'[공식몰/ ] GIGABYTE B760M AORUS ELITE 피씨디렉트 (#M)11st>PC부품>메인보드>인텔 CPU용 11st > 가전/디지털 > PC부품 > 메인보드 > 인텔 CPU용'</li><li>'[ASRock] B660M Pro RS D4 디앤디컴 (인텔B660/M-ATX) (#M)디지털/가전>PC부품>메인보드>인텔CPU용 GFK > Naverstore > 컴퓨터 > 부품 > 메인보드 > 인텔용'</li></ul> |
| 95.0 | <ul><li>'삼성전자 삼성 VC33M3120LU 싸이클론 진공청소기 안티탱글 3중청정클린 슬라이드핸들 (#M)디지털/가전>생활가전>청소기>유선청소기 Naverstore > 가전 > 청소기 > 진공청소기'</li><li>'[LG 공식판매점] 슈퍼 싸이킹 III 청소기 K83RG (#M)홈>생활가전>싸이킹 Naverstore > 디지털/가전 > 생활가전 > 청소기 > 유선청소기'</li><li>'LG전자 유선 최강흡입 통돌이 진공청소기 홈>디지털/가전>생활가전>청소기>유선청소기;(#M)홈>전체상품 Naverstore > 가전 > 청소기 > 유선청소기'</li></ul> |
| 67.0 | <ul><li>'4k HDMI USB 2.0 캡쳐보드 화면 녹화 obs 게임 스크린 캡처 방송 닌텐도 스위치 02_USB 3.0 (#M)디지털/가전>멀티미디어장비>영상편집카드>영상편집 GFK > Naverstore > 가전 > 영상가전 > 액세서리 > 영상편집카드'</li><li>'엠비에프 MBF-UHCP-C '</li><li>'AVerMedia GC553 외장형 캡쳐카드 4K 캡쳐보드 '</li></ul> |
| 29.0 | <ul><li>'캠핑 선풍기 캠핑용 써큘레이터 무선 충전식 무드등 차박 탁상용선풍기 캠핑선풍기+수납가방 (#M)홈>디지털/가전>계절가전>선풍기>탁상형선풍기 Naverstore > 가전 > 계절가전 > 선풍기 > 미니선풍기'</li><li>'프롬비 사일런트 스톰 저소음 무선 휴대용선풍기 FA135 SilentStorm(거치대형) 인디핑크 (#M)디지털/가전>계절가전>선풍기>휴대용선풍기 Naverstore > 가전 > 계절가전 > 선풍기 > 휴대용'</li><li>'신일 캠핑용선풍기 캠핑선풍기 무선 휴대용 야외용 충전식 12인치 선풍기 캠핑장 12인치+가방 / 무선 / 아이보리색 홈>디지털/가전>계절가전>선풍기>휴대용선풍기;(#M)홈>디지털/가전>계절가전>선풍기>탁상형선풍기 Naverstore > 가전 > 계절가전 > 선풍기 > 탁상형'</li></ul> |
| 63.0 | <ul><li>'게이밍 게임 스탠딩 마이크 배그 디스코드 컴퓨터 JTUM400 실버 단품 실버단품 (#M)디지털/가전>멀티미디어장비>PC마이크 GFK > Naverstore > 컴퓨터 > 주변기기 > 사운드 > 마이크'</li><li>'컴소닉 CM-7010 USB 프리미엄 스탠드마이크 게임 방송 디코 디스코드 필라마이크 CM-7010 USB Premium (#M)디지털/가전>멀티미디어장비>PC마이크 GFK > Naverstore > 컴퓨터 > 주변기기 > 사운드 > 마이크'</li><li>'앱코 MP3300 USB 콘덴서 스트리밍 스탠드 마이크 (#M)디지털/가전>멀티미디어장비>PC마이크 GFK > Naverstore > 컴퓨터 > 주변기기 > 사운드 > 마이크'</li></ul> |
| 181.0 | <ul><li>'휴렉 음식물 처리기 히어로 HD-9000SD (건조형) 히어로 필터 필터 추가(3개) (#M)디지털/가전>주방가전>음식물처리기 Naverstore > 가전 > 주방가전 > 위생관리 > 음식물처리기'</li><li>'스마트카라 PCS-400 가정용 음식물처리기 PCS-400 화이트+필터2세트 (#M)디지털/가전>주방가전>음식물처리기 GFK > Naverstore > 가전 > 주방가전 > 위생관리 > 음식물처리기'</li><li>'락앤락 음식물 쓰레기 냉장고 3L 화이트/그레이 (EJT116) 화이트 (#M)디지털/가전>주방가전>음식물처리기 Naverstore > 가전 > 주방가전 > 위생관리 > 음식물처리기'</li></ul> |
| 223.0 | <ul><li>'니콘 어댑터 링 SY-1-52 52mm (#M)카메라/주변기기>렌즈용품>렌즈용품 기타 GFK > traverse > 11st > 가전/디지털 > 카메라/주변기기 > 렌즈용품 > 렌즈용품 기타'</li><li>'스퀘어후드 후지필름 XF33 / XF23mm f1.4 R LM WR / XF16-50mm 렌즈 후드 (#M)디지털/가전>카메라/캠코더용품>렌즈용품>렌즈후드 GFK > traverse > Naverstore > 디지털 > 카메라 > 렌즈용품 > 렌즈후드'</li><li>'WEB CMOS CMS-V52S 산와 서플라이 카메라 회의용 와이드 렌즈 광각(수평 (#M)SSG.COM>카메라/캠코더>디지털카메라/액션캠>캠코더 GFK > traverse > ssg > 디지털/렌탈 > 카메라/캠코더 > 디지털카메라/액션캠 > 캠코더'</li></ul> |
| 197.0 | <ul><li>'테팔 컴팩트 커피메이커 원두커피 커피 CM3218 (#M)홈>디지털/가전>주방가전>커피메이커 Naverstore > 가전 > 주방가전 > 커피용품 > 커피메이커'</li><li>'브레빌 커피 그라인더 도즈 컨트롤 프로 BCG600 (#M)디지털/가전>주방가전>커피메이커 Naverstore > 가전 > 주방가전 > 커피용품 > 커피메이커'</li><li>'[리빙가전] 테팔 커피메이커 비보 CM222B (#M)가전·컴퓨터>주방가전>전기주전자>무선포트 Tmon > 가전·디지털 > 가전·컴퓨터 > 주방가전 > 전기주전자 > 무선포트'</li></ul> |
| 163.0 | <ul><li>'키친아트 다지기 KM-28FM 스테인레스 6리터 대용량 키친아트 다지기 KM-28F (#M)주방가전>믹서기/핸드블렌더>다지기/분쇄기 GFK > 11st > 가전/디지털 > 주방가전 > 믹서기/핸드블렌더 > 다지기/분쇄기'</li><li>'7초 만능 다지기 김장 대용량 마늘 박피기 다지는기계 마늘 까는기계 만능다지기 2.5L(마늘박피기포함) (#M)홈>디지털/가전>주방가전>분쇄기 Naverstore > 가전 > 주방가전 > 믹서기/블렌더 > 분쇄기'</li><li>'한일전기 3.2L 대용량 스텐믹서 SHMF-3250S (#M)디지털/가전>주방가전>분쇄기 Naverstore > 가전 > 주방가전 > 믹서기/블렌더 > 분쇄기'</li></ul> |
| 165.0 | <ul><li>'[세트할인] 단미 1구 와플메이커 샌드위치메이커 SAN01+플레이트 세트 (붕어빵 or 도넛) SAN01 핑크 + 붕어빵 플레이트 (#M)디지털/가전>주방가전>샌드위치제조기 Naverstore > 가전 > 주방가전 > 간식메이커 > 샌드위치'</li><li>'키친아트 샌드위치 메이커 (#M)가전·컴퓨터>주방가전>토스트·제빵·간식>홈베이킹·간식메이커 Tmon > 가전·디지털 > 가전·컴퓨터 > 주방가전 > 토스트·제빵·간식 > 홈베이킹·간식메이커'</li><li>'[6%쿠폰] 키친아트 샌드위치 메이커 토스트기 토스터기 아이들-아빠 간식메이커 PK-2168JT(샌드위치) (#M)위메프 > 가전·디지털·컴퓨터 > 주방가전 > 홈메이드가전 > 홈베이킹/토스터기 위메프 > 가전·디지털·컴퓨터 > 주방가전 > 홈메이드가전 > 홈베이킹/토스터기'</li></ul> |
| 164.0 | <ul><li>'키친아트 5L 자동 전기 빙수기 KIC-2311WS (#M)디지털/가전>주방가전>빙수기 Naverstore > 가전 > 주방가전 > 간식메이커 > 빙수기'</li><li>'키친아트 빙수기/전기빙수기/슬러시 KAIM-P2791NK (#M)홈>디지털/가전>주방가전>빙수기 Naverstore > 가전 > 주방가전 > 간식메이커 > 빙수기'</li><li>'보국전자 눈꽃 얼음 빙수기 BKK-1140S 팥빙수 우유빙수 설빙빙수 (#M)디지털/가전>주방가전>빙수기 Naverstore > 가전 > 주방가전 > 간식메이커 > 빙수기'</li></ul> |
| 76.0 | <ul><li>'테팔 클래식 논스틱 코팅열판 건식 다리미 (#M)11st>생활가전>다리미>스팀다리미 11st > 가전/디지털 > 생활가전 > 다리미 > 스팀다리미'</li><li>'태팔건식 가벼운다리미 클래식 논스틱 코팅 열판 경량 다리미 (#M)생활가전>다리미>건식다리미 GFK > 11st > 가전/디지털 > 생활가전 > 다리미 > 건식다리미'</li><li>'스팀다리미 스마트 프로텍트 플러스 FV6872/다리미/테팔/테팔(가전) (#M)홈>디지털/가전>생활가전>다리미>건식다리미 Naverstore > 가전 > 생활가전 > 다리미 > 건식'</li></ul> |
| 31.0 | <ul><li>'[HDC아이파크몰] 벤타 오리지널에어워셔 LW-45B 블랙기화식 가습기 공기청정기 LW-45W(화이트) (#M)홈>디지털/가전>계절가전>공기정화기>에어워셔 Naverstore > 가전 > 계절가전 > 공기청정기 > 에어워셔'</li><li>'[LG 공식판매점] 퓨리케어 에어워셔 HW500DAS 5L 자연기화식 가습기 35㎡ 홈>계절가전>에어워셔;홈>퓨리케어 공기청정기;홈>에어케어>에어워셔(가습기);(#M)홈>계절가전>에어워셔(가습기) Naverstore > 가전 > 계절가전 > 공기청정기 > 에어워셔'</li><li>'LG전자 퓨리케어 공기청정기 AS301DNPA .. LG전자 퓨리케어 공기청정기 AS301DNPA 무료배송 .. (#M)가전·컴퓨터>계절가전>에어워셔·공기청정>에어워셔·공기청정 Tmon > 가전·디지털 > 가전·컴퓨터 > 계절가전 > 에어워셔·공기청정'</li></ul> |
| 72.0 | <ul><li>'Ekeepment 하이라이저 높이조절 아이맥 메탈 모니터 받침대 스탠드 선반 Silver (#M)디지털/가전>모니터주변기기>모니터받침대 Naverstore > 컴퓨터 > 주변기기 > 모니터용 > 받침대'</li><li>'알파플랜 높은 알루미늄 아이맥 모니터 받침대 스탠드 선반 560mm_스페이스그레이(SG) (#M)디지털/가전>모니터주변기기>모니터받침대 Naverstore > 컴퓨터 > 주변기기 > 모니터용 > 받침대'</li><li>'높은 모니터받침대 듀얼 모니터 받침대 스탠드 받침 선반 (#M)디지털/가전>모니터주변기기>모니터받침대 GFK > Naverstore > 컴퓨터 > 주변기기 > 모니터용'</li></ul> |
| 36.0 | <ul><li>'[일월] 22년형 프리미엄 온수 매트 듀얼하트 온수매트 퀸 홈>온수카페트;(#M)홈>온수매트 Naverstore > 가전 > 계절가전 > 냉온수매트 > 온수매트'</li><li>'일월 듀얼하트 온수매트 플러스싱글 2023년 최신형 05.초슬림 온수매트_싱글100x200 홈>매트_커버>온수매트;(#M)홈>전체상품 Naverstore > 가전 > 계절가전 > 냉온수매트 > 온수매트'</li><li>'비나잇 프리미엄 온수매트 세탁 워셔블 스몰 싱글 침대용 퀸(1500x1900)_단일난방(침대용) (#M)디지털/가전>계절가전>온수매트 Naverstore > 가전 > 계절가전 > 냉온수매트 > 온수매트'</li></ul> |
| 22.0 | <ul><li>'[르젠] 선풍기 리모컨 (기타) '</li><li>'[스멜스탑 본사몰] 화장실 환풍기 댐퍼 배관용품 & 주방렌지후드 음식냄새 역류방지 아파트 담배냄새차단 (2타입) '</li><li>'베셀S자드라이버2PC셋 코너 ㄱ자 양용 직각 기억자 특수 십자 공구 (#M)주방가전>정수기>부속품 GFK > traverse > 11st > 가전/디지털 > 주방가전 > 정수기'</li></ul> |
| 230.0 | <ul><li>'중고갤럭시S21/S21+/울트라/Z폴드/Z플립 프리미엄 중고 공기계 자급제 노트20울트라 256GB_3사공용 화이트_특S급 쇼킹딜 홈>디지털>휴대폰/액세서리>가입상품/공기계;11st>휴대폰>공기계/언락폰>삼성;11st>휴대폰>자급제/공기계>삼성;쇼킹딜 홈>디지털>리퍼/중고/렌탈>리퍼/중고/렌탈;11st>디지털>리퍼/중고/렌탈>리퍼/중고/렌탈;11st>휴대폰>중고폰>중고폰;11st > 디지털/가전/컴퓨터 > 휴대폰 > 공기계/언락폰;(#M)11st>휴대폰>공기계/중고폰>공기계/새상품 11st > 가전/디지털 > 휴대폰 > 공기계/중고폰 > 공기계/새상품'</li><li>'[정품 리퍼폰]노트20,10/노트10플러스,갤럭시10 5G/20/20+ 공기계/알뜰폰/리퍼폰/새배터리/새액정 갤럭시S20플러스 256GB_리퍼폰(새액정+새배터리+테두리 교체)_3사공용-아우라블루 11st>휴대폰>중고폰>중고폰;(#M)11st>휴대폰>공기계/중고폰>공기계/새상품 11st > 가전/디지털 > 휴대폰 > 공기계/중고폰 > 공기계/새상품'</li><li>'[프리미엄리퍼폰/중고폰]갤럭시S22/S21/S20/S10/노트20/노트10/Z플립2,3/21울트라/알뜰폰/공기계 갤럭시S21플러스 256GB_리퍼폰(새액정+새배터리+테두리 교체)_3사공용-팬텀 바이올렛 11st>디지털>리퍼/중고/렌탈>리퍼/중고/렌탈;11st>휴대폰>중고폰>중고폰;11st Hour Event > 디지털/가전 > 디지털 > 리퍼/중고/렌탈 > 리퍼/중고/렌탈;(#M)11st>휴대폰>공기계/중고폰>공기계/새상품 11st > 가전/디지털 > 휴대폰 > 공기계/중고폰 > 공기계/새상품'</li></ul> |
| 186.0 | <ul><li>'키친아트 샤브샤브 전기 냄비 2단 멀티쿠커 전골냄비 (#M)홈>디지털/가전>주방가전>전기쿠커>전기냄비 Naverstore > 가전 > 주방가전 > 전기쿠커 > 전기냄비'</li><li>'Bear 7구 올스텐 미니 고구마 계란찜기 달걀삶는 기계 타이머 Bear 다용도 계란찜기 (#M)디지털/가전>주방가전>전기쿠커>전기찜기 Naverstore > 가전 > 주방가전 > 전기쿠커 > 전기찜기'</li><li>'[키친아트] 허브 자취용 만능 멀티쿠커 찜기 냄비 KTP-MS1218 (#M)11st>주방가전>전기포트>무선포트/주전자 11st > 가전/디지털 > 주방가전 > 전기포트 > 무선포트/주전자'</li></ul> |
| 102.0 | <ul><li>'로봇 진공 청소기 Hepa 필터 샤오미 Roborock S5 Max S6 MaxV 액세서리 예비 부품 로봇 진공 청소기 Hepa 필터 사*미 Roborock S5 Max S6 MaxV 액_세트 J (#M)가전·컴퓨터>생활가전>청소기>로봇청소기 Tmon > 가전·디지털 > 가전·컴퓨터 > 생활가전 > 청소기 > 로봇청소기'</li><li>'[호환] 라이드스토 R1 S1 필터 소모품 로봇청소기 부품 교체 사이드 브러쉬 2EA (#M)홈>디지털/가전>생활가전>청소기>청소기액세서리 Naverstore > 디지털/가전 > 생활가전 > 청소기 > 청소기액세서리'</li><li>'MD글로벌 다이슨 거치대 V10 V8 V7 V6 전기종 호환 6.프리미엄 다이슨 전용 거치대 - 화이트 (#M)홈>디지털/가전>생활가전>청소기>청소기액세서리 Naverstore > 디지털/가전 > 생활가전 > 청소기 > 청소기액세서리'</li></ul> |
| 23.0 | <ul><li>'LG 냉난방기 스탠드 인버터 냉온풍기 업소용 사무실 15형 PW0603R2SF 설치비별도 특가\t15형\t3등급\tPW0603R2SF 홈>추천★냉난방기모음;홈>추천★냉난방기;(#M)홈>냉난방기>LG전자>스탠드 냉난방기 Naverstore > 가전 > 계절가전 > 에어컨 > 냉온풍기'</li><li>'삼성전자 스탠드 냉난방기 40평형 인버터 냉온풍기 업소용 AP145RAPDHH1S 홈>냉난방기>삼성>스탠드;(#M)홈>🔥냉난방기>삼성>스탠드 Naverstore > 가전 > 계절가전 > 에어컨 > 냉온풍기'</li><li>'[캐리어대리점] 23년 신형 초절전 인버터 6평형 벽걸이 에어컨 OARC-0061WAWSD (실외기포함/전국 /기본설치무료) (#M)디지털/가전>계절가전>에어컨>벽걸이형에어컨 Naverstore > 가전 > 계절가전 > 에어컨 > 벽걸이형'</li></ul> |
| 141.0 | <ul><li>'수동 코털 깎기 제거기 수동코털제거기 콧털가위 코털정리기 수동콧털제거기 콧털제거기 코털깍기 홈 > 뷰티 > 뷰티기기/소품 > 면도기/제모기 > 코털정리기 LO > traverse > LotteOn > 뷰티 > 뷰티기기/소품 > 면도기/제모기 > 코털정리기'</li><li>'필립스 NT 3600 코털제거기 방수 2헤드 코털정리기 (#M)GSSHOP>뷰티>이미용기기>기타이미용기기 GSSHOP > 뷰티 > 이미용기기 > 기타이미용기기'</li><li>'나비 전기 코털정리기 코털제거기 코털 잔털제거기 잔털 눈섭정리기 NV151-ENT7 블랙 홈 > 뷰티 > 뷰티기기/소품 > 면도기/제모기 > 코털정리기 LO > traverse > LotteOn > 뷰티 > 뷰티기기/소품 > 면도기/제모기 > 코털정리기'</li></ul> |
| 90.0 | <ul><li>'아이스티머 런던 스팀다리미+아이클리너+거치대+레더박스 색상:리얼그린 (#M)가전·컴퓨터>생활가전>다리미·미싱·기타>스팀다리미 Tmon > 가전·디지털 > 가전·컴퓨터 > 생활가전 > 다리미·미싱·기타 > 스팀다리미'</li><li>'아웃핏터 프로 프리미엄A(상의+바지) 홈드라이 의류케어 자동스팀다리미판 스탠드 핸드형 와이셔츠다림질 프리미엄C(상의+바지+커버+모자신발+롱) (#M)홈>디지털/가전>생활가전>다리미>스팀다리미 Naverstore > 가전 > 생활가전 > 다리미 > 스팀'</li><li>'[얀스토어(yarn store)]독일 프림 휴대용 미니스팀다리미 (PYRM MINI STEAM IRON) 611916-KB 11st>홈패브릭/수예>주방패브릭>앞치마;(#M)11st>생활가전>다리미>스팀다리미 11st > 가전/디지털 > 생활가전 > 다리미 > 스팀다리미'</li></ul> |
| 64.0 | <ul><li>'Companion 50 컴퓨터 겸용 멀티미디어 스피커 GS (#M)홈>디지털/가전>멀티미디어장비>PC스피커>2.1채널 Naverstore > 컴퓨터 > 주변기기 > 사운드 > 스피커'</li><li>'[브랜드위크 14만] 삼성공식파트너 JBL PULSE4 펄스4 감성 무드등 블루투스 스피커 LED 360도 조명 블랙 쇼킹딜 홈>가전>음향/프로젝터>스피커/사운드바;11st>가전>음향/프로젝터>스피커/사운드바;11st Hour Event > 오늘오픈;(#M)11st>음향가전>스피커>블루투스 스피커 11st > 가전/디지털 > 음향가전 > 스피커 > 블루투스 스피커'</li><li>'앱코 SP400 2채널 멀티미디어 PC스피커 (블랙) (#M)홈>디지털/가전>멀티미디어장비>PC스피커>2채널 Naverstore > 컴퓨터 > 주변기기 > 사운드 > 스피커'</li></ul> |
| 207.0 | <ul><li>'로지텍 무선 무소음 손목 편한 마우스 m331 레드 (#M)디지털/가전>주변기기>마우스>무선마우스 GFK > traverse > Naverstore > 컴퓨터 > 키보드/마우스 > 마우스 > 저소음마우스'</li><li>'클로 넥앤프로 목 어깨 마사지기 안마기 승모근 마사지 기계 지압기 무선 넥앤프로 (베이지)_넥앤프로 (퍼플) (#M)생활/건강>안마용품>안마기 GFK > traverse > Naverstore > 건강/의료용품 > 안마용품'</li><li>'유선 게이밍 광마우스 Hacker A660 3325 센서 핑크 (#M)컴퓨터 주변기기>게이밍 주변기기>게이밍 마우스 GFK > traverse > 11st > 가전/디지털 > 컴퓨터 주변기기 > 게이밍 주변기기 > 게이밍 마우스'</li></ul> |
| 13.0 | <ul><li>'[PS5] 플레이스테이션5 디스크 에디션 (#M)디지털/가전>게임기/타이틀>가정용게임기 Naverstore > 디지털 > 게이밍 > 플레이스테이션 > 본체'</li><li>'(new) 노리박스 TV연결형 오락실게임기 가정용 오락기 레트로 게임기 신형FX팩(5152게임/1080P/총게임지원) (#M)디지털/가전>게임기/타이틀>가정용게임기 Naverstore > 디지털 > 게이밍 > 레트로게임기'</li><li>'[플레이스테이션] 엑스박스 본체 정품 악세사리 모음 07.마이크로소프트 엑스박스 XBOX Series X (#M)가전·컴퓨터>게임·소프트웨어>게임기>소니∙XBOX Tmon > 가전·디지털 > 가전·컴퓨터 > 게임·소프트웨어 > 게임기 > 소니∙XBOX'</li></ul> |
| 220.0 | <ul><li>'인스탁스 미니필름 40매 (#M)SSG.COM>카메라/캠코더>즉석/필름카메라>즉석카메라 GFK > traverse > ssg > 디지털/렌탈 > 카메라/캠코더 > 즉석/필름카메라 > 즉석카메라'</li><li>'인스탁스 디자인 미니필름 모던 5종 세트 (#M)SSG.COM>카메라/캠코더>즉석/필름카메라>즉석카메라 GFK > traverse > ssg > 디지털/렌탈 > 카메라/캠코더 > 즉석/필름카메라 > 즉석카메라'</li><li>'[한국후지필름] 인스탁스X위글위글 콜라보 미니12 즉석카메라 올인원 선물세트 (#M)카메라/주변기기>즉석카메라>일회용카메라 GFK > traverse > 11st > 가전/디지털 > 카메라/주변기기 > 즉석카메라 > 일회용카메라'</li></ul> |
| 196.0 | <ul><li>'훼마 샤워 스크린 E98 UP E61 / 라심발리 M27 M23 UP 훼마 샤워스크린 - 60mm (#M)디지털/가전>주방가전>커피머신>커피머신부속품 GFK > Naverstore > 가전 > 주방가전 > 커피용품 > 액세서리'</li><li>'시티즈앤밀크 D123 (화이트, 블랙)시티즈앤밀크 시티즈앤밀크 화이트 (#M)11st>주방가전>커피머신/메이커>캡슐커피머신 11st > 가전/디지털 > 주방가전 > 커피머신/메이커 > 캡슐커피머신'</li><li>'정품 훼마 E98 UP E61 가스켓 FAEMA 페마 커피머신 샤워스크린 - 60mm (#M)디지털/가전>주방가전>커피머신>커피머신부속품 GFK > Naverstore > 가전 > 주방가전 > 커피용품 > 액세서리'</li></ul> |
| 92.0 | <ul><li>'린클 음식물처리기(RC02) 색상:노블네이비 (#M)홈>디지털/가전>생활가전>건조기/탈수기>신발건조기 Naverstore > 가전 > 세탁기/건조기 > 신발건조기'</li><li>'스테인리스 장화세척대 발판 세척기 신발 부츠 공장 800x410x550mm 장화세척대 (#M)세탁기/건조기>건조기>신발건조기 GFK > 11st > 가전/디지털 > 세탁기/건조기 > 건조기'</li><li>'린클 음식물처리기(RC02) 색상:스페이스블랙 (#M)홈>디지털/가전>생활가전>건조기/탈수기>신발건조기 Naverstore > 가전 > 세탁기/건조기 > 신발건조기'</li></ul> |
| 200.0 | <ul><li>'키친아트 오븐 토스터기 KAO-700NK 홈>디지털/가전>주방가전>오븐>전기오븐;홈>디지털/가전>주방가전>토스터기>오븐토스터기;(#M)홈>주방가전>토스터기 Naverstore > 가전 > 주방가전 > 토스터기 > 오븐토스터기'</li><li>'정품 ㅁ 테팔 노베오 토스트기 LT-251870 (#M)11st>주방가전>토스터기>일반토스터기 11st > 가전/디지털 > 주방가전 > 토스터기 > 일반토스터기'</li><li>'테팔 토스터기 TT132DKR 토스트 자동전원차단 테팔 토스터기 TT132DKR 토스트 자동전원차단 (#M)가전·컴퓨터>주방가전>토스트·제빵·간식>홈베이킹·간식메이커 Tmon > 가전·디지털 > 가전·컴퓨터 > 주방가전 > 토스트·제빵·간식 > 홈베이킹·간식메이커'</li></ul> |
| 199.0 | <ul><li>'탄산수제조기 소다스트림 정품 탄산실린더 구매(스페어실린더) 충전 N타입 (#M)디지털/가전>주방가전>탄산수제조기 Naverstore > 가전 > 주방가전 > 음료제조기 > 탄산수제조기'</li><li>'딜라이트소다 셰프 탄산수제조기 1. 화이트 (#M)디지털/가전>주방가전>탄산수제조기 Naverstore > 가전 > 주방가전 > 음료제조기 > 탄산수제조기'</li><li>'[ 점] 딜라이트소다 바리스타 탄산수제조기 (#M)디지털/가전>주방가전>탄산수제조기 Naverstore > 가전 > 주방가전 > 음료제조기 > 탄산수제조기'</li></ul> |
| 130.0 | <ul><li>'동국제약 센텔리안24 마데카프라임 뷰티디바이스 1개 + 글루타치온 부스팅 앰플 30ml 1종 멜라캡처앰플10ml x 4개 샤 마데카프라임+콜라겐앰플+사은품 [C178] 홈 > 뷰티 > 뷰티기기/소품 > 헤어스타일러 > 고데기/매직기 LO > traverse > LotteOn > 뷰티 > 뷰티기기/소품 > 헤어스타일러 > 고데기/매직기'</li><li>'[문가영 Pick] 보다나 글램웨이브 봉고데기 프리볼트 핑크 40mm 보다나 글램웨이브 봉고데기 프리볼트 핑크 40mm 홈>헤어케어>헤어기기>헤어셋팅기기X;홈>헤어케어>헤어기기>헤어셋팅기기;홈>헤어케어>헤어기기>헤어롤;홈>헤어케어>헤어기기>고데기;홈>헤어케어>헤어기기>탈모/두피기기;(#M)홈>헤어케어>헤어기기>탈모/두피기기/헤어롤 OLIVEYOUNG > 헤어케어 > 헤어기기 > 탈모/두피기기/헤어롤'</li><li>'[문가영 Pick] 보다나 트리플 플로우 물결고데기 25mm (히피펌) [문가영PICK]보다나 트리플 플로우 물결고데기 25mm (히피펌) 홈>헤어케어>헤어기기>헤어셋팅기기X;홈>헤어케어>헤어기기>헤어셋팅기기;홈>헤어케어>헤어기기>헤어롤;홈>헤어케어>헤어기기>고데기;홈>헤어케어>헤어기기>탈모/두피기기;(#M)홈>헤어케어>헤어기기>탈모/두피기기/헤어롤 OLIVEYOUNG > 헤어케어 > 헤어기기 > 탈모/두피기기/헤어롤'</li></ul> |
| 39.0 | <ul><li>'곰표 한일 전기장판 거실용 전기매트 침대 EMF 탄소매트 소형 싱글 EMF탄소매트(진그레이)_싱글(중형)(105x180cm) (#M)디지털/가전>계절가전>전기매트/장판>전기장판 GFK > Naverstore > 가전 > 겨울가전 > 전기매트/장판'</li><li>'일월 텐셀 원적외선 탄소 카본매트(보관가방 포함) 모달 싱글 11st > timedeal;(#M)11st>계절가전>전기매트/장판>전기매트 11st > 가전/디지털 > 계절가전 > 전기매트/장판 > 전기매트'</li><li>'힐로빔 조인트빔 무릎 마사지기 찜질기 온열 어깨 더블팩(1+1/보조배터리 무료증정) (#M)생활/건강>안마용품>안마기 GFK > traverse > Naverstore > 건강/의료용품 > 안마용품 > 안마기'</li></ul> |
| 88.0 | <ul><li>'K9 PRO 유선형 K9PRO 유선+무선형 본품@배터리 2입증정) (#M)디지털/가전>생활가전>손소독기 Naverstore > 가전 > 욕실가전 > 손소독기'</li><li>'샤오미 미지아 센서형 자동 거품 손 세정기 리필 세정액 전용 손세정제 (3개입) 아미노산+향균(6개) (#M)홈>디지털/가전>생활가전>손소독기 Naverstore > 가전 > 욕실가전 > 손소독기'</li><li>'[청결양행] 분무형 자동 손소독기 BIO-001 기본형 (#M)디지털/가전>생활가전>손소독기 Naverstore > 가전 > 욕실가전 > 손소독기'</li></ul> |
| 161.0 | <ul><li>'두유제조기 두유기 콩국물 죽제조 600ml Amazom베스트 Mokkom 그린 (#M)디지털/가전>주방가전>두부두유제조기 Naverstore > 가전 > 주방가전 > 홍삼/영양식 > 두부,두유'</li><li>'[연속매진 사전예약] 오쿠아침앤 콩불림없는 두유제조기 BM600 목넘김이 부드러운 6중날 민트그린 (#M)디지털/가전>주방가전>두부두유제조기 Naverstore > 가전 > 주방가전 > 홍삼/영양식 > 두부,두유'</li><li>'조영 두유 제조기 콩물 만드는 기계 메이커 조영 두유 제조기(DJ12G-D545) (#M)디지털/가전>주방가전>두부두유제조기 GFK > Naverstore > 가전 > 주방가전 > 홍삼/영양식'</li></ul> |
| 75.0 | <ul><li>'대한민국 DC 12V 전원 어댑터 모니터 CCTV 공유기 전자악기 3구접지 12V0.5A 전원일체형 F(ST) KC인증 Skyplus 10) 12V3A 전원일체 F(ST) 홈>디지털/가전>모니터주변기기>모니터어댑터;(#M)홈>12V 어댑터 Naverstore > 컴퓨터 > 주변기기 > 모니터용 > 어댑터'</li><li>'LG 엘지 모니터 어댑터 DC 12V / 19V 전원 19V1.3A 대한민국 KC인증품 6) 19V2.1A 전원일체형 (#M)디지털/가전>모니터주변기기>모니터어댑터 GFK > Naverstore > 컴퓨터 > 주변기기 > 모니터용 > 어댑터'</li><li>'DC 12V 어댑터 전원 모니터 CCTV LED 12V 0.5A (500mA) 벽걸이형 12V 5A_(22) 3구 접지형 (#M)디지털/가전>모니터주변기기>모니터어댑터 GFK > Naverstore > 컴퓨터 > 주변기기 > 모니터용'</li></ul> |
| 148.0 | <ul><li>'원터치형 SSD 외장케이스 / M.2 NVMe / 8TB 10Gps (#M)PC부품>PC케이스>파워포함케이스 GFK > traverse > 11st > 가전/디지털 > PC부품 > PC케이스 > 파워포함케이스'</li><li>'타무즈 GKM330 M.2 2280 SATA (512GB)/SSD/정품판매점/무상3년/ngff//R (#M)저장장치>SSD>500GB이상 GFK > traverse > 11st > 가전/디지털 > 저장장치 > SSD > 500GB이상'</li><li>'공식 판매점 WD BLACK SN850X NVMe SSD 4TB AS 5년 PS5 호환 (#M)디지털/가전>저장장치>SSD GFK > naver_plus_traverse_extension > Naverstore > PC/주변기기 > 저장장치 > SSD'</li></ul> |
| 69.0 | <ul><li>'삼성전자 하만카돈 오라 스튜디오4 (Aura studio 4) '</li><li>'브리츠 BR-ST202 '</li><li>'스타벅스 서머 우드 스피커,2021 스타벅스 여름 md 2차 홈>21 MD>21 서머 2차;(#M)홈>시즌 MD Naverstore > 디지털 > 음향기기 > 스피커 > 미니/휴대용'</li></ul> |
| 119.0 | <ul><li>'휴대용 레트로 라디오 fm am 단파 라디오 어르신 효도 라디오 블루투스 1702 (#M)디지털/가전>음향가전>라디오 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 음향가전 > 라디오/MP3'</li><li>'트로트 등산용 MP3 어르신 미니 라디오 휴대용 소형 추천템 멀티 효도 라디오 H-868 (#M)음향가전>라디오>라디오 GFK > traverse > 11st > 가전/디지털 > 음향가전 > 라디오 > 라디오'</li><li>'수동식 크랭크 라디오 비상 랜턴 태양광 충전 다기능 비상 크랭크라디오 (#M)디지털/가전>음향가전>라디오 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 음향가전 > 라디오/MP3'</li></ul> |
| 212.0 | <ul><li>'지클릭커 오피스프로 WMK70 사일런스L 무소음 인체공학 무선 키보드 마우스 세트 블랙 화이트 (#M)디지털/가전>주변기기>키보드/마우스세트 GFK > traverse > Naverstore > 컴퓨터 > 키보드/마우스 > 키보드 > 키보드+마우스'</li><li>'지클릭커 오피스프로 WMK70 사일런스L 무선 키보드 마우스 세트 (화이트) (#M)컴퓨터 주변기기>마우스/키보드 세트>마우스/키보드 세트 GFK > traverse > 11st > 가전/디지털 > 컴퓨터 주변기기 > 마우스/키보드 세트 > 마우스/키보드 세트'</li><li>'마이크로소프트 에고노믹 무선 블루투스 5.0 마우스 택배 병행 블랙 당일출고 (#M)디지털/가전>주변기기>마우스>무선마우스 GFK > naver_plus_traverse_extension > Naverstore > PC/주변기기 > 키보드/마우스 > 마우스'</li></ul> |
| 215.0 | <ul><li>'샌디스크 마이크로 SD카드 익스트림 프로 블랙박스 액션캠 닌텐도 메모리 2TB (#M)디지털/가전>카메라/캠코더용품>메모리카드>MicroSD메모리 GFK > naver_plus_traverse_extension > Naverstore > 휴대폰/카메라 > 카메라'</li><li>'삼성전자 마이크로 SD카드 512GB 메모리카드 EVO PLUS 외장 스마트폰 메모리 512기가 신형EVO PLUS 512G 케이스+리더기 (#M)디지털/가전>카메라/캠코더용품>메모리카드>MicroSD메모리 GFK > short_clip > Naverstore > Short Clip > 테크 > 20241031'</li><li>'카드 DJI Care Refresh 2년판(DJI Osmo Pocket 3) (#M)SSG.COM>카메라/캠코더>디지털카메라/액션캠>액션캠 GFK > traverse > ssg > 디지털/렌탈 > 카메라/캠코더 > 디지털카메라/액션캠 > 액션캠'</li></ul> |
| 56.0 | <ul><li>'[바로가기 ON 15% 중.복.쿠.폰] IPTIME BT50 블루투스 V5.0 USB 동글 화이트 (#M)컴퓨터 주변기기>블루투스동글>블루투스동글 GFK > 11st > 가전/디지털 > 컴퓨터 주변기기 > 블루투스동글 > 블루투스동글'</li><li>'아이피타임 ipTiME BT50XR 블루투스 5.0 USB 동글 블랙 (#M)홈>디지털/가전>네트워크장비>블루투스동글 Naverstore > 컴퓨터 > 주변기기 > 허브/컨트롤러 > 블루투스동글'</li><li>'Logitech G903 G403 G900 G703 G603 G PRO 무선 마우스 어댑터용 Usb 동글 신호 수신기 어댑터 Logitech G903 G403 G900 G703 G603 G PRO 무선 마우스 어댑터용_G603 (#M)가전·컴퓨터>PC부품·주변기기>키보드>키보드·마우스세트 Tmon > 가전·디지털 > 가전·컴퓨터 > PC부품·주변기기 > 키보드 > 키보드·마우스세트'</li></ul> |
| 168.0 | <ul><li>'[전용세제 ]DWA90C7B00CE 트리플케어 식기세척기 빌트인 (8가지색상) 07.블루라구나(블루) 홈>프리미엄관>(14인용)트리플케어 식기세척기>90C 모델;(#M)홈>식기세척기>(8인이상) 와이드형>트리플케어 Naverstore > 가전 > 주방가전 > 위생관리 > 식기세척기'</li><li>'[체감가152만원대] DWA90R6B00SL 트리플케어 식기세척기 빌트인 (8가지색상) 02.토르토라(그레이쉬 아이보리) (#M)홈>식기세척기>(8인이상) 와이드형>트리플케어 Naverstore > 가전 > 주방가전 > 위생관리 > 식기세척기'</li><li>'SK매직 DWA-7303D (#M)홈>전체상품 Naverstore > 가전 > 주방가전 > 위생관리 > 식기세척기'</li></ul> |
| 123.0 | <ul><li>'네임뮤조2 스피커 스탠드 최고급 원형 실버 거치대 받침대 네임뮤조2 실버 스탠드 (#M)디지털/가전>음향가전>스피커>스피커액세서리 GFK > traverse > Naverstore > 디지털 > 음향기기 > 스피커 > 액세서리'</li><li>'소니 무선 넥밴드 스피커 HT-AN7 BRAVIA Theatre U HT-AN7 BRAVIA Theatre U (#M)디지털/가전>음향가전>스피커>블루투스스피커 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 음향가전 > 스피커'</li><li>'ADAM AUDIO A5X 아담 오디오 5인치 모니터 스피커 스튜디오 고음질 홈레코딩 홈>브랜드>A-B>Adam Audio;(#M)홈>브랜드>A>Adam Audio Naverstore > 디지털/가전 > 음향가전 > 스피커 > 스피커단품'</li></ul> |
| 203.0 | <ul><li>'키친아트 2구 하이브리드 인덕션+하이라이트 하이브리드 전기레인지 2050 하이브리드렌지 홈>디지털/가전>주방가전>하이브리드;홈>전체상품;홈>🧡주방가전🧡;(#M)홈>주방가전💛 Naverstore > 가전 > 주방가전 > 전기레인지 > 하이브리드'</li><li>'SK매직 ERA-FH20D ERAFH20D00DS(인덕션1구+하이1구) (#M)홈>전체상품 Naverstore > 가전 > 주방가전 > 전기레인지 > 하이브리드'</li><li>'3구 플렉스 하이브리드 인덕션레인지 빌트인 (2인덕션+1하이라이트) ERAHBTS3 (#M)디지털/가전>주방가전>하이브리드 Naverstore > 가전 > 주방가전 > 전기레인지 > 하이브리드'</li></ul> |
| 135.0 | <ul><li>'두꺼운 발톱깍기 발톱관리 깎이 손톱정리도구 정리 손톱깎이 안튀는손톱깎이 네일 휴대용손톱깎이 홈 > 뷰티 > 네일 > 네일관리기기 > 전동네일관리기 T200 > traverse > LotteOn > 뷰티 > 네일 > 네일관리기기 > 전동네일관리기'</li><li>'라운드 메이커 올인원 네일 케어 기기 Coupang > 가전디지털 > 이미용가전 > 눈썹/네일관리 > 전동네일관리기;쿠팡 홈>가전디지털>이미용가전>눈썹/네일관리>전동네일관리기;쿠팡 홈>가전디지털>뷰티/헤어가전>눈썹/네일관리>전동네일관리기;Coupang > 뷰티 > 네일 > 네일케어도구 > 파일/버퍼/스틱 > 파일/버퍼;Coupang > 가전디지털 > 뷰티/헤어가전 > 눈썹/네일관리 > 전동네일관리기;(#M)쿠팡 홈>뷰티>네일>네일케어도구>파일/버퍼/스틱>파일/버퍼 Coupang > 가전디지털 > 뷰티/헤어가전 > 눈썹/네일관리 > 전동네일관리기'</li><li>'다이아미 핀큐어 젤네일 LED 램프 혼합색상 × 1개 Coupang > 가전디지털 > 이미용가전 > 눈썹/네일관리;쿠팡 홈>가전디지털>이미용가전>눈썹/네일관리>젤네일 램프;Coupang > 가전디지털 > 뷰티/헤어가전 > 눈썹/네일관리 > 젤네일 램프;쿠팡 홈>가전디지털>뷰티/헤어가전>눈썹/네일관리>젤네일 램프;(#M)쿠팡 홈>뷰티>네일>네일아트소품/도구>네일드라이어/램프>젤네일 램프 Coupang > 가전디지털 > 뷰티/헤어가전 > 눈썹/네일관리 > 젤네일 램프'</li></ul> |
| 74.0 | <ul><li>'[포토상품평] 카멜 CA3 싱글암 패브릭 모니터거치대 모니터암 화이트 (#M)모니터>모니터 주변기기>모니터주변기기 기타 GFK > 11st > 가전/디지털 > 모니터 > 모니터 주변기기 > 모니터주변기기 기타'</li><li>'카멜 모니터암 CA2D 듀얼 모니터거치대 이지밸런스 그레이 (#M)디지털/가전>모니터주변기기>모니터암 GFK > Naverstore > 컴퓨터 > 주변기기 > 모니터용'</li><li>'[카멜인터내셔널] 클램프형 암, CMA-2P, 블랙 [32형] (#M)디지털/가전>모니터주변기기>모니터암 Naverstore > 컴퓨터 > 주변기기 > 모니터용 > 모니터암'</li></ul> |
| 84.0 | <ul><li>'보풀컷 보풀제거기 세탁소 업소용 니트 코트 옷 로즈골드 5중날 올블랙(6중날) (#M)홈>디지털/가전>생활가전>보풀제거기 Naverstore > 가전 > 생활가전 > 보풀제거기'</li><li>'필립스 GC-026 블루 (#M)디지털/가전>생활가전>보풀제거기 Naverstore > 가전 > 생활가전 > 보풀제거기'</li><li>'유닉스 정품 충전식 무선 보풀제거기 추천 휴대용 세탁소 니트 보풀 제거 UNL-9302 UNL-9302 (+사은품 마스크 1매) (#M)디지털/가전>생활가전>보풀제거기 GFK > Naverstore > 가전 > 생활가전 > 보풀제거기'</li></ul> |
| 49.0 | <ul><li>'CAT6A 랜 커플러 키스톤 잭 모듈러 랜선 STP RJ45 CAT6 1번_6A STP 랜커플러 키스톤잭_XB265 (#M)디지털/가전>네트워크장비>기타네트워크장비 GFK > Naverstore > 컴퓨터 > 주변기기 > 네트워크장비 > 기타'</li><li>'스위치봇 - 허브 미니 원격제어 스마트홈 허브 만능리모컨 (#M)디지털/가전>네트워크장비>기타네트워크장비 GFK > Naverstore > 컴퓨터 > 주변기기 > 네트워크장비 > 기타'</li><li>'[3-5일 배송] 구글 네스트 온도조절기 자동 스마트러닝 3세대 스테인리스 스틸 스테인리스 스틸 (#M)디지털/가전>네트워크장비>기타네트워크장비 GFK > Naverstore > 컴퓨터 > 주변기기 > 네트워크장비 > 기타'</li></ul> |
| 19.0 | <ul><li>'[들꽃잠]멀티형 배 찜질팩 생리통 복부 팥 허리 냉온 (#M)생활/건강>냉온/찜질용품>찜질팩 GFK > Naverstore > 건강/의료용품 > 냉온/찜질용품'</li><li>'볼케이노가습기 무중력 가열식 기화식 가습기 샤오미 새로운 스마트 워치 울트라 8 NFC GPS 트랙 49mm 남성 여성 Smartwatch 시리즈 8 온도계 BluetoothCal 볼케이노가습기 무중력 가열식 기화식 가습기 샤오미 새로운 스_블랙 추가 3 스트랩 (#M)가전·컴퓨터>계절가전>가습기 Tmon > 가전·디지털 > 가전·컴퓨터 > 계절가전 > 가습기'</li><li>'가습기 가열식가습기 원룸 사무실 기석사 원룸 무선 4 살균충전버전 유스파우더 (#M)홈>전체상품 Naverstore > 가전 > 계절가전 > 가습기/에어워셔 > 필터/액세서리'</li></ul> |
| 122.0 | <ul><li>'브리츠 Realfit5 오픈형 블루투스 이어폰 V5.4 무선충전 초경량 귀걸이형 운동 자전거 오토바이 라이딩 아이보리 (#M)음향가전>이어폰>무선 이어폰 GFK > traverse > 11st > 가전/디지털 > 음향가전 > 이어폰 > 무선 이어폰'</li><li>'브리츠 BT4000 ANC 노이즈캔슬링 무선 블루투스 헤드셋 헤드폰(블랙, 아이보리, 화이트) BT4000 아이보리 (#M)디지털/가전>음향가전>블루투스셋>블루투스헤드폰/헤드셋 GFK > traverse > Naverstore > 디지털 > 블루투스'</li><li>'Sony WH-1000XM5 노이즈캔슬링 블루투스 헤드폰 화이트 (#M)컴퓨터 주변기기>헤드셋>블루투스헤드셋 GFK > traverse > 11st > 가전/디지털 > 컴퓨터 주변기기 > 헤드셋 > 블루투스헤드셋'</li></ul> |
| 154.0 | <ul><li>'주방 ntec후드필터 엔텍 파세코 한샘 가스렌지 후드필터 환풍기 닥트 청소 엔텍일반형 340x230 (#M)홈>디지털/가전>주방가전>가스레인지후드 Naverstore > 가전 > 주방가전 > 위생관리 > 레인지후드'</li><li>'SK매직 프론트형 600 레인지후드 RHD304L 전동댐퍼추가가능 배송만(자가설치)_전동댐퍼추가 홈>전체상품;(#M)홈>레인지후드 Naverstore > 가전 > 주방가전 > 위생관리 > 레인지후드'</li><li>'하츠 허리케인 도어 HDH-90S 씽크대 렌지 후드 교체 후황 도어없는상품_설치미접수 (배송만) (#M)홈>레인지후드 Naverstore > 가전 > 주방가전 > 위생관리 > 레인지후드'</li></ul> |
| 115.0 | <ul><li>'윤씨네 4:3 유압식 포터블 빔스크린 PM-SV 매트원단 롤러블스크린 203cm(80), 1개 (#M)디지털/가전>영상가전>프로젝터주변기기>프로젝터스크린 GFK > traverse > Naverstore > 가전 > 영상가전 > 프로젝터 > 스크린'</li><li>'윤씨네 16:9 삼각대 족자봉 빔스크린 세트 YJH 캠핑용 휴대용 가정용 203cm(80), 1개 (#M)디지털/가전>영상가전>프로젝터주변기기>프로젝터스크린 GFK > traverse > Naverstore > 가전 > 영상가전 > 프로젝터 > 스크린'</li><li>'윤씨네 4:3 C-SV 수동 체인 빔스크린 업무용 학원용 187.5cm(60), 1개 (#M)디지털/가전>영상가전>프로젝터주변기기>프로젝터스크린 GFK > traverse > Naverstore > 가전 > 영상가전 > 프로젝터 > 스크린'</li></ul> |
| 185.0 | <ul><li>'쿠쿠 게임부록 청소기/밥솥/인덕션 BEST 모델 기획전 06. 쿠쿠 6인용 IH전기압력밥솥 CRP-DHP0610FD (#M)가전·컴퓨터>주방가전>전기밥솥>압력밥솥 Tmon > 가전·디지털 > 가전·컴퓨터 > 주방가전 > 전기밥솥 > 압력밥솥'</li><li>'쿠첸 brain 듀얼프레셔 IH전기압력밥솥 6인용/10인용 풀스텐 스텐내솥 04. [다운로드쿠폰] 쿠첸 brain 풀스텐 듀얼프레셔 10인용 IH전기압력밥솥 CRH-TWS1011E 베이지/스텐내솥 (#M)가전·컴퓨터>주방가전>전기밥솥>압력밥솥 Tmon > 가전·디지털 > 가전·컴퓨터 > 주방가전 > 전기밥솥 > 압력밥솥'</li><li>'1인용밥솥 2인용밥솥 미니전기밥솥 키친아트 자취생밥솥 KC-202MY_피치 (#M)가전·컴퓨터>주방가전>전기밥솥>일반밥솥 Tmon > 가전·디지털 > 가전·컴퓨터 > 주방가전 > 전기밥솥 > 일반밥솥'</li></ul> |
| 208.0 | <ul><li>'신도리코 A3흑백복합기 N621 정식판매처 [무료설치] [당일출고] 홈>전체상품;(#M)홈>A3흑백복합기 Naverstore > 컴퓨터 > 복합기/프린터 > 흑백레이저복합기'</li><li>'삼성전자 SL-C2470FR 컬러 레이저 복합기 인쇄 복사 스캔 팩스 학교 관공서 (#M)디지털/가전>주변기기>복합기>컬러레이저복합기 GFK > Naverstore > 컴퓨터 > 복합기/프린터 > 컬러레이저복합기'</li><li>'삼정 국내제조 책상 공부 독서 LED스탠드 SL-660 블랙 (#M)디지털/가전>생활가전>스탠드>LED스탠드 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 생활가전'</li></ul> |
| 80.0 | <ul><li>'대여 창문 로봇 청소기 아파트 유리창 청소 닦이 일상 렌탈 일상 창문로봇청소기_✨설 연휴 세트✨_1/23일 (목)발송 → 1/30 (목)까지 (#M)디지털/가전>청소기>창문청소기 GFK > traverse > Naverstore > 가전 > 청소기 > 로봇청소기'</li><li>'삼성 로봇 청소기 AI 비스포크 제트 봇 진공 미니 소형 원룸 자취방 펫캠 삼성 청소기 페블 그레이 (#M)디지털/가전>생활가전>청소기>로봇청소기 GFK > Naverstore > 가전 > 청소기 > 로봇청소기'</li><li>'삼성 비스포크 제트봇 VR50B9563AE 로봇청소기 AI SE 자율주행 청정스테이션 페블 그레이 (#M)디지털/가전>생활가전>청소기>로봇청소기 GFK > Naverstore > 가전 > 청소기 > 로봇청소기'</li></ul> |
| 81.0 | <ul><li>'Dreame 충전기 V11 V9 교체용 예비 부품 어댑터 유럽 플러그 진공 청소기 액세서리 01 Adapter (#M)생활가전>청소기부품>액세서리 기타 GFK > traverse > 11st > 가전/디지털 > 생활가전 > 청소기부품'</li><li>'[팅크웨어] 아이나비 차량용 무선휴대용 스마트 에어건 EPI-A218 휴대용 충전식 청소기 홈>전체상품;홈>자동차ㆍ공구ㆍ안전>차량용디지털>차량용 전자용품;(#M)홈>자동차ㆍ공구ㆍ안전>자동차 관련용품 Naverstore > 가전 > 청소기 > 차량용'</li><li>'[히트상품] [다이슨] 청소기/에어랩/고데기/공기청정기2 06. 다이슨 슬림 플러피 오리진 (#M)가전·컴퓨터>TV·냉장고·세탁기>냉장고>그외 브랜드 Tmon > 가전·디지털 > 가전·컴퓨터 > TV·냉장고·세탁기 > 냉장고 > 그외 브랜드'</li></ul> |
| 152.0 | <ul><li>'SK하이닉스 Tube T31 Stick 외장SSD 512GB [D램탑재+스틱형] (#M)디지털/가전>저장장치>외장SSD GFK > naver_plus_traverse_extension > Naverstore > PC/주변기기 > 저장장치 > 외장SSD'</li><li>'아이팟 클래식 7세대(A1238) SSD 32GB A/S 180일 스페이스 그레이_SD 512gb+1950mAh대용량 배터리 (#M)디지털/가전>음향가전>MP3 GFK > traverse > Naverstore > 디지털 > 음향기기 > 라디오/MP3'</li><li>'SSD 외장케이스 USB C 타입 2.5 SATA HDD 외장SSD (#M)디지털/가전>저장장치>외장SSD GFK > naver_plus_traverse_extension > Naverstore > PC/주변기기 > 저장장치 > 외장SSD'</li></ul> |
| 142.0 | <ul><li>'[단독] 스킨 라이트 테라피Ⅱ LotteOn > 뷰티 > 뷰티기기 > 피부케어기 LotteOn > 뷰티 > 뷰티기기 > 피부케어기'</li><li>'동국제약 센텔리안24 마데카프라임 피부관리기 뷰티디바이스 2개 + 멜라캡처앰플PRO 10ml x 8개 + 앰플 샤쉐 6종 2개 + 마데카프라임 2개 + 사은품 [C41] 홈 > 뷰티 > 뷰티기기/소품 > 피부케어기 > 피부케어기 LO > traverse > LotteOn > 뷰티 > 뷰티기기/소품 > 피부케어기 > 피부케어기'</li><li>'[LIVE] [연말 ] 글로우엠 부스터 소닉 (젤 세럼 ) 부스터소닉 1개 + 젤 2개 + 팩 20매 (#M)디지털/가전>이미용가전>피부케어기기 LO > live > Naverstore > Shop Live > 뷰티 > 20240813 > 19:30 ~ 21:30'</li></ul> |
| 209.0 | <ul><li>'Bambu Lab A1 mini 3D 프린터 (#M)디지털/가전>주변기기>프린터>3D프린터 GFK > traverse > Naverstore > 컴퓨터 > 복합기/프린터 > 3D프린터/3D펜 > 3D프린터'</li><li>'HP 정품 CE314A 드럼 Color LJ CP1025,M175,M176, M177 / LJ pro M275nw Imaging Unit (Imaging Drum) (#M)프린터/복합기>토너>정품 GFK > traverse > 11st > 가전/디지털 > 프린터/복합기 > 토너 > 정품'</li><li>'[호환] 필터바바 3+1 삼성 에어드레서 필터 미세먼지 교체 프리미엄 H13 3벌용 3벌용 (프리미엄 H13등급) (#M)디지털/가전>생활가전>세탁/건조기>액세서리 GFK > naver_plus_traverse > Naverstore > 가전 > 세탁기/건조기 > 드럼세탁기'</li></ul> |
| 16.0 | <ul><li>'닌텐도 정품 조이콘 (R) 스위치 컨트롤러 조이스틱 오른쪽+스트랩 포함 확인하였습니다_에어캡포장(박스없음)_3.(R)네온옐로 단품 (#M)디지털/가전>게임기/타이틀>게임기주변기기>조이스틱/컨트롤러 GFK > naver_plus_traverse_extension > Naverstore > PC/주변기기 > 게이밍 > 주변용품'</li><li>'닌텐도 스위치 배터리개선판 본체 네온+링피트 어드벤처 세트+OLED공용 조이콘커버악세사리 had네온+링피트+OLED공용 조이콘커버 홈>디지털/가전>게임기/타이틀>게임타이틀;(#M)홈>디지털/가전>게임기/타이틀>휴대용게임기 Naverstore > 디지털 > 게이밍 > 닌텐도 > 본체'</li><li>'젤다의 전설 티어스 오브 더 킹덤 에디션 정품 팩케이스 세트 닌텐도 스위치 OLED 본체 닌텐도스위치 OLED 젤다의 전설 에디션_+ 인기 게임패키지 (젤다의전설 왕국의눈물) 홈>「 Game 」;홈>「 예약판매/신규출시 」;(#M)홈>「 Game 」>Nintendo Naverstore > 디지털 > 게이밍 > 닌텐도 > 본체'</li></ul> |
| 77.0 | <ul><li>'웍스 무선 충전식 고압세척기 WG630E.2 브러시리스 (#M)홈>디지털/가전>생활가전>청소기>고압세척기 Naverstore > 가전 > 청소기 > 고압세척기'</li><li>'RL30고압건 고압세척기부품 스팀건 RL30 (#M)홈>고압건 숏건 건set Naverstore > 디지털/가전 > 생활가전 > 청소기 > 고압세척기'</li><li>'웍스 창문닦이 WA4050 (#M)홈>전체상품 Naverstore > 디지털/가전 > 생활가전 > 청소기 > 고압세척기'</li></ul> |
| 160.0 | <ul><li>'쿠잉 냉동고 /쾌속형/서랍식/FR-191SS/소형/미니/164L 쿠잉 냉동고 /쾌속형/서랍식/FR-191SS/소형/미니/16 (#M)11st>냉장고>냉동고>냉동고 11st > 가전/디지털 > 냉장고 > 냉동고 > 냉동고'</li><li>'삼성전자 비스포크 RZ34C7805AP 냉동고 1도어 키친핏 오토오픈도어 좌흰지(좌개퍠)_새틴베이지 (#M)홈>전체상품 Naverstore > 가전 > 냉장고 > 냉동고'</li><li>'삼성전자 비스포크 RZ34C7805AP 냉동고 1도어 키친핏 오토오픈도어 우흰지(우개폐)_새틴세이지그린 (#M)홈>전체상품 Naverstore > 가전 > 냉장고 > 냉동고'</li></ul> |
| 110.0 | <ul><li>'(1년 구독) 파인리더 PDF 16 스탠다드 - ABBYY FineReader PDF 16 Standard (1Year) 이메일로 수령 (#M)디지털/가전>소프트웨어>사무/회계 GFK > Naverstore > 컴퓨터 > 소프트웨어 > 사무/회계'</li><li>'마이크로소프트 오피스 M365 Personal PKC (1년 구독) 엑셀/파워포인트/아웃룩/워드/팀즈/패밀리세이프티 (#M)디지털/가전>소프트웨어>사무/회계 GFK > Naverstore > 컴퓨터 > 소프트웨어'</li><li>'상품 재고관리 프로그램(거래처/제품별 재고관리, 매입/매출/환입/환출, 거래처원장, 재고현황 및 수익금액, 재고자산회전율/회전일수) 상품 재고관리 프로그램 (#M)디지털/가전>소프트웨어>사무/회계 GFK > Naverstore > 컴퓨터 > 소프트웨어'</li></ul> |
| 127.0 | <ul><li>'LG전자 스탠바이미 스피커 XT7S 디지털샵 (#M)음향가전>턴테이블>턴테이블 GFK > traverse > 11st > 가전/디지털 > 음향가전 > 턴테이블 > 턴테이블'</li><li>'인켈 (셔우드) PM-9970U 벨트드라이브 프리미엄 USB 턴테이블 블랙 24년 신형 '</li><li>'크로슬리 Voyager CR8017A '</li></ul> |
| 231.0 | <ul><li>'에어팟 4세대 케이스 홀로그램 실버 왕리본 키링 세트 (#M)디지털/가전>음향가전>이어폰/헤드폰액세서리>케이스/파우치 GFK > short_clip > Naverstore > Short Clip > 테크 > 20250116'</li><li>'[블루/실버 +상품권5만][Z플립6 512GB 체감가 119만][쿠폰15%+카드5%] 갤럭시 자급제 SM-F741N Z플립6 512GB 자급제 + 버즈3 패키지_자급제 블루 + 버즈3 화이트 [LBEKOO] (#M)휴대폰>자급제폰>삼성>5G GFK > traverse > 11st > 가전/디지털 > 휴대폰 > 자급제폰 > 삼성'</li><li>'[Z폴드6 512GB 가 2,033,000원 쿠폰10%+카드5%] 갤럭시 자급제 SM-F956N Z폴드6 512GB 자급제 + 버즈3 패키지_자급제 실버 쉐도우 + 버즈3 실버 [ZSEKOO] (#M)휴대폰>자급제폰>삼성>5G GFK > traverse > 11st > 가전/디지털 > 휴대폰 > 자급제폰 > 삼성'</li></ul> |
| 86.0 | <ul><li>'휴앤봇 3kg 소형 미니세탁기 아기옷 HS-MW3150G 헹굼 속옷 양말 1인용 원룸 (#M)디지털/가전>생활가전>세탁기>미니세탁기 Naverstore > 가전 > 세탁기/건조기 > 미니세탁기'</li><li>'휴앤봇 미니 세탁기 HS-MW25G 아기옷 속옷 수건 운동화 2.5kg 3.5kg 1) 미니세탁기 HS-MW25G(2.5kg) (#M)디지털/가전>생활가전>세탁기>미니세탁기 Naverstore > 가전 > 세탁기/건조기 > 미니세탁기'</li><li>'[호환] 대우 위니아 통돌이 세탁기 먼지 거름망 필터 03. 대우소[DZ-03] (#M)디지털/가전>생활가전>세탁기>세탁기부품 GFK > Naverstore > 가전 > 세탁기/건조기 > 액세서리 > 필터'</li></ul> |
| 205.0 | <ul><li>'[최신모델]무선도깨비방망이 노블 CHB2300 로즈펄 (#M)홈>디지털/가전>주방가전>핸드블렌더 Naverstore > 가전 > 주방가전 > 믹서기/블렌더 > 핸드블렌더'</li><li>'도깨비방망이 PHB2200 (대용량 2200ml 컵 포함) 블랙 (#M)11st>주방가전>믹서기/핸드블렌더>미니믹서기 11st > 가전/디지털 > 주방가전 > 믹서기/핸드블렌더 > 미니믹서기'</li><li>'신일 키친아트 핸드블랜더 다기능 모음 SMX-HB600S (#M)가전·컴퓨터>주방가전>믹서·원액·블렌더>핸드블렌더 Tmon > 가전·디지털 > 가전·컴퓨터 > 주방가전 > 믹서·원액·블렌더 > 핸드블렌더'</li></ul> |
| 184.0 | <ul><li>'[키친아트] 허브 와이드 전기그릴 KNG-P771NK (#M)가전·컴퓨터>주방가전>전기그릴·찜기>전기그릴 Tmon > 가전·디지털 > 가전·컴퓨터 > 주방가전 > 전기그릴·찜기'</li><li>'[세트상품] 테팔 전기그릴 컴팩트 그릴 TG300 +아이스포스 고기가위 + 인지니오 미니 스테인리스 다용도 집게 (#M)홈>주방가전>전기그릴 Naverstore > 가전 > 주방가전 > 전기그릴/팬 > 전기그릴'</li><li>'벨로닉스 레트로 멀티쿠커 전기그릴 SHMC-020 다크그레이_그릴세트(기본구성+그릴플레이트) (#M)디지털/가전>주방가전>전기그릴 Naverstore > 가전 > 주방가전 > 전기그릴/팬 > 전기그릴'</li></ul> |
| 85.0 | <ul><li>'[린나이]노비타 라인핏 방수 비데 BD-AFM51N (무상설치) (#M)11st>뷰티소품>피부관리기>피부관리기 11st > 뷰티 > 뷰티소품 > 피부관리기'</li><li>'이누스 방수비데 IS-520 - 360° 모든 방향 완벽 파워방수 IPX5 / 스마트 터치식 2. IS-510 온풍건조X_2. 설치후 2만원 결재 (#M)11st>생활가전>비데>전자식비데 11st > 가전/디지털 > 생활가전 > 비데 > 전자식비데'</li><li>'[롯데백화점]보보 [롯데잠실]VOVO 보보 시트비데 무선리모컨 쾌변기능 VB-6000 무상설치 (#M)11st>생활가전>비데>기계식비데 11st > 가전/디지털 > 생활가전 > 비데 > 기계식비데'</li></ul> |
| 157.0 | <ul><li>'닭탈모기 닭털뽑는기계 은행탈피기 LIM-30A(소/중/대/특대형) 기본 30개 (#M)홈>디지털/가전>주방가전>기타주방가전 Naverstore > 디지털/가전 > 주방가전 > 기타주방가전'</li><li>'테팔 비어텐더 생맥주 디스펜서 맥주기계 VB310EVB310EKR (#M)가전·컴퓨터>주방가전>전기쿠커·튀김기>기타용품 Tmon > 가전·디지털 > 가전·컴퓨터 > 주방가전 > 전기쿠커·튀김기 > 기타용품'</li><li>'LG전자렌지 교체용 유리회전접시 회전판 A타입 24.5cm (#M)홈>디지털/가전>주방가전>기타주방가전 Naverstore > 디지털/가전 > 주방가전 > 기타주방가전'</li></ul> |
| 189.0 | <ul><li>'유니크대성 업소용 사리 육수냉장고 냉면육수통 선택11. 스텐-2말쌍통1라인 (#M)11st>냉장고>일반형>일반형 11st > 가전/디지털 > 냉장고 > 일반형 > 일반형'</li><li>'케민 22L 미니 기숙사 이유식 1인 냉장고 듀얼 스마트 MinSellAmount (#M)주방가전>냉장고/냉동고>화장품냉장고 Gmarket > 가전 > 주방가전 > 냉장고/냉동고 > 화장품냉장고'</li><li>'Celler Cool CX2200 와인셀러 냉각 시스템 전면 전원 코드 Rear Power Cord (#M)냉장고>전용냉장고>와인냉장고 GFK > 11st > 가전/디지털 > 냉장고 > 전용냉장고'</li></ul> |
| 108.0 | <ul><li>'Arobas Music Guitar Pro 8 아로바스 뮤직 기타프로 8 타브 악보 제작 Guitar Pro 8 (#M)디지털/가전>소프트웨어>그래픽/멀티미디어 GFK > Naverstore > 컴퓨터 > 소프트웨어 > 그래픽/멀티미디어'</li><li>'다빈치 리졸브 스튜디오 다빈치 리졸브 스튜디오 (#M)디지털/가전>소프트웨어>그래픽/멀티미디어 GFK > Naverstore > 컴퓨터 > 소프트웨어 > 그래픽/멀티미디어'</li><li>'어도비 마스터컬렉션 CC [포토샵 일러스트레이터 프리미어프로 에프터이펙트 라이트룸 인디자인 아크로벳 미디어인코더 등 포함 1년 플랜] (#M)디지털/가전>소프트웨어>그래픽/멀티미디어 GFK > traverse > Naverstore > 컴퓨터 > 소프트웨어'</li></ul> |
| 174.0 | <ul><li>'수저 살균 소독기 식기살균건조기 수저통 식당 업소용 대신 열소독 건식 살균기 6구 '</li><li>'하임셰프 업소용 열풍 식기살균 자외선 건조기 '</li><li>'한일 식기건조기 UV 살균 2단 그릇 건조대 대형 살균기 주방 컵 정리대 식기 건조기 '</li></ul> |
| 227.0 | <ul><li>'와콤 신티크16 DTK-1660 액정타블렛 공식판매점 홍대입구점 / 필수악세서리 이벤트 / 필름부착서비스 신티크16+AG필름부착발송 홈>Wacom>신티크;홈>전체상품;(#M)홈>와콤>신티크 Naverstore > 컴퓨터 > 키보드/마우스 > 타블렛 > 본체'</li><li>'삼성전자 갤럭시탭 S9 플러스 256GB 슈퍼아몰레드2X 방수/방진 256G x Wi-Fi_그라파이트 SM-X810NZAAKOO_단품+힐링쉴드필름+65W충전기 (#M)디지털/가전>태블릿PC Naverstore > 컴퓨터 > 노트북 > 태블릿PC'</li><li>'[신제품 이벤트] 와콤 신티크프로 27 터치 DTH-271 액정타블렛 신티크프로27+와콤스탠드 세트 (#M)11st>컴퓨터주변기기>태블릿/디지털 펜>태블릿/디지털 펜 11st > 가전/디지털 > 컴퓨터 주변기기 > 태블릿/디지털 펜 > 태블릿/디지털 펜'</li></ul> |
| 182.0 | <ul><li>'린나이 포터블 인덕션 1구렌지 RPI-Y10 (#M)홈>디지털/가전>주방가전>인덕션 Naverstore > 가전 > 주방가전 > 전기레인지 > 인덕션'</li><li>'[택배/전문기사방문, ]린나이 미드나잇컬러인덕션 3구 전기레인지RBI-G3000N 전문기사설치 (#M)주방가전>전기레인지>인덕션>빌트인 GFK > 11st > 가전/디지털 > 주방가전 > 전기레인지 > 인덕션'</li><li>'냄비2종 전국무료설치 3구 올파워 화이트 인덕션 전기레인지 IHRB32A3 화이트_무료설치_배송후 SK설치기사방문 홈>전체상품;(#M)홈>전기레인지>인덕션 Naverstore > 가전 > 주방가전 > 전기레인지 > 인덕션'</li></ul> |
| 204.0 | <ul><li>'핫플레이트 인덕션 버너 가열판 열전도판 전달 열플레이트 L 홈>전체상품;(#M)홈>디지털/가전>주방가전>핫플레이트 Naverstore > 가전 > 주방가전 > 전기레인지 > 핫플레이트'</li><li>'키친아트 KG-02TH 1구 세라믹 핫플레이트 /HB (#M)디지털/가전>주방가전>핫플레이트 GFK > Naverstore > 가전 > 주방가전 > 전기레인지 > 핫플레이트'</li><li>'키친아트 세라믹 핫플레이트 1구 전기레인지 KG-02TH 미니 전기곤로 온도조절 전기버너 (#M)디지털/가전>주방가전>핫플레이트 GFK > Naverstore > 가전 > 주방가전 > 전기레인지 > 핫플레이트'</li></ul> |
| 5.0 | <ul><li>'다크플래쉬 DK110 컴퓨터케이스 PC케이스 (#M)디지털/가전>PC부품>PC케이스 GFK > Naverstore > 컴퓨터 > 부품 > 케이스/파워'</li><li>'앱코 NCORE G30 트루포스 미들타워 PC케이스 (블랙) (#M)PC부품>PC케이스>미들케이스 GFK > 11st > 가전/디지털 > PC부품 > PC케이스'</li><li>'마이크로닉스 EM2 STEREO 미들 타워 PC 케이스 블랙 (#M)디지털/가전>PC부품>PC케이스 Naverstore > 컴퓨터 > 부품 > 케이스/파워'</li></ul> |
| 40.0 | <ul><li>'한일 캠핑 전기요 프리볼트 장판 싱글 1인용 전기장판 전기매트 2인용 도형 랜덤 디자인 랜덤_소 (#M)디지털/가전>계절가전>전기요/담요/방석>전기요 Naverstore > 가전 > 계절가전 > 전기요/담요/방석 > 전기요'</li><li>'[미니 출시] 보국 에어셀 인체감지 전기요 카모플라쥬 BKB-9511S 2) 싱글 BKB-9511S (#M)디지털/가전>계절가전>전기요/담요/방석>전기요 Naverstore > 가전 > 계절가전 > 전기요/담요/방석 > 전기요'</li><li>'2023년형 일월 전기방석 온열방석 쇼파용 1인 2인 3인 전기매트 장판 일월 50W 미니싱글 (장판소재/무늬랜덤) 홈>디지털/가전>계절가전>전기장판/담요/방석>전기방석;(#M)홈>디지털/가전>계절가전>전기요/담요/방석>전기방석 Naverstore > 가전 > 계절가전 > 전기요/담요/방석 > 전기방석'</li></ul> |
| 133.0 | <ul><li>'LG프라엘 메디헤어 HGN2V LG전자 탈모치료기 의료기기 LG프라엘 메디헤어 (P700) (#M)생활/건강/취미>건강/안마용품>의료/구강용품>기타 관리용품 CJmall > 뷰티 > 헤어/바디/미용기기 > 피부/바디기기 > 피부 마사지기'</li><li>'신광 실리콘 전동두피 머리마사지 마사지 실리콘마사지 실리콘케어 전동마사지 전동두피마사지 두피케어 (#M)이미용가전>기타 미용가전>전동두피마사지기 GFK > traverse > 11st > 가전/디지털 > 이미용가전 > 기타 미용가전 > 전동두피마사지기'</li><li>'[LG전자] 프라엘 메디헤어 탈모 케어기기 HGN1 (#M)11st>헤어케어>샴푸>한방 11st > 뷰티 > 헤어케어 > 샴푸 > 한방'</li></ul> |
| 213.0 | <ul><li>'인스탁스 스퀘어필름 20매(10매X2) (영등포점) (#M)디지털/가전>주변기기>프린터>포토프린터 GFK > traverse > Naverstore > 디지털 > 카메라 > 즉석카메라/용품 > 필름'</li><li>'폴라로이드 즉석 카메라 사진기 후지필름 인스탁스 스퀘어 필름 화이트 엣지 인화지 SQ10 SQ40 SQ20 공유 S 20 Sheets (#M)카메라/주변기기>즉석카메라>일회용카메라 GFK > traverse > 11st > 가전/디지털 > 카메라/주변기기 > 즉석카메라 > 일회용카메라'</li><li>'전동 손톱깍이 자동 휴대용 네일케어 손톱정리 국내발송 전동손톱깍이(CD-300) (#M)디지털/가전>이미용가전>손발톱정리기 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 이미용가전 > 손발케어'</li></ul> |
| 219.0 | <ul><li>'소니 사이버샷 DSC-RX100 '</li><li>'리코 GR3X HDF (#M)디지털/가전>카메라/캠코더용품>일반디카 GFK > traverse > Naverstore > 디지털 > 1인방송/촬영 > 카메라 > 일반디카'</li><li>'리코 PENTAX WG-1000 아웃도어 방수카메라 올리브_S0002167 (#M)디지털/가전>카메라/캠코더용품>일반디카 GFK > traverse > Naverstore > 디지털 > 1인방송/촬영 > 카메라 > 일반디카'</li></ul> |
| 120.0 | <ul><li>'FiiO BTR17 디코더 앰프 블루투스 오디오 리시버 스마트폰용 DAC 헤드폰 앰프 블랙 (#M)디지털/가전>음향가전>리시버/앰프 GFK > traverse > Naverstore > 디지털 > 음향기기 > 리시버/앰프'</li><li>'[런칭할인] Bluesound 블루사운드 NODE NANO 네트워크 플레이어 (#M)디지털/가전>음향가전>리시버/앰프 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 음향가전 > 리시버/앰프'</li><li>'MARANTZ(마란츠) M-CR612 네트워크 올인원 인티앰프 (#M)디지털/가전>음향가전>리시버/앰프 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 음향가전 > 리시버/앰프'</li></ul> |
| 192.0 | <ul><li>'위즈웰 가정용 제빵기 식빵 기계 대용량 예약기능 발효기 반죽 도우 WSB8000 WSB8000 + Npay 20000 적립 (#M)디지털/가전>주방가전>제빵기 GFK > Naverstore > 가전 > 주방가전 > 오븐/제빵'</li><li>'매직쉐프 스타일리쉬 홈베이킹 제빵기 MEBM-X900 제빵기화이트 (#M)디지털/가전>주방가전>제빵기 Naverstore > 가전 > 주방가전 > 오븐/제빵 > 제빵기'</li><li>'JCP 브레드가든 BM2401 (#M)디지털/가전>주방가전>제빵기 Naverstore > 가전 > 주방가전 > 오븐/제빵 > 제빵기'</li></ul> |
| 162.0 | <ul><li>'테팔 믹서기 초고속 블렌더 퍼펙트믹스 플러스 트라이탄 BL82AD (#M)11st>주방가전>믹서기/핸드블렌더>일반믹서기 11st > 가전/디지털 > 주방가전 > 믹서기/핸드블렌더 > 일반믹서기'</li><li>'해피콜 초고속 블렌더 믹서기 브리즈탭 해피콜 블렌더 브리즈탭(차콜그레이) (#M)디지털/가전>주방가전>믹서기 Naverstore > 가전 > 주방가전 > 믹서기/블렌더 > 믹서기'</li><li>'[공식] 테팔 초고속블렌더 퍼펙트믹스 플러스 트라이탄 BL82AD (#M)11st>주방가전>믹서기/핸드블렌더>초고속믹서기 11st > 가전/디지털 > 주방가전 > 믹서기/핸드블렌더 > 초고속믹서기'</li></ul> |
| 195.0 | <ul><li>'가정용 진공포장기 12 대형롤28cmX3M 3개 (#M)디지털/가전>주방가전>진공포장기 GFK > Naverstore > 가전 > 주방가전 > 위생관리 > 진공포장기'</li><li>'미소랩 가정용 자동 무선 진공포장기 진공탭 ML-210 진공포장기 1개 (#M)디지털/가전>주방가전>진공포장기 GFK > Naverstore > 가전 > 주방가전 > 위생관리 > 진공포장기'</li><li>'키친아트 진공포장기 KJP-3800WS 밀봉가능 비닐팩포함 (#M)11st>주방가전>기타 주방가전>주방가전 기타 11st > 가전/디지털 > 주방가전 > 기타 주방가전 > 주방가전 기타'</li></ul> |
| 178.0 | <ul><li>'테팔 에퀴녹스 9L 전기 오븐 그릴 토스터기 (#M)위메프 > 가전·디지털·컴퓨터 > 주방가전 > 홈메이드가전 > 홈베이킹/토스터기 위메프 > 가전·디지털·컴퓨터 > 주방가전 > 홈메이드가전 > 홈베이킹/토스터기'</li><li>"[25'설선물대첩] 발뮤다 더 레인지 다크그레이 K09B 다크그레이_레이에 서버 집게 (#M)디지털/가전>주방가전>오븐>복합형오븐 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 주방가전 > 오븐/제빵"</li><li>'위즈웰 디지털 컨벡션 오븐 전기 제과 제빵 빵 만들기 홈베이킹 가정용 GL-42A/B 디지털오븐(GL-42A/B)+15000 N적립 (#M)디지털/가전>주방가전>오븐>전기오븐 GFK > traverse > Naverstore > 가전 > 주방가전 > 오븐/제빵 > 전기오븐'</li></ul> |
| 30.0 | <ul><li>'캐리어 50평,80평 업소용 대형 냉난방기 실외기 포함 '</li><li>'캐리어 냉난방기 40평형 인버터 스탠드 냉온풍기 실외기포함 DMQE401LAWWSX '</li><li>'앞치마소독기 열풍건조 위생복살균기 앞치마15장 업소용 MVHAA815 (#M)주방가전>식기세척/건조기>칼도마살균건조기 GFK > traverse > 11st > 가전/디지털 > 주방가전 > 식기세척/건조기 > 칼도마살균건조기'</li></ul> |
| 139.0 | <ul><li>'[이오시카] 뷰티유튜버 PICK IPL 제모의료기기 SIPL-2000 PLUS(100만회)+시카젤+선글라스 (#M)디지털/가전>이미용가전>제모기 Naverstore > 가전 > 이미용가전 > 면도기/이발기 > 제모기'</li><li>'쉬크 인튜이션 미니언즈에디션 버라이어티 기획 2종 택 1 (기+날4입) 핑크(쉐어버터) (#M)홈>바디케어>제모용품>면도기/제모의료기기 OLIVEYOUNG > 바디케어 > 제모용품'</li><li>'필립스 모근제거기 BRE255/매끈한 피부 (#M)GSSHOP>뷰티>이미용기기>기타이미용기기 GSSHOP > 뷰티 > 이미용기기 > 기타이미용기기'</li></ul> |
| 35.0 | <ul><li>'린나이 전기온수기 15리터 저장식 교체 까페 대용량 직접설치 직접설치(택배발송)_15리터(벽걸이형) (#M)디지털/가전>계절가전>온수기>전기온수기 GFK > Naverstore > 가전 > 계절가전 > 온수기 > 전기식'</li><li>'경동나비엔 30리터 전기온수기 EW-30RN-U [NEW] ESW350-30U(상향식) (#M)디지털/가전>계절가전>온수기>전기온수기 GFK > Naverstore > 가전 > 계절가전 > 온수기 > 전기식'</li><li>'린나이 전기온수기 15리터 저장식 교체 까페 대용량 직접설치 직접설치(택배발송)_15리터(바닥형) (#M)디지털/가전>계절가전>온수기>전기온수기 GFK > Naverstore > 가전 > 계절가전 > 온수기 > 전기식'</li></ul> |
| 38.0 | <ul><li>'순수편백나무 격자무늬 자연기화식 가습기 증발식 (#M)위메프 > 가전·디지털·컴퓨터 > 계절가전/에어컨 > 가습/청정가전 > 가습기 위메프 > 가전·디지털·컴퓨터 > 계절가전/에어컨 > 가습/청정가전 > 가습기'</li><li>'순수편백나무 자연기화식 바스켓가습기 소 (#M)위메프 > 가전·디지털·컴퓨터 > 계절가전/에어컨 > 가습/청정가전 > 가습기 위메프 > 가전·디지털·컴퓨터 > 계절가전/에어컨 > 가습/청정가전 > 가습기'</li><li>'자연기화가습기 사무실 공부 수험생 건조 디퓨저 무드 하얀 풍경 에센셜 오일 3병 700ml (#M)11st>계절가전>가습기>복합식가습기 11st > 가전/디지털 > 계절가전 > 가습기 > 복합식가습기'</li></ul> |
| 94.0 | <ul><li>'비솝연수기 구 에코렉스연수기 살균 염소제거 '</li><li>'현대 연수기 렌탈 업소용 식당 가정용 잔류염소케어 약정4년 HQ-S2010 '</li><li>'연수기 듀벨 F15 간편 본품 녹물제거필터 모둠 리필필터 리필필터_F15_고급형_3개 홈>전체상품;(#M)홈>듀벨 연수기>수도애>본품 Naverstore > 가전 > 욕실가전 > 연수기'</li></ul> |
| 83.0 | <ul><li>'엑타코 스프레이 무선물걸레청소기 E7 (건조대 + 극세사 총 6장 + 일회용청소포 20매 + 인스톨패드 2장 / 포토 상품평 이벤트) S85_엑타코 E7 (스타터 세트/배터리1개) (#M)홈>디지털/가전>생활가전>청소기>물걸레청소기 Naverstore > 가전 > 청소기 > 물걸레청소기'</li><li>'[10분어택] 세비즈 원터치 물분사 LED 트리플 고주파 회전 무선 물걸레청소기 MOP1 (#M)가전·컴퓨터>생활가전>청소기>물걸레청소기 Tmon > 가전·디지털 > 가전·컴퓨터 > 생활가전 > 청소기 > 물걸레청소기'</li><li>'코맘스 소형 물걸레청소기 PC9005G 1. 그레이 (PC9005G) (#M)홈>생활가전>청소기 Naverstore > 가전 > 청소기 > 물걸레청소기'</li></ul> |
| 150.0 | <ul><li>'[기타]Seagate 외장하드 Backup Plus Portable 4TB '</li><li>'[기타]외장 하드 케이스 하드디스크 케이스 C타입 USB3.0 '</li><li>'[기타]3.5형 SATA HDD 외장하드 케이스 보관함 데이터 백업 '</li></ul> |
| 140.0 | <ul><li>'LG전자 프라엘 워시팝 초음파 진동클렌저 코코넛 화이트_BCP2 (#M)홈>화장품/미용>뷰티소품>메이크업브러시>브러시세트 Naverstore > 화장품/미용 > 뷰티소품 > 메이크업브러시 > 브러시세트'</li><li>'슬룸 허리편한케어 허리마사지기 마사지베개 스트레칭 온열 진동 안마기 1개 [48% 할인] 허리편한케어 + 크림 (#M)생활/건강>안마용품>안마기 GFK > Naverstore > 건강/의료용품 > 안마용품 > 쿠션안마기'</li><li>'엘지 프라엘 바디스파 SSP1 (#M)홈>화장품/미용>바디케어>바디케어세트 Naverstore > 화장품/미용 > 바디케어 > 바디케어세트'</li></ul> |
| 47.0 | <ul><li>'HDMI+USB 통합 KVM 케이블 (1.5M, 2M, 3M, 5M) '</li><li>'시스라인 CBD-600H 6m, 1개 '</li><li>'강원전자 넷메이트 KVM USB Stereo 케이블 '</li></ul> |
| 159.0 | <ul><li>'[위니아]클라쎄 컨버터블 김치냉장고 120리터 KAE112SSM4MSV(AK) (#M)냉장고>김치 냉장고>뚜껑형 GFK > traverse > 11st > 가전/디지털 > 냉장고 > 김치 냉장고 > 뚜껑형'</li><li>'비스포크 키친핏 김치냉장고 3도어 RQ33C74B1W6 (313L, 새틴 화이트, 1등급) (#M)냉장고>김치 냉장고>스탠드형>3도어 GFK > 11st > 가전/디지털 > 냉장고 > 김치 냉장고 > 스탠드형'</li><li>'삼성전자 RQ33C74C3AP 비스포크 김치플러스 키친핏 새틴 베이지+그레이 3도어 냉장고 국민전자 (#M)냉장고>김치 냉장고>스탠드형>3도어 GFK > traverse > 11st > 가전/디지털 > 냉장고 > 김치 냉장고'</li></ul> |
| 93.0 | <ul><li>'삼성전자 15L 대형 대용량 업소용 공업용 산업용 영업용 유선 청소기 강력한 흡입력 홈>생활 가전>청소기;(#M)홈>전체상품 Naverstore > 디지털/가전 > 생활가전 > 청소기 > 업소용청소기'</li><li>'백마스터 연동 청소기 VQ1530SFDC VQ1220PF 프레레 집진기 EVC-20P 이엑스파워 선택2. 연동형 20L VQ1220PFC (#M)홈>청소기>유선청소기 Naverstore > 디지털/가전 > 생활가전 > 청소기 > 업소용청소기'</li><li>'디월트 청소기 건습식 송풍기능 23L,45L,61L 모음 DXV23P,45P,61P 호스 선택02. DXV45P(45L) (#M)홈>전동공구>디월트 Naverstore > 디지털/가전 > 생활가전 > 청소기 > 업소용청소기'</li></ul> |
| 201.0 | <ul><li>'키친아트 허브 올인원 전기튀김기 3리터 KF-P4144NK (#M)가전·컴퓨터>주방가전>기타 주방가전>정수기 Tmon > 가전·디지털 > 가전·컴퓨터 > 주방가전 > 기타 주방가전 > 정수기'</li><li>'테팔 튀김기 컴팩트 프로 전기튀김기 FR3220 FR3220KR (#M)11st>주방가전>업소용 주방가전>튀김기 11st > 가전/디지털 > 주방가전 > 업소용 주방가전 > 튀김기'</li><li>'키친아트/라팔/프리미엄/분리형/바스켓/전기 튀김기 KA-P730 (#M)위메프 > 가전·디지털·컴퓨터 > 주방가전 > 에어프라이어/전기오븐/찜기 > 전기 튀김기 위메프 > 가전·디지털·컴퓨터 > 주방가전 > 에어프라이어/전기오븐/찜기 > 전기 튀김기'</li></ul> |
| 70.0 | <ul><li>'(현대Hmall)LG 27UL550 UHD HDR 피벗 높이조절 27인치 화이트 모니터 (#M)위메프 > 가전·디지털·컴퓨터 > 모니터/프린터 > 모니터 > 일반 모니터 위메프 > 가전·디지털·컴퓨터 > 모니터/프린터 > 모니터 > 일반 모니터'</li><li>'LG전자 그램 뷰 View+ 16MQ70 포터블 모니터 새제품 진열제품(C급 액정기스 일부) (#M)11st>모니터>일반 모니터>58cm이하(~23인치) 11st > 가전/디지털 > 모니터 > 일반 모니터 > 58cm이하(~23인치)'</li><li>'알파스캔 에이건 AGON 323QCX2 QHD 155 프리싱크 HDR 게이밍 모니터 (#M)11st>모니터>게이밍 모니터>144Hz 이상 11st > 가전/디지털 > 모니터 > 게이밍 모니터 > 144Hz 이상'</li></ul> |
| 20.0 | <ul><li>'위닉스 H13등급 필터 제로/2.0/S/플러스/WACU300/WACU150 모음전 호환용필터 선택05 - 타워Q_프리미엄형 쇼킹딜 홈>가전>계절가전>가습/제습/청정기;(#M)11st>계절가전>공기청정기>필터/액세서리 11st > 가전/디지털 > 계절가전 > 공기청정기 > 필터/액세서리'</li><li>'정품 위닉스공기청정기필터 타워Q CAF-D0S5 D필터 (#M)11st>생활가전>청소기부품>액세서리 기타 11st > 가전/디지털 > 생활가전 > 청소기부품 > 액세서리 기타'</li><li>'[행사] 위닉스 공기청정기 필터 교환 세트 전기종 호환 1. 위닉스 타워Q 호환 (CAF-D0S5)_헤파플러스 (헤파단일) 쇼킹딜 홈>가전>계절가전>가습/제습/청정기;(#M)11st>계절가전>공기청정기>필터/액세서리 11st > 가전/디지털 > 계절가전 > 공기청정기 > 필터/액세서리'</li></ul> |
| 177.0 | <ul><li>'풀무원 글라스쿡 글라스 유리바스켓 에어프라이어 3리터 (#M)디지털/가전>주방가전>에어프라이어 Naverstore > 가전 > 주방가전 > 에어프라이어 > 바스켓형'</li><li>'테팔 3.5L 에어프라이어 이지프라이 에센셜 EY-1308KR (#M)가전·컴퓨터>TV·냉장고·세탁기>세탁기·건조기>그외 브랜드 Tmon > 가전·디지털 > 가전·컴퓨터 > TV·냉장고·세탁기 > 세탁기·건조기 > 그외 브랜드'</li><li>'쿠쿠전자 쿠쿠 CAF-G0610TB (#M)디지털/가전>주방가전>에어프라이어 Naverstore > 가전 > 주방가전 > 에어프라이어 > 바스켓형'</li></ul> |
| 188.0 | <ul><li>'키친아트 제로 304 무선 전기주전자 1.2리터 (#M)홈>디지털/가전>주방가전>전기포트>무선포트 Naverstore > 가전 > 주방가전 > 전기포트 > 분유포트'</li><li>'키친아트 무선 유리 스텐 전기 커피 주전자 포트 모음 급속가열 360도 회전받침대 SEP-C1700KP (#M)11st>주방가전>전기포트>무선포트/주전자 11st > 가전/디지털 > 주방가전 > 전기포트 > 무선포트/주전자'</li><li>'신일 무선 티포트 전기주전자 45. 키친아트 KK-551MH (#M)가전·컴퓨터>주방가전>전기주전자>무선포트 Tmon > 가전·디지털 > 가전·컴퓨터 > 주방가전 > 전기주전자 > 무선포트'</li></ul> |
| 57.0 | <ul><li>'(EFM) IPTIME POE4002 4포트 기가비트 스위칭허브 +1 UP링크 (SFP COMBO 포트) (#M)디지털/가전>네트워크장비>스위칭허브 Naverstore > 컴퓨터 > 주변기기 > 허브/컨트롤러 > 스위칭허브'</li><li>'IPTIME H6008-IGMP 스위칭 허브 스위치 8포트 (#M)홈>허브(HUB)>스위칭 허브>기가 스위칭 허브 Naverstore > 컴퓨터 > 주변기기 > 허브/컨트롤러 > 스위칭허브'</li><li>'EFM네트웍스 아이피타임 H6008 8포트 기가비트 스위칭허브 홈>스위칭 허브;(#M)홈>스위칭 허브>1GHz 스위칭허브 Naverstore > 컴퓨터 > 주변기기 > 허브/컨트롤러 > 스위칭허브'</li></ul> |
| 190.0 | <ul><li>'SK매진 전자식 전자레인 20L MWO-20EC2 (#M)11st>주방가전>전자레인지>전자레인지 11st > 가전/디지털 > 주방가전 > 전자레인지 > 전자레인지'</li><li>'LG전자 MW23BD (#M)디지털/가전>주방가전>전자레인지 Naverstore > 가전 > 주방가전 > 전자레인지'</li><li>'SK매직 MWO-M8A02 (#M)11st>주방가전>전자레인지>전자레인지 11st > 가전/디지털 > 주방가전 > 전자레인지 > 전자레인지'</li></ul> |
| 45.0 | <ul><li>'[악세사리]스킨세이버R2 홈>악세사리, 소모품;홈>디지털/가전>계절가전>히터>연탄/화목난로;홈>마이스토브;홈>캠핑화목난로>마이스토브;(#M)홈>악세사리, 소모품>설치 악세사리 Naverstore > 가전 > 계절가전 > 난방가전 > 연탄/화목난로'</li><li>'[국내생산] 포시즌 전기발난로 발찜질기 발온열기 풋워머 발히터 보온 실내화 슬리퍼 사무실 옵6) 땡땡이_멀티B형 (#M)가전·컴퓨터>계절가전>전기히터>전기히터 Tmon > 가전·디지털 > 가전·컴퓨터 > 계절가전 > 전기히터'</li><li>'21센추리 사무실 전기 발난로 히팅패드 파티션 히터 10cm 더 넓게 195W 21센추리 파티션히터+담요(색상랜덤)+보관가방 (#M)디지털/가전>계절가전>히터>전기히터 GFK > Naverstore > 가전 > 계절가전 > 난방가전 > 전기히터'</li></ul> |
| 9.0 | <ul><li>'장우컴퍼니 JW-HTKM01 메모리 방열판 (블랙) (#M)디지털/가전>PC부품>쿨러>방열판 GFK > Naverstore > 컴퓨터 > 부품 > 쿨러 > 방열판'</li><li>'JONSBO M.2 방열판 NVME PS5 SSD 방열판 M2-3 (그레이,레드,블랙) 존스보 M2-3_(블랙) (#M)11st>PC부품>쿨러>기타 11st > 가전/디지털 > PC부품 > 쿨러 > 기타'</li><li>'PC 컴퓨터 케이스 120MM RGB LED 쿨러 파워 전원 인텔 타워형 CPU쿨러 교환 튜닝 냉각 쿨링팬 (#M)11st>PC부품>쿨러>케이스용 11st > 가전/디지털 > PC부품 > 쿨러 > 케이스용'</li></ul> |
| 105.0 | <ul><li>'모스큐 가정용 모기퇴치기 벌레 날파리 포충기 무선 포충등 한정수량 55%이벤트 모기퇴치기 (#M)홈>디지털/가전>생활가전>해충퇴치기 Naverstore > 가전 > 계절가전 > 해충퇴치기'</li><li>'Thermacell 써마셀 백패커 모기퇴치기 훈증기 향매트 2세대 모기퇴치기2.0+파우치+4시간용 리필매트 4개 홈>전체상품;홈>생활/건강>생활용품>해충퇴치용품>리퀴드;(#M)홈>디지털/가전>생활가전>해충퇴치기 Naverstore > 가전 > 계절가전 > 해충퇴치기'</li><li>'[끈끈이13장+8종+2개이상구입시 개당5천] 스카이에프 모기 파리 해충퇴치기 포충기 스카이에프플러스(끈끈이13장+8종+복수할인) (#M)디지털/가전>생활가전>해충퇴치기 Naverstore > 가전 > 계절가전 > 해충퇴치기'</li></ul> |
| 54.0 | <ul><li>'HDMI 리피터 EXTENDER 랜선 UTP 연장기 150M 송수신기세트 '</li><li>'HDMI 리피터 UTP 거리연장기 익스텐더 송수신기 세트 150M '</li><li>'넥시 HDMI 무선 송수신기 30M NX-WHR30 NX1076 '</li></ul> |
| 10.0 | <ul><li>'스위치 접착식 하부 흡음재(120pcs) (#M)디지털/가전>PC부품>튜닝용품>기타튜닝용품 GFK > Naverstore > 컴퓨터 > 부품 > 튜닝용품'</li><li>'SW 빈티지 기계식 키보드 스위치 (#M)디지털/가전>PC부품>튜닝용품>기타튜닝용품 GFK > Naverstore > 컴퓨터 > 부품 > 튜닝용품'</li><li>'스테빌 철심 패드 (#M)디지털/가전>PC부품>튜닝용품>기타튜닝용품 GFK > Naverstore > 컴퓨터 > 부품 > 튜닝용품'</li></ul> |
| 52.0 | <ul><li>'LDW931 LTE 라우터 와이파이 동글 유심 카파이 5채널 제품 (#M)디지털/가전>네트워크장비>라우터 GFK > Naverstore > 컴퓨터 > 주변기기 > 네트워크장비 > 라우터'</li><li>'갤럭시 5G 라우터 모바일 포켓 와이파이 심프리 SCR01 화이트 (#M)디지털/가전>네트워크장비>라우터 GFK > Naverstore > 컴퓨터 > 주변기기 > 네트워크장비 > 라우터'</li><li>'갤럭시 5G 모바일 라우터 화이트 SCR01 Galaxy 5G 와이파이 SIM 프리 (#M)디지털/가전>네트워크장비>라우터 Naverstore > 컴퓨터 > 주변기기 > 공유기 > 유무선공유기'</li></ul> |
| 172.0 | <ul><li>'라셀르 업소용냉장고 45박스 간냉식 올냉장 LS-1025R (#M)위메프 > 가전·디지털·컴퓨터 > 대형가전 > 냉장고 > 양문형 냉장고 위메프 > 가전·디지털·컴퓨터 > 대형가전 > 냉장고 > 양문형 냉장고'</li><li>'뷔페 셀프바 반찬 냉장고 샐러드 김밥 보관통 업소용 D1 (뚜껑 포함) (#M)11st>냉장고>4도어 냉장고>4도어 냉장고 11st > 가전/디지털 > 냉장고 > 4도어 냉장고 > 4도어 냉장고'</li><li>'유니크대성 냉장냉동고 테이블냉장고 업소용작업대 냉장-선택19 메탈1500-아날로그 (#M)위메프 > 가전·디지털·컴퓨터 > 대형가전 > 냉장고 > 일반 냉장고 위메프 > 가전·디지털·컴퓨터 > 대형가전 > 냉장고 > 일반 냉장고'</li></ul> |
| 42.0 | <ul><li>'무중력가습기 무선 가습기 디퓨저, 1000ml, 아로마 테라피, 4000mAh 배터리, 충전식 에센셜 오일 무중력가습기 무선 가습기 디퓨저, 1000ml, 아로마 테라피, 4000mAh 배터리, 충전식 에센셜 오일_04 FF (#M)가전·컴퓨터>계절가전>가습기 액세서리 Tmon > 가전·디지털 > 가전·컴퓨터 > 계절가전 > 가습기 액세서리'</li><li>'거치대 차량용 송풍구 초음파 충전식 무선 가습기 통풍구 NEO2M 958차량용가습기 (#M)홈>생활/건강>자동차용품>편의용품>차량용가습기 Naverstore > 가전 > 계절가전 > 가습기/에어워셔 > 차량용 가습기'</li><li>'가습기 불멍 물멍 대용량 미니 청소쉬운 차량용 컬러풀 D 초음파 에센셜 오일 아로마 디퓨저 3L, 더블 가습기 불멍 물멍 대용량 미니 청소쉬운 차량용 컬러풀 D 초음파 에센셜 오일 아로마 디퓨저 3L, 더블_01 WHITE (#M)가전·컴퓨터>계절가전>가습기 Tmon > 가전·디지털 > 가전·컴퓨터 > 계절가전 > 가습기'</li></ul> |
| 113.0 | <ul><li>'삼성전자 Crystal UHD KU55UD7000FXKR 스탠드형 R3 (#M)TV>138~175cm (55~69인치)>138~175cm (55~69인치) GFK > traverse > 11st > 가전/디지털 > TV > 138~175cm (55~69인치) > 138~175cm (55~69인치)'</li><li>'삼성전자 2024 QLED 4K KQ65QD83AFXKR 스탠드형 (사운드바포함) (#M)디지털/가전>영상가전>TV>QLEDTV GFK > naver_plus_traverse > Naverstore > 가전 > TV > QLEDTV'</li><li>'2022년형 신제품 더함 50인치 퀀텀닷 안드로이드 OS11 스마트TV UA501QLED 기본스탠드(TV다리) 기사방문설치_UA501QLED 홈>[NEW]우버 AMG 안드로이드TV;홈>[NEW]안드로이드 스마트 TV;(#M)홈>인치별>50인치TV Naverstore > 가전 > TV > QLEDTV'</li></ul> |
| 137.0 | <ul><li>'하이맥스 CL-9700K 바리깡 / 클리퍼 / 전문가용 이발기 / 신형 (#M)디지털/가전>이미용가전>이발기 Naverstore > 가전 > 이미용가전 > 면도기/이발기 > 이발기'</li><li>'하이맥스 CL-300 장미 토끼 바리깡 미용실 전문가용 남자 이발기 히다치 가정용 CL-300 화이트 (#M)홈>디지털/가전>이미용가전>이발기 Naverstore > 가전 > 이미용가전 > 면도기/이발기 > 이발기'</li><li>'아지아 전문가용 미용실 바리깡 스마트오토 JP-700 홈>전문가용이발기;(#M)홈>전문가용 이발기 Naverstore > 가전 > 이미용가전 > 면도기/이발기 > 이발기'</li></ul> |
| 221.0 | <ul><li>'고프로 히어로 배터리 13 12 11 10 9 8 7 6 5 4 고프로13 전용 엔듀로배터리 정품 (#M)디지털/가전>카메라/캠코더용품>충전기/배터리>전용정품배터리 GFK > naver_plus_traverse_extension > Naverstore > 휴대폰/카메라 > 카메라 > 충전기/배터리'</li><li>'큐라덴 큐라프록스 하이드로소닉 Easy 3단 음파전동칫솔 (핸들 1+리필모 1+충전기+케이스) (#M)디지털/가전>생활가전>구강청정기>전동칫솔 GFK > live > Naverstore > Shop Live > 테크 > 20250121 > 19:30 ~ 21:30'</li><li>'카메라 DSC-W300 충전기 NP BG1 배터리 1800mAh 04 2batterycharger_01 CHINA (#M)카메라/주변기기>배터리/충전기>전용배터리 GFK > traverse > 11st > 가전/디지털 > 카메라/주변기기 > 배터리/충전기 > 전용배터리'</li></ul> |
| 32.0 | <ul><li>'21센추리 업소용 에어커튼 EKOVIM-G1-09 날벌레차단 출입문 먼지차단 자가설치가능 CYA-A090 출입문용 (#M)디지털/가전>계절가전>에어커튼 GFK > Naverstore > 가전 > 계절가전 > 에어커튼'</li><li>'21센추리 업소용 에어커튼 EKOVIM-G1-09 날벌레차단 출입문 먼지차단 자가설치가능 EKOVIM-G1-09 일반용 (#M)디지털/가전>계절가전>에어커튼 GFK > Naverstore > 가전 > 계절가전 > 에어커튼'</li><li>'신일 에어커튼 업소용 산업용 날벌레차단 냉기차단 현관 출입문 900mm 원모터 900mm(원모터) (#M)디지털/가전>계절가전>에어커튼 GFK > Naverstore > 가전 > 계절가전 > 에어커튼'</li></ul> |
| 33.0 | <ul><li>'AK몰_21센추리 창문형에어컨 CINT-8100R 초절전인버터 (#M)위메프 > 가전·디지털·컴퓨터 > 계절가전/에어컨 > 에어컨 > 벽걸이 에어컨 위메프 > 가전·디지털·컴퓨터 > 계절가전/에어컨 > 에어컨 > 벽걸이 에어컨'</li><li>'삼성전자 삼성 Q9000 AF17B6474GZRS 멀티형에어컨 전국 기본설치비포함 1.일반배관 (#M)디지털/가전>계절가전>에어컨>멀티형에어컨 GFK > Naverstore > 가전 > 계절가전 > 에어컨 > 멀티형'</li><li>'삼성 비스포크 창문형에어컨 윈도우핏 AW05B5171BWA 17㎡ 새틴 블루 창문매립형 본사설치[X] 11st > 가전/디지털 > 계절가전 > 에어컨 > 창문형;(#M)11st>계절가전>에어컨>창문형 11st > 가전/디지털 > 계절가전 > 에어컨 > 창문형'</li></ul> |
| 202.0 | <ul><li>'키친아트 신제품 1구 하이라이트 전기 레인지 가정용 원룸 휴대용 소형 1인용 캠핑 미니 인덕션 모델명 : KP-8011 (#M)홈>디지털/가전>주방가전>하이라이트 Naverstore > 가전 > 주방가전 > 전기레인지 > 하이라이트'</li><li>'SK매직 빌트인 매립형 스탠드형 프리스탠딩 3구 하이라이트 전기레인지 / ERABT300M 스탠드타입(높이8CM) (#M)11st>주방가전>전기레인지>하이라이트 11st > 가전/디지털 > 주방가전 > 전기레인지 > 하이라이트'</li><li>'보랄 DUO 2구 하이라이트 BR-TH5800FY 인덕션 전기렌지 주방용품 집들이선물 (#M)홈>전체상품 Naverstore > 가전 > 주방가전 > 전기레인지 > 하이라이트'</li></ul> |
| 111.0 | <ul><li>'텐바이텐 정품 MS 윈도우 10 프로 한글 FPP 처음사용자용 설치USB 병행 (#M)위메프 > 가전·디지털·컴퓨터 > PC부품/주변기기/저장장치 > PC주변기기 > 케이블/젠더 위메프 > 가전·디지털·컴퓨터 > PC부품/주변기기/저장장치 > PC주변기기 > 케이블/젠더'</li><li>'마이크로소프트 윈도우11홈 FPP 처음사용자용 한글 (USB) 온라인 공식 판매 인증점 (#M)컴퓨터 주변기기>소프트웨어>운영체제(OS) GFK > 11st > 가전/디지털 > 컴퓨터 주변기기 > 소프트웨어 > 운영체제(OS)'</li><li>'5천원 쿠폰💖 [마이크로소프트] Windows 10 Pro 처음사용자용 패키지(FPP) [한글/USB타입] (#M)디지털/가전>소프트웨어>운영체제 GFK > Naverstore > 컴퓨터 > 소프트웨어'</li></ul> |
| 12.0 | <ul><li>'(24시 상품발송) PC/스팀 한글판 Raft 래프트 레프트 NA 래프트 NA (#M)디지털/가전>게임기/타이틀>PC게임 GFK > Naverstore > 디지털 > 게이밍 > PC게임'</li><li>'(스팀코드 24시간 자동발송) Victoria 3 빅토리아 3 AA 모든계정에 등록가능 1.빅토리아 3 AA (#M)디지털/가전>게임기/타이틀>PC게임 GFK > Naverstore > 디지털 > 게이밍 > PC게임'</li><li>'(10초발송 스팀 스팀게임) 라스트 에폭 NA Last Epoch 라스트에폭 AA모든 (#M)디지털/가전>게임기/타이틀>PC게임 GFK > Naverstore > 디지털 > 게이밍 > PC게임'</li></ul> |
| 131.0 | <ul><li>'포레오 진동클렌저 루나 4 고 에버그린 1개 루나 4 고 (에버그린)+선물박스 (소) (#M)디지털/가전>이미용가전>기타이미용가전 LO > window_fashion_town > Naverstore > FashionTown > 뷰티 > CATEGORY > 뷰티 디바이스 > 기타'</li><li>'글로비 다크리스 색소침착 마사지기 다크서클 홈케어 다크써클 본품1개(1월20일 소량입고) (#M)디지털/가전>이미용가전>피부케어기기 GFK > traverse > Naverstore > 가전 > 이미용가전'</li><li>'포레오 진동클렌저 루나 4 (민감성 피부) 1개 루나 4 (민감성 피부)+선물박스 (대) (#M)디지털/가전>이미용가전>기타이미용가전 LO > window_fashion_town > Naverstore > FashionTown > 뷰티 > CATEGORY > 뷰티 디바이스 > 기타'</li></ul> |
| 166.0 | <ul><li>'에버홈 EV-RG3000 투명창 듀얼 필터 생선구이기. (#M)주방가전>전기그릴/전기팬>전기그릴 GFK > traverse > 11st > 가전/디지털 > 주방가전 > 전기그릴/전기팬 > 전기그릴'</li><li>'쿠쿠 양면 멀티 그릴 CFR-331R (#M)디지털/가전>주방가전>생선그릴 Naverstore > 가전 > 주방가전 > 전기그릴/팬 > 생선그릴'</li><li>'[에버홈] 생선구이기 점보 (#M)주방가전>전기포트>무선포트/주전자 GFK > traverse > 11st > 가전/디지털 > 주방가전 > 전기포트 > 무선포트/주전자'</li></ul> |
| 71.0 | <ul><li>'샤오미 미지아모니터조명 LED MJGJD02YL 2세대 (#M)디지털/가전>모니터주변기기>기타모니터주변기기 GFK > traverse > Naverstore > 컴퓨터 > 주변기기 > 모니터용 > 기타'</li><li>'[카멜인터내셔널] 베사 확장 브라켓, VC-1 [200X200mm 변환] (#M)디지털/가전>모니터주변기기>기타모니터주변기기 Naverstore > 컴퓨터 > 주변기기 > 모니터용 > 기타'</li><li>'스톤힐 MS-01 모니터 받침대 듀얼 스탠드 다용도 선반 MS-01 400(400mm)_블랙(업그레이드-높이8cm) (#M)디지털/가전>모니터주변기기>기타모니터주변기기 Naverstore > 컴퓨터 > 주변기기 > 모니터용 > 받침대'</li></ul> |
| 224.0 | <ul><li>'DJI Osmo 마그네틱 볼 조인트 어댑터 마운트 (#M)디지털/가전>카메라/캠코더용품>액션캠 GFK > naver_plus_traverse_extension > Naverstore > 휴대폰/카메라 > 카메라 > 액션캠/캠코더'</li><li>'인스타360 ACE PRO2 에이스 프로2 다이브 번들 정품 액션캠 포인트 포함 256GB로 변경 (#M)디지털/가전>카메라/캠코더용품>액션캠 GFK > naver_plus_traverse_extension > Naverstore > 휴대폰/카메라 > 카메라 > 액션캠/캠코더'</li><li>'포토토 빈티지 캠코더 레트로 Y2K 미니 비디오 카메라 핑크 (#M)디지털/가전>카메라/캠코더용품>캠코더 GFK > naver_plus_traverse_extension > Naverstore > 휴대폰/카메라 > 카메라 > 액션캠/캠코더'</li></ul> |
| 48.0 | <ul><li>'ipTIME A2004SE 기가비트 와이파이 공유기 유무선 아이피타임 라이트 메시 무선 인터넷 WIFI (#M)컴퓨터 주변기기>공유기>유무선공유기 GFK > 11st > 가전/디지털 > 컴퓨터 주변기기 > 공유기 > 유무선공유기'</li><li>'EFM네트웍스 아이피타임 N704EPlus (#M)홈>디지털/가전>네트워크장비>공유기>유무선공유기 Naverstore > 컴퓨터 > 주변기기 > 공유기 > 유무선공유기'</li><li>'아이피타임 ipTIME A3008-MU WIFI 유무선 공유기 YBS (#M)홈>디지털/가전>네트워크장비>공유기>유무선공유기 Naverstore > 컴퓨터 > 주변기기 > 공유기 > 유무선공유기'</li></ul> |
| 121.0 | <ul><li>'무선 핀마이크 유튜브 휴대용 방송용 강의용 마이크 스마트폰 블루투스 마이크 보이스원 프로 M-70RW-PRO (#M)디지털/가전>음향가전>마이크>무선마이크 GFK > naver_plus_traverse_extension > Naverstore > 휴대폰/카메라 > 1인방송/촬영 > 스마트폰용품'</li><li>'듀얼 무선 블루투스 마이크 무선스피커 버스킹마이크 노래방 앰프 가정용 앰프마이크 블루투스스피커MP3 본품+NV179-저속충전기 (#M)음향가전>마이크>무선마이크 GFK > traverse > 11st > 가전/디지털 > 음향가전 > 마이크 > 무선마이크'</li><li>'마이크론 Crucial T500 히트싱크 M.2 NVMe 대원씨티에스 (2TB) (#M)저장장치>SSD>1TB이상 GFK > traverse > 11st > 가전/디지털 > 저장장치 > SSD > 1TB이상'</li></ul> |
| 151.0 | <ul><li>'삼성전자 외장하드 Y3 SLIM 2TB 파우치 패키지 HX-MK20Y 01.Y3+파우치 증정_2TB_스모키 그레이 (25년형) + 파우치 (#M)디지털/가전>저장장치>외장HDD GFK > traverse > Naverstore > 컴퓨터 > 저장장치 > 외장하드'</li><li>'씨게이트 외장하드 4TB 4테라 외장HDD 스페이스그레이 [데이터복구+파우치] One Touch HDD 5TB 데이터복구_실버+전용파우치 (#M)디지털/가전>저장장치>외장HDD GFK > naver_plus_traverse_extension > Naverstore > PC/주변기기 > 저장장치 > 외장하드'</li><li>'삼성전자 삼성 외장하드 J3 Portable USB3.0 2TB 외장 HDD [공식인증점] 도착보장 상품 (주문즉시 발송진행)_2TB 블랙 (#M)디지털/가전>저장장치>외장HDD GFK > naver_plus_traverse > Naverstore > PC/주변기기 > 저장장치 > 외장하드'</li></ul> |
| 0.0 | <ul><li>'HP일체형PC 올인원 게이밍컴퓨터 RTX3050 인텔13세대 가정용 기업용 화상회의 파워팩(총32G업+윈11홈정품/개봉설치)_NVMe 1TB 교체(개봉장착) (#M)홈>🖥데스크탑 Naverstore > 컴퓨터 > 데스크탑 > 브랜드PC > HP'</li><li>'[✨삼성슈퍼위크 72만+메모리 무상UP] 삼성전자 삼성 DM500TFA-A38A 데스크탑 인텔 13세대 i3 가성비 인강용 사무용 PC 1. 참여(한컴오피스 동봉)_1. 참여(완료 시 DROP 키보드)_삼성 메모리 8GB(개봉장착) (#M)디지털/가전>PC>브랜드PC GFK > Naverstore > 컴퓨터 > 데스크탑'</li><li>'삼성 데스크탑 DM500TEA-A78A 고사양 사무용 인텔 12세대 i7 컴퓨터 삼성PC 1. 참여(한컴오피스 동봉)_2.NEW✨DM500TFA-A78A(13세대) 홈>전체상품;홈>데스크탑>12세대 CPU;(#M)홈>삼성데스크탑>12세대 CPU Naverstore > 컴퓨터 > 데스크탑 > 브랜드PC > 삼성전자'</li></ul> |
| 136.0 | <ul><li>'비달사순 에어스타일러 VSAS80PIK 비달사순 에어스타일러 VSAS80PIK 홈>헤어케어>헤어기기>탈모/두피기기/헤어롤;(#M)홈>헤어케어>헤어기기>고데기 OLIVEYOUNG > 헤어케어 > 헤어기기 > 고데기'</li><li>'포뷰트 엠스타일러 포뷰트 엠스타일러 홈>남성>헤어케어>헤어 기기;홈>헤어케어>헤어기기>헤어셋팅기기;홈>헤어케어>헤어기기>헤어롤;홈>헤어케어>스타일링>왁스/젤/무스;홈>헤어케어>헤어기기>탈모/두피기기;홈>헤어케어>헤어기기>탈모/두피기기/헤어롤;(#M)홈>헤어케어>헤어기기>고데기 OLIVEYOUNG > 남성 > 헤어케어 > 염색/다운펌/기기'</li><li>'청담스타일 뿌리펌 브러쉬 청담스타일 뿌리펌 브러쉬 (그레이) (#M)홈>청담스타일 고데기 Naverstore > 가전 > 이미용가전 > 헤어스타일러 > 에어브러시'</li></ul> |
| 59.0 | <ul><li>'솔텍 SFC200-SCS 싱글모드 100Mbps 광컨버터 (#M)디지털/가전>네트워크장비>컨버터장비 Naverstore > 컴퓨터 > 주변기기 > 허브/컨트롤러 > 컨버터'</li><li>'넥시 AV 아날로그 3RCA to HDMI 변환 컨버터 NX648 (#M)디지털/가전>네트워크장비>컨버터장비 Naverstore > 컴퓨터 > 주변기기 > 허브/컨트롤러 > 컨버터'</li><li>'랜스타 LS-AV2HD AV컨버터 3RCA to HDMI 1080P 지원 양방향 불가 (#M)디지털/가전>네트워크장비>컨버터장비 Naverstore > 컴퓨터 > 주변기기 > 허브/컨트롤러 > 컨버터'</li></ul> |
| 58.0 | <ul><li>'ipTIME(아이피타임) A1004 기가비트 유무선공유기 Wi-fi 안테나 3개 5GHz, 2.4GHz 듀얼밴드 홈>전체상품;(#M)홈>브랜드관>ipTime(공유기,랜카드)>유무선 공유기 Naverstore > 컴퓨터 > 주변기기 > 공유기 > 유무선공유기'</li><li>'COMS 무선 안테나 암,수 Wi-Fi Antennas 2.4Ghz 5dbi-RP-SMA 5dbi-RP-SMA (암) (#M)디지털/가전>네트워크장비>안테나 GFK > Naverstore > 컴퓨터 > 주변기기 > 네트워크장비 > 안테나'</li><li>'포켓 라디오 소리큰 비상용 라디오 재난용 초미니 라디오 안테나 mp3플레이어 라디오 (#M)디지털/가전>음향가전>라디오 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 음향가전 > 라디오/MP3'</li></ul> |
| 7.0 | <ul><li>'[INTEL] Arc A770 Limited Edition D6 16GB (#M)디지털/가전>PC부품>그래픽카드>기타계열 Naverstore > 컴퓨터 > 부품 > 그래픽카드 > 기타계열'</li><li>'GIGABYTE 지포스 RTX 4060 Ti EAGLE D6 8GB 피씨디렉트 (#M)홈>디지털/가전>PC부품>그래픽카드>NVIDIA계열 Naverstore > 컴퓨터 > 부품 > 그래픽카드 > NVIDIA계열'</li><li>'갤럭시 GALAX RTX 3080 EX 게이머 WHITE OC 10GB 24년 8월~10월 무상as 남음 풀박스제품 3팬 화이트 (#M)디지털/가전>PC부품>그래픽카드>NVIDIA계열 GFK > Naverstore > 컴퓨터 > 부품 > 그래픽카드 > NVIDIA계열'</li></ul> |
| 155.0 | <ul><li>'네스프레소 에어로치노4 NESPRESSO 유럽 직배송 (#M)홈>전체상품 Naverstore > 디지털/가전 > 주방가전 > 거품/반죽기'</li><li>'네스프레소 에어로치노4 (#M)디지털/가전>주방가전>거품/반죽기 Naverstore > 가전 > 주방가전 > 커피용품 > 우유거품기'</li><li>'오펠 스탠드믹서 1100W 거품기 반죽기 휘핑기 OFM-1504 레트로베이지 (#M)디지털/가전>주방가전>거품/반죽기 Naverstore > 가전 > 주방가전 > 오븐/제빵 > 거품/반죽기'</li></ul> |
| 46.0 | <ul><li>'이지넷유비쿼터스 NEXT-7602KVM-4K 2포트 HDMI KVM스위치 화이트 (#M)디지털/가전>네트워크장비>KVM스위치 Naverstore > 컴퓨터 > 주변기기 > 허브/컨트롤러 > KVM'</li><li>'ATEN KL1516AIN 19인치 Cat5 LCD KVM 스위치 듀얼레일 Over IP LCD콘솔 (#M)디지털/가전>네트워크장비>KVM스위치 GFK > traverse > Naverstore > 컴퓨터 > 주변기기 > 허브/컨트롤러 > KVM'</li><li>'이지넷유비쿼터스 NEXT-7102KVM-4K 2x1 HDMI USB UHD 4K KVM 스위치 (#M)디지털/가전>네트워크장비>KVM스위치 Naverstore > 컴퓨터 > 주변기기 > 허브/컨트롤러 > KVM'</li></ul> |
| 229.0 | <ul><li>'샤오미 이북리더기 E북리더기 전자책리더기 mi reader 미리더 '</li><li>'밀리의서재 E북리더기 + 밀리의서재 12개월 구독권 '</li><li>'[ 설 선물대첩 ] 이노스페이스원 루나 6인치 이북리더기 범용기 루나X+퍼플스킨 (#M)디지털/가전>학습기기>전자책 GFK > traverse > Naverstore > 디지털 > 태블릿PC > 전자책 > 본체'</li></ul> |
| 34.0 | <ul><li>'천장형 시스템 에어컨 바람막이 윈드 플렉스 가림막 윈드플렉스 투명 1개 (#M)디지털/가전>계절가전>에어컨주변기기>기타액세서리 GFK > Naverstore > 가전 > 계절가전 > 에어컨 > 리모컨, 주변용품'</li><li>'천장형 시스템에어컨 실링팬 화이트 올트팬 바람막이 순환프로펠러 윈드바이저 에어컨 바람개비 천정형 에어컨 실링팬 화이트 (#M)디지털/가전>계절가전>에어컨주변기기>기타액세서리 GFK > Naverstore > 가전 > 계절가전 > 에어컨 > 리모컨, 주변용품'</li><li>'천장형 시스템 에어컨바람막이 LG 삼성 공용(4way 1세트) (#M)디지털/가전>계절가전>에어컨주변기기>기타액세서리 GFK > Naverstore > 가전 > 계절가전 > 에어컨 > 리모컨, 주변용품'</li></ul> |
| 78.0 | <ul><li>'오랄비 iO9 전동칫솔 블랙 오닉스 (핸들1+리필모4+충전기+충전케이스)+( )치간칫솔 10개입 치간칫솔 10개입 [GW344]_iO9 블랙 오닉스[Q034]_얼티밋화이트4입[Q039] (#M)디지털/가전>생활가전>구강청정기>전동칫솔 GFK > Naverstore > oralbkr브랜드스토어 > 전동칫솔 > iO Series'</li><li>'2080 소닉클론 음파진동 기획팩 (본품1+리필3) 2080 소닉클론 음파진동 기획팩 (본품1+리필3) 홈>건강/위생용품>덴탈케어>전동칫솔/세정기;홈>건강/위생용품>구강용품>전동칫솔/세정기;(#M)홈>구강/건강용품>구강용품>전동칫솔/세정기 OLIVEYOUNG > 베스트 > 구강/건강용품'</li><li>'식스비 3단 유아 음파 전동칫솔 전용 칫솔모 3단유아_옐로우칫솔모(2EA) (#M)디지털/가전>생활가전>구강청정기>전동칫솔모 Naverstore > 가전 > 욕실가전 > 전동칫솔모'</li></ul> |
| 44.0 | <ul><li>'힘펠 터보팬 JV-102 환풍기 욕실 저소음 정풍량 고성능 역류방지 전동댐퍼 자가설치(직접설치) (#M)디지털/가전>계절가전>공기정화기>환풍기 GFK > traverse > Naverstore > 가전 > 계절가전 > 공기청정기'</li><li>'한일 화장실 환풍기 욕실 환풍기 환기팬 셔터형 35cm (#M)11st>계절가전>공기청정기>필터식 11st > 가전/디지털 > 계절가전 > 공기청정기 > 필터식'</li><li>'힘펠 욕실/화장실 환풍기 플렉스 C2-100LF 역류방지 냄새차단 자가설치 중정압/저소음 1.제로크(No.11~22)_12-1.제로크HV3-80X(MD) F그릴_자가설치 (#M)디지털/가전>계절가전>공기정화기>환풍기 Naverstore > 가전 > 계절가전 > 공기청정기 > 환풍기'</li></ul> |
| 1.0 | <ul><li>'레노버 씽크스테이션 P360 Ultra-30G1S01N00 i7-12700 16G 512G ( 11월 입고) (#M)홈>전체상품 Naverstore > 컴퓨터 > 데스크탑 > 서버/워크스테이션'</li><li>'[Dell] PowerEdge T350 E-2378G 8GB 480GB SSD 600W(1+1) H755 '</li><li>'워크스테이션 DELL T7910 24코어 48스레드 128G 홈>디지털/가전>PC>서버/워크스테이션;(#M)홈>디지털가전 Naverstore > 컴퓨터 > 데스크탑 > 서버/워크스테이션'</li></ul> |
| 129.0 | <ul><li>'삼성전자 삼성 HW-Q990D '</li><li>'벽걸이 타공형 슬림 사운드바거치대 심플 사운드바 브라켓 셀프인테리어 캣 벽걸이 선반 켓 사운드바 사운드바 브라켓 (#M)음향가전>홈시어터>홈시어터 GFK > traverse > 11st > 가전/디지털 > 음향가전 > 홈시어터 > 홈시어터'</li><li>'브리츠 BZ-T3600 '</li></ul> |
| 222.0 | <ul><li>'NEXI 넥시 USB3.0 Type-C A 카드리더기 NX1479 [0001](NEXI) 넥시 USB3.0 Type-C A 카드리 (#M)휴대폰>선불폰/기타>선불유심 GFK > traverse > 11st > 가전/디지털 > 휴대폰 > 선불폰/기타 > 선불유심'</li><li>'POS 신용카드 리더기 MSR-1000 USB 마그네틱리더기 '</li><li>'무인정산기 주차장 자판기 키오스크 단말기 신용카드리더기 TL3500BP '</li></ul> |
| 206.0 | <ul><li>'대용량약탕기 가정용약탕기 홍삼 중탕기 제조기 6L (#M)디지털/가전>주방가전>홍삼제조기 GFK > Naverstore > 가전 > 주방가전 > 홍삼/영양식 > 홍삼제조기'</li><li>'[티울림 건강포트] 약탕기 티포트 중탕기 전기 가정용 차탕기 홍삼제조기 뉴베이지 (#M)디지털/가전>주방가전>홍삼제조기 GFK > Naverstore > 가전 > 주방가전 > 홍삼/영양식'</li><li>'오쿠 도자기 단지 패킹 / 전 도자기 사용 가능 (#M)디지털/가전>주방가전>홍삼제조기 GFK > Naverstore > 가전 > 주방가전 > 홍삼/영양식'</li></ul> |
| 15.0 | <ul><li>'XBOX 오버쿡드 + 오버쿡드2 (코드전송) 한국 계정은 등록법 참조 (#M)디지털/가전>게임기/타이틀>게임타이틀 GFK > Naverstore > 디지털 > 게이밍 > XBOX > 게임타이틀'</li><li>'닌텐도 링피트 어드벤처 링콘 세트 스위치 스포츠 게임 팩 링핏 다이어트 운동 ☆신작☆ 링피트어드벤처 + 저스트댄스 2023 (#M)디지털/가전>게임기/타이틀>게임타이틀 GFK > Naverstore > 디지털 > 게이밍 > 닌텐도 > 게임타이틀'</li><li>'닌텐도 스위치 슈퍼 마리오 RPG 특전 칩케이스 마리오RPG + 버섯 칩케이스 (#M)디지털/가전>게임기/타이틀>게임타이틀 GFK > Naverstore > 디지털 > 게이밍 > 닌텐도 > 게임타이틀'</li></ul> |
| 226.0 | <ul><li>'코닥 골드 필름 200 36컷 + 코닥 컬러플러스 필름 200 36컷 1세트 단품 '</li><li>'스몰리그 X FILM RIOT 10 in 1 접이식 멀티툴 키트 레드 4813 '</li><li>'코닥 필름카메라 필름 컬러플러스 200/36 '</li></ul> |
| 50.0 | <ul><li>'40Gb/s QSFP+ 광모듈 트랜시버 NEXT-QSFP40G-SR4 '</li><li>'이지넷유비쿼터스 넥스트유 SFP10G-LR-H '</li><li>'ipTIME SFP-UTP1G RJ45 모듈 기가비트 100M 거리 지원 '</li></ul> |
| 17.0 | <ul><li>'미니 가습기 휴대용 USB 초음파 아로마 에센셜 오일 디퓨저 220ml 가정용 자동차 미스 미니 가습기 휴대용 USB 초음파 아로마 에센셜 오일 디퓨저 220ml 가정용 자동차 미스_03 green (#M)가전·컴퓨터>계절가전>USB·스틱가습기 Tmon > 가전·디지털 > 가전·컴퓨터 > 계절가전 > USB·스틱가습기'</li><li>'USB가습기 밤Bomb USB 가습기 간편세척 청소쉬운 가정용 미니 탁삭용 거실 휴대용 비염 하늘 (#M)홈>디지털/가전>계절가전>가습기>가습기필터 Naverstore > 가전 > 계절가전 > 가습기/에어워셔 > 필터/액세서리'</li><li>'아로마 불꽃 가습기USB 충전식 디퓨저 미스트 분무기, 사무실 차량 공기 청정기 장식, 침실 장식품 아로마 불꽃 가습기USB 충전식 디퓨저 미스트 분무기, 사무실 차량 공기 청정기 장식, 침실 장식품_03 분홍색 (#M)가전·컴퓨터>계절가전>가습기 Tmon > 가전·디지털 > 가전·컴퓨터 > 계절가전 > 가습기'</li></ul> |
| 218.0 | <ul><li>'카드 DJI 케어 리프레쉬 2년 플랜 (오즈모 액션 4) (#M)SSG.COM>카메라/캠코더>촬영용 드론 GFK > traverse > ssg > 디지털/렌탈 > 카메라/캠코더 > 촬영용 드론'</li><li>'유프로 프리미엄2 액션캠 브이로그카메라 유튜브카메라 블랙 본품 '</li><li>'팅크웨어 아이나비 모빌리티 액션캠 MC-1 '</li></ul> |
| 214.0 | <ul><li>'입문용카메라 초보자 디지털 카메라 가성비 dslr 4K '</li><li>'캐논정품 EOS 90D바디만(미개봉 새상품)/R '</li><li>'(Hidden) 정품 소니 알파 A350 '</li></ul> |
| 144.0 | <ul><li>'스타롤 충전식 열헤어롤 블랙 스타롤 충전식 열헤어롤 블랙 홈>헤어케어>헤어기기>헤어셋팅기기X;홈>헤어케어>헤어기기>헤어셋팅기기;홈>헤어케어>헤어기기>헤어롤;홈>헤어케어>헤어기기>탈모/두피기기;홈>헤어케어>헤어기기>탈모/두피기기/헤어롤;(#M)홈>헤어케어>헤어기기>고데기 OLIVEYOUNG > 헤어케어 > 헤어기기 > 고데기'</li><li>'전기 헤어롤 여행용 비달사순 헤어 세팅기 롤 구르프 셋팅롤 VSHS10BK(N) (#M)홈>게릴라특가 Naverstore > 가전 > 이미용가전 > 헤어스타일러 > 헤어롤/롤셋'</li><li>'스타롤 빅스타롤 충전식 열헤어롤 민트 홈>헤어케어>헤어기기>헤어셋팅기기X;홈>헤어케어>헤어기기>헤어셋팅기기;홈>헤어케어>헤어기기>헤어롤;홈>헤어케어>헤어기기>탈모/두피기기;홈>헤어케어>헤어기기>탈모/두피기기/헤어롤;(#M)홈>헤어케어>헤어기기>고데기 OLIVEYOUNG > 헤어케어 > 헤어기기 > 고데기'</li></ul> |
| 89.0 | <ul><li>'키스뉴욕 마그네틱 원 큐 램프 큐어 핀큐어 휴대용 젤램프 스탠드 거치대 포함+선물선택 고급 오일펜 (#M)디지털/가전>이미용가전>손발톱정리기 GFK > naver_plus_traverse > Naverstore > 가전 > 이미용가전 > 손발케어'</li><li>'파파 와이드 스탠드 500S 책상 조명 스텐드 LED등 독서등 공부조명 '</li><li>'파나소닉 LED스탠드 5W USB-C 충전방식 접이식 무선스탠드 휴대용스탠드 침대독서등 '</li></ul> |
| 126.0 | <ul><li>'아날로그 휴대용 카세트 플레이어 테이프 MP3변환 레트로 감성 '</li><li>'Byron Statics 휴대용 카세트 플레이어 '</li><li>'롯데알미늄 블루투스 CD플레이어 핑키-500 라디오 (#M)디지털/가전>음향가전>CD플레이어 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 음향가전 > 플레이어'</li></ul> |
| 91.0 | <ul><li>'[베스토/BESTO] 핸디형 스팀청소기 BSC-900 홈>청소&세척;(#M)홈>수공구 Naverstore > 가전 > 청소기 > 핸디청소기'</li><li>'샤오미 디어마 스팀청소기 핸디형 살균스팀청소기 ZQ610/600 청소기+리필세트 홈>전체상품;(#M)홈>디지털/가전>생활가전>청소기>스팀청소기 Naverstore > 가전 > 청소기 > 핸디청소기'</li><li>'[대여] 카처SC4 스팀청소기 새걸레 제공 전용 브러쉬 6종 동의 합니다._1/20~24일 수령 후 31일 수거 (#M)디지털/가전>청소기>스팀청소기 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 청소기'</li></ul> |
| 228.0 | <ul><li>'초소형녹음기 소형 장시간 휴대용 보이스레코드 32G 32G (#M)디지털/가전>학습기기>보이스레코더 GFK > traverse > Naverstore > 디지털 > 음향기기 > 녹음기'</li><li>'이지렉 초소형 블루투스 보이스레코더 32GB '</li><li>'자체제작 16기가 C타입 초소형 동전크기 대용량 장시간 휴대용 보이스레코더 녹음기 '</li></ul> |
| 147.0 | <ul><li>'엠비에프 USB 3.0 / C타입 외장 ODD DVD-RW '</li><li>'멀티허브 3.0 C타입 레코더기기 외장ODD DVD룸 외장드라이브 레코더 DVD롬 외장 USB ED02 CD A ODD 7IN1 (#M)저장장치>ODD>CD-ROM/RW GFK > traverse > 11st > 가전/디지털 > 저장장치 > ODD'</li><li>'노트북 외장CD롬 ODD 플레이어 DVD콤보 리더기 '</li></ul> |
| 116.0 | <ul><li>'플레오맥스 CD 플레이어 블루투스 라디오 스피커 휴대용 '</li><li>'일우 투명 CD플레이어 IW-ET07 휴대용 충전식 레트로 감성 '</li><li>'아이리버 올인원 CD 플레이어 턴테이블 디자인 라디오 블루투스 스피커 IAB40 '</li></ul> |
| 125.0 | <ul><li>'인이어이어폰 게이밍이어폰 커널형 마이크 유선 이어폰 탕주 상관완아 탕주 상관완아 블랙_MIC (#M)디지털/가전>음향가전>이어폰 GFK > traverse > Naverstore > 디지털 > 게이밍 > 이어폰/헤드셋'</li><li>'KOSS 코스 포르타 프로 한정판 온이어 유선 헤드폰 Koss Porta Pro 정품 미국발송 (#M)디지털/가전>음향가전>헤드폰 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 음향가전 > 헤드폰'</li><li>'인이어이어폰 탕주 상관완아 SE 스튜디오 에디션 커널형 유선 이어폰 탕주 상관완아 SE 화이트 (#M)디지털/가전>음향가전>이어폰 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 음향가전 > 이어폰'</li></ul> |
| 149.0 | <ul><li>'USB C TO HDMI 케이블 C타입hdmi 4K 미러링 복제 확장 1M 실버 3M-실버 (#M)디지털/가전>PC부품>PC케이블>변환 젠더/케이블 GFK > traverse > Naverstore > 컴퓨터 > 주변기기 > 케이블/젠더 > 케이블'</li><li>'샌디스크 울트라 듀얼드라이브 고 USB Type C USB 메모리 256GB 묵인하다 (#M)디지털/가전>저장장치>USB메모리 GFK > traverse > Naverstore > 컴퓨터 > 저장장치 > USB메모리'</li><li>'Bliksem TYPE C 플래시 드라이브 OTG 32GB 고속 USB2.0, 컴퓨터 휴대폰용, 3 인 1 미니 펜 01 64GB (#M)저장장치>USB 메모리>카드/주얼리형 GFK > traverse > 11st > 가전/디지털 > 저장장치 > USB 메모리 > 카드/주얼리형'</li></ul> |
| 51.0 | <ul><li>'랜스타 LS-NF8209 랜 케이블 멀티 테스터기 탐지/길이/POE 지원 (#M)디지털/가전>네트워크장비>네트워크테스트기 GFK > Naverstore > 컴퓨터 > 주변기기 > 네트워크장비 > 네트워크테스트기'</li><li>'랜 테스터기 468W 랜선 테스터 UTP 단선체크 RJ45 RJ11 02 랜테스터기 468W 블랙 (#M)디지털/가전>네트워크장비>네트워크테스트기 GFK > Naverstore > 컴퓨터 > 주변기기 > 네트워크장비 > 네트워크테스트기'</li><li>'LS 랜테스터기 UTP RJ45 랜케이블 퀵테스터기 LS-LAN-TQ LS-LAN-TA 분리형타입 (#M)디지털/가전>네트워크장비>네트워크테스트기 GFK > Naverstore > 컴퓨터 > 주변기기 > 네트워크장비 > 네트워크테스트기'</li></ul> |
| 169.0 | <ul><li>'키친아트 와이드 6단 트레이 식품건조기 KKW-KG7000 음식 야채 과일 간식 고기 건조기 KKW-KG7000 (#M)홈>디지털/가전>주방가전>식품건조기 Naverstore > 가전 > 주방가전 > 식품건조기'</li><li>'[6%쿠폰] 키친아트 식품건조기 타이머가능 과일 야채 고추 건조기 GN-232D-타이머기능 위메프 > 가전·디지털·컴퓨터 > 주방가전 > 홈메이드가전;(#M)위메프 > 가전·디지털·컴퓨터 > 주방가전 > 홈메이드가전 > 식품건조기 위메프 > 가전·디지털·컴퓨터 > 주방가전 > 홈메이드가전 > 식품건조기'</li><li>'명품 농산물 다목적 고추건조기 소형 7채반 가정용전기사용 (#M)홈>전체상품 Naverstore > 가전 > 주방가전 > 식품건조기'</li></ul> |
| 118.0 | <ul><li>'금영 태진 가정용 노래방기계 이동식 세트 '</li><li>'AV-1000 AV1000 휴대용 노래방 가정용 노래방기기 캠핑 차박 (#M)디지털/가전>음향가전>노래반주기 Naverstore > 디지털 > 음향기기 > 노래반주기'</li><li>'코인노래방 기계 풀세트 가정용 노래방 방음부스 태진반주기 '</li></ul> |
| 14.0 | <ul><li>'아싸라봉 닌텐도 스위치 OLED 찐패키지 악세사리 7종 젤리 세트 닌텐도 OLED용-찐(젤리)7종패키지 블랙 (#M)디지털/가전>게임기/타이틀>게임기주변기기>가방/케이스 GFK > Naverstore > 디지털 > 게이밍 > 주변용품'</li><li>'휴대용 게임 콘솔 보관 가방 보호 케이스, 충격 방지 하드 파우치, Asus ROG Ally 액세서리 01 Red (#M)11st>노트북>삼성전자>코어 i5 11st > 가전/디지털 > 노트북 > 삼성전자 > 코어 i5'</li><li>'XBOX 마이크로소프트 엑스박스 무선 컨트롤러 4세대 (로봇화이트) 로봇화이트 (#M)위메프 > 가전·디지털·컴퓨터 > 게임기/게임타이틀 > 게임 주변기기 > XBOX 주변기기 위메프 > 가전·디지털·컴퓨터 > 게임기/게임타이틀 > 게임 주변기기 > XBOX 주변기기'</li></ul> |
| 82.0 | <ul><li>'나노N / 나노팬더 / 나노펭귄 무전기 이어폰 경호용 이어마이크 리시버 인이어 핸드마이크 옵션2(귀걸이형이어마이크) (#M)디지털/가전>생활가전>무전기>무전기액세서리 GFK > Naverstore > 가전 > 생활가전 > 무전기 > 액세서리'</li><li>'무전기 이어마이크 / 인이어 / 리시버 / 리필 이어튜브 / 투명 / 블랙 투명튜브 (#M)디지털/가전>생활가전>무전기>무전기액세서리 GFK > Naverstore > 가전 > 생활가전 > 무전기'</li><li>'무전기이어폰 JM-8000T 스탠다드 이어마이크 외 다른타입 경호용 인이어 리시버 국산 ③ 스탠다드 (#M)디지털/가전>생활가전>무전기>무전기액세서리 GFK > Naverstore > 가전 > 생활가전 > 무전기 > 액세서리'</li></ul> |
| 104.0 | <ul><li>'한일 미니 짤순이 음식물 탈수기 야채 빨래 만능 다용도 NW-Y2020(신모델) (#M)디지털/가전>생활가전>세탁/건조기>탈수기 GFK > traverse > Naverstore > 가전 > 세탁/건조기 > 탈수기'</li><li>'휴앤봇 스텐 가정용 업소용 세탁 빨래 탈수기 짤순이 DL560 (#M)홈>디지털/가전>생활가전>건조기/탈수기>탈수기 Naverstore > 가전 > 세탁기/건조기 > 탈수기'</li><li>'[25년형] 신일 빨래탈수기 스텐 소형 대용량 수영장 의류 세탁 업소용 7kg '</li></ul> |
| 103.0 | <ul><li>'무선UV침구 청소기 빽가 미우새 진드기 빈대 충전식 이불 침구 진드기 침구청소기 자동청소기 무선UV침구청소기-화이트 (#M)생활가전>청소기>스팀청소기>핸디/스틱형 GFK > traverse > 11st > 가전/디지털 > 생활가전 > 청소기 > 스팀청소기'</li><li>'[텐바이텐][Sanrio] 헬로키티 밥솥 홈>텐바이텐 X Sanrio;(#M)홈>전체상품 Naverstore > 가전 > 청소기 > 침구청소기'</li><li>'[텐바이텐][모던하우스] 2중 전기포트 (#M)홈>전체상품 Naverstore > 가전 > 청소기 > 침구청소기'</li></ul> |
| 65.0 | <ul><li>'헤드셋거치대 에어팟맥스 소니 게이밍 헤드폰 걸이 스탠드 (#M)디지털/가전>음향가전>이어폰/헤드폰액세서리>거치대 GFK > traverse > Naverstore > 디지털 > 음향기기 > 이어폰/헤드폰액세서리 > 케이스/거치대'</li><li>'[스냅케이스]프리미엄 가죽 헤드폰 헤드셋 파우치 케이스 수납 가방 휴대용 보관 크림화이트(HP06) (#M)음향가전>이어폰>무선 이어폰 GFK > traverse > 11st > 가전/디지털 > 음향가전 > 이어폰 > 무선 이어폰'</li><li>'[호환] 앱코 해커 B510 이어패드 게이밍 헤드셋 B510U 7.1 커버 H030 (#M)홈>헤드폰 이어패드 Naverstore > 디지털 > 음향기기 > 이어폰/헤드폰액세서리 > 캡/솜/팁'</li></ul> |
| 107.0 | <ul><li>'1초발송 브이엠웨어 워크스테이션 프로 17 개인용 상업용 정품 영구 라이선스 리딤코드 VMware Workstation Pro 워크스테이션 프로 17 개인용 윈도우용 (#M)디지털/가전>소프트웨어>개발툴 GFK > Naverstore > 컴퓨터 > 소프트웨어 > 개발툴'</li><li>'MS SQL Server 2022 Standard Edition CSP 라이선스 (#M)디지털/가전>소프트웨어>운영체제 GFK > Naverstore > 컴퓨터 > 소프트웨어 > 운영체제'</li><li>'비주얼스튜디오 프로 VisualStudio 2022 Pro 영구 라이선스 (#M)디지털/가전>소프트웨어>개발툴 GFK > Naverstore > 컴퓨터 > 소프트웨어 > 개발툴'</li></ul> |
| 191.0 | <ul><li>'렌탈[공식인증]SK매직정수기렌탈 WPU-8230C 의무사용기간 36개월 초기비용면제 09.스스로 직수 냉정수기 2022_의무기간 해피콜 상담 시 결정_60 11st>가전>이미용/생활가전>생활가전;(#M)11st>렌털/가입상품>가전렌털>정수기 11st > 가전/디지털 > 렌털/가입상품 > 가전렌털 > 정수기'</li><li>'렌탈SK매직 미니 직수 정수기 렌탈 단하루 역대급 최대혜택보장 에코미니 정수기_해피콜 상담시 확인 및 결정(1644-5279)하겠습니다._72 11st>렌털/가입상품>가전렌털>정수기;11st > 가전/디지털 > 렌털/가입상품 > 가전렌털 > 정수기 11st > 가전/디지털 > 렌털/가입상품 > 가전렌털 > 정수기'</li><li>'렌탈[SK매직] 렌탈/라이브방송 기념 상품권 오늘 하루만 35만원 지급/얼음정수기/직수정수기/렌탈료 7천원 할인 01.올인원플러스 직수얼음 정수기(WPUIAC302)_6년약정_72 11st>렌털/가입상품>가전렌털>정수기;11st > 가전/디지털 > 렌털/가입상품 > 가전렌털 11st > 가전/디지털 > 렌털/가입상품 > 가전렌털 > 정수기'</li></ul> |
| 61.0 | <ul><li>'JWC CCTV 녹화기 500만화소 JDO-8005 8채널 DVR '</li><li>'이지피스 DVR CCTV 녹화기 AHVR-2204L 265 4채널 '</li><li>'다후아 500만화소 4채널 CCTV녹화기 DVR 본체 XVR5104HS-I3 '</li></ul> |
| 124.0 | <ul><li>'인켈 IK-A360CD '</li><li>'사운드디퓨저 음향판 음향디퓨저 (벌집Type) (#M)디지털/가전>음향가전>오디오>오디오액세서리 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 음향가전 > 오디오'</li><li>'제네바 제네바스피커 L + 스탠드 '</li></ul> |
| 24.0 | <ul><li>'제크롤 날개없어 안전한 에어쿨러 리모컨 냉풍기 JK-CF3000R 선풍기 기화냉각방식 (#M)11st>계절가전>냉풍기>냉풍기 11st > 가전/디지털 > 계절가전 > 냉풍기 > 냉풍기'</li><li>'[캐리어]공식인증점 캐리어 창문형 에어컨 AWC06FYHS 18.7㎡ (#M)11st>계절가전>냉풍기>냉풍기 11st > 가전/디지털 > 계절가전 > 냉풍기 > 냉풍기'</li><li>'신일전자 기화냉각방식 에어쿨러 이동식 냉풍기 SIF-D700SJ 7L 선풍기 SIF-D700SJ (#M)11st>계절가전>냉풍기>냉풍기 11st > 가전/디지털 > 계절가전 > 냉풍기 > 냉풍기'</li></ul> |
| 128.0 | <ul><li>'삼성전자 AKG N9 HYBRID '</li><li>'브리츠 BT4000 ANC '</li><li>'Apple 에어팟 맥스 '</li></ul> |
| 145.0 | <ul><li>'2.5인치 HDD 하드 500GB 데스크탑 노트북 하드디스크 500기가 (#M)디지털/가전>저장장치>HDD GFK > naver_plus_traverse > Naverstore > PC/주변기기 > 저장장치 > HDD'</li><li>'유니콘 USB3.1 유무선 HDD케이스 HDD외장하드케이스 노트북하드케이스 외장하드케이스 슬라이드 3.5인치 (#M)저장장치>외장HDD>500G~1TB미만 GFK > traverse > 11st > 가전/디지털 > 저장장치 > 외장HDD > 500G~1TB미만'</li><li>'WD Ultrastar HC560 20TB 1PACK SATA3 총판점 무상3년 보증 (#M)디지털/가전>저장장치>HDD GFK > naver_plus_traverse_extension > Naverstore > PC/주변기기 > 저장장치 > HDD'</li></ul> |
| 217.0 | <ul><li>'썬포토 슬릭 삼각대 SLIK GX-S 7500 스마트폰 카메라 겸용 삼각대 (#M)디지털/가전>카메라/캠코더용품>삼각대/헤드>삼각대 GFK > naver_plus_traverse_extension > Naverstore > 휴대폰/카메라 > 카메라 > 삼각대/헤드'</li><li>'[공식인증]인스타360 플로팅 핸드그립 (#M)SSG.COM>카메라/캠코더>삼각대/케이스>삼각대/헤드/플레이트 GFK > traverse > ssg > 디지털/렌탈 > 카메라/캠코더 > 삼각대/케이스 > 삼각대/헤드/플레이트'</li><li>'고프로 히어로 쇼티 삼각대 셀카봉 미니 익스텐션폴 (#M)디지털/가전>카메라/캠코더용품>삼각대/헤드>삼각대 GFK > naver_plus_traverse_extension > Naverstore > 휴대폰/카메라 > 카메라 > 삼각대/헤드'</li></ul> |
| 114.0 | <ul><li>'did모니터 광고용모니터 32인치 전자메뉴판 디지털 '</li><li>'삼성 43인치 4K UHD 광고 DID 모니터 디지털 사이니지 LH43QETELGCXKR '</li><li>'삼성 디지털 사이니지 55인치 LH55QBCEBGCXKR 광고 모니터 DID (#M)디지털/가전>영상가전>TV>LEDTV GFK > Naverstore > 가전 > TV > 화면크기별 > 50인치대'</li></ul> |
| 146.0 | <ul><li>'아이피타임 개인용 나스 NAS 서버 2베이 NAS 2dual '</li><li>'EFM네트웍스 아이피타임 NAS2 Dual '</li><li>'개인서버 가정용NAS J1900 타오나스 가정용 헤놀로지 서버 '</li></ul> |
| 193.0 | <ul><li>'[25년형 NEW] 한경희 건강식 마스터 데이필 두유 죽제조기 HFM-7000 '</li><li>'신일 두유제조기 1L 대용량 가정용 콩물 죽 메이커 만드는기계 '</li><li>'오쿠 OCC-BM1300 '</li></ul> |
| 109.0 | <ul><li>'[기업용] 터보백신 윈도우 서버 1년(통합 보안 악성코드 바이러스 검사/치료) '</li><li>'V3 365 클리닉 '</li><li>'[즉시발송] 카스퍼스키 플러스 1PC 신규형 카스퍼스키 플러스 1년 사용권 (#M)디지털/가전>소프트웨어>보안/백신 GFK > Naverstore > 컴퓨터 > 소프트웨어'</li></ul> |
| 132.0 | <ul><li>'3종세트 눈썹정리 (숱가위+눈썹칼+트위저+가죽케이스) 퍼플 (#M)이미용가전>눈썹정리기>눈썹정리기 GFK > traverse > 11st > 가전/디지털 > 이미용가전 > 눈썹정리기'</li><li>'[Y존,겨드랑이]쉬크 인튜이션 5중날 제모기(핸들1개+날2입)+특별 (쉬크 눈썹칼 프리미엄 4입) 바디트리머 1개+눈썹칼 4입 (#M)디지털/가전>이미용가전>제모기 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 이미용가전 > 제모기/이발기'</li><li>'눈썹고데기 눈썹올리기 마스카라 열 뷰러 진케어 아이컬 아이컬(마스카라형) 핑크 (#M)홈>디지털/가전>이미용가전>눈썹정리기 Naverstore > 가전 > 이미용가전 > 눈썹관리기 > 속눈썹고데기'</li></ul> |
| 134.0 | <ul><li>'필립스 S7000 S5000 교체용 헤드 면도날 (#M)GSSHOP>뷰티>이미용기기>기타이미용기기 GSSHOP > 뷰티 > 이미용기기 > 기타이미용기기'</li><li>'필립스 면도기 무선 클렌징 팟 세척카트리지 6개입/면도기세정액 (#M)GSSHOP>뷰티>이미용기기>기타이미용기기 GSSHOP > 뷰티 > 이미용기기 > 기타이미용기기'</li><li>'필립스 RQ11 교체용 전기면도기날망 면도기날망 (#M)GSSHOP>뷰티>이미용기기>기타이미용기기 GSSHOP > 뷰티 > 이미용기기 > 기타이미용기기'</li></ul> |
| 198.0 | <ul><li>'믹스커피 자판기 커피 미니 기계 머신 식당 기계 업소용 상품 '</li><li>'VEN502 (기기+재료포함) 동구전자 믹스커피자판기 미니자판기 커피머신 전국설치 '</li><li>'두산로보틱스 무인카페 바리스타로봇 닥터프레소 '</li></ul> |
| 73.0 | <ul><li>'천장형 TV브라켓 천정형 티비거치대 모니터브라켓 벽걸이브라켓 cml6 '</li><li>'이젤형 티비 거치대 191cm 호환 TV 스탠드 거치대 크롬 크롬스탠드(1/13 입고) (#M)디지털/가전>영상가전>영상가전액세서리>스탠드 GFK > traverse > Naverstore > 가전 > TV > TV 액세서리 > 스탠드/브라켓'</li><li>'24년식 삼탠바이미 호환 사운드바거치대 무빙스탠드 기둥지름 50mm이하 (#M)디지털/가전>영상가전>영상가전액세서리>브라켓 GFK > naver_plus_traverse_extension > Naverstore > 가전 > TV > 스탠드/거치대'</li></ul> |
| 176.0 | <ul><li>'린나이 엘앤피 파세코 에코 웰텍 동양 호환 기름 정제필터 식용유필터 정제기필터 100매 320x490 엘앤피 파세코 에코 웰텍 (#M)홈>생활건강 Naverstore > 디지털/가전 > 주방가전 > 업소용튀김기'</li><li>'린나이 엘앤피 파세코 에코 웰텍 동양 호환 기름 정제필터 식용유필터 정제기필터 100매 322x382 린나이 ROR-F30 (#M)홈>생활건강 Naverstore > 디지털/가전 > 주방가전 > 업소용튀김기'</li><li>'린나이 엘앤피 파세코 에코 웰텍 동양 호환 기름 정제필터 식용유필터 정제기필터 100매 325x490 엘앤피 파세코 (#M)홈>생활건강 Naverstore > 디지털/가전 > 주방가전 > 업소용튀김기'</li></ul> |
| 175.0 | <ul><li>'리브레 업소용식기세척기sk매직호환 CDW-R152E 세제 2개월분포함 식당 영업용 식세기 '</li><li>'아트원 업소용 식기세척기 도어타입 온수용 카페 식당 영업용 대용량 무료배송 '</li><li>'제스트 업소용식기세척기 온수형 영업용 식당용 교회 회사 구내식당 식기세척기 전국 무료배송 '</li></ul> |
| 55.0 | <ul><li>'에그무제한 포켓파이 LG 신규 기기 대여 1개월 (LTE 데이터 2배 제공) 신규 기기 대여_1개월 (#M)디지털/가전>네트워크장비>무선모뎀 GFK > Naverstore > 컴퓨터 > 주변기기 > 네트워크장비 > 모뎀'</li><li>'에그 무제한 20GB KT LTE 데이터 신규 기기대여 도마우스 1개월 기존 기기 연장_도마우스 20GB_1개월 (#M)디지털/가전>네트워크장비>무선모뎀 GFK > Naverstore > 컴퓨터 > 주변기기 > 네트워크장비 > 모뎀'</li><li>'에그 무제한 20GB KT LTE 데이터 신규 기기대여 도마우스 1개월 기존 기기 연장_하트여왕 MAX_1개월 (10%+ 할인) (#M)디지털/가전>네트워크장비>무선모뎀 GFK > Naverstore > 컴퓨터 > 주변기기 > 네트워크장비 > 모뎀'</li></ul> |
| 62.0 | <ul><li>'유니콘 안드로이드셋탑박스 UHD 4K 60Hz 디빅스플레이어 DV-X70 '</li><li>'유니콘 AV-M7 2세대 디빅스플레이어 UHD 4K지원 미디어플레이어 (#M)디지털/가전>멀티미디어장비>Divx플레이어 Naverstore > 가전 > 영상가전 > 플레이어 > Dvix'</li><li>'서진네트웍스 유니콘 AV-M4 AV-M4본체 (#M)디지털/가전>멀티미디어장비>Divx플레이어 Naverstore > 가전 > 영상가전 > 플레이어 > Dvix'</li></ul> |
| 66.0 | <ul><li>'엠비에프 MBF-USB71C 사운드카드 '</li><li>'리버네트워크 넥시 NX-U20STC USB 사운드카드 (NX614) '</li><li>'[MBF] USB Virtual7.1 Channel 사운드카드 [MBF-USB71C] '</li></ul> |
| 28.0 | <ul><li>'바른산소 고체산소 가정용 사무실 휴대용 독서실 산소발생기 '</li><li>'클린숨 가정용 산소발생기 휴대용 산소생성기 독서실 고체 하루 산소 '</li><li>'세이버 오투나라 KSO-1205H 가정용 상업용 업소용 산소발생기 '</li></ul> |
## Evaluation
### Metrics
| Label | Accuracy |
|:--------|:---------|
| **all** | 0.9082 |
## Uses
### Direct Use for Inference
First install the SetFit library:
```bash
pip install setfit
```
Then you can load this model and run inference.
```python
from setfit import SetFitModel
# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("mini1013/master_item_top_el_flat")
# Run inference
preds = model("해피콜 프리미엄 초고속 블렌더 브리즈탭 LED 터치 UI 믹서기 분쇄기 차콜그레이 (#M)디지털/가전>주방가전>믹서기 Naverstore > 가전 > 주방가전 > 믹서기/블렌더 > 믹서기")
```
<!--
### Downstream Use
*List how someone could finetune this model on their own dataset.*
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Set Metrics
| Training set | Min | Median | Max |
|:-------------|:----|:--------|:----|
| Word count | 5 | 21.2994 | 91 |
| Label | Training Sample Count |
|:------|:----------------------|
| 0.0 | 50 |
| 1.0 | 16 |
| 2.0 | 50 |
| 3.0 | 50 |
| 4.0 | 50 |
| 5.0 | 50 |
| 6.0 | 50 |
| 7.0 | 50 |
| 8.0 | 50 |
| 9.0 | 50 |
| 10.0 | 50 |
| 11.0 | 50 |
| 12.0 | 50 |
| 13.0 | 50 |
| 14.0 | 50 |
| 15.0 | 50 |
| 16.0 | 50 |
| 17.0 | 50 |
| 18.0 | 50 |
| 19.0 | 50 |
| 20.0 | 50 |
| 21.0 | 50 |
| 22.0 | 14 |
| 23.0 | 50 |
| 24.0 | 10 |
| 25.0 | 50 |
| 26.0 | 50 |
| 27.0 | 50 |
| 28.0 | 14 |
| 29.0 | 50 |
| 30.0 | 12 |
| 31.0 | 45 |
| 32.0 | 14 |
| 33.0 | 50 |
| 34.0 | 42 |
| 35.0 | 41 |
| 36.0 | 50 |
| 37.0 | 50 |
| 38.0 | 50 |
| 39.0 | 50 |
| 40.0 | 50 |
| 41.0 | 50 |
| 42.0 | 50 |
| 43.0 | 50 |
| 44.0 | 50 |
| 45.0 | 50 |
| 46.0 | 39 |
| 47.0 | 12 |
| 48.0 | 50 |
| 49.0 | 50 |
| 50.0 | 11 |
| 51.0 | 12 |
| 52.0 | 18 |
| 53.0 | 50 |
| 54.0 | 11 |
| 55.0 | 17 |
| 56.0 | 50 |
| 57.0 | 50 |
| 58.0 | 3 |
| 59.0 | 35 |
| 60.0 | 50 |
| 61.0 | 15 |
| 62.0 | 16 |
| 63.0 | 50 |
| 64.0 | 50 |
| 65.0 | 50 |
| 66.0 | 11 |
| 67.0 | 13 |
| 68.0 | 50 |
| 69.0 | 13 |
| 70.0 | 50 |
| 71.0 | 40 |
| 72.0 | 50 |
| 73.0 | 19 |
| 74.0 | 50 |
| 75.0 | 50 |
| 76.0 | 50 |
| 77.0 | 41 |
| 78.0 | 50 |
| 79.0 | 42 |
| 80.0 | 50 |
| 81.0 | 50 |
| 82.0 | 14 |
| 83.0 | 50 |
| 84.0 | 50 |
| 85.0 | 50 |
| 86.0 | 50 |
| 87.0 | 50 |
| 88.0 | 50 |
| 89.0 | 16 |
| 90.0 | 50 |
| 91.0 | 38 |
| 92.0 | 38 |
| 93.0 | 18 |
| 94.0 | 19 |
| 95.0 | 33 |
| 96.0 | 50 |
| 97.0 | 50 |
| 98.0 | 25 |
| 99.0 | 50 |
| 100.0 | 39 |
| 101.0 | 11 |
| 102.0 | 50 |
| 103.0 | 23 |
| 104.0 | 18 |
| 105.0 | 50 |
| 106.0 | 41 |
| 107.0 | 15 |
| 108.0 | 50 |
| 109.0 | 18 |
| 110.0 | 50 |
| 111.0 | 50 |
| 112.0 | 50 |
| 113.0 | 50 |
| 114.0 | 12 |
| 115.0 | 13 |
| 116.0 | 15 |
| 117.0 | 15 |
| 118.0 | 12 |
| 119.0 | 18 |
| 120.0 | 22 |
| 121.0 | 21 |
| 122.0 | 50 |
| 123.0 | 50 |
| 124.0 | 17 |
| 125.0 | 12 |
| 126.0 | 17 |
| 127.0 | 12 |
| 128.0 | 11 |
| 129.0 | 18 |
| 130.0 | 50 |
| 131.0 | 26 |
| 132.0 | 15 |
| 133.0 | 50 |
| 134.0 | 14 |
| 135.0 | 29 |
| 136.0 | 49 |
| 137.0 | 50 |
| 138.0 | 50 |
| 139.0 | 50 |
| 140.0 | 50 |
| 141.0 | 35 |
| 142.0 | 50 |
| 143.0 | 50 |
| 144.0 | 17 |
| 145.0 | 10 |
| 146.0 | 12 |
| 147.0 | 14 |
| 148.0 | 50 |
| 149.0 | 33 |
| 150.0 | 18 |
| 151.0 | 50 |
| 152.0 | 20 |
| 153.0 | 50 |
| 154.0 | 50 |
| 155.0 | 50 |
| 156.0 | 14 |
| 157.0 | 50 |
| 158.0 | 50 |
| 159.0 | 50 |
| 160.0 | 50 |
| 161.0 | 41 |
| 162.0 | 50 |
| 163.0 | 50 |
| 164.0 | 26 |
| 165.0 | 20 |
| 166.0 | 13 |
| 167.0 | 50 |
| 168.0 | 50 |
| 169.0 | 50 |
| 170.0 | 16 |
| 171.0 | 50 |
| 172.0 | 50 |
| 173.0 | 11 |
| 174.0 | 11 |
| 175.0 | 18 |
| 176.0 | 10 |
| 177.0 | 50 |
| 178.0 | 50 |
| 179.0 | 50 |
| 180.0 | 50 |
| 181.0 | 50 |
| 182.0 | 50 |
| 183.0 | 50 |
| 184.0 | 50 |
| 185.0 | 50 |
| 186.0 | 50 |
| 187.0 | 43 |
| 188.0 | 50 |
| 189.0 | 50 |
| 190.0 | 50 |
| 191.0 | 50 |
| 192.0 | 24 |
| 193.0 | 13 |
| 194.0 | 50 |
| 195.0 | 50 |
| 196.0 | 50 |
| 197.0 | 50 |
| 198.0 | 14 |
| 199.0 | 33 |
| 200.0 | 50 |
| 201.0 | 50 |
| 202.0 | 50 |
| 203.0 | 50 |
| 204.0 | 50 |
| 205.0 | 50 |
| 206.0 | 16 |
| 207.0 | 50 |
| 208.0 | 45 |
| 209.0 | 50 |
| 210.0 | 50 |
| 211.0 | 50 |
| 212.0 | 22 |
| 213.0 | 18 |
| 214.0 | 15 |
| 215.0 | 18 |
| 216.0 | 27 |
| 217.0 | 10 |
| 218.0 | 12 |
| 219.0 | 15 |
| 220.0 | 10 |
| 221.0 | 14 |
| 222.0 | 14 |
| 223.0 | 50 |
| 224.0 | 13 |
| 225.0 | 48 |
| 226.0 | 18 |
| 227.0 | 50 |
| 228.0 | 11 |
| 229.0 | 16 |
| 230.0 | 50 |
| 231.0 | 22 |
### Training Hyperparameters
- batch_size: (64, 64)
- num_epochs: (30, 30)
- max_steps: -1
- sampling_strategy: oversampling
- num_iterations: 100
- body_learning_rate: (2e-05, 1e-05)
- head_learning_rate: 0.01
- loss: CosineSimilarityLoss
- distance_metric: cosine_distance
- margin: 0.25
- end_to_end: False
- use_amp: False
- warmup_proportion: 0.1
- l2_weight: 0.01
- seed: 42
- eval_max_steps: -1
- load_best_model_at_end: False
### Training Results
| Epoch | Step | Training Loss | Validation Loss |
|:-------:|:------:|:-------------:|:---------------:|
| 0.0001 | 1 | 0.4775 | - |
| 0.0037 | 50 | 0.4398 | - |
| 0.0075 | 100 | 0.4346 | - |
| 0.0112 | 150 | 0.4312 | - |
| 0.0149 | 200 | 0.4414 | - |
| 0.0187 | 250 | 0.4317 | - |
| 0.0224 | 300 | 0.4304 | - |
| 0.0261 | 350 | 0.4107 | - |
| 0.0299 | 400 | 0.3971 | - |
| 0.0336 | 450 | 0.3888 | - |
| 0.0373 | 500 | 0.3775 | - |
| 0.0411 | 550 | 0.3672 | - |
| 0.0448 | 600 | 0.3485 | - |
| 0.0485 | 650 | 0.311 | - |
| 0.0523 | 700 | 0.2665 | - |
| 0.0560 | 750 | 0.2369 | - |
| 0.0597 | 800 | 0.22 | - |
| 0.0635 | 850 | 0.1967 | - |
| 0.0672 | 900 | 0.1982 | - |
| 0.0709 | 950 | 0.183 | - |
| 0.0747 | 1000 | 0.1649 | - |
| 0.0784 | 1050 | 0.1569 | - |
| 0.0821 | 1100 | 0.1353 | - |
| 0.0859 | 1150 | 0.1388 | - |
| 0.0896 | 1200 | 0.1259 | - |
| 0.0933 | 1250 | 0.1216 | - |
| 0.0971 | 1300 | 0.1101 | - |
| 0.1008 | 1350 | 0.1026 | - |
| 0.1045 | 1400 | 0.0987 | - |
| 0.1083 | 1450 | 0.0936 | - |
| 0.1120 | 1500 | 0.0877 | - |
| 0.1157 | 1550 | 0.0835 | - |
| 0.1195 | 1600 | 0.0818 | - |
| 0.1232 | 1650 | 0.0762 | - |
| 0.1270 | 1700 | 0.0789 | - |
| 0.1307 | 1750 | 0.074 | - |
| 0.1344 | 1800 | 0.0736 | - |
| 0.1382 | 1850 | 0.0712 | - |
| 0.1419 | 1900 | 0.0706 | - |
| 0.1456 | 1950 | 0.0685 | - |
| 0.1494 | 2000 | 0.0647 | - |
| 0.1531 | 2050 | 0.0667 | - |
| 0.1568 | 2100 | 0.0604 | - |
| 0.1606 | 2150 | 0.066 | - |
| 0.1643 | 2200 | 0.0588 | - |
| 0.1680 | 2250 | 0.0616 | - |
| 0.1718 | 2300 | 0.0579 | - |
| 0.1755 | 2350 | 0.057 | - |
| 0.1792 | 2400 | 0.0557 | - |
| 0.1830 | 2450 | 0.057 | - |
| 0.1867 | 2500 | 0.0523 | - |
| 0.1904 | 2550 | 0.0569 | - |
| 0.1942 | 2600 | 0.055 | - |
| 0.1979 | 2650 | 0.0533 | - |
| 0.2016 | 2700 | 0.0509 | - |
| 0.2054 | 2750 | 0.0489 | - |
| 0.2091 | 2800 | 0.0498 | - |
| 0.2128 | 2850 | 0.0508 | - |
| 0.2166 | 2900 | 0.049 | - |
| 0.2203 | 2950 | 0.0492 | - |
| 0.2240 | 3000 | 0.0475 | - |
| 0.2278 | 3050 | 0.0467 | - |
| 0.2315 | 3100 | 0.0469 | - |
| 0.2352 | 3150 | 0.0475 | - |
| 0.2390 | 3200 | 0.0448 | - |
| 0.2427 | 3250 | 0.0441 | - |
| 0.2464 | 3300 | 0.0438 | - |
| 0.2502 | 3350 | 0.0435 | - |
| 0.2539 | 3400 | 0.0447 | - |
| 0.2576 | 3450 | 0.0435 | - |
| 0.2614 | 3500 | 0.0433 | - |
| 0.2651 | 3550 | 0.0441 | - |
| 0.2688 | 3600 | 0.0395 | - |
| 0.2726 | 3650 | 0.0425 | - |
| 0.2763 | 3700 | 0.0404 | - |
| 0.2800 | 3750 | 0.0357 | - |
| 0.2838 | 3800 | 0.0378 | - |
| 0.2875 | 3850 | 0.038 | - |
| 0.2912 | 3900 | 0.037 | - |
| 0.2950 | 3950 | 0.038 | - |
| 0.2987 | 4000 | 0.0374 | - |
| 0.3024 | 4050 | 0.0356 | - |
| 0.3062 | 4100 | 0.0373 | - |
| 0.3099 | 4150 | 0.0357 | - |
| 0.3136 | 4200 | 0.0342 | - |
| 0.3174 | 4250 | 0.0349 | - |
| 0.3211 | 4300 | 0.0332 | - |
| 0.3248 | 4350 | 0.0325 | - |
| 0.3286 | 4400 | 0.0342 | - |
| 0.3323 | 4450 | 0.0325 | - |
| 0.3360 | 4500 | 0.0333 | - |
| 0.3398 | 4550 | 0.0337 | - |
| 0.3435 | 4600 | 0.0293 | - |
| 0.3472 | 4650 | 0.0316 | - |
| 0.3510 | 4700 | 0.03 | - |
| 0.3547 | 4750 | 0.03 | - |
| 0.3584 | 4800 | 0.0319 | - |
| 0.3622 | 4850 | 0.0317 | - |
| 0.3659 | 4900 | 0.0317 | - |
| 0.3697 | 4950 | 0.0309 | - |
| 0.3734 | 5000 | 0.03 | - |
| 0.3771 | 5050 | 0.0279 | - |
| 0.3809 | 5100 | 0.0258 | - |
| 0.3846 | 5150 | 0.0292 | - |
| 0.3883 | 5200 | 0.0278 | - |
| 0.3921 | 5250 | 0.028 | - |
| 0.3958 | 5300 | 0.0269 | - |
| 0.3995 | 5350 | 0.0282 | - |
| 0.4033 | 5400 | 0.0246 | - |
| 0.4070 | 5450 | 0.027 | - |
| 0.4107 | 5500 | 0.0284 | - |
| 0.4145 | 5550 | 0.0277 | - |
| 0.4182 | 5600 | 0.0252 | - |
| 0.4219 | 5650 | 0.026 | - |
| 0.4257 | 5700 | 0.0256 | - |
| 0.4294 | 5750 | 0.0239 | - |
| 0.4331 | 5800 | 0.0236 | - |
| 0.4369 | 5850 | 0.0249 | - |
| 0.4406 | 5900 | 0.0239 | - |
| 0.4443 | 5950 | 0.0224 | - |
| 0.4481 | 6000 | 0.0233 | - |
| 0.4518 | 6050 | 0.024 | - |
| 0.4555 | 6100 | 0.023 | - |
| 0.4593 | 6150 | 0.0234 | - |
| 0.4630 | 6200 | 0.0202 | - |
| 0.4667 | 6250 | 0.0209 | - |
| 0.4705 | 6300 | 0.023 | - |
| 0.4742 | 6350 | 0.0212 | - |
| 0.4779 | 6400 | 0.022 | - |
| 0.4817 | 6450 | 0.0224 | - |
| 0.4854 | 6500 | 0.021 | - |
| 0.4891 | 6550 | 0.0225 | - |
| 0.4929 | 6600 | 0.0226 | - |
| 0.4966 | 6650 | 0.0211 | - |
| 0.5003 | 6700 | 0.021 | - |
| 0.5041 | 6750 | 0.0192 | - |
| 0.5078 | 6800 | 0.0204 | - |
| 0.5115 | 6850 | 0.0201 | - |
| 0.5153 | 6900 | 0.0194 | - |
| 0.5190 | 6950 | 0.0198 | - |
| 0.5227 | 7000 | 0.0182 | - |
| 0.5265 | 7050 | 0.0184 | - |
| 0.5302 | 7100 | 0.0175 | - |
| 0.5339 | 7150 | 0.0192 | - |
| 0.5377 | 7200 | 0.0172 | - |
| 0.5414 | 7250 | 0.0178 | - |
| 0.5451 | 7300 | 0.0174 | - |
| 0.5489 | 7350 | 0.0189 | - |
| 0.5526 | 7400 | 0.0176 | - |
| 0.5563 | 7450 | 0.0195 | - |
| 0.5601 | 7500 | 0.017 | - |
| 0.5638 | 7550 | 0.0179 | - |
| 0.5675 | 7600 | 0.0149 | - |
| 0.5713 | 7650 | 0.0156 | - |
| 0.5750 | 7700 | 0.0166 | - |
| 0.5787 | 7750 | 0.0156 | - |
| 0.5825 | 7800 | 0.0177 | - |
| 0.5862 | 7850 | 0.0179 | - |
| 0.5899 | 7900 | 0.0143 | - |
| 0.5937 | 7950 | 0.015 | - |
| 0.5974 | 8000 | 0.0153 | - |
| 0.6012 | 8050 | 0.0158 | - |
| 0.6049 | 8100 | 0.0157 | - |
| 0.6086 | 8150 | 0.0143 | - |
| 0.6124 | 8200 | 0.0162 | - |
| 0.6161 | 8250 | 0.0153 | - |
| 0.6198 | 8300 | 0.0155 | - |
| 0.6236 | 8350 | 0.0145 | - |
| 0.6273 | 8400 | 0.0133 | - |
| 0.6310 | 8450 | 0.0145 | - |
| 0.6348 | 8500 | 0.0138 | - |
| 0.6385 | 8550 | 0.0142 | - |
| 0.6422 | 8600 | 0.0144 | - |
| 0.6460 | 8650 | 0.014 | - |
| 0.6497 | 8700 | 0.014 | - |
| 0.6534 | 8750 | 0.0149 | - |
| 0.6572 | 8800 | 0.012 | - |
| 0.6609 | 8850 | 0.0129 | - |
| 0.6646 | 8900 | 0.0119 | - |
| 0.6684 | 8950 | 0.0128 | - |
| 0.6721 | 9000 | 0.0134 | - |
| 0.6758 | 9050 | 0.0129 | - |
| 0.6796 | 9100 | 0.0124 | - |
| 0.6833 | 9150 | 0.0147 | - |
| 0.6870 | 9200 | 0.0127 | - |
| 0.6908 | 9250 | 0.0132 | - |
| 0.6945 | 9300 | 0.0118 | - |
| 0.6982 | 9350 | 0.0144 | - |
| 0.7020 | 9400 | 0.0117 | - |
| 0.7057 | 9450 | 0.01 | - |
| 0.7094 | 9500 | 0.011 | - |
| 0.7132 | 9550 | 0.0111 | - |
| 0.7169 | 9600 | 0.0122 | - |
| 0.7206 | 9650 | 0.0092 | - |
| 0.7244 | 9700 | 0.011 | - |
| 0.7281 | 9750 | 0.0109 | - |
| 0.7318 | 9800 | 0.0114 | - |
| 0.7356 | 9850 | 0.0101 | - |
| 0.7393 | 9900 | 0.0104 | - |
| 0.7430 | 9950 | 0.0127 | - |
| 0.7468 | 10000 | 0.0091 | - |
| 0.7505 | 10050 | 0.0092 | - |
| 0.7542 | 10100 | 0.0109 | - |
| 0.7580 | 10150 | 0.0113 | - |
| 0.7617 | 10200 | 0.0101 | - |
| 0.7654 | 10250 | 0.0096 | - |
| 0.7692 | 10300 | 0.0104 | - |
| 0.7729 | 10350 | 0.0107 | - |
| 0.7766 | 10400 | 0.0113 | - |
| 0.7804 | 10450 | 0.0102 | - |
| 0.7841 | 10500 | 0.0103 | - |
| 0.7878 | 10550 | 0.0092 | - |
| 0.7916 | 10600 | 0.008 | - |
| 0.7953 | 10650 | 0.0102 | - |
| 0.7990 | 10700 | 0.0093 | - |
| 0.8028 | 10750 | 0.0085 | - |
| 0.8065 | 10800 | 0.009 | - |
| 0.8102 | 10850 | 0.0072 | - |
| 0.8140 | 10900 | 0.0078 | - |
| 0.8177 | 10950 | 0.011 | - |
| 0.8214 | 11000 | 0.0087 | - |
| 0.8252 | 11050 | 0.0098 | - |
| 0.8289 | 11100 | 0.0087 | - |
| 0.8326 | 11150 | 0.0094 | - |
| 0.8364 | 11200 | 0.0077 | - |
| 0.8401 | 11250 | 0.0084 | - |
| 0.8439 | 11300 | 0.0082 | - |
| 0.8476 | 11350 | 0.0087 | - |
| 0.8513 | 11400 | 0.0084 | - |
| 0.8551 | 11450 | 0.0106 | - |
| 0.8588 | 11500 | 0.0095 | - |
| 0.8625 | 11550 | 0.0086 | - |
| 0.8663 | 11600 | 0.0077 | - |
| 0.8700 | 11650 | 0.0071 | - |
| 0.8737 | 11700 | 0.0077 | - |
| 0.8775 | 11750 | 0.008 | - |
| 0.8812 | 11800 | 0.0083 | - |
| 0.8849 | 11850 | 0.0082 | - |
| 0.8887 | 11900 | 0.0081 | - |
| 0.8924 | 11950 | 0.0074 | - |
| 0.8961 | 12000 | 0.0086 | - |
| 0.8999 | 12050 | 0.0082 | - |
| 0.9036 | 12100 | 0.0086 | - |
| 0.9073 | 12150 | 0.0083 | - |
| 0.9111 | 12200 | 0.008 | - |
| 0.9148 | 12250 | 0.0079 | - |
| 0.9185 | 12300 | 0.0082 | - |
| 0.9223 | 12350 | 0.0066 | - |
| 0.9260 | 12400 | 0.0064 | - |
| 0.9297 | 12450 | 0.0075 | - |
| 0.9335 | 12500 | 0.0088 | - |
| 0.9372 | 12550 | 0.0075 | - |
| 0.9409 | 12600 | 0.0074 | - |
| 0.9447 | 12650 | 0.008 | - |
| 0.9484 | 12700 | 0.0067 | - |
| 0.9521 | 12750 | 0.0074 | - |
| 0.9559 | 12800 | 0.0075 | - |
| 0.9596 | 12850 | 0.0059 | - |
| 0.9633 | 12900 | 0.0091 | - |
| 0.9671 | 12950 | 0.008 | - |
| 0.9708 | 13000 | 0.0093 | - |
| 0.9745 | 13050 | 0.0067 | - |
| 0.9783 | 13100 | 0.0084 | - |
| 0.9820 | 13150 | 0.0066 | - |
| 0.9857 | 13200 | 0.0069 | - |
| 0.9895 | 13250 | 0.0063 | - |
| 0.9932 | 13300 | 0.007 | - |
| 0.9969 | 13350 | 0.0074 | - |
| 1.0007 | 13400 | 0.0076 | - |
| 1.0044 | 13450 | 0.0067 | - |
| 1.0081 | 13500 | 0.0062 | - |
| 1.0119 | 13550 | 0.0083 | - |
| 1.0156 | 13600 | 0.0058 | - |
| 1.0193 | 13650 | 0.0047 | - |
| 1.0231 | 13700 | 0.007 | - |
| 1.0268 | 13750 | 0.0082 | - |
| 1.0305 | 13800 | 0.0069 | - |
| 1.0343 | 13850 | 0.0055 | - |
| 1.0380 | 13900 | 0.0066 | - |
| 1.0417 | 13950 | 0.0069 | - |
| 1.0455 | 14000 | 0.0067 | - |
| 1.0492 | 14050 | 0.0061 | - |
| 1.0529 | 14100 | 0.0063 | - |
| 1.0567 | 14150 | 0.0053 | - |
| 1.0604 | 14200 | 0.0065 | - |
| 1.0641 | 14250 | 0.0059 | - |
| 1.0679 | 14300 | 0.0078 | - |
| 1.0716 | 14350 | 0.0057 | - |
| 1.0753 | 14400 | 0.0062 | - |
| 1.0791 | 14450 | 0.0061 | - |
| 1.0828 | 14500 | 0.0063 | - |
| 1.0866 | 14550 | 0.0067 | - |
| 1.0903 | 14600 | 0.0062 | - |
| 1.0940 | 14650 | 0.0065 | - |
| 1.0978 | 14700 | 0.0048 | - |
| 1.1015 | 14750 | 0.0049 | - |
| 1.1052 | 14800 | 0.0059 | - |
| 1.1090 | 14850 | 0.0062 | - |
| 1.1127 | 14900 | 0.005 | - |
| 1.1164 | 14950 | 0.0059 | - |
| 1.1202 | 15000 | 0.0049 | - |
| 1.1239 | 15050 | 0.0048 | - |
| 1.1276 | 15100 | 0.0058 | - |
| 1.1314 | 15150 | 0.0059 | - |
| 1.1351 | 15200 | 0.0069 | - |
| 1.1388 | 15250 | 0.0071 | - |
| 1.1426 | 15300 | 0.0063 | - |
| 1.1463 | 15350 | 0.0049 | - |
| 1.1500 | 15400 | 0.0048 | - |
| 1.1538 | 15450 | 0.0057 | - |
| 1.1575 | 15500 | 0.006 | - |
| 1.1612 | 15550 | 0.0049 | - |
| 1.1650 | 15600 | 0.0051 | - |
| 1.1687 | 15650 | 0.0057 | - |
| 1.1724 | 15700 | 0.0057 | - |
| 1.1762 | 15750 | 0.0054 | - |
| 1.1799 | 15800 | 0.0054 | - |
| 1.1836 | 15850 | 0.0051 | - |
| 1.1874 | 15900 | 0.0051 | - |
| 1.1911 | 15950 | 0.005 | - |
| 1.1948 | 16000 | 0.0053 | - |
| 1.1986 | 16050 | 0.005 | - |
| 1.2023 | 16100 | 0.0055 | - |
| 1.2060 | 16150 | 0.0052 | - |
| 1.2098 | 16200 | 0.0063 | - |
| 1.2135 | 16250 | 0.0059 | - |
| 1.2172 | 16300 | 0.0058 | - |
| 1.2210 | 16350 | 0.0055 | - |
| 1.2247 | 16400 | 0.0051 | - |
| 1.2284 | 16450 | 0.0049 | - |
| 1.2322 | 16500 | 0.0049 | - |
| 1.2359 | 16550 | 0.0051 | - |
| 1.2396 | 16600 | 0.0048 | - |
| 1.2434 | 16650 | 0.0053 | - |
| 1.2471 | 16700 | 0.0054 | - |
| 1.2508 | 16750 | 0.0044 | - |
| 1.2546 | 16800 | 0.0054 | - |
| 1.2583 | 16850 | 0.0048 | - |
| 1.2620 | 16900 | 0.0061 | - |
| 1.2658 | 16950 | 0.0048 | - |
| 1.2695 | 17000 | 0.0039 | - |
| 1.2732 | 17050 | 0.0044 | - |
| 1.2770 | 17100 | 0.0065 | - |
| 1.2807 | 17150 | 0.0052 | - |
| 1.2844 | 17200 | 0.0045 | - |
| 1.2882 | 17250 | 0.005 | - |
| 1.2919 | 17300 | 0.0031 | - |
| 1.2956 | 17350 | 0.0041 | - |
| 1.2994 | 17400 | 0.0051 | - |
| 1.3031 | 17450 | 0.0049 | - |
| 1.3068 | 17500 | 0.006 | - |
| 1.3106 | 17550 | 0.0051 | - |
| 1.3143 | 17600 | 0.0044 | - |
| 1.3180 | 17650 | 0.0054 | - |
| 1.3218 | 17700 | 0.0054 | - |
| 1.3255 | 17750 | 0.0047 | - |
| 1.3293 | 17800 | 0.0046 | - |
| 1.3330 | 17850 | 0.004 | - |
| 1.3367 | 17900 | 0.0044 | - |
| 1.3405 | 17950 | 0.0047 | - |
| 1.3442 | 18000 | 0.0054 | - |
| 1.3479 | 18050 | 0.0041 | - |
| 1.3517 | 18100 | 0.0046 | - |
| 1.3554 | 18150 | 0.0059 | - |
| 1.3591 | 18200 | 0.005 | - |
| 1.3629 | 18250 | 0.0042 | - |
| 1.3666 | 18300 | 0.0047 | - |
| 1.3703 | 18350 | 0.0041 | - |
| 1.3741 | 18400 | 0.0048 | - |
| 1.3778 | 18450 | 0.0032 | - |
| 1.3815 | 18500 | 0.0044 | - |
| 1.3853 | 18550 | 0.0038 | - |
| 1.3890 | 18600 | 0.0033 | - |
| 1.3927 | 18650 | 0.0033 | - |
| 1.3965 | 18700 | 0.0053 | - |
| 1.4002 | 18750 | 0.0042 | - |
| 1.4039 | 18800 | 0.0036 | - |
| 1.4077 | 18850 | 0.0044 | - |
| 1.4114 | 18900 | 0.0044 | - |
| 1.4151 | 18950 | 0.0026 | - |
| 1.4189 | 19000 | 0.0042 | - |
| 1.4226 | 19050 | 0.0041 | - |
| 1.4263 | 19100 | 0.0034 | - |
| 1.4301 | 19150 | 0.0042 | - |
| 1.4338 | 19200 | 0.0049 | - |
| 1.4375 | 19250 | 0.0039 | - |
| 1.4413 | 19300 | 0.0036 | - |
| 1.4450 | 19350 | 0.005 | - |
| 1.4487 | 19400 | 0.0044 | - |
| 1.4525 | 19450 | 0.0058 | - |
| 1.4562 | 19500 | 0.0037 | - |
| 1.4599 | 19550 | 0.0043 | - |
| 1.4637 | 19600 | 0.0038 | - |
| 1.4674 | 19650 | 0.0032 | - |
| 1.4711 | 19700 | 0.0032 | - |
| 1.4749 | 19750 | 0.0052 | - |
| 1.4786 | 19800 | 0.0034 | - |
| 1.4823 | 19850 | 0.004 | - |
| 1.4861 | 19900 | 0.004 | - |
| 1.4898 | 19950 | 0.0049 | - |
| 1.4935 | 20000 | 0.0037 | - |
| 1.4973 | 20050 | 0.0038 | - |
| 1.5010 | 20100 | 0.0045 | - |
| 1.5047 | 20150 | 0.0043 | - |
| 1.5085 | 20200 | 0.0038 | - |
| 1.5122 | 20250 | 0.0028 | - |
| 1.5159 | 20300 | 0.0036 | - |
| 1.5197 | 20350 | 0.0035 | - |
| 1.5234 | 20400 | 0.0037 | - |
| 1.5271 | 20450 | 0.0044 | - |
| 1.5309 | 20500 | 0.0031 | - |
| 1.5346 | 20550 | 0.0038 | - |
| 1.5383 | 20600 | 0.0036 | - |
| 1.5421 | 20650 | 0.0038 | - |
| 1.5458 | 20700 | 0.0027 | - |
| 1.5495 | 20750 | 0.003 | - |
| 1.5533 | 20800 | 0.0026 | - |
| 1.5570 | 20850 | 0.0036 | - |
| 1.5607 | 20900 | 0.0038 | - |
| 1.5645 | 20950 | 0.0034 | - |
| 1.5682 | 21000 | 0.0036 | - |
| 1.5720 | 21050 | 0.0046 | - |
| 1.5757 | 21100 | 0.0039 | - |
| 1.5794 | 21150 | 0.0033 | - |
| 1.5832 | 21200 | 0.0028 | - |
| 1.5869 | 21250 | 0.0035 | - |
| 1.5906 | 21300 | 0.003 | - |
| 1.5944 | 21350 | 0.0034 | - |
| 1.5981 | 21400 | 0.0032 | - |
| 1.6018 | 21450 | 0.0031 | - |
| 1.6056 | 21500 | 0.0024 | - |
| 1.6093 | 21550 | 0.0031 | - |
| 1.6130 | 21600 | 0.0035 | - |
| 1.6168 | 21650 | 0.0038 | - |
| 1.6205 | 21700 | 0.0033 | - |
| 1.6242 | 21750 | 0.0038 | - |
| 1.6280 | 21800 | 0.0033 | - |
| 1.6317 | 21850 | 0.0047 | - |
| 1.6354 | 21900 | 0.0034 | - |
| 1.6392 | 21950 | 0.0046 | - |
| 1.6429 | 22000 | 0.0039 | - |
| 1.6466 | 22050 | 0.0035 | - |
| 1.6504 | 22100 | 0.003 | - |
| 1.6541 | 22150 | 0.0034 | - |
| 1.6578 | 22200 | 0.004 | - |
| 1.6616 | 22250 | 0.0015 | - |
| 1.6653 | 22300 | 0.0036 | - |
| 1.6690 | 22350 | 0.0023 | - |
| 1.6728 | 22400 | 0.0031 | - |
| 1.6765 | 22450 | 0.0032 | - |
| 1.6802 | 22500 | 0.0038 | - |
| 1.6840 | 22550 | 0.0035 | - |
| 1.6877 | 22600 | 0.0031 | - |
| 1.6914 | 22650 | 0.0036 | - |
| 1.6952 | 22700 | 0.0027 | - |
| 1.6989 | 22750 | 0.0027 | - |
| 1.7026 | 22800 | 0.0031 | - |
| 1.7064 | 22850 | 0.0042 | - |
| 1.7101 | 22900 | 0.0033 | - |
| 1.7138 | 22950 | 0.0029 | - |
| 1.7176 | 23000 | 0.0028 | - |
| 1.7213 | 23050 | 0.0018 | - |
| 1.7250 | 23100 | 0.0028 | - |
| 1.7288 | 23150 | 0.0032 | - |
| 1.7325 | 23200 | 0.0037 | - |
| 1.7362 | 23250 | 0.003 | - |
| 1.7400 | 23300 | 0.0039 | - |
| 1.7437 | 23350 | 0.0027 | - |
| 1.7474 | 23400 | 0.0032 | - |
| 1.7512 | 23450 | 0.0037 | - |
| 1.7549 | 23500 | 0.0022 | - |
| 1.7586 | 23550 | 0.0026 | - |
| 1.7624 | 23600 | 0.0036 | - |
| 1.7661 | 23650 | 0.0027 | - |
| 1.7698 | 23700 | 0.0026 | - |
| 1.7736 | 23750 | 0.003 | - |
| 1.7773 | 23800 | 0.0036 | - |
| 1.7810 | 23850 | 0.0027 | - |
| 1.7848 | 23900 | 0.0033 | - |
| 1.7885 | 23950 | 0.0034 | - |
| 1.7922 | 24000 | 0.0028 | - |
| 1.7960 | 24050 | 0.003 | - |
| 1.7997 | 24100 | 0.0028 | - |
| 1.8035 | 24150 | 0.0021 | - |
| 1.8072 | 24200 | 0.0027 | - |
| 1.8109 | 24250 | 0.0028 | - |
| 1.8147 | 24300 | 0.0029 | - |
| 1.8184 | 24350 | 0.002 | - |
| 1.8221 | 24400 | 0.0022 | - |
| 1.8259 | 24450 | 0.002 | - |
| 1.8296 | 24500 | 0.0025 | - |
| 1.8333 | 24550 | 0.0025 | - |
| 1.8371 | 24600 | 0.0025 | - |
| 1.8408 | 24650 | 0.0028 | - |
| 1.8445 | 24700 | 0.002 | - |
| 1.8483 | 24750 | 0.0029 | - |
| 1.8520 | 24800 | 0.0024 | - |
| 1.8557 | 24850 | 0.0023 | - |
| 1.8595 | 24900 | 0.0025 | - |
| 1.8632 | 24950 | 0.002 | - |
| 1.8669 | 25000 | 0.0031 | - |
| 1.8707 | 25050 | 0.0021 | - |
| 1.8744 | 25100 | 0.0025 | - |
| 1.8781 | 25150 | 0.0032 | - |
| 1.8819 | 25200 | 0.0041 | - |
| 1.8856 | 25250 | 0.0048 | - |
| 1.8893 | 25300 | 0.0023 | - |
| 1.8931 | 25350 | 0.0032 | - |
| 1.8968 | 25400 | 0.0026 | - |
| 1.9005 | 25450 | 0.0037 | - |
| 1.9043 | 25500 | 0.0019 | - |
| 1.9080 | 25550 | 0.0022 | - |
| 1.9117 | 25600 | 0.0025 | - |
| 1.9155 | 25650 | 0.0031 | - |
| 1.9192 | 25700 | 0.0018 | - |
| 1.9229 | 25750 | 0.002 | - |
| 1.9267 | 25800 | 0.0018 | - |
| 1.9304 | 25850 | 0.0025 | - |
| 1.9341 | 25900 | 0.0021 | - |
| 1.9379 | 25950 | 0.0019 | - |
| 1.9416 | 26000 | 0.0018 | - |
| 1.9453 | 26050 | 0.003 | - |
| 1.9491 | 26100 | 0.0021 | - |
| 1.9528 | 26150 | 0.0029 | - |
| 1.9565 | 26200 | 0.0031 | - |
| 1.9603 | 26250 | 0.0023 | - |
| 1.9640 | 26300 | 0.003 | - |
| 1.9677 | 26350 | 0.003 | - |
| 1.9715 | 26400 | 0.0021 | - |
| 1.9752 | 26450 | 0.0028 | - |
| 1.9789 | 26500 | 0.0027 | - |
| 1.9827 | 26550 | 0.0021 | - |
| 1.9864 | 26600 | 0.0016 | - |
| 1.9901 | 26650 | 0.0021 | - |
| 1.9939 | 26700 | 0.0021 | - |
| 1.9976 | 26750 | 0.0032 | - |
| 2.0013 | 26800 | 0.0022 | - |
| 2.0051 | 26850 | 0.0023 | - |
| 2.0088 | 26900 | 0.0025 | - |
| 2.0125 | 26950 | 0.0017 | - |
| 2.0163 | 27000 | 0.0015 | - |
| 2.0200 | 27050 | 0.0011 | - |
| 2.0237 | 27100 | 0.0016 | - |
| 2.0275 | 27150 | 0.0015 | - |
| 2.0312 | 27200 | 0.002 | - |
| 2.0349 | 27250 | 0.0024 | - |
| 2.0387 | 27300 | 0.003 | - |
| 2.0424 | 27350 | 0.0023 | - |
| 2.0462 | 27400 | 0.0013 | - |
| 2.0499 | 27450 | 0.0027 | - |
| 2.0536 | 27500 | 0.0048 | - |
| 2.0574 | 27550 | 0.0027 | - |
| 2.0611 | 27600 | 0.0027 | - |
| 2.0648 | 27650 | 0.0029 | - |
| 2.0686 | 27700 | 0.0019 | - |
| 2.0723 | 27750 | 0.0026 | - |
| 2.0760 | 27800 | 0.0029 | - |
| 2.0798 | 27850 | 0.0024 | - |
| 2.0835 | 27900 | 0.0034 | - |
| 2.0872 | 27950 | 0.0026 | - |
| 2.0910 | 28000 | 0.0024 | - |
| 2.0947 | 28050 | 0.0018 | - |
| 2.0984 | 28100 | 0.0021 | - |
| 2.1022 | 28150 | 0.0022 | - |
| 2.1059 | 28200 | 0.0023 | - |
| 2.1096 | 28250 | 0.0015 | - |
| 2.1134 | 28300 | 0.0027 | - |
| 2.1171 | 28350 | 0.0018 | - |
| 2.1208 | 28400 | 0.0008 | - |
| 2.1246 | 28450 | 0.0025 | - |
| 2.1283 | 28500 | 0.0027 | - |
| 2.1320 | 28550 | 0.0029 | - |
| 2.1358 | 28600 | 0.0022 | - |
| 2.1395 | 28650 | 0.0026 | - |
| 2.1432 | 28700 | 0.0038 | - |
| 2.1470 | 28750 | 0.0037 | - |
| 2.1507 | 28800 | 0.0024 | - |
| 2.1544 | 28850 | 0.0028 | - |
| 2.1582 | 28900 | 0.0028 | - |
| 2.1619 | 28950 | 0.0028 | - |
| 2.1656 | 29000 | 0.0023 | - |
| 2.1694 | 29050 | 0.0019 | - |
| 2.1731 | 29100 | 0.0024 | - |
| 2.1768 | 29150 | 0.0028 | - |
| 2.1806 | 29200 | 0.0026 | - |
| 2.1843 | 29250 | 0.0023 | - |
| 2.1880 | 29300 | 0.0015 | - |
| 2.1918 | 29350 | 0.0035 | - |
| 2.1955 | 29400 | 0.0028 | - |
| 2.1992 | 29450 | 0.0024 | - |
| 2.2030 | 29500 | 0.0015 | - |
| 2.2067 | 29550 | 0.0021 | - |
| 2.2104 | 29600 | 0.002 | - |
| 2.2142 | 29650 | 0.0019 | - |
| 2.2179 | 29700 | 0.002 | - |
| 2.2216 | 29750 | 0.0019 | - |
| 2.2254 | 29800 | 0.002 | - |
| 2.2291 | 29850 | 0.0019 | - |
| 2.2328 | 29900 | 0.002 | - |
| 2.2366 | 29950 | 0.0025 | - |
| 2.2403 | 30000 | 0.0026 | - |
| 2.2440 | 30050 | 0.0027 | - |
| 2.2478 | 30100 | 0.0022 | - |
| 2.2515 | 30150 | 0.0019 | - |
| 2.2552 | 30200 | 0.0025 | - |
| 2.2590 | 30250 | 0.0022 | - |
| 2.2627 | 30300 | 0.0018 | - |
| 2.2664 | 30350 | 0.0017 | - |
| 2.2702 | 30400 | 0.0015 | - |
| 2.2739 | 30450 | 0.0017 | - |
| 2.2776 | 30500 | 0.0016 | - |
| 2.2814 | 30550 | 0.0011 | - |
| 2.2851 | 30600 | 0.0012 | - |
| 2.2889 | 30650 | 0.0016 | - |
| 2.2926 | 30700 | 0.0019 | - |
| 2.2963 | 30750 | 0.0017 | - |
| 2.3001 | 30800 | 0.0026 | - |
| 2.3038 | 30850 | 0.0023 | - |
| 2.3075 | 30900 | 0.0021 | - |
| 2.3113 | 30950 | 0.0028 | - |
| 2.3150 | 31000 | 0.0011 | - |
| 2.3187 | 31050 | 0.0024 | - |
| 2.3225 | 31100 | 0.0026 | - |
| 2.3262 | 31150 | 0.0026 | - |
| 2.3299 | 31200 | 0.0021 | - |
| 2.3337 | 31250 | 0.0024 | - |
| 2.3374 | 31300 | 0.001 | - |
| 2.3411 | 31350 | 0.0021 | - |
| 2.3449 | 31400 | 0.0015 | - |
| 2.3486 | 31450 | 0.0017 | - |
| 2.3523 | 31500 | 0.0015 | - |
| 2.3561 | 31550 | 0.0005 | - |
| 2.3598 | 31600 | 0.0019 | - |
| 2.3635 | 31650 | 0.002 | - |
| 2.3673 | 31700 | 0.0022 | - |
| 2.3710 | 31750 | 0.0033 | - |
| 2.3747 | 31800 | 0.0016 | - |
| 2.3785 | 31850 | 0.0013 | - |
| 2.3822 | 31900 | 0.0022 | - |
| 2.3859 | 31950 | 0.0022 | - |
| 2.3897 | 32000 | 0.0039 | - |
| 2.3934 | 32050 | 0.0025 | - |
| 2.3971 | 32100 | 0.0035 | - |
| 2.4009 | 32150 | 0.0018 | - |
| 2.4046 | 32200 | 0.0019 | - |
| 2.4083 | 32250 | 0.0016 | - |
| 2.4121 | 32300 | 0.0022 | - |
| 2.4158 | 32350 | 0.0017 | - |
| 2.4195 | 32400 | 0.0027 | - |
| 2.4233 | 32450 | 0.0027 | - |
| 2.4270 | 32500 | 0.0014 | - |
| 2.4307 | 32550 | 0.0032 | - |
| 2.4345 | 32600 | 0.002 | - |
| 2.4382 | 32650 | 0.0014 | - |
| 2.4419 | 32700 | 0.0022 | - |
| 2.4457 | 32750 | 0.0018 | - |
| 2.4494 | 32800 | 0.0015 | - |
| 2.4531 | 32850 | 0.0023 | - |
| 2.4569 | 32900 | 0.0023 | - |
| 2.4606 | 32950 | 0.0018 | - |
| 2.4643 | 33000 | 0.002 | - |
| 2.4681 | 33050 | 0.0019 | - |
| 2.4718 | 33100 | 0.002 | - |
| 2.4755 | 33150 | 0.0023 | - |
| 2.4793 | 33200 | 0.0013 | - |
| 2.4830 | 33250 | 0.0015 | - |
| 2.4867 | 33300 | 0.001 | - |
| 2.4905 | 33350 | 0.0018 | - |
| 2.4942 | 33400 | 0.0015 | - |
| 2.4979 | 33450 | 0.0013 | - |
| 2.5017 | 33500 | 0.0017 | - |
| 2.5054 | 33550 | 0.002 | - |
| 2.5091 | 33600 | 0.0014 | - |
| 2.5129 | 33650 | 0.0012 | - |
| 2.5166 | 33700 | 0.0014 | - |
| 2.5203 | 33750 | 0.0024 | - |
| 2.5241 | 33800 | 0.0016 | - |
| 2.5278 | 33850 | 0.0017 | - |
| 2.5316 | 33900 | 0.0016 | - |
| 2.5353 | 33950 | 0.0015 | - |
| 2.5390 | 34000 | 0.0019 | - |
| 2.5428 | 34050 | 0.0012 | - |
| 2.5465 | 34100 | 0.0021 | - |
| 2.5502 | 34150 | 0.0019 | - |
| 2.5540 | 34200 | 0.0018 | - |
| 2.5577 | 34250 | 0.0028 | - |
| 2.5614 | 34300 | 0.0035 | - |
| 2.5652 | 34350 | 0.0034 | - |
| 2.5689 | 34400 | 0.0028 | - |
| 2.5726 | 34450 | 0.0034 | - |
| 2.5764 | 34500 | 0.003 | - |
| 2.5801 | 34550 | 0.0019 | - |
| 2.5838 | 34600 | 0.0026 | - |
| 2.5876 | 34650 | 0.0026 | - |
| 2.5913 | 34700 | 0.0029 | - |
| 2.5950 | 34750 | 0.0029 | - |
| 2.5988 | 34800 | 0.0025 | - |
| 2.6025 | 34850 | 0.0018 | - |
| 2.6062 | 34900 | 0.003 | - |
| 2.6100 | 34950 | 0.0021 | - |
| 2.6137 | 35000 | 0.0014 | - |
| 2.6174 | 35050 | 0.0013 | - |
| 2.6212 | 35100 | 0.0015 | - |
| 2.6249 | 35150 | 0.0016 | - |
| 2.6286 | 35200 | 0.0016 | - |
| 2.6324 | 35250 | 0.0016 | - |
| 2.6361 | 35300 | 0.0013 | - |
| 2.6398 | 35350 | 0.0019 | - |
| 2.6436 | 35400 | 0.0016 | - |
| 2.6473 | 35450 | 0.002 | - |
| 2.6510 | 35500 | 0.0019 | - |
| 2.6548 | 35550 | 0.0017 | - |
| 2.6585 | 35600 | 0.0016 | - |
| 2.6622 | 35650 | 0.0011 | - |
| 2.6660 | 35700 | 0.0022 | - |
| 2.6697 | 35750 | 0.0015 | - |
| 2.6734 | 35800 | 0.0012 | - |
| 2.6772 | 35850 | 0.0017 | - |
| 2.6809 | 35900 | 0.002 | - |
| 2.6846 | 35950 | 0.0013 | - |
| 2.6884 | 36000 | 0.0015 | - |
| 2.6921 | 36050 | 0.0014 | - |
| 2.6958 | 36100 | 0.0014 | - |
| 2.6996 | 36150 | 0.0021 | - |
| 2.7033 | 36200 | 0.0021 | - |
| 2.7070 | 36250 | 0.0015 | - |
| 2.7108 | 36300 | 0.001 | - |
| 2.7145 | 36350 | 0.0011 | - |
| 2.7182 | 36400 | 0.0013 | - |
| 2.7220 | 36450 | 0.0021 | - |
| 2.7257 | 36500 | 0.001 | - |
| 2.7294 | 36550 | 0.0016 | - |
| 2.7332 | 36600 | 0.0018 | - |
| 2.7369 | 36650 | 0.001 | - |
| 2.7406 | 36700 | 0.0014 | - |
| 2.7444 | 36750 | 0.002 | - |
| 2.7481 | 36800 | 0.0032 | - |
| 2.7518 | 36850 | 0.0011 | - |
| 2.7556 | 36900 | 0.0018 | - |
| 2.7593 | 36950 | 0.0024 | - |
| 2.7630 | 37000 | 0.0015 | - |
| 2.7668 | 37050 | 0.0023 | - |
| 2.7705 | 37100 | 0.0019 | - |
| 2.7743 | 37150 | 0.0015 | - |
| 2.7780 | 37200 | 0.0012 | - |
| 2.7817 | 37250 | 0.0009 | - |
| 2.7855 | 37300 | 0.0013 | - |
| 2.7892 | 37350 | 0.0016 | - |
| 2.7929 | 37400 | 0.0018 | - |
| 2.7967 | 37450 | 0.0026 | - |
| 2.8004 | 37500 | 0.0016 | - |
| 2.8041 | 37550 | 0.0017 | - |
| 2.8079 | 37600 | 0.0022 | - |
| 2.8116 | 37650 | 0.0025 | - |
| 2.8153 | 37700 | 0.0013 | - |
| 2.8191 | 37750 | 0.0022 | - |
| 2.8228 | 37800 | 0.0018 | - |
| 2.8265 | 37850 | 0.002 | - |
| 2.8303 | 37900 | 0.0018 | - |
| 2.8340 | 37950 | 0.0031 | - |
| 2.8377 | 38000 | 0.0019 | - |
| 2.8415 | 38050 | 0.0017 | - |
| 2.8452 | 38100 | 0.0024 | - |
| 2.8489 | 38150 | 0.0016 | - |
| 2.8527 | 38200 | 0.0019 | - |
| 2.8564 | 38250 | 0.0025 | - |
| 2.8601 | 38300 | 0.0025 | - |
| 2.8639 | 38350 | 0.0024 | - |
| 2.8676 | 38400 | 0.002 | - |
| 2.8713 | 38450 | 0.0018 | - |
| 2.8751 | 38500 | 0.0013 | - |
| 2.8788 | 38550 | 0.0011 | - |
| 2.8825 | 38600 | 0.002 | - |
| 2.8863 | 38650 | 0.0014 | - |
| 2.8900 | 38700 | 0.0011 | - |
| 2.8937 | 38750 | 0.0018 | - |
| 2.8975 | 38800 | 0.0027 | - |
| 2.9012 | 38850 | 0.0011 | - |
| 2.9049 | 38900 | 0.001 | - |
| 2.9087 | 38950 | 0.0012 | - |
| 2.9124 | 39000 | 0.0016 | - |
| 2.9161 | 39050 | 0.0011 | - |
| 2.9199 | 39100 | 0.0016 | - |
| 2.9236 | 39150 | 0.0018 | - |
| 2.9273 | 39200 | 0.0017 | - |
| 2.9311 | 39250 | 0.0016 | - |
| 2.9348 | 39300 | 0.0029 | - |
| 2.9385 | 39350 | 0.0011 | - |
| 2.9423 | 39400 | 0.0015 | - |
| 2.9460 | 39450 | 0.0017 | - |
| 2.9497 | 39500 | 0.0022 | - |
| 2.9535 | 39550 | 0.0012 | - |
| 2.9572 | 39600 | 0.0018 | - |
| 2.9609 | 39650 | 0.0015 | - |
| 2.9647 | 39700 | 0.0015 | - |
| 2.9684 | 39750 | 0.0009 | - |
| 2.9721 | 39800 | 0.0015 | - |
| 2.9759 | 39850 | 0.0009 | - |
| 2.9796 | 39900 | 0.0011 | - |
| 2.9833 | 39950 | 0.0008 | - |
| 2.9871 | 40000 | 0.001 | - |
| 2.9908 | 40050 | 0.0011 | - |
| 2.9945 | 40100 | 0.0012 | - |
| 2.9983 | 40150 | 0.0014 | - |
| 3.0020 | 40200 | 0.0014 | - |
| 3.0058 | 40250 | 0.0015 | - |
| 3.0095 | 40300 | 0.0014 | - |
| 3.0132 | 40350 | 0.0009 | - |
| 3.0170 | 40400 | 0.0014 | - |
| 3.0207 | 40450 | 0.0009 | - |
| 3.0244 | 40500 | 0.0014 | - |
| 3.0282 | 40550 | 0.0014 | - |
| 3.0319 | 40600 | 0.0011 | - |
| 3.0356 | 40650 | 0.0017 | - |
| 3.0394 | 40700 | 0.0025 | - |
| 3.0431 | 40750 | 0.0036 | - |
| 3.0468 | 40800 | 0.0018 | - |
| 3.0506 | 40850 | 0.001 | - |
| 3.0543 | 40900 | 0.0021 | - |
| 3.0580 | 40950 | 0.0023 | - |
| 3.0618 | 41000 | 0.0019 | - |
| 3.0655 | 41050 | 0.0018 | - |
| 3.0692 | 41100 | 0.0021 | - |
| 3.0730 | 41150 | 0.0018 | - |
| 3.0767 | 41200 | 0.0018 | - |
| 3.0804 | 41250 | 0.0008 | - |
| 3.0842 | 41300 | 0.0019 | - |
| 3.0879 | 41350 | 0.0007 | - |
| 3.0916 | 41400 | 0.0006 | - |
| 3.0954 | 41450 | 0.0009 | - |
| 3.0991 | 41500 | 0.0006 | - |
| 3.1028 | 41550 | 0.0005 | - |
| 3.1066 | 41600 | 0.0013 | - |
| 3.1103 | 41650 | 0.0006 | - |
| 3.1140 | 41700 | 0.0006 | - |
| 3.1178 | 41750 | 0.0009 | - |
| 3.1215 | 41800 | 0.0011 | - |
| 3.1252 | 41850 | 0.0007 | - |
| 3.1290 | 41900 | 0.0008 | - |
| 3.1327 | 41950 | 0.0008 | - |
| 3.1364 | 42000 | 0.0008 | - |
| 3.1402 | 42050 | 0.0006 | - |
| 3.1439 | 42100 | 0.0005 | - |
| 3.1476 | 42150 | 0.0005 | - |
| 3.1514 | 42200 | 0.0007 | - |
| 3.1551 | 42250 | 0.001 | - |
| 3.1588 | 42300 | 0.0011 | - |
| 3.1626 | 42350 | 0.0007 | - |
| 3.1663 | 42400 | 0.001 | - |
| 3.1700 | 42450 | 0.0007 | - |
| 3.1738 | 42500 | 0.0005 | - |
| 3.1775 | 42550 | 0.001 | - |
| 3.1812 | 42600 | 0.0004 | - |
| 3.1850 | 42650 | 0.0006 | - |
| 3.1887 | 42700 | 0.0007 | - |
| 3.1924 | 42750 | 0.0007 | - |
| 3.1962 | 42800 | 0.001 | - |
| 3.1999 | 42850 | 0.0014 | - |
| 3.2036 | 42900 | 0.0029 | - |
| 3.2074 | 42950 | 0.0047 | - |
| 3.2111 | 43000 | 0.0034 | - |
| 3.2148 | 43050 | 0.0029 | - |
| 3.2186 | 43100 | 0.0021 | - |
| 3.2223 | 43150 | 0.0015 | - |
| 3.2260 | 43200 | 0.0016 | - |
| 3.2298 | 43250 | 0.0015 | - |
| 3.2335 | 43300 | 0.0012 | - |
| 3.2372 | 43350 | 0.0012 | - |
| 3.2410 | 43400 | 0.0017 | - |
| 3.2447 | 43450 | 0.0018 | - |
| 3.2485 | 43500 | 0.0011 | - |
| 3.2522 | 43550 | 0.0024 | - |
| 3.2559 | 43600 | 0.002 | - |
| 3.2597 | 43650 | 0.0014 | - |
| 3.2634 | 43700 | 0.0024 | - |
| 3.2671 | 43750 | 0.0019 | - |
| 3.2709 | 43800 | 0.0006 | - |
| 3.2746 | 43850 | 0.0013 | - |
| 3.2783 | 43900 | 0.0008 | - |
| 3.2821 | 43950 | 0.0018 | - |
| 3.2858 | 44000 | 0.0012 | - |
| 3.2895 | 44050 | 0.0013 | - |
| 3.2933 | 44100 | 0.0013 | - |
| 3.2970 | 44150 | 0.0009 | - |
| 3.3007 | 44200 | 0.0018 | - |
| 3.3045 | 44250 | 0.0005 | - |
| 3.3082 | 44300 | 0.0018 | - |
| 3.3119 | 44350 | 0.0007 | - |
| 3.3157 | 44400 | 0.0006 | - |
| 3.3194 | 44450 | 0.0013 | - |
| 3.3231 | 44500 | 0.0013 | - |
| 3.3269 | 44550 | 0.0014 | - |
| 3.3306 | 44600 | 0.0019 | - |
| 3.3343 | 44650 | 0.0007 | - |
| 3.3381 | 44700 | 0.0016 | - |
| 3.3418 | 44750 | 0.0014 | - |
| 3.3455 | 44800 | 0.0008 | - |
| 3.3493 | 44850 | 0.0002 | - |
| 3.3530 | 44900 | 0.0008 | - |
| 3.3567 | 44950 | 0.0012 | - |
| 3.3605 | 45000 | 0.0009 | - |
| 3.3642 | 45050 | 0.0014 | - |
| 3.3679 | 45100 | 0.0007 | - |
| 3.3717 | 45150 | 0.0004 | - |
| 3.3754 | 45200 | 0.0007 | - |
| 3.3791 | 45250 | 0.0013 | - |
| 3.3829 | 45300 | 0.0009 | - |
| 3.3866 | 45350 | 0.0014 | - |
| 3.3903 | 45400 | 0.0014 | - |
| 3.3941 | 45450 | 0.0016 | - |
| 3.3978 | 45500 | 0.0011 | - |
| 3.4015 | 45550 | 0.0007 | - |
| 3.4053 | 45600 | 0.002 | - |
| 3.4090 | 45650 | 0.0028 | - |
| 3.4127 | 45700 | 0.0025 | - |
| 3.4165 | 45750 | 0.0012 | - |
| 3.4202 | 45800 | 0.001 | - |
| 3.4239 | 45850 | 0.0006 | - |
| 3.4277 | 45900 | 0.0016 | - |
| 3.4314 | 45950 | 0.0025 | - |
| 3.4351 | 46000 | 0.0011 | - |
| 3.4389 | 46050 | 0.002 | - |
| 3.4426 | 46100 | 0.0019 | - |
| 3.4463 | 46150 | 0.0016 | - |
| 3.4501 | 46200 | 0.0019 | - |
| 3.4538 | 46250 | 0.0013 | - |
| 3.4575 | 46300 | 0.0017 | - |
| 3.4613 | 46350 | 0.0011 | - |
| 3.4650 | 46400 | 0.0011 | - |
| 3.4687 | 46450 | 0.0011 | - |
| 3.4725 | 46500 | 0.0008 | - |
| 3.4762 | 46550 | 0.0014 | - |
| 3.4799 | 46600 | 0.0009 | - |
| 3.4837 | 46650 | 0.001 | - |
| 3.4874 | 46700 | 0.0014 | - |
| 3.4912 | 46750 | 0.0007 | - |
| 3.4949 | 46800 | 0.0013 | - |
| 3.4986 | 46850 | 0.0018 | - |
| 3.5024 | 46900 | 0.0014 | - |
| 3.5061 | 46950 | 0.0011 | - |
| 3.5098 | 47000 | 0.0012 | - |
| 3.5136 | 47050 | 0.0008 | - |
| 3.5173 | 47100 | 0.0007 | - |
| 3.5210 | 47150 | 0.0011 | - |
| 3.5248 | 47200 | 0.0016 | - |
| 3.5285 | 47250 | 0.0008 | - |
| 3.5322 | 47300 | 0.0003 | - |
| 3.5360 | 47350 | 0.0009 | - |
| 3.5397 | 47400 | 0.001 | - |
| 3.5434 | 47450 | 0.0008 | - |
| 3.5472 | 47500 | 0.0013 | - |
| 3.5509 | 47550 | 0.0012 | - |
| 3.5546 | 47600 | 0.0016 | - |
| 3.5584 | 47650 | 0.0014 | - |
| 3.5621 | 47700 | 0.0022 | - |
| 3.5658 | 47750 | 0.0018 | - |
| 3.5696 | 47800 | 0.0017 | - |
| 3.5733 | 47850 | 0.0015 | - |
| 3.5770 | 47900 | 0.0018 | - |
| 3.5808 | 47950 | 0.0009 | - |
| 3.5845 | 48000 | 0.0014 | - |
| 3.5882 | 48050 | 0.0016 | - |
| 3.5920 | 48100 | 0.0011 | - |
| 3.5957 | 48150 | 0.0006 | - |
| 3.5994 | 48200 | 0.0012 | - |
| 3.6032 | 48250 | 0.0011 | - |
| 3.6069 | 48300 | 0.0016 | - |
| 3.6106 | 48350 | 0.0014 | - |
| 3.6144 | 48400 | 0.0012 | - |
| 3.6181 | 48450 | 0.0015 | - |
| 3.6218 | 48500 | 0.0008 | - |
| 3.6256 | 48550 | 0.0011 | - |
| 3.6293 | 48600 | 0.0009 | - |
| 3.6330 | 48650 | 0.0007 | - |
| 3.6368 | 48700 | 0.0011 | - |
| 3.6405 | 48750 | 0.001 | - |
| 3.6442 | 48800 | 0.0005 | - |
| 3.6480 | 48850 | 0.001 | - |
| 3.6517 | 48900 | 0.0007 | - |
| 3.6554 | 48950 | 0.0009 | - |
| 3.6592 | 49000 | 0.0006 | - |
| 3.6629 | 49050 | 0.0012 | - |
| 3.6666 | 49100 | 0.0014 | - |
| 3.6704 | 49150 | 0.0011 | - |
| 3.6741 | 49200 | 0.0003 | - |
| 3.6778 | 49250 | 0.0013 | - |
| 3.6816 | 49300 | 0.0004 | - |
| 3.6853 | 49350 | 0.0009 | - |
| 3.6890 | 49400 | 0.0012 | - |
| 3.6928 | 49450 | 0.0006 | - |
| 3.6965 | 49500 | 0.0009 | - |
| 3.7002 | 49550 | 0.0012 | - |
| 3.7040 | 49600 | 0.0009 | - |
| 3.7077 | 49650 | 0.0008 | - |
| 3.7114 | 49700 | 0.0009 | - |
| 3.7152 | 49750 | 0.0006 | - |
| 3.7189 | 49800 | 0.0009 | - |
| 3.7226 | 49850 | 0.0009 | - |
| 3.7264 | 49900 | 0.0014 | - |
| 3.7301 | 49950 | 0.0011 | - |
| 3.7339 | 50000 | 0.0011 | - |
| 3.7376 | 50050 | 0.0004 | - |
| 3.7413 | 50100 | 0.0009 | - |
| 3.7451 | 50150 | 0.0016 | - |
| 3.7488 | 50200 | 0.0009 | - |
| 3.7525 | 50250 | 0.0012 | - |
| 3.7563 | 50300 | 0.0008 | - |
| 3.7600 | 50350 | 0.0005 | - |
| 3.7637 | 50400 | 0.0011 | - |
| 3.7675 | 50450 | 0.0008 | - |
| 3.7712 | 50500 | 0.0009 | - |
| 3.7749 | 50550 | 0.0013 | - |
| 3.7787 | 50600 | 0.0008 | - |
| 3.7824 | 50650 | 0.001 | - |
| 3.7861 | 50700 | 0.0006 | - |
| 3.7899 | 50750 | 0.0008 | - |
| 3.7936 | 50800 | 0.0028 | - |
| 3.7973 | 50850 | 0.0027 | - |
| 3.8011 | 50900 | 0.0021 | - |
| 3.8048 | 50950 | 0.003 | - |
| 3.8085 | 51000 | 0.0022 | - |
| 3.8123 | 51050 | 0.0011 | - |
| 3.8160 | 51100 | 0.0013 | - |
| 3.8197 | 51150 | 0.0009 | - |
| 3.8235 | 51200 | 0.0008 | - |
| 3.8272 | 51250 | 0.0016 | - |
| 3.8309 | 51300 | 0.0017 | - |
| 3.8347 | 51350 | 0.0012 | - |
| 3.8384 | 51400 | 0.0005 | - |
| 3.8421 | 51450 | 0.0011 | - |
| 3.8459 | 51500 | 0.0012 | - |
| 3.8496 | 51550 | 0.0006 | - |
| 3.8533 | 51600 | 0.0009 | - |
| 3.8571 | 51650 | 0.0015 | - |
| 3.8608 | 51700 | 0.0006 | - |
| 3.8645 | 51750 | 0.0005 | - |
| 3.8683 | 51800 | 0.001 | - |
| 3.8720 | 51850 | 0.0009 | - |
| 3.8757 | 51900 | 0.0012 | - |
| 3.8795 | 51950 | 0.0004 | - |
| 3.8832 | 52000 | 0.002 | - |
| 3.8869 | 52050 | 0.001 | - |
| 3.8907 | 52100 | 0.0013 | - |
| 3.8944 | 52150 | 0.0017 | - |
| 3.8981 | 52200 | 0.0028 | - |
| 3.9019 | 52250 | 0.0027 | - |
| 3.9056 | 52300 | 0.0017 | - |
| 3.9093 | 52350 | 0.0017 | - |
| 3.9131 | 52400 | 0.0013 | - |
| 3.9168 | 52450 | 0.0013 | - |
| 3.9205 | 52500 | 0.0014 | - |
| 3.9243 | 52550 | 0.0009 | - |
| 3.9280 | 52600 | 0.001 | - |
| 3.9317 | 52650 | 0.0014 | - |
| 3.9355 | 52700 | 0.0014 | - |
| 3.9392 | 52750 | 0.001 | - |
| 3.9429 | 52800 | 0.001 | - |
| 3.9467 | 52850 | 0.0014 | - |
| 3.9504 | 52900 | 0.0018 | - |
| 3.9541 | 52950 | 0.0009 | - |
| 3.9579 | 53000 | 0.0012 | - |
| 3.9616 | 53050 | 0.0006 | - |
| 3.9653 | 53100 | 0.0015 | - |
| 3.9691 | 53150 | 0.0013 | - |
| 3.9728 | 53200 | 0.0013 | - |
| 3.9766 | 53250 | 0.0011 | - |
| 3.9803 | 53300 | 0.0014 | - |
| 3.9840 | 53350 | 0.0007 | - |
| 3.9878 | 53400 | 0.0007 | - |
| 3.9915 | 53450 | 0.0007 | - |
| 3.9952 | 53500 | 0.0004 | - |
| 3.9990 | 53550 | 0.0006 | - |
| 4.0027 | 53600 | 0.0011 | - |
| 4.0064 | 53650 | 0.0009 | - |
| 4.0102 | 53700 | 0.001 | - |
| 4.0139 | 53750 | 0.0014 | - |
| 4.0176 | 53800 | 0.002 | - |
| 4.0214 | 53850 | 0.0016 | - |
| 4.0251 | 53900 | 0.0021 | - |
| 4.0288 | 53950 | 0.0017 | - |
| 4.0326 | 54000 | 0.0009 | - |
| 4.0363 | 54050 | 0.0008 | - |
| 4.0400 | 54100 | 0.0012 | - |
| 4.0438 | 54150 | 0.0014 | - |
| 4.0475 | 54200 | 0.0008 | - |
| 4.0512 | 54250 | 0.0009 | - |
| 4.0550 | 54300 | 0.0014 | - |
| 4.0587 | 54350 | 0.001 | - |
| 4.0624 | 54400 | 0.0004 | - |
| 4.0662 | 54450 | 0.0003 | - |
| 4.0699 | 54500 | 0.0012 | - |
| 4.0736 | 54550 | 0.0006 | - |
| 4.0774 | 54600 | 0.0004 | - |
| 4.0811 | 54650 | 0.001 | - |
| 4.0848 | 54700 | 0.0006 | - |
| 4.0886 | 54750 | 0.0008 | - |
| 4.0923 | 54800 | 0.0012 | - |
| 4.0960 | 54850 | 0.0009 | - |
| 4.0998 | 54900 | 0.0013 | - |
| 4.1035 | 54950 | 0.0009 | - |
| 4.1072 | 55000 | 0.0005 | - |
| 4.1110 | 55050 | 0.0009 | - |
| 4.1147 | 55100 | 0.0008 | - |
| 4.1184 | 55150 | 0.0003 | - |
| 4.1222 | 55200 | 0.0007 | - |
| 4.1259 | 55250 | 0.0004 | - |
| 4.1296 | 55300 | 0.0009 | - |
| 4.1334 | 55350 | 0.001 | - |
| 4.1371 | 55400 | 0.0015 | - |
| 4.1408 | 55450 | 0.0016 | - |
| 4.1446 | 55500 | 0.0014 | - |
| 4.1483 | 55550 | 0.002 | - |
| 4.1520 | 55600 | 0.0014 | - |
| 4.1558 | 55650 | 0.0022 | - |
| 4.1595 | 55700 | 0.0007 | - |
| 4.1632 | 55750 | 0.0008 | - |
| 4.1670 | 55800 | 0.0011 | - |
| 4.1707 | 55850 | 0.0011 | - |
| 4.1744 | 55900 | 0.0009 | - |
| 4.1782 | 55950 | 0.0011 | - |
| 4.1819 | 56000 | 0.0009 | - |
| 4.1856 | 56050 | 0.0004 | - |
| 4.1894 | 56100 | 0.0012 | - |
| 4.1931 | 56150 | 0.001 | - |
| 4.1968 | 56200 | 0.001 | - |
| 4.2006 | 56250 | 0.0009 | - |
| 4.2043 | 56300 | 0.001 | - |
| 4.2081 | 56350 | 0.0007 | - |
| 4.2118 | 56400 | 0.0013 | - |
| 4.2155 | 56450 | 0.0012 | - |
| 4.2193 | 56500 | 0.0008 | - |
| 4.2230 | 56550 | 0.0005 | - |
| 4.2267 | 56600 | 0.0007 | - |
| 4.2305 | 56650 | 0.0007 | - |
| 4.2342 | 56700 | 0.001 | - |
| 4.2379 | 56750 | 0.0009 | - |
| 4.2417 | 56800 | 0.0005 | - |
| 4.2454 | 56850 | 0.0006 | - |
| 4.2491 | 56900 | 0.0007 | - |
| 4.2529 | 56950 | 0.0008 | - |
| 4.2566 | 57000 | 0.0006 | - |
| 4.2603 | 57050 | 0.0004 | - |
| 4.2641 | 57100 | 0.0008 | - |
| 4.2678 | 57150 | 0.0013 | - |
| 4.2715 | 57200 | 0.0003 | - |
| 4.2753 | 57250 | 0.0005 | - |
| 4.2790 | 57300 | 0.0005 | - |
| 4.2827 | 57350 | 0.0011 | - |
| 4.2865 | 57400 | 0.0007 | - |
| 4.2902 | 57450 | 0.0007 | - |
| 4.2939 | 57500 | 0.0013 | - |
| 4.2977 | 57550 | 0.0008 | - |
| 4.3014 | 57600 | 0.0007 | - |
| 4.3051 | 57650 | 0.0001 | - |
| 4.3089 | 57700 | 0.0007 | - |
| 4.3126 | 57750 | 0.0005 | - |
| 4.3163 | 57800 | 0.0002 | - |
| 4.3201 | 57850 | 0.0006 | - |
| 4.3238 | 57900 | 0.0003 | - |
| 4.3275 | 57950 | 0.0004 | - |
| 4.3313 | 58000 | 0.0007 | - |
| 4.3350 | 58050 | 0.0009 | - |
| 4.3387 | 58100 | 0.002 | - |
| 4.3425 | 58150 | 0.0013 | - |
| 4.3462 | 58200 | 0.0023 | - |
| 4.3499 | 58250 | 0.0016 | - |
| 4.3537 | 58300 | 0.0016 | - |
| 4.3574 | 58350 | 0.0008 | - |
| 4.3611 | 58400 | 0.0018 | - |
| 4.3649 | 58450 | 0.0009 | - |
| 4.3686 | 58500 | 0.0011 | - |
| 4.3723 | 58550 | 0.0009 | - |
| 4.3761 | 58600 | 0.001 | - |
| 4.3798 | 58650 | 0.0005 | - |
| 4.3835 | 58700 | 0.0017 | - |
| 4.3873 | 58750 | 0.001 | - |
| 4.3910 | 58800 | 0.001 | - |
| 4.3947 | 58850 | 0.0004 | - |
| 4.3985 | 58900 | 0.0011 | - |
| 4.4022 | 58950 | 0.0006 | - |
| 4.4059 | 59000 | 0.0005 | - |
| 4.4097 | 59050 | 0.0005 | - |
| 4.4134 | 59100 | 0.0002 | - |
| 4.4171 | 59150 | 0.0011 | - |
| 4.4209 | 59200 | 0.001 | - |
| 4.4246 | 59250 | 0.0005 | - |
| 4.4283 | 59300 | 0.0007 | - |
| 4.4321 | 59350 | 0.0006 | - |
| 4.4358 | 59400 | 0.0005 | - |
| 4.4395 | 59450 | 0.0007 | - |
| 4.4433 | 59500 | 0.0007 | - |
| 4.4470 | 59550 | 0.0012 | - |
| 4.4508 | 59600 | 0.0012 | - |
| 4.4545 | 59650 | 0.0013 | - |
| 4.4582 | 59700 | 0.001 | - |
| 4.4620 | 59750 | 0.0006 | - |
| 4.4657 | 59800 | 0.001 | - |
| 4.4694 | 59850 | 0.0005 | - |
| 4.4732 | 59900 | 0.0008 | - |
| 4.4769 | 59950 | 0.0008 | - |
| 4.4806 | 60000 | 0.0006 | - |
| 4.4844 | 60050 | 0.0008 | - |
| 4.4881 | 60100 | 0.0001 | - |
| 4.4918 | 60150 | 0.0011 | - |
| 4.4956 | 60200 | 0.0011 | - |
| 4.4993 | 60250 | 0.0014 | - |
| 4.5030 | 60300 | 0.0007 | - |
| 4.5068 | 60350 | 0.0011 | - |
| 4.5105 | 60400 | 0.0007 | - |
| 4.5142 | 60450 | 0.0009 | - |
| 4.5180 | 60500 | 0.0009 | - |
| 4.5217 | 60550 | 0.0004 | - |
| 4.5254 | 60600 | 0.0004 | - |
| 4.5292 | 60650 | 0.0007 | - |
| 4.5329 | 60700 | 0.0002 | - |
| 4.5366 | 60750 | 0.0008 | - |
| 4.5404 | 60800 | 0.001 | - |
| 4.5441 | 60850 | 0.001 | - |
| 4.5478 | 60900 | 0.0008 | - |
| 4.5516 | 60950 | 0.0009 | - |
| 4.5553 | 61000 | 0.0011 | - |
| 4.5590 | 61050 | 0.0008 | - |
| 4.5628 | 61100 | 0.001 | - |
| 4.5665 | 61150 | 0.0004 | - |
| 4.5702 | 61200 | 0.0009 | - |
| 4.5740 | 61250 | 0.001 | - |
| 4.5777 | 61300 | 0.0011 | - |
| 4.5814 | 61350 | 0.0007 | - |
| 4.5852 | 61400 | 0.0002 | - |
| 4.5889 | 61450 | 0.0004 | - |
| 4.5926 | 61500 | 0.0007 | - |
| 4.5964 | 61550 | 0.0006 | - |
| 4.6001 | 61600 | 0.0011 | - |
| 4.6038 | 61650 | 0.0007 | - |
| 4.6076 | 61700 | 0.0008 | - |
| 4.6113 | 61750 | 0.0011 | - |
| 4.6150 | 61800 | 0.0007 | - |
| 4.6188 | 61850 | 0.0005 | - |
| 4.6225 | 61900 | 0.0003 | - |
| 4.6262 | 61950 | 0.0007 | - |
| 4.6300 | 62000 | 0.0002 | - |
| 4.6337 | 62050 | 0.0008 | - |
| 4.6374 | 62100 | 0.0009 | - |
| 4.6412 | 62150 | 0.0002 | - |
| 4.6449 | 62200 | 0.0004 | - |
| 4.6486 | 62250 | 0.0005 | - |
| 4.6524 | 62300 | 0.0003 | - |
| 4.6561 | 62350 | 0.0005 | - |
| 4.6598 | 62400 | 0.0006 | - |
| 4.6636 | 62450 | 0.0008 | - |
| 4.6673 | 62500 | 0.0004 | - |
| 4.6710 | 62550 | 0.0007 | - |
| 4.6748 | 62600 | 0.001 | - |
| 4.6785 | 62650 | 0.0002 | - |
| 4.6822 | 62700 | 0.0005 | - |
| 4.6860 | 62750 | 0.0006 | - |
| 4.6897 | 62800 | 0.0008 | - |
| 4.6935 | 62850 | 0.001 | - |
| 4.6972 | 62900 | 0.0029 | - |
| 4.7009 | 62950 | 0.0019 | - |
| 4.7047 | 63000 | 0.0016 | - |
| 4.7084 | 63050 | 0.0013 | - |
| 4.7121 | 63100 | 0.0014 | - |
| 4.7159 | 63150 | 0.0023 | - |
| 4.7196 | 63200 | 0.0009 | - |
| 4.7233 | 63250 | 0.0018 | - |
| 4.7271 | 63300 | 0.0021 | - |
| 4.7308 | 63350 | 0.0008 | - |
| 4.7345 | 63400 | 0.0012 | - |
| 4.7383 | 63450 | 0.0017 | - |
| 4.7420 | 63500 | 0.0006 | - |
| 4.7457 | 63550 | 0.0018 | - |
| 4.7495 | 63600 | 0.0015 | - |
| 4.7532 | 63650 | 0.0014 | - |
| 4.7569 | 63700 | 0.0009 | - |
| 4.7607 | 63750 | 0.0009 | - |
| 4.7644 | 63800 | 0.0006 | - |
| 4.7681 | 63850 | 0.0006 | - |
| 4.7719 | 63900 | 0.0013 | - |
| 4.7756 | 63950 | 0.001 | - |
| 4.7793 | 64000 | 0.0008 | - |
| 4.7831 | 64050 | 0.0005 | - |
| 4.7868 | 64100 | 0.0017 | - |
| 4.7905 | 64150 | 0.0006 | - |
| 4.7943 | 64200 | 0.0012 | - |
| 4.7980 | 64250 | 0.0005 | - |
| 4.8017 | 64300 | 0.0005 | - |
| 4.8055 | 64350 | 0.0006 | - |
| 4.8092 | 64400 | 0.0009 | - |
| 4.8129 | 64450 | 0.0009 | - |
| 4.8167 | 64500 | 0.0006 | - |
| 4.8204 | 64550 | 0.001 | - |
| 4.8241 | 64600 | 0.001 | - |
| 4.8279 | 64650 | 0.0001 | - |
| 4.8316 | 64700 | 0.0005 | - |
| 4.8353 | 64750 | 0.0004 | - |
| 4.8391 | 64800 | 0.0006 | - |
| 4.8428 | 64850 | 0.0004 | - |
| 4.8465 | 64900 | 0.0004 | - |
| 4.8503 | 64950 | 0.0005 | - |
| 4.8540 | 65000 | 0.0006 | - |
| 4.8577 | 65050 | 0.0007 | - |
| 4.8615 | 65100 | 0.0003 | - |
| 4.8652 | 65150 | 0.0005 | - |
| 4.8689 | 65200 | 0.0007 | - |
| 4.8727 | 65250 | 0.0008 | - |
| 4.8764 | 65300 | 0.0005 | - |
| 4.8801 | 65350 | 0.0006 | - |
| 4.8839 | 65400 | 0.001 | - |
| 4.8876 | 65450 | 0.0001 | - |
| 4.8913 | 65500 | 0.0004 | - |
| 4.8951 | 65550 | 0.0007 | - |
| 4.8988 | 65600 | 0.0006 | - |
| 4.9025 | 65650 | 0.0006 | - |
| 4.9063 | 65700 | 0.0005 | - |
| 4.9100 | 65750 | 0.0006 | - |
| 4.9137 | 65800 | 0.0008 | - |
| 4.9175 | 65850 | 0.0015 | - |
| 4.9212 | 65900 | 0.0019 | - |
| 4.9249 | 65950 | 0.0011 | - |
| 4.9287 | 66000 | 0.0014 | - |
| 4.9324 | 66050 | 0.0008 | - |
| 4.9362 | 66100 | 0.0011 | - |
| 4.9399 | 66150 | 0.0007 | - |
| 4.9436 | 66200 | 0.001 | - |
| 4.9474 | 66250 | 0.0005 | - |
| 4.9511 | 66300 | 0.0007 | - |
| 4.9548 | 66350 | 0.0011 | - |
| 4.9586 | 66400 | 0.0009 | - |
| 4.9623 | 66450 | 0.0008 | - |
| 4.9660 | 66500 | 0.0009 | - |
| 4.9698 | 66550 | 0.0006 | - |
| 4.9735 | 66600 | 0.0006 | - |
| 4.9772 | 66650 | 0.0002 | - |
| 4.9810 | 66700 | 0.0006 | - |
| 4.9847 | 66750 | 0.0004 | - |
| 4.9884 | 66800 | 0.0007 | - |
| 4.9922 | 66850 | 0.0009 | - |
| 4.9959 | 66900 | 0.0008 | - |
| 4.9996 | 66950 | 0.0003 | - |
| 5.0034 | 67000 | 0.0008 | - |
| 5.0071 | 67050 | 0.001 | - |
| 5.0108 | 67100 | 0.0007 | - |
| 5.0146 | 67150 | 0.0013 | - |
| 5.0183 | 67200 | 0.0011 | - |
| 5.0220 | 67250 | 0.0003 | - |
| 5.0258 | 67300 | 0.0004 | - |
| 5.0295 | 67350 | 0.0009 | - |
| 5.0332 | 67400 | 0.0005 | - |
| 5.0370 | 67450 | 0.0001 | - |
| 5.0407 | 67500 | 0.0003 | - |
| 5.0444 | 67550 | 0.0007 | - |
| 5.0482 | 67600 | 0.0007 | - |
| 5.0519 | 67650 | 0.0011 | - |
| 5.0556 | 67700 | 0.0007 | - |
| 5.0594 | 67750 | 0.0006 | - |
| 5.0631 | 67800 | 0.0006 | - |
| 5.0668 | 67850 | 0.0005 | - |
| 5.0706 | 67900 | 0.0006 | - |
| 5.0743 | 67950 | 0.0006 | - |
| 5.0780 | 68000 | 0.0003 | - |
| 5.0818 | 68050 | 0.0009 | - |
| 5.0855 | 68100 | 0.0007 | - |
| 5.0892 | 68150 | 0.0006 | - |
| 5.0930 | 68200 | 0.0003 | - |
| 5.0967 | 68250 | 0.0016 | - |
| 5.1004 | 68300 | 0.0006 | - |
| 5.1042 | 68350 | 0.0006 | - |
| 5.1079 | 68400 | 0.0005 | - |
| 5.1116 | 68450 | 0.0003 | - |
| 5.1154 | 68500 | 0.0006 | - |
| 5.1191 | 68550 | 0.0008 | - |
| 5.1228 | 68600 | 0.0005 | - |
| 5.1266 | 68650 | 0.0011 | - |
| 5.1303 | 68700 | 0.0018 | - |
| 5.1340 | 68750 | 0.0013 | - |
| 5.1378 | 68800 | 0.0017 | - |
| 5.1415 | 68850 | 0.0009 | - |
| 5.1452 | 68900 | 0.0009 | - |
| 5.1490 | 68950 | 0.0018 | - |
| 5.1527 | 69000 | 0.0012 | - |
| 5.1564 | 69050 | 0.0012 | - |
| 5.1602 | 69100 | 0.0015 | - |
| 5.1639 | 69150 | 0.0006 | - |
| 5.1676 | 69200 | 0.0008 | - |
| 5.1714 | 69250 | 0.0022 | - |
| 5.1751 | 69300 | 0.0013 | - |
| 5.1789 | 69350 | 0.0008 | - |
| 5.1826 | 69400 | 0.0009 | - |
| 5.1863 | 69450 | 0.0006 | - |
| 5.1901 | 69500 | 0.0012 | - |
| 5.1938 | 69550 | 0.0011 | - |
| 5.1975 | 69600 | 0.0007 | - |
| 5.2013 | 69650 | 0.0005 | - |
| 5.2050 | 69700 | 0.0008 | - |
| 5.2087 | 69750 | 0.0009 | - |
| 5.2125 | 69800 | 0.0005 | - |
| 5.2162 | 69850 | 0.0008 | - |
| 5.2199 | 69900 | 0.0009 | - |
| 5.2237 | 69950 | 0.0008 | - |
| 5.2274 | 70000 | 0.0006 | - |
| 5.2311 | 70050 | 0.0004 | - |
| 5.2349 | 70100 | 0.0009 | - |
| 5.2386 | 70150 | 0.0009 | - |
| 5.2423 | 70200 | 0.0008 | - |
| 5.2461 | 70250 | 0.0006 | - |
| 5.2498 | 70300 | 0.0003 | - |
| 5.2535 | 70350 | 0.0014 | - |
| 5.2573 | 70400 | 0.0006 | - |
| 5.2610 | 70450 | 0.0005 | - |
| 5.2647 | 70500 | 0.0008 | - |
| 5.2685 | 70550 | 0.0007 | - |
| 5.2722 | 70600 | 0.0001 | - |
| 5.2759 | 70650 | 0.0007 | - |
| 5.2797 | 70700 | 0.0005 | - |
| 5.2834 | 70750 | 0.0007 | - |
| 5.2871 | 70800 | 0.0004 | - |
| 5.2909 | 70850 | 0.0001 | - |
| 5.2946 | 70900 | 0.0005 | - |
| 5.2983 | 70950 | 0.0003 | - |
| 5.3021 | 71000 | 0.0008 | - |
| 5.3058 | 71050 | 0.0007 | - |
| 5.3095 | 71100 | 0.0002 | - |
| 5.3133 | 71150 | 0.0009 | - |
| 5.3170 | 71200 | 0.0006 | - |
| 5.3207 | 71250 | 0.0008 | - |
| 5.3245 | 71300 | 0.001 | - |
| 5.3282 | 71350 | 0.0009 | - |
| 5.3319 | 71400 | 0.0005 | - |
| 5.3357 | 71450 | 0.0011 | - |
| 5.3394 | 71500 | 0.0012 | - |
| 5.3431 | 71550 | 0.0011 | - |
| 5.3469 | 71600 | 0.0012 | - |
| 5.3506 | 71650 | 0.0007 | - |
| 5.3543 | 71700 | 0.0009 | - |
| 5.3581 | 71750 | 0.0011 | - |
| 5.3618 | 71800 | 0.0013 | - |
| 5.3655 | 71850 | 0.0008 | - |
| 5.3693 | 71900 | 0.0011 | - |
| 5.3730 | 71950 | 0.0007 | - |
| 5.3767 | 72000 | 0.0008 | - |
| 5.3805 | 72050 | 0.0011 | - |
| 5.3842 | 72100 | 0.001 | - |
| 5.3879 | 72150 | 0.0006 | - |
| 5.3917 | 72200 | 0.0008 | - |
| 5.3954 | 72250 | 0.0004 | - |
| 5.3991 | 72300 | 0.0007 | - |
| 5.4029 | 72350 | 0.001 | - |
| 5.4066 | 72400 | 0.0007 | - |
| 5.4104 | 72450 | 0.0006 | - |
| 5.4141 | 72500 | 0.0008 | - |
| 5.4178 | 72550 | 0.0009 | - |
| 5.4216 | 72600 | 0.0005 | - |
| 5.4253 | 72650 | 0.001 | - |
| 5.4290 | 72700 | 0.0009 | - |
| 5.4328 | 72750 | 0.0006 | - |
| 5.4365 | 72800 | 0.0011 | - |
| 5.4402 | 72850 | 0.0003 | - |
| 5.4440 | 72900 | 0.001 | - |
| 5.4477 | 72950 | 0.0007 | - |
| 5.4514 | 73000 | 0.0009 | - |
| 5.4552 | 73050 | 0.0007 | - |
| 5.4589 | 73100 | 0.0003 | - |
| 5.4626 | 73150 | 0.0003 | - |
| 5.4664 | 73200 | 0.0003 | - |
| 5.4701 | 73250 | 0.0006 | - |
| 5.4738 | 73300 | 0.0004 | - |
| 5.4776 | 73350 | 0.0006 | - |
| 5.4813 | 73400 | 0.0007 | - |
| 5.4850 | 73450 | 0.0005 | - |
| 5.4888 | 73500 | 0.0006 | - |
| 5.4925 | 73550 | 0.0008 | - |
| 5.4962 | 73600 | 0.0009 | - |
| 5.5000 | 73650 | 0.0012 | - |
| 5.5037 | 73700 | 0.0008 | - |
| 5.5074 | 73750 | 0.0011 | - |
| 5.5112 | 73800 | 0.0013 | - |
| 5.5149 | 73850 | 0.0008 | - |
| 5.5186 | 73900 | 0.001 | - |
| 5.5224 | 73950 | 0.0012 | - |
| 5.5261 | 74000 | 0.0005 | - |
| 5.5298 | 74050 | 0.0013 | - |
| 5.5336 | 74100 | 0.0007 | - |
| 5.5373 | 74150 | 0.0006 | - |
| 5.5410 | 74200 | 0.0008 | - |
| 5.5448 | 74250 | 0.0003 | - |
| 5.5485 | 74300 | 0.001 | - |
| 5.5522 | 74350 | 0.0009 | - |
| 5.5560 | 74400 | 0.0013 | - |
| 5.5597 | 74450 | 0.0009 | - |
| 5.5634 | 74500 | 0.0011 | - |
| 5.5672 | 74550 | 0.0014 | - |
| 5.5709 | 74600 | 0.0005 | - |
| 5.5746 | 74650 | 0.001 | - |
| 5.5784 | 74700 | 0.0007 | - |
| 5.5821 | 74750 | 0.0006 | - |
| 5.5858 | 74800 | 0.0011 | - |
| 5.5896 | 74850 | 0.0009 | - |
| 5.5933 | 74900 | 0.0008 | - |
| 5.5970 | 74950 | 0.0011 | - |
| 5.6008 | 75000 | 0.0015 | - |
| 5.6045 | 75050 | 0.0009 | - |
| 5.6082 | 75100 | 0.0008 | - |
| 5.6120 | 75150 | 0.0007 | - |
| 5.6157 | 75200 | 0.0005 | - |
| 5.6194 | 75250 | 0.0003 | - |
| 5.6232 | 75300 | 0.0006 | - |
| 5.6269 | 75350 | 0.0006 | - |
| 5.6306 | 75400 | 0.0008 | - |
| 5.6344 | 75450 | 0.0008 | - |
| 5.6381 | 75500 | 0.0009 | - |
| 5.6418 | 75550 | 0.0011 | - |
| 5.6456 | 75600 | 0.0005 | - |
| 5.6493 | 75650 | 0.0005 | - |
| 5.6531 | 75700 | 0.001 | - |
| 5.6568 | 75750 | 0.0005 | - |
| 5.6605 | 75800 | 0.0002 | - |
| 5.6643 | 75850 | 0.0004 | - |
| 5.6680 | 75900 | 0.0007 | - |
| 5.6717 | 75950 | 0.0007 | - |
| 5.6755 | 76000 | 0.0005 | - |
| 5.6792 | 76050 | 0.0004 | - |
| 5.6829 | 76100 | 0.0006 | - |
| 5.6867 | 76150 | 0.0003 | - |
| 5.6904 | 76200 | 0.0008 | - |
| 5.6941 | 76250 | 0.0009 | - |
| 5.6979 | 76300 | 0.0002 | - |
| 5.7016 | 76350 | 0.0001 | - |
| 5.7053 | 76400 | 0.0009 | - |
| 5.7091 | 76450 | 0.0006 | - |
| 5.7128 | 76500 | 0.0006 | - |
| 5.7165 | 76550 | 0.0001 | - |
| 5.7203 | 76600 | 0.0002 | - |
| 5.7240 | 76650 | 0.0012 | - |
| 5.7277 | 76700 | 0.0011 | - |
| 5.7315 | 76750 | 0.0008 | - |
| 5.7352 | 76800 | 0.0006 | - |
| 5.7389 | 76850 | 0.0001 | - |
| 5.7427 | 76900 | 0.0002 | - |
| 5.7464 | 76950 | 0.0004 | - |
| 5.7501 | 77000 | 0.0004 | - |
| 5.7539 | 77050 | 0.0002 | - |
| 5.7576 | 77100 | 0.0003 | - |
| 5.7613 | 77150 | 0.0006 | - |
| 5.7651 | 77200 | 0.0001 | - |
| 5.7688 | 77250 | 0.0009 | - |
| 5.7725 | 77300 | 0.0006 | - |
| 5.7763 | 77350 | 0.0016 | - |
| 5.7800 | 77400 | 0.0016 | - |
| 5.7837 | 77450 | 0.0011 | - |
| 5.7875 | 77500 | 0.0012 | - |
| 5.7912 | 77550 | 0.0015 | - |
| 5.7949 | 77600 | 0.0017 | - |
| 5.7987 | 77650 | 0.0018 | - |
| 5.8024 | 77700 | 0.0011 | - |
| 5.8061 | 77750 | 0.0005 | - |
| 5.8099 | 77800 | 0.0009 | - |
| 5.8136 | 77850 | 0.0009 | - |
| 5.8173 | 77900 | 0.0011 | - |
| 5.8211 | 77950 | 0.0013 | - |
| 5.8248 | 78000 | 0.0008 | - |
| 5.8285 | 78050 | 0.0009 | - |
| 5.8323 | 78100 | 0.0013 | - |
| 5.8360 | 78150 | 0.001 | - |
| 5.8397 | 78200 | 0.001 | - |
| 5.8435 | 78250 | 0.0007 | - |
| 5.8472 | 78300 | 0.0014 | - |
| 5.8509 | 78350 | 0.0013 | - |
| 5.8547 | 78400 | 0.001 | - |
| 5.8584 | 78450 | 0.0011 | - |
| 5.8621 | 78500 | 0.0007 | - |
| 5.8659 | 78550 | 0.0007 | - |
| 5.8696 | 78600 | 0.0013 | - |
| 5.8733 | 78650 | 0.0004 | - |
| 5.8771 | 78700 | 0.0011 | - |
| 5.8808 | 78750 | 0.0009 | - |
| 5.8845 | 78800 | 0.0007 | - |
| 5.8883 | 78850 | 0.001 | - |
| 5.8920 | 78900 | 0.001 | - |
| 5.8958 | 78950 | 0.0006 | - |
| 5.8995 | 79000 | 0.0009 | - |
| 5.9032 | 79050 | 0.0008 | - |
| 5.9070 | 79100 | 0.0012 | - |
| 5.9107 | 79150 | 0.0007 | - |
| 5.9144 | 79200 | 0.0003 | - |
| 5.9182 | 79250 | 0.0008 | - |
| 5.9219 | 79300 | 0.0014 | - |
| 5.9256 | 79350 | 0.0006 | - |
| 5.9294 | 79400 | 0.0005 | - |
| 5.9331 | 79450 | 0.0007 | - |
| 5.9368 | 79500 | 0.0007 | - |
| 5.9406 | 79550 | 0.0001 | - |
| 5.9443 | 79600 | 0.0005 | - |
| 5.9480 | 79650 | 0.0004 | - |
| 5.9518 | 79700 | 0.0007 | - |
| 5.9555 | 79750 | 0.0006 | - |
| 5.9592 | 79800 | 0.0005 | - |
| 5.9630 | 79850 | 0.0009 | - |
| 5.9667 | 79900 | 0.0011 | - |
| 5.9704 | 79950 | 0.0005 | - |
| 5.9742 | 80000 | 0.0008 | - |
| 5.9779 | 80050 | 0.0004 | - |
| 5.9816 | 80100 | 0.0008 | - |
| 5.9854 | 80150 | 0.0012 | - |
| 5.9891 | 80200 | 0.0005 | - |
| 5.9928 | 80250 | 0.0009 | - |
| 5.9966 | 80300 | 0.0015 | - |
| 6.0003 | 80350 | 0.0008 | - |
| 6.0040 | 80400 | 0.0009 | - |
| 6.0078 | 80450 | 0.0009 | - |
| 6.0115 | 80500 | 0.0007 | - |
| 6.0152 | 80550 | 0.0014 | - |
| 6.0190 | 80600 | 0.0008 | - |
| 6.0227 | 80650 | 0.0012 | - |
| 6.0264 | 80700 | 0.0005 | - |
| 6.0302 | 80750 | 0.0002 | - |
| 6.0339 | 80800 | 0.0006 | - |
| 6.0376 | 80850 | 0.0006 | - |
| 6.0414 | 80900 | 0.0006 | - |
| 6.0451 | 80950 | 0.0008 | - |
| 6.0488 | 81000 | 0.0007 | - |
| 6.0526 | 81050 | 0.0006 | - |
| 6.0563 | 81100 | 0.0001 | - |
| 6.0600 | 81150 | 0.0007 | - |
| 6.0638 | 81200 | 0.0004 | - |
| 6.0675 | 81250 | 0.0003 | - |
| 6.0712 | 81300 | 0.0002 | - |
| 6.0750 | 81350 | 0.0006 | - |
| 6.0787 | 81400 | 0.001 | - |
| 6.0824 | 81450 | 0.0009 | - |
| 6.0862 | 81500 | 0.0006 | - |
| 6.0899 | 81550 | 0.0003 | - |
| 6.0936 | 81600 | 0.0004 | - |
| 6.0974 | 81650 | 0.0007 | - |
| 6.1011 | 81700 | 0.0004 | - |
| 6.1048 | 81750 | 0.0005 | - |
| 6.1086 | 81800 | 0.0004 | - |
| 6.1123 | 81850 | 0.0004 | - |
| 6.1160 | 81900 | 0.0001 | - |
| 6.1198 | 81950 | 0.0008 | - |
| 6.1235 | 82000 | 0.0003 | - |
| 6.1272 | 82050 | 0.0002 | - |
| 6.1310 | 82100 | 0.0004 | - |
| 6.1347 | 82150 | 0.0005 | - |
| 6.1385 | 82200 | 0.0003 | - |
| 6.1422 | 82250 | 0.0002 | - |
| 6.1459 | 82300 | 0.0008 | - |
| 6.1497 | 82350 | 0.0001 | - |
| 6.1534 | 82400 | 0.0007 | - |
| 6.1571 | 82450 | 0.0001 | - |
| 6.1609 | 82500 | 0.0013 | - |
| 6.1646 | 82550 | 0.0008 | - |
| 6.1683 | 82600 | 0.0012 | - |
| 6.1721 | 82650 | 0.0002 | - |
| 6.1758 | 82700 | 0.0003 | - |
| 6.1795 | 82750 | 0.0005 | - |
| 6.1833 | 82800 | 0.0002 | - |
| 6.1870 | 82850 | 0.0001 | - |
| 6.1907 | 82900 | 0.0002 | - |
| 6.1945 | 82950 | 0.0004 | - |
| 6.1982 | 83000 | 0.0003 | - |
| 6.2019 | 83050 | 0.0014 | - |
| 6.2057 | 83100 | 0.0008 | - |
| 6.2094 | 83150 | 0.0009 | - |
| 6.2131 | 83200 | 0.0004 | - |
| 6.2169 | 83250 | 0.0012 | - |
| 6.2206 | 83300 | 0.0012 | - |
| 6.2243 | 83350 | 0.0006 | - |
| 6.2281 | 83400 | 0.0011 | - |
| 6.2318 | 83450 | 0.0019 | - |
| 6.2355 | 83500 | 0.001 | - |
| 6.2393 | 83550 | 0.0012 | - |
| 6.2430 | 83600 | 0.001 | - |
| 6.2467 | 83650 | 0.0013 | - |
| 6.2505 | 83700 | 0.0012 | - |
| 6.2542 | 83750 | 0.0007 | - |
| 6.2579 | 83800 | 0.0007 | - |
| 6.2617 | 83850 | 0.0007 | - |
| 6.2654 | 83900 | 0.0004 | - |
| 6.2691 | 83950 | 0.0008 | - |
| 6.2729 | 84000 | 0.0008 | - |
| 6.2766 | 84050 | 0.0005 | - |
| 6.2803 | 84100 | 0.0005 | - |
| 6.2841 | 84150 | 0.0002 | - |
| 6.2878 | 84200 | 0.0004 | - |
| 6.2915 | 84250 | 0.0006 | - |
| 6.2953 | 84300 | 0.0004 | - |
| 6.2990 | 84350 | 0.0014 | - |
| 6.3027 | 84400 | 0.0007 | - |
| 6.3065 | 84450 | 0.0004 | - |
| 6.3102 | 84500 | 0.0002 | - |
| 6.3139 | 84550 | 0.0004 | - |
| 6.3177 | 84600 | 0.0004 | - |
| 6.3214 | 84650 | 0.0006 | - |
| 6.3251 | 84700 | 0.0005 | - |
| 6.3289 | 84750 | 0.0004 | - |
| 6.3326 | 84800 | 0.0013 | - |
| 6.3363 | 84850 | 0.0013 | - |
| 6.3401 | 84900 | 0.001 | - |
| 6.3438 | 84950 | 0.0014 | - |
| 6.3475 | 85000 | 0.0008 | - |
| 6.3513 | 85050 | 0.0005 | - |
| 6.3550 | 85100 | 0.0005 | - |
| 6.3587 | 85150 | 0.0009 | - |
| 6.3625 | 85200 | 0.0007 | - |
| 6.3662 | 85250 | 0.0002 | - |
| 6.3699 | 85300 | 0.0003 | - |
| 6.3737 | 85350 | 0.0002 | - |
| 6.3774 | 85400 | 0.0005 | - |
| 6.3812 | 85450 | 0.0009 | - |
| 6.3849 | 85500 | 0.0005 | - |
| 6.3886 | 85550 | 0.0009 | - |
| 6.3924 | 85600 | 0.0006 | - |
| 6.3961 | 85650 | 0.0003 | - |
| 6.3998 | 85700 | 0.0008 | - |
| 6.4036 | 85750 | 0.0007 | - |
| 6.4073 | 85800 | 0.0007 | - |
| 6.4110 | 85850 | 0.0018 | - |
| 6.4148 | 85900 | 0.0011 | - |
| 6.4185 | 85950 | 0.0009 | - |
| 6.4222 | 86000 | 0.001 | - |
| 6.4260 | 86050 | 0.0006 | - |
| 6.4297 | 86100 | 0.0003 | - |
| 6.4334 | 86150 | 0.0008 | - |
| 6.4372 | 86200 | 0.0006 | - |
| 6.4409 | 86250 | 0.0007 | - |
| 6.4446 | 86300 | 0.0006 | - |
| 6.4484 | 86350 | 0.0003 | - |
| 6.4521 | 86400 | 0.0004 | - |
| 6.4558 | 86450 | 0.0004 | - |
| 6.4596 | 86500 | 0.0006 | - |
| 6.4633 | 86550 | 0.0004 | - |
| 6.4670 | 86600 | 0.0007 | - |
| 6.4708 | 86650 | 0.0007 | - |
| 6.4745 | 86700 | 0.0007 | - |
| 6.4782 | 86750 | 0.0002 | - |
| 6.4820 | 86800 | 0.0005 | - |
| 6.4857 | 86850 | 0.0001 | - |
| 6.4894 | 86900 | 0.0004 | - |
| 6.4932 | 86950 | 0.0011 | - |
| 6.4969 | 87000 | 0.0003 | - |
| 6.5006 | 87050 | 0.0002 | - |
| 6.5044 | 87100 | 0.0002 | - |
| 6.5081 | 87150 | 0.0008 | - |
| 6.5118 | 87200 | 0.0006 | - |
| 6.5156 | 87250 | 0.0005 | - |
| 6.5193 | 87300 | 0.0002 | - |
| 6.5230 | 87350 | 0.0002 | - |
| 6.5268 | 87400 | 0.0006 | - |
| 6.5305 | 87450 | 0.0002 | - |
| 6.5342 | 87500 | 0.0002 | - |
| 6.5380 | 87550 | 0.0002 | - |
| 6.5417 | 87600 | 0.0007 | - |
| 6.5454 | 87650 | 0.0012 | - |
| 6.5492 | 87700 | 0.0017 | - |
| 6.5529 | 87750 | 0.001 | - |
| 6.5566 | 87800 | 0.0011 | - |
| 6.5604 | 87850 | 0.0008 | - |
| 6.5641 | 87900 | 0.0007 | - |
| 6.5678 | 87950 | 0.0014 | - |
| 6.5716 | 88000 | 0.0006 | - |
| 6.5753 | 88050 | 0.001 | - |
| 6.5790 | 88100 | 0.0007 | - |
| 6.5828 | 88150 | 0.0008 | - |
| 6.5865 | 88200 | 0.0005 | - |
| 6.5902 | 88250 | 0.0008 | - |
| 6.5940 | 88300 | 0.0004 | - |
| 6.5977 | 88350 | 0.0003 | - |
| 6.6014 | 88400 | 0.0004 | - |
| 6.6052 | 88450 | 0.0008 | - |
| 6.6089 | 88500 | 0.0013 | - |
| 6.6127 | 88550 | 0.0011 | - |
| 6.6164 | 88600 | 0.0007 | - |
| 6.6201 | 88650 | 0.0009 | - |
| 6.6239 | 88700 | 0.0008 | - |
| 6.6276 | 88750 | 0.0007 | - |
| 6.6313 | 88800 | 0.0004 | - |
| 6.6351 | 88850 | 0.0003 | - |
| 6.6388 | 88900 | 0.0007 | - |
| 6.6425 | 88950 | 0.0007 | - |
| 6.6463 | 89000 | 0.0004 | - |
| 6.6500 | 89050 | 0.0001 | - |
| 6.6537 | 89100 | 0.0008 | - |
| 6.6575 | 89150 | 0.0007 | - |
| 6.6612 | 89200 | 0.0004 | - |
| 6.6649 | 89250 | 0.0003 | - |
| 6.6687 | 89300 | 0.0001 | - |
| 6.6724 | 89350 | 0.0007 | - |
| 6.6761 | 89400 | 0.0007 | - |
| 6.6799 | 89450 | 0.0003 | - |
| 6.6836 | 89500 | 0.0003 | - |
| 6.6873 | 89550 | 0.0006 | - |
| 6.6911 | 89600 | 0.0007 | - |
| 6.6948 | 89650 | 0.0001 | - |
| 6.6985 | 89700 | 0.0003 | - |
| 6.7023 | 89750 | 0.0004 | - |
| 6.7060 | 89800 | 0.0005 | - |
| 6.7097 | 89850 | 0.0003 | - |
| 6.7135 | 89900 | 0.0007 | - |
| 6.7172 | 89950 | 0.0003 | - |
| 6.7209 | 90000 | 0.0002 | - |
| 6.7247 | 90050 | 0.0005 | - |
| 6.7284 | 90100 | 0.0004 | - |
| 6.7321 | 90150 | 0.0002 | - |
| 6.7359 | 90200 | 0.0007 | - |
| 6.7396 | 90250 | 0.0003 | - |
| 6.7433 | 90300 | 0.0011 | - |
| 6.7471 | 90350 | 0.0008 | - |
| 6.7508 | 90400 | 0.0005 | - |
| 6.7545 | 90450 | 0.0003 | - |
| 6.7583 | 90500 | 0.0003 | - |
| 6.7620 | 90550 | 0.0005 | - |
| 6.7657 | 90600 | 0.0005 | - |
| 6.7695 | 90650 | 0.0002 | - |
| 6.7732 | 90700 | 0.0006 | - |
| 6.7769 | 90750 | 0.0007 | - |
| 6.7807 | 90800 | 0.0013 | - |
| 6.7844 | 90850 | 0.0019 | - |
| 6.7881 | 90900 | 0.0009 | - |
| 6.7919 | 90950 | 0.0015 | - |
| 6.7956 | 91000 | 0.0015 | - |
| 6.7993 | 91050 | 0.0007 | - |
| 6.8031 | 91100 | 0.0014 | - |
| 6.8068 | 91150 | 0.0007 | - |
| 6.8105 | 91200 | 0.001 | - |
| 6.8143 | 91250 | 0.001 | - |
| 6.8180 | 91300 | 0.0004 | - |
| 6.8217 | 91350 | 0.0007 | - |
| 6.8255 | 91400 | 0.0009 | - |
| 6.8292 | 91450 | 0.0007 | - |
| 6.8329 | 91500 | 0.0013 | - |
| 6.8367 | 91550 | 0.0007 | - |
| 6.8404 | 91600 | 0.0011 | - |
| 6.8441 | 91650 | 0.0007 | - |
| 6.8479 | 91700 | 0.0004 | - |
| 6.8516 | 91750 | 0.0009 | - |
| 6.8554 | 91800 | 0.0005 | - |
| 6.8591 | 91850 | 0.0005 | - |
| 6.8628 | 91900 | 0.0015 | - |
| 6.8666 | 91950 | 0.0003 | - |
| 6.8703 | 92000 | 0.0005 | - |
| 6.8740 | 92050 | 0.0004 | - |
| 6.8778 | 92100 | 0.0005 | - |
| 6.8815 | 92150 | 0.0006 | - |
| 6.8852 | 92200 | 0.0006 | - |
| 6.8890 | 92250 | 0.0004 | - |
| 6.8927 | 92300 | 0.0006 | - |
| 6.8964 | 92350 | 0.0004 | - |
| 6.9002 | 92400 | 0.0008 | - |
| 6.9039 | 92450 | 0.0003 | - |
| 6.9076 | 92500 | 0.0006 | - |
| 6.9114 | 92550 | 0.0005 | - |
| 6.9151 | 92600 | 0.0003 | - |
| 6.9188 | 92650 | 0.0002 | - |
| 6.9226 | 92700 | 0.001 | - |
| 6.9263 | 92750 | 0.0009 | - |
| 6.9300 | 92800 | 0.0002 | - |
| 6.9338 | 92850 | 0.0004 | - |
| 6.9375 | 92900 | 0.0009 | - |
| 6.9412 | 92950 | 0.0004 | - |
| 6.9450 | 93000 | 0.0004 | - |
| 6.9487 | 93050 | 0.0005 | - |
| 6.9524 | 93100 | 0.0004 | - |
| 6.9562 | 93150 | 0.0005 | - |
| 6.9599 | 93200 | 0.0002 | - |
| 6.9636 | 93250 | 0.0006 | - |
| 6.9674 | 93300 | 0.0005 | - |
| 6.9711 | 93350 | 0.0007 | - |
| 6.9748 | 93400 | 0.0006 | - |
| 6.9786 | 93450 | 0.0007 | - |
| 6.9823 | 93500 | 0.0 | - |
| 6.9860 | 93550 | 0.0003 | - |
| 6.9898 | 93600 | 0.0006 | - |
| 6.9935 | 93650 | 0.0004 | - |
| 6.9972 | 93700 | 0.0005 | - |
| 7.0010 | 93750 | 0.0004 | - |
| 7.0047 | 93800 | 0.0005 | - |
| 7.0084 | 93850 | 0.0007 | - |
| 7.0122 | 93900 | 0.0002 | - |
| 7.0159 | 93950 | 0.0003 | - |
| 7.0196 | 94000 | 0.0005 | - |
| 7.0234 | 94050 | 0.0006 | - |
| 7.0271 | 94100 | 0.0002 | - |
| 7.0308 | 94150 | 0.0004 | - |
| 7.0346 | 94200 | 0.0003 | - |
| 7.0383 | 94250 | 0.001 | - |
| 7.0420 | 94300 | 0.0006 | - |
| 7.0458 | 94350 | 0.0007 | - |
| 7.0495 | 94400 | 0.0011 | - |
| 7.0532 | 94450 | 0.0009 | - |
| 7.0570 | 94500 | 0.0009 | - |
| 7.0607 | 94550 | 0.0004 | - |
| 7.0644 | 94600 | 0.001 | - |
| 7.0682 | 94650 | 0.0005 | - |
| 7.0719 | 94700 | 0.0008 | - |
| 7.0756 | 94750 | 0.0008 | - |
| 7.0794 | 94800 | 0.0004 | - |
| 7.0831 | 94850 | 0.0005 | - |
| 7.0868 | 94900 | 0.0004 | - |
| 7.0906 | 94950 | 0.0004 | - |
| 7.0943 | 95000 | 0.0004 | - |
| 7.0981 | 95050 | 0.0004 | - |
| 7.1018 | 95100 | 0.0007 | - |
| 7.1055 | 95150 | 0.0006 | - |
| 7.1093 | 95200 | 0.0004 | - |
| 7.1130 | 95250 | 0.0007 | - |
| 7.1167 | 95300 | 0.0004 | - |
| 7.1205 | 95350 | 0.0007 | - |
| 7.1242 | 95400 | 0.0001 | - |
| 7.1279 | 95450 | 0.0003 | - |
| 7.1317 | 95500 | 0.0002 | - |
| 7.1354 | 95550 | 0.0009 | - |
| 7.1391 | 95600 | 0.0003 | - |
| 7.1429 | 95650 | 0.001 | - |
| 7.1466 | 95700 | 0.0001 | - |
| 7.1503 | 95750 | 0.0006 | - |
| 7.1541 | 95800 | 0.0001 | - |
| 7.1578 | 95850 | 0.0004 | - |
| 7.1615 | 95900 | 0.0002 | - |
| 7.1653 | 95950 | 0.0009 | - |
| 7.1690 | 96000 | 0.0002 | - |
| 7.1727 | 96050 | 0.0007 | - |
| 7.1765 | 96100 | 0.0005 | - |
| 7.1802 | 96150 | 0.0002 | - |
| 7.1839 | 96200 | 0.0003 | - |
| 7.1877 | 96250 | 0.0005 | - |
| 7.1914 | 96300 | 0.0002 | - |
| 7.1951 | 96350 | 0.0 | - |
| 7.1989 | 96400 | 0.0005 | - |
| 7.2026 | 96450 | 0.0009 | - |
| 7.2063 | 96500 | 0.0002 | - |
| 7.2101 | 96550 | 0.0009 | - |
| 7.2138 | 96600 | 0.0006 | - |
| 7.2175 | 96650 | 0.0009 | - |
| 7.2213 | 96700 | 0.0007 | - |
| 7.2250 | 96750 | 0.0004 | - |
| 7.2287 | 96800 | 0.0003 | - |
| 7.2325 | 96850 | 0.0011 | - |
| 7.2362 | 96900 | 0.0004 | - |
| 7.2399 | 96950 | 0.0006 | - |
| 7.2437 | 97000 | 0.0003 | - |
| 7.2474 | 97050 | 0.0011 | - |
| 7.2511 | 97100 | 0.0006 | - |
| 7.2549 | 97150 | 0.0012 | - |
| 7.2586 | 97200 | 0.0006 | - |
| 7.2623 | 97250 | 0.002 | - |
| 7.2661 | 97300 | 0.0013 | - |
| 7.2698 | 97350 | 0.0009 | - |
| 7.2735 | 97400 | 0.0009 | - |
| 7.2773 | 97450 | 0.0013 | - |
| 7.2810 | 97500 | 0.0007 | - |
| 7.2847 | 97550 | 0.0013 | - |
| 7.2885 | 97600 | 0.0008 | - |
| 7.2922 | 97650 | 0.0012 | - |
| 7.2959 | 97700 | 0.0008 | - |
| 7.2997 | 97750 | 0.0009 | - |
| 7.3034 | 97800 | 0.0006 | - |
| 7.3071 | 97850 | 0.0007 | - |
| 7.3109 | 97900 | 0.0007 | - |
| 7.3146 | 97950 | 0.0012 | - |
| 7.3183 | 98000 | 0.0004 | - |
| 7.3221 | 98050 | 0.0006 | - |
| 7.3258 | 98100 | 0.0009 | - |
| 7.3295 | 98150 | 0.0011 | - |
| 7.3333 | 98200 | 0.0013 | - |
| 7.3370 | 98250 | 0.0014 | - |
| 7.3408 | 98300 | 0.0003 | - |
| 7.3445 | 98350 | 0.0005 | - |
| 7.3482 | 98400 | 0.0012 | - |
| 7.3520 | 98450 | 0.0016 | - |
| 7.3557 | 98500 | 0.0011 | - |
| 7.3594 | 98550 | 0.0015 | - |
| 7.3632 | 98600 | 0.0009 | - |
| 7.3669 | 98650 | 0.0005 | - |
| 7.3706 | 98700 | 0.0008 | - |
| 7.3744 | 98750 | 0.0005 | - |
| 7.3781 | 98800 | 0.001 | - |
| 7.3818 | 98850 | 0.0005 | - |
| 7.3856 | 98900 | 0.0002 | - |
| 7.3893 | 98950 | 0.0013 | - |
| 7.3930 | 99000 | 0.0011 | - |
| 7.3968 | 99050 | 0.0008 | - |
| 7.4005 | 99100 | 0.0009 | - |
| 7.4042 | 99150 | 0.001 | - |
| 7.4080 | 99200 | 0.0007 | - |
| 7.4117 | 99250 | 0.0006 | - |
| 7.4154 | 99300 | 0.0009 | - |
| 7.4192 | 99350 | 0.0007 | - |
| 7.4229 | 99400 | 0.0003 | - |
| 7.4266 | 99450 | 0.0004 | - |
| 7.4304 | 99500 | 0.0008 | - |
| 7.4341 | 99550 | 0.0008 | - |
| 7.4378 | 99600 | 0.0002 | - |
| 7.4416 | 99650 | 0.0009 | - |
| 7.4453 | 99700 | 0.0004 | - |
| 7.4490 | 99750 | 0.0011 | - |
| 7.4528 | 99800 | 0.0007 | - |
| 7.4565 | 99850 | 0.0008 | - |
| 7.4602 | 99900 | 0.0006 | - |
| 7.4640 | 99950 | 0.0004 | - |
| 7.4677 | 100000 | 0.0004 | - |
| 7.4714 | 100050 | 0.0005 | - |
| 7.4752 | 100100 | 0.0004 | - |
| 7.4789 | 100150 | 0.0004 | - |
| 7.4826 | 100200 | 0.0005 | - |
| 7.4864 | 100250 | 0.0007 | - |
| 7.4901 | 100300 | 0.0001 | - |
| 7.4938 | 100350 | 0.0004 | - |
| 7.4976 | 100400 | 0.0006 | - |
| 7.5013 | 100450 | 0.0005 | - |
| 7.5050 | 100500 | 0.0004 | - |
| 7.5088 | 100550 | 0.0004 | - |
| 7.5125 | 100600 | 0.0002 | - |
| 7.5162 | 100650 | 0.0005 | - |
| 7.5200 | 100700 | 0.0001 | - |
| 7.5237 | 100750 | 0.0002 | - |
| 7.5274 | 100800 | 0.0002 | - |
| 7.5312 | 100850 | 0.0005 | - |
| 7.5349 | 100900 | 0.0002 | - |
| 7.5386 | 100950 | 0.0004 | - |
| 7.5424 | 101000 | 0.0005 | - |
| 7.5461 | 101050 | 0.0009 | - |
| 7.5498 | 101100 | 0.0002 | - |
| 7.5536 | 101150 | 0.0003 | - |
| 7.5573 | 101200 | 0.0003 | - |
| 7.5610 | 101250 | 0.0006 | - |
| 7.5648 | 101300 | 0.0007 | - |
| 7.5685 | 101350 | 0.0002 | - |
| 7.5723 | 101400 | 0.0005 | - |
| 7.5760 | 101450 | 0.0004 | - |
| 7.5797 | 101500 | 0.0007 | - |
| 7.5835 | 101550 | 0.0003 | - |
| 7.5872 | 101600 | 0.0005 | - |
| 7.5909 | 101650 | 0.0005 | - |
| 7.5947 | 101700 | 0.0004 | - |
| 7.5984 | 101750 | 0.0003 | - |
| 7.6021 | 101800 | 0.0005 | - |
| 7.6059 | 101850 | 0.0005 | - |
| 7.6096 | 101900 | 0.0003 | - |
| 7.6133 | 101950 | 0.0004 | - |
| 7.6171 | 102000 | 0.0003 | - |
| 7.6208 | 102050 | 0.0004 | - |
| 7.6245 | 102100 | 0.0002 | - |
| 7.6283 | 102150 | 0.0 | - |
| 7.6320 | 102200 | 0.0001 | - |
| 7.6357 | 102250 | 0.0002 | - |
| 7.6395 | 102300 | 0.0001 | - |
| 7.6432 | 102350 | 0.0001 | - |
| 7.6469 | 102400 | 0.0001 | - |
| 7.6507 | 102450 | 0.0002 | - |
| 7.6544 | 102500 | 0.0005 | - |
| 7.6581 | 102550 | 0.0008 | - |
| 7.6619 | 102600 | 0.0007 | - |
| 7.6656 | 102650 | 0.0003 | - |
| 7.6693 | 102700 | 0.0004 | - |
| 7.6731 | 102750 | 0.0002 | - |
| 7.6768 | 102800 | 0.0007 | - |
| 7.6805 | 102850 | 0.0002 | - |
| 7.6843 | 102900 | 0.0004 | - |
| 7.6880 | 102950 | 0.0003 | - |
| 7.6917 | 103000 | 0.0009 | - |
| 7.6955 | 103050 | 0.0015 | - |
| 7.6992 | 103100 | 0.0011 | - |
| 7.7029 | 103150 | 0.001 | - |
| 7.7067 | 103200 | 0.0008 | - |
| 7.7104 | 103250 | 0.0003 | - |
| 7.7141 | 103300 | 0.0005 | - |
| 7.7179 | 103350 | 0.001 | - |
| 7.7216 | 103400 | 0.0011 | - |
| 7.7253 | 103450 | 0.0008 | - |
| 7.7291 | 103500 | 0.0007 | - |
| 7.7328 | 103550 | 0.0007 | - |
| 7.7365 | 103600 | 0.0007 | - |
| 7.7403 | 103650 | 0.0005 | - |
| 7.7440 | 103700 | 0.0004 | - |
| 7.7477 | 103750 | 0.0009 | - |
| 7.7515 | 103800 | 0.0004 | - |
| 7.7552 | 103850 | 0.0006 | - |
| 7.7589 | 103900 | 0.0005 | - |
| 7.7627 | 103950 | 0.001 | - |
| 7.7664 | 104000 | 0.0003 | - |
| 7.7701 | 104050 | 0.0004 | - |
| 7.7739 | 104100 | 0.0007 | - |
| 7.7776 | 104150 | 0.0008 | - |
| 7.7813 | 104200 | 0.0005 | - |
| 7.7851 | 104250 | 0.0004 | - |
| 7.7888 | 104300 | 0.0009 | - |
| 7.7925 | 104350 | 0.0005 | - |
| 7.7963 | 104400 | 0.0004 | - |
| 7.8000 | 104450 | 0.001 | - |
| 7.8037 | 104500 | 0.0002 | - |
| 7.8075 | 104550 | 0.0009 | - |
| 7.8112 | 104600 | 0.0004 | - |
| 7.8150 | 104650 | 0.0007 | - |
| 7.8187 | 104700 | 0.0004 | - |
| 7.8224 | 104750 | 0.0007 | - |
| 7.8262 | 104800 | 0.0004 | - |
| 7.8299 | 104850 | 0.0004 | - |
| 7.8336 | 104900 | 0.0001 | - |
| 7.8374 | 104950 | 0.0006 | - |
| 7.8411 | 105000 | 0.0002 | - |
| 7.8448 | 105050 | 0.0009 | - |
| 7.8486 | 105100 | 0.0004 | - |
| 7.8523 | 105150 | 0.0005 | - |
| 7.8560 | 105200 | 0.0004 | - |
| 7.8598 | 105250 | 0.0004 | - |
| 7.8635 | 105300 | 0.0008 | - |
| 7.8672 | 105350 | 0.0005 | - |
| 7.8710 | 105400 | 0.0009 | - |
| 7.8747 | 105450 | 0.0008 | - |
| 7.8784 | 105500 | 0.0001 | - |
| 7.8822 | 105550 | 0.0004 | - |
| 7.8859 | 105600 | 0.0006 | - |
| 7.8896 | 105650 | 0.0006 | - |
| 7.8934 | 105700 | 0.0004 | - |
| 7.8971 | 105750 | 0.0006 | - |
| 7.9008 | 105800 | 0.0005 | - |
| 7.9046 | 105850 | 0.0013 | - |
| 7.9083 | 105900 | 0.0027 | - |
| 7.9120 | 105950 | 0.0026 | - |
| 7.9158 | 106000 | 0.0026 | - |
| 7.9195 | 106050 | 0.0024 | - |
| 7.9232 | 106100 | 0.0017 | - |
| 7.9270 | 106150 | 0.0013 | - |
| 7.9307 | 106200 | 0.0019 | - |
| 7.9344 | 106250 | 0.0008 | - |
| 7.9382 | 106300 | 0.0016 | - |
| 7.9419 | 106350 | 0.0005 | - |
| 7.9456 | 106400 | 0.0009 | - |
| 7.9494 | 106450 | 0.0023 | - |
| 7.9531 | 106500 | 0.0021 | - |
| 7.9568 | 106550 | 0.0009 | - |
| 7.9606 | 106600 | 0.0005 | - |
| 7.9643 | 106650 | 0.0009 | - |
| 7.9680 | 106700 | 0.0009 | - |
| 7.9718 | 106750 | 0.0008 | - |
| 7.9755 | 106800 | 0.0006 | - |
| 7.9792 | 106850 | 0.0002 | - |
| 7.9830 | 106900 | 0.0004 | - |
| 7.9867 | 106950 | 0.0006 | - |
| 7.9904 | 107000 | 0.0005 | - |
| 7.9942 | 107050 | 0.0011 | - |
| 7.9979 | 107100 | 0.0005 | - |
| 8.0016 | 107150 | 0.0006 | - |
| 8.0054 | 107200 | 0.0003 | - |
| 8.0091 | 107250 | 0.0007 | - |
| 8.0128 | 107300 | 0.0007 | - |
| 8.0166 | 107350 | 0.0005 | - |
| 8.0203 | 107400 | 0.0005 | - |
| 8.0240 | 107450 | 0.0003 | - |
| 8.0278 | 107500 | 0.0004 | - |
| 8.0315 | 107550 | 0.0002 | - |
| 8.0352 | 107600 | 0.0002 | - |
| 8.0390 | 107650 | 0.0004 | - |
| 8.0427 | 107700 | 0.0001 | - |
| 8.0464 | 107750 | 0.0005 | - |
| 8.0502 | 107800 | 0.0004 | - |
| 8.0539 | 107850 | 0.0008 | - |
| 8.0577 | 107900 | 0.0005 | - |
| 8.0614 | 107950 | 0.0005 | - |
| 8.0651 | 108000 | 0.0004 | - |
| 8.0689 | 108050 | 0.0007 | - |
| 8.0726 | 108100 | 0.0004 | - |
| 8.0763 | 108150 | 0.0005 | - |
| 8.0801 | 108200 | 0.0007 | - |
| 8.0838 | 108250 | 0.0003 | - |
| 8.0875 | 108300 | 0.0004 | - |
| 8.0913 | 108350 | 0.0004 | - |
| 8.0950 | 108400 | 0.0006 | - |
| 8.0987 | 108450 | 0.0002 | - |
| 8.1025 | 108500 | 0.0001 | - |
| 8.1062 | 108550 | 0.0003 | - |
| 8.1099 | 108600 | 0.0004 | - |
| 8.1137 | 108650 | 0.0008 | - |
| 8.1174 | 108700 | 0.0008 | - |
| 8.1211 | 108750 | 0.0005 | - |
| 8.1249 | 108800 | 0.0004 | - |
| 8.1286 | 108850 | 0.001 | - |
| 8.1323 | 108900 | 0.0004 | - |
| 8.1361 | 108950 | 0.0005 | - |
| 8.1398 | 109000 | 0.0006 | - |
| 8.1435 | 109050 | 0.0007 | - |
| 8.1473 | 109100 | 0.0004 | - |
| 8.1510 | 109150 | 0.0009 | - |
| 8.1547 | 109200 | 0.0007 | - |
| 8.1585 | 109250 | 0.0011 | - |
| 8.1622 | 109300 | 0.0003 | - |
| 8.1659 | 109350 | 0.0002 | - |
| 8.1697 | 109400 | 0.0005 | - |
| 8.1734 | 109450 | 0.0011 | - |
| 8.1771 | 109500 | 0.0015 | - |
| 8.1809 | 109550 | 0.0014 | - |
| 8.1846 | 109600 | 0.0008 | - |
| 8.1883 | 109650 | 0.0005 | - |
| 8.1921 | 109700 | 0.0005 | - |
| 8.1958 | 109750 | 0.0007 | - |
| 8.1995 | 109800 | 0.0007 | - |
| 8.2033 | 109850 | 0.0008 | - |
| 8.2070 | 109900 | 0.0003 | - |
| 8.2107 | 109950 | 0.0005 | - |
| 8.2145 | 110000 | 0.0004 | - |
| 8.2182 | 110050 | 0.0001 | - |
| 8.2219 | 110100 | 0.0006 | - |
| 8.2257 | 110150 | 0.0006 | - |
| 8.2294 | 110200 | 0.0001 | - |
| 8.2331 | 110250 | 0.0008 | - |
| 8.2369 | 110300 | 0.0005 | - |
| 8.2406 | 110350 | 0.0005 | - |
| 8.2443 | 110400 | 0.0002 | - |
| 8.2481 | 110450 | 0.0005 | - |
| 8.2518 | 110500 | 0.0004 | - |
| 8.2555 | 110550 | 0.0003 | - |
| 8.2593 | 110600 | 0.0005 | - |
| 8.2630 | 110650 | 0.0002 | - |
| 8.2667 | 110700 | 0.0004 | - |
| 8.2705 | 110750 | 0.0004 | - |
| 8.2742 | 110800 | 0.0002 | - |
| 8.2779 | 110850 | 0.0002 | - |
| 8.2817 | 110900 | 0.0005 | - |
| 8.2854 | 110950 | 0.0004 | - |
| 8.2891 | 111000 | 0.0007 | - |
| 8.2929 | 111050 | 0.0007 | - |
| 8.2966 | 111100 | 0.0003 | - |
| 8.3004 | 111150 | 0.0004 | - |
| 8.3041 | 111200 | 0.0008 | - |
| 8.3078 | 111250 | 0.0003 | - |
| 8.3116 | 111300 | 0.0002 | - |
| 8.3153 | 111350 | 0.0002 | - |
| 8.3190 | 111400 | 0.0 | - |
| 8.3228 | 111450 | 0.0006 | - |
| 8.3265 | 111500 | 0.0004 | - |
| 8.3302 | 111550 | 0.0006 | - |
| 8.3340 | 111600 | 0.0005 | - |
| 8.3377 | 111650 | 0.0007 | - |
| 8.3414 | 111700 | 0.0006 | - |
| 8.3452 | 111750 | 0.0005 | - |
| 8.3489 | 111800 | 0.002 | - |
| 8.3526 | 111850 | 0.0021 | - |
| 8.3564 | 111900 | 0.0009 | - |
| 8.3601 | 111950 | 0.0005 | - |
| 8.3638 | 112000 | 0.0005 | - |
| 8.3676 | 112050 | 0.0005 | - |
| 8.3713 | 112100 | 0.001 | - |
| 8.3750 | 112150 | 0.0006 | - |
| 8.3788 | 112200 | 0.0008 | - |
| 8.3825 | 112250 | 0.0003 | - |
| 8.3862 | 112300 | 0.0009 | - |
| 8.3900 | 112350 | 0.0008 | - |
| 8.3937 | 112400 | 0.0004 | - |
| 8.3974 | 112450 | 0.0004 | - |
| 8.4012 | 112500 | 0.0003 | - |
| 8.4049 | 112550 | 0.0004 | - |
| 8.4086 | 112600 | 0.0006 | - |
| 8.4124 | 112650 | 0.0004 | - |
| 8.4161 | 112700 | 0.0009 | - |
| 8.4198 | 112750 | 0.0003 | - |
| 8.4236 | 112800 | 0.0003 | - |
| 8.4273 | 112850 | 0.0006 | - |
| 8.4310 | 112900 | 0.0005 | - |
| 8.4348 | 112950 | 0.0004 | - |
| 8.4385 | 113000 | 0.0003 | - |
| 8.4422 | 113050 | 0.0001 | - |
| 8.4460 | 113100 | 0.0002 | - |
| 8.4497 | 113150 | 0.0004 | - |
| 8.4534 | 113200 | 0.0002 | - |
| 8.4572 | 113250 | 0.0005 | - |
| 8.4609 | 113300 | 0.0003 | - |
| 8.4646 | 113350 | 0.0006 | - |
| 8.4684 | 113400 | 0.0002 | - |
| 8.4721 | 113450 | 0.0005 | - |
| 8.4758 | 113500 | 0.0006 | - |
| 8.4796 | 113550 | 0.0004 | - |
| 8.4833 | 113600 | 0.0001 | - |
| 8.4870 | 113650 | 0.0002 | - |
| 8.4908 | 113700 | 0.0008 | - |
| 8.4945 | 113750 | 0.0002 | - |
| 8.4982 | 113800 | 0.0009 | - |
| 8.5020 | 113850 | 0.0005 | - |
| 8.5057 | 113900 | 0.0004 | - |
| 8.5094 | 113950 | 0.0002 | - |
| 8.5132 | 114000 | 0.0002 | - |
| 8.5169 | 114050 | 0.0005 | - |
| 8.5206 | 114100 | 0.0006 | - |
| 8.5244 | 114150 | 0.0007 | - |
| 8.5281 | 114200 | 0.0004 | - |
| 8.5318 | 114250 | 0.0001 | - |
| 8.5356 | 114300 | 0.0004 | - |
| 8.5393 | 114350 | 0.0004 | - |
| 8.5431 | 114400 | 0.0002 | - |
| 8.5468 | 114450 | 0.0004 | - |
| 8.5505 | 114500 | 0.0002 | - |
| 8.5543 | 114550 | 0.0005 | - |
| 8.5580 | 114600 | 0.0 | - |
| 8.5617 | 114650 | 0.0002 | - |
| 8.5655 | 114700 | 0.0004 | - |
| 8.5692 | 114750 | 0.0001 | - |
| 8.5729 | 114800 | 0.0004 | - |
| 8.5767 | 114850 | 0.0002 | - |
| 8.5804 | 114900 | 0.0003 | - |
| 8.5841 | 114950 | 0.0004 | - |
| 8.5879 | 115000 | 0.0002 | - |
| 8.5916 | 115050 | 0.0002 | - |
| 8.5953 | 115100 | 0.0003 | - |
| 8.5991 | 115150 | 0.0 | - |
| 8.6028 | 115200 | 0.0002 | - |
| 8.6065 | 115250 | 0.0005 | - |
| 8.6103 | 115300 | 0.0002 | - |
| 8.6140 | 115350 | 0.0001 | - |
| 8.6177 | 115400 | 0.0002 | - |
| 8.6215 | 115450 | 0.0009 | - |
| 8.6252 | 115500 | 0.0001 | - |
| 8.6289 | 115550 | 0.0005 | - |
| 8.6327 | 115600 | 0.0004 | - |
| 8.6364 | 115650 | 0.0005 | - |
| 8.6401 | 115700 | 0.0004 | - |
| 8.6439 | 115750 | 0.0004 | - |
| 8.6476 | 115800 | 0.0001 | - |
| 8.6513 | 115850 | 0.0002 | - |
| 8.6551 | 115900 | 0.0002 | - |
| 8.6588 | 115950 | 0.0002 | - |
| 8.6625 | 116000 | 0.0007 | - |
| 8.6663 | 116050 | 0.0008 | - |
| 8.6700 | 116100 | 0.0008 | - |
| 8.6737 | 116150 | 0.0008 | - |
| 8.6775 | 116200 | 0.0011 | - |
| 8.6812 | 116250 | 0.0019 | - |
| 8.6849 | 116300 | 0.0009 | - |
| 8.6887 | 116350 | 0.0009 | - |
| 8.6924 | 116400 | 0.0007 | - |
| 8.6961 | 116450 | 0.0008 | - |
| 8.6999 | 116500 | 0.0009 | - |
| 8.7036 | 116550 | 0.0011 | - |
| 8.7073 | 116600 | 0.0012 | - |
| 8.7111 | 116650 | 0.0009 | - |
| 8.7148 | 116700 | 0.0006 | - |
| 8.7185 | 116750 | 0.0003 | - |
| 8.7223 | 116800 | 0.0006 | - |
| 8.7260 | 116850 | 0.0006 | - |
| 8.7297 | 116900 | 0.0004 | - |
| 8.7335 | 116950 | 0.0006 | - |
| 8.7372 | 117000 | 0.0002 | - |
| 8.7409 | 117050 | 0.0004 | - |
| 8.7447 | 117100 | 0.0008 | - |
| 8.7484 | 117150 | 0.0003 | - |
| 8.7521 | 117200 | 0.0007 | - |
| 8.7559 | 117250 | 0.0002 | - |
| 8.7596 | 117300 | 0.0003 | - |
| 8.7633 | 117350 | 0.0001 | - |
| 8.7671 | 117400 | 0.0004 | - |
| 8.7708 | 117450 | 0.0004 | - |
| 8.7746 | 117500 | 0.0003 | - |
| 8.7783 | 117550 | 0.0003 | - |
| 8.7820 | 117600 | 0.0005 | - |
| 8.7858 | 117650 | 0.0003 | - |
| 8.7895 | 117700 | 0.0006 | - |
| 8.7932 | 117750 | 0.0005 | - |
| 8.7970 | 117800 | 0.0003 | - |
| 8.8007 | 117850 | 0.0002 | - |
| 8.8044 | 117900 | 0.0004 | - |
| 8.8082 | 117950 | 0.0006 | - |
| 8.8119 | 118000 | 0.0006 | - |
| 8.8156 | 118050 | 0.0003 | - |
| 8.8194 | 118100 | 0.0004 | - |
| 8.8231 | 118150 | 0.001 | - |
| 8.8268 | 118200 | 0.0005 | - |
| 8.8306 | 118250 | 0.001 | - |
| 8.8343 | 118300 | 0.0005 | - |
| 8.8380 | 118350 | 0.001 | - |
| 8.8418 | 118400 | 0.0002 | - |
| 8.8455 | 118450 | 0.0003 | - |
| 8.8492 | 118500 | 0.0003 | - |
| 8.8530 | 118550 | 0.0003 | - |
| 8.8567 | 118600 | 0.0003 | - |
| 8.8604 | 118650 | 0.0003 | - |
| 8.8642 | 118700 | 0.0002 | - |
| 8.8679 | 118750 | 0.0003 | - |
| 8.8716 | 118800 | 0.0008 | - |
| 8.8754 | 118850 | 0.0006 | - |
| 8.8791 | 118900 | 0.0004 | - |
| 8.8828 | 118950 | 0.0005 | - |
| 8.8866 | 119000 | 0.0002 | - |
| 8.8903 | 119050 | 0.0005 | - |
| 8.8940 | 119100 | 0.0003 | - |
| 8.8978 | 119150 | 0.0008 | - |
| 8.9015 | 119200 | 0.0004 | - |
| 8.9052 | 119250 | 0.0007 | - |
| 8.9090 | 119300 | 0.0008 | - |
| 8.9127 | 119350 | 0.0004 | - |
| 8.9164 | 119400 | 0.0003 | - |
| 8.9202 | 119450 | 0.0003 | - |
| 8.9239 | 119500 | 0.0003 | - |
| 8.9276 | 119550 | 0.0011 | - |
| 8.9314 | 119600 | 0.0002 | - |
| 8.9351 | 119650 | 0.0003 | - |
| 8.9388 | 119700 | 0.0002 | - |
| 8.9426 | 119750 | 0.0007 | - |
| 8.9463 | 119800 | 0.0002 | - |
| 8.9500 | 119850 | 0.0004 | - |
| 8.9538 | 119900 | 0.0003 | - |
| 8.9575 | 119950 | 0.0008 | - |
| 8.9612 | 120000 | 0.0003 | - |
| 8.9650 | 120050 | 0.0008 | - |
| 8.9687 | 120100 | 0.0001 | - |
| 8.9724 | 120150 | 0.0001 | - |
| 8.9762 | 120200 | 0.0005 | - |
| 8.9799 | 120250 | 0.0005 | - |
| 8.9836 | 120300 | 0.0003 | - |
| 8.9874 | 120350 | 0.0008 | - |
| 8.9911 | 120400 | 0.0002 | - |
| 8.9948 | 120450 | 0.0002 | - |
| 8.9986 | 120500 | 0.0004 | - |
| 9.0023 | 120550 | 0.0002 | - |
| 9.0060 | 120600 | 0.0003 | - |
| 9.0098 | 120650 | 0.0005 | - |
| 9.0135 | 120700 | 0.0004 | - |
| 9.0173 | 120750 | 0.0002 | - |
| 9.0210 | 120800 | 0.0002 | - |
| 9.0247 | 120850 | 0.0009 | - |
| 9.0285 | 120900 | 0.0005 | - |
| 9.0322 | 120950 | 0.0004 | - |
| 9.0359 | 121000 | 0.0001 | - |
| 9.0397 | 121050 | 0.0001 | - |
| 9.0434 | 121100 | 0.0003 | - |
| 9.0471 | 121150 | 0.0007 | - |
| 9.0509 | 121200 | 0.0006 | - |
| 9.0546 | 121250 | 0.0002 | - |
| 9.0583 | 121300 | 0.0002 | - |
| 9.0621 | 121350 | 0.0002 | - |
| 9.0658 | 121400 | 0.0004 | - |
| 9.0695 | 121450 | 0.0001 | - |
| 9.0733 | 121500 | 0.0004 | - |
| 9.0770 | 121550 | 0.0004 | - |
| 9.0807 | 121600 | 0.0001 | - |
| 9.0845 | 121650 | 0.0002 | - |
| 9.0882 | 121700 | 0.0004 | - |
| 9.0919 | 121750 | 0.0001 | - |
| 9.0957 | 121800 | 0.0003 | - |
| 9.0994 | 121850 | 0.0003 | - |
| 9.1031 | 121900 | 0.0004 | - |
| 9.1069 | 121950 | 0.0004 | - |
| 9.1106 | 122000 | 0.0005 | - |
| 9.1143 | 122050 | 0.0005 | - |
| 9.1181 | 122100 | 0.0008 | - |
| 9.1218 | 122150 | 0.0007 | - |
| 9.1255 | 122200 | 0.0003 | - |
| 9.1293 | 122250 | 0.0003 | - |
| 9.1330 | 122300 | 0.0005 | - |
| 9.1367 | 122350 | 0.0004 | - |
| 9.1405 | 122400 | 0.0002 | - |
| 9.1442 | 122450 | 0.0003 | - |
| 9.1479 | 122500 | 0.0001 | - |
| 9.1517 | 122550 | 0.0004 | - |
| 9.1554 | 122600 | 0.0001 | - |
| 9.1591 | 122650 | 0.0002 | - |
| 9.1629 | 122700 | 0.0008 | - |
| 9.1666 | 122750 | 0.0002 | - |
| 9.1703 | 122800 | 0.0002 | - |
| 9.1741 | 122850 | 0.0005 | - |
| 9.1778 | 122900 | 0.0002 | - |
| 9.1815 | 122950 | 0.0005 | - |
| 9.1853 | 123000 | 0.0007 | - |
| 9.1890 | 123050 | 0.0002 | - |
| 9.1927 | 123100 | 0.0005 | - |
| 9.1965 | 123150 | 0.0004 | - |
| 9.2002 | 123200 | 0.0004 | - |
| 9.2039 | 123250 | 0.0006 | - |
| 9.2077 | 123300 | 0.0005 | - |
| 9.2114 | 123350 | 0.0003 | - |
| 9.2151 | 123400 | 0.0007 | - |
| 9.2189 | 123450 | 0.0005 | - |
| 9.2226 | 123500 | 0.0004 | - |
| 9.2263 | 123550 | 0.0006 | - |
| 9.2301 | 123600 | 0.0004 | - |
| 9.2338 | 123650 | 0.0005 | - |
| 9.2375 | 123700 | 0.0004 | - |
| 9.2413 | 123750 | 0.0005 | - |
| 9.2450 | 123800 | 0.0005 | - |
| 9.2487 | 123850 | 0.0002 | - |
| 9.2525 | 123900 | 0.0013 | - |
| 9.2562 | 123950 | 0.0006 | - |
| 9.2600 | 124000 | 0.0005 | - |
| 9.2637 | 124050 | 0.001 | - |
| 9.2674 | 124100 | 0.0005 | - |
| 9.2712 | 124150 | 0.0009 | - |
| 9.2749 | 124200 | 0.0004 | - |
| 9.2786 | 124250 | 0.001 | - |
| 9.2824 | 124300 | 0.0008 | - |
| 9.2861 | 124350 | 0.0009 | - |
| 9.2898 | 124400 | 0.0008 | - |
| 9.2936 | 124450 | 0.0009 | - |
| 9.2973 | 124500 | 0.0002 | - |
| 9.3010 | 124550 | 0.0005 | - |
| 9.3048 | 124600 | 0.0011 | - |
| 9.3085 | 124650 | 0.0004 | - |
| 9.3122 | 124700 | 0.0005 | - |
| 9.3160 | 124750 | 0.0007 | - |
| 9.3197 | 124800 | 0.0008 | - |
| 9.3234 | 124850 | 0.0005 | - |
| 9.3272 | 124900 | 0.0007 | - |
| 9.3309 | 124950 | 0.0006 | - |
| 9.3346 | 125000 | 0.0005 | - |
| 9.3384 | 125050 | 0.0003 | - |
| 9.3421 | 125100 | 0.0002 | - |
| 9.3458 | 125150 | 0.0004 | - |
| 9.3496 | 125200 | 0.0006 | - |
| 9.3533 | 125250 | 0.0005 | - |
| 9.3570 | 125300 | 0.0004 | - |
| 9.3608 | 125350 | 0.0006 | - |
| 9.3645 | 125400 | 0.0004 | - |
| 9.3682 | 125450 | 0.0002 | - |
| 9.3720 | 125500 | 0.0 | - |
| 9.3757 | 125550 | 0.0002 | - |
| 9.3794 | 125600 | 0.0001 | - |
| 9.3832 | 125650 | 0.0002 | - |
| 9.3869 | 125700 | 0.0005 | - |
| 9.3906 | 125750 | 0.0005 | - |
| 9.3944 | 125800 | 0.0008 | - |
| 9.3981 | 125850 | 0.0004 | - |
| 9.4018 | 125900 | 0.0006 | - |
| 9.4056 | 125950 | 0.0009 | - |
| 9.4093 | 126000 | 0.0007 | - |
| 9.4130 | 126050 | 0.0007 | - |
| 9.4168 | 126100 | 0.0005 | - |
| 9.4205 | 126150 | 0.0005 | - |
| 9.4242 | 126200 | 0.0004 | - |
| 9.4280 | 126250 | 0.0003 | - |
| 9.4317 | 126300 | 0.0006 | - |
| 9.4354 | 126350 | 0.0003 | - |
| 9.4392 | 126400 | 0.0005 | - |
| 9.4429 | 126450 | 0.0002 | - |
| 9.4466 | 126500 | 0.0005 | - |
| 9.4504 | 126550 | 0.0005 | - |
| 9.4541 | 126600 | 0.0002 | - |
| 9.4578 | 126650 | 0.0004 | - |
| 9.4616 | 126700 | 0.0001 | - |
| 9.4653 | 126750 | 0.0001 | - |
| 9.4690 | 126800 | 0.0 | - |
| 9.4728 | 126850 | 0.001 | - |
| 9.4765 | 126900 | 0.0009 | - |
| 9.4802 | 126950 | 0.0004 | - |
| 9.4840 | 127000 | 0.0001 | - |
| 9.4877 | 127050 | 0.0002 | - |
| 9.4914 | 127100 | 0.0002 | - |
| 9.4952 | 127150 | 0.0005 | - |
| 9.4989 | 127200 | 0.0004 | - |
| 9.5027 | 127250 | 0.0001 | - |
| 9.5064 | 127300 | 0.0012 | - |
| 9.5101 | 127350 | 0.0004 | - |
| 9.5139 | 127400 | 0.0001 | - |
| 9.5176 | 127450 | 0.0004 | - |
| 9.5213 | 127500 | 0.0005 | - |
| 9.5251 | 127550 | 0.0005 | - |
| 9.5288 | 127600 | 0.0005 | - |
| 9.5325 | 127650 | 0.0003 | - |
| 9.5363 | 127700 | 0.0007 | - |
| 9.5400 | 127750 | 0.0004 | - |
| 9.5437 | 127800 | 0.0006 | - |
| 9.5475 | 127850 | 0.0003 | - |
| 9.5512 | 127900 | 0.0003 | - |
| 9.5549 | 127950 | 0.0001 | - |
| 9.5587 | 128000 | 0.0004 | - |
| 9.5624 | 128050 | 0.0003 | - |
| 9.5661 | 128100 | 0.0002 | - |
| 9.5699 | 128150 | 0.0003 | - |
| 9.5736 | 128200 | 0.0004 | - |
| 9.5773 | 128250 | 0.0001 | - |
| 9.5811 | 128300 | 0.0012 | - |
| 9.5848 | 128350 | 0.0006 | - |
| 9.5885 | 128400 | 0.0003 | - |
| 9.5923 | 128450 | 0.0008 | - |
| 9.5960 | 128500 | 0.0004 | - |
| 9.5997 | 128550 | 0.0014 | - |
| 9.6035 | 128600 | 0.0011 | - |
| 9.6072 | 128650 | 0.0011 | - |
| 9.6109 | 128700 | 0.0011 | - |
| 9.6147 | 128750 | 0.0011 | - |
| 9.6184 | 128800 | 0.001 | - |
| 9.6221 | 128850 | 0.0006 | - |
| 9.6259 | 128900 | 0.0004 | - |
| 9.6296 | 128950 | 0.0007 | - |
| 9.6333 | 129000 | 0.0007 | - |
| 9.6371 | 129050 | 0.0011 | - |
| 9.6408 | 129100 | 0.0006 | - |
| 9.6445 | 129150 | 0.0005 | - |
| 9.6483 | 129200 | 0.0005 | - |
| 9.6520 | 129250 | 0.001 | - |
| 9.6557 | 129300 | 0.0002 | - |
| 9.6595 | 129350 | 0.0003 | - |
| 9.6632 | 129400 | 0.0007 | - |
| 9.6669 | 129450 | 0.0004 | - |
| 9.6707 | 129500 | 0.0009 | - |
| 9.6744 | 129550 | 0.0004 | - |
| 9.6781 | 129600 | 0.0007 | - |
| 9.6819 | 129650 | 0.0007 | - |
| 9.6856 | 129700 | 0.0003 | - |
| 9.6893 | 129750 | 0.0007 | - |
| 9.6931 | 129800 | 0.0002 | - |
| 9.6968 | 129850 | 0.0003 | - |
| 9.7005 | 129900 | 0.0008 | - |
| 9.7043 | 129950 | 0.0009 | - |
| 9.7080 | 130000 | 0.0005 | - |
| 9.7117 | 130050 | 0.0002 | - |
| 9.7155 | 130100 | 0.0007 | - |
| 9.7192 | 130150 | 0.0009 | - |
| 9.7229 | 130200 | 0.0001 | - |
| 9.7267 | 130250 | 0.0002 | - |
| 9.7304 | 130300 | 0.0004 | - |
| 9.7341 | 130350 | 0.0002 | - |
| 9.7379 | 130400 | 0.0005 | - |
| 9.7416 | 130450 | 0.0003 | - |
| 9.7454 | 130500 | 0.0007 | - |
| 9.7491 | 130550 | 0.0004 | - |
| 9.7528 | 130600 | 0.0 | - |
| 9.7566 | 130650 | 0.0007 | - |
| 9.7603 | 130700 | 0.0002 | - |
| 9.7640 | 130750 | 0.0007 | - |
| 9.7678 | 130800 | 0.0007 | - |
| 9.7715 | 130850 | 0.0004 | - |
| 9.7752 | 130900 | 0.0003 | - |
| 9.7790 | 130950 | 0.0004 | - |
| 9.7827 | 131000 | 0.0002 | - |
| 9.7864 | 131050 | 0.0002 | - |
| 9.7902 | 131100 | 0.0002 | - |
| 9.7939 | 131150 | 0.0001 | - |
| 9.7976 | 131200 | 0.0002 | - |
| 9.8014 | 131250 | 0.0002 | - |
| 9.8051 | 131300 | 0.0003 | - |
| 9.8088 | 131350 | 0.0007 | - |
| 9.8126 | 131400 | 0.0004 | - |
| 9.8163 | 131450 | 0.0003 | - |
| 9.8200 | 131500 | 0.0006 | - |
| 9.8238 | 131550 | 0.0001 | - |
| 9.8275 | 131600 | 0.0004 | - |
| 9.8312 | 131650 | 0.0006 | - |
| 9.8350 | 131700 | 0.0002 | - |
| 9.8387 | 131750 | 0.0003 | - |
| 9.8424 | 131800 | 0.0004 | - |
| 9.8462 | 131850 | 0.0002 | - |
| 9.8499 | 131900 | 0.0002 | - |
| 9.8536 | 131950 | 0.0 | - |
| 9.8574 | 132000 | 0.0004 | - |
| 9.8611 | 132050 | 0.0018 | - |
| 9.8648 | 132100 | 0.0007 | - |
| 9.8686 | 132150 | 0.0022 | - |
| 9.8723 | 132200 | 0.0007 | - |
| 9.8760 | 132250 | 0.0008 | - |
| 9.8798 | 132300 | 0.0008 | - |
| 9.8835 | 132350 | 0.0007 | - |
| 9.8872 | 132400 | 0.0008 | - |
| 9.8910 | 132450 | 0.0002 | - |
| 9.8947 | 132500 | 0.0006 | - |
| 9.8984 | 132550 | 0.0007 | - |
| 9.9022 | 132600 | 0.0003 | - |
| 9.9059 | 132650 | 0.0005 | - |
| 9.9096 | 132700 | 0.0004 | - |
| 9.9134 | 132750 | 0.0004 | - |
| 9.9171 | 132800 | 0.0004 | - |
| 9.9208 | 132850 | 0.0009 | - |
| 9.9246 | 132900 | 0.0002 | - |
| 9.9283 | 132950 | 0.001 | - |
| 9.9320 | 133000 | 0.0001 | - |
| 9.9358 | 133050 | 0.0004 | - |
| 9.9395 | 133100 | 0.0001 | - |
| 9.9432 | 133150 | 0.0007 | - |
| 9.9470 | 133200 | 0.0006 | - |
| 9.9507 | 133250 | 0.0002 | - |
| 9.9544 | 133300 | 0.0003 | - |
| 9.9582 | 133350 | 0.0003 | - |
| 9.9619 | 133400 | 0.0006 | - |
| 9.9656 | 133450 | 0.0008 | - |
| 9.9694 | 133500 | 0.0004 | - |
| 9.9731 | 133550 | 0.0009 | - |
| 9.9769 | 133600 | 0.0003 | - |
| 9.9806 | 133650 | 0.0003 | - |
| 9.9843 | 133700 | 0.0004 | - |
| 9.9881 | 133750 | 0.0003 | - |
| 9.9918 | 133800 | 0.0006 | - |
| 9.9955 | 133850 | 0.0006 | - |
| 9.9993 | 133900 | 0.0004 | - |
| 10.0030 | 133950 | 0.0004 | - |
| 10.0067 | 134000 | 0.0006 | - |
| 10.0105 | 134050 | 0.001 | - |
| 10.0142 | 134100 | 0.0004 | - |
| 10.0179 | 134150 | 0.0006 | - |
| 10.0217 | 134200 | 0.0004 | - |
| 10.0254 | 134250 | 0.0008 | - |
| 10.0291 | 134300 | 0.0002 | - |
| 10.0329 | 134350 | 0.0004 | - |
| 10.0366 | 134400 | 0.0009 | - |
| 10.0403 | 134450 | 0.0011 | - |
| 10.0441 | 134500 | 0.0007 | - |
| 10.0478 | 134550 | 0.0007 | - |
| 10.0515 | 134600 | 0.0007 | - |
| 10.0553 | 134650 | 0.0012 | - |
| 10.0590 | 134700 | 0.0008 | - |
| 10.0627 | 134750 | 0.0003 | - |
| 10.0665 | 134800 | 0.0005 | - |
| 10.0702 | 134850 | 0.0002 | - |
| 10.0739 | 134900 | 0.0005 | - |
| 10.0777 | 134950 | 0.0006 | - |
| 10.0814 | 135000 | 0.0008 | - |
| 10.0851 | 135050 | 0.0007 | - |
| 10.0889 | 135100 | 0.0003 | - |
| 10.0926 | 135150 | 0.0004 | - |
| 10.0963 | 135200 | 0.0003 | - |
| 10.1001 | 135250 | 0.0004 | - |
| 10.1038 | 135300 | 0.0005 | - |
| 10.1075 | 135350 | 0.0005 | - |
| 10.1113 | 135400 | 0.0007 | - |
| 10.1150 | 135450 | 0.0009 | - |
| 10.1187 | 135500 | 0.0004 | - |
| 10.1225 | 135550 | 0.0005 | - |
| 10.1262 | 135600 | 0.0002 | - |
| 10.1299 | 135650 | 0.0005 | - |
| 10.1337 | 135700 | 0.0004 | - |
| 10.1374 | 135750 | 0.0001 | - |
| 10.1411 | 135800 | 0.0004 | - |
| 10.1449 | 135850 | 0.0003 | - |
| 10.1486 | 135900 | 0.0005 | - |
| 10.1523 | 135950 | 0.0002 | - |
| 10.1561 | 136000 | 0.0001 | - |
| 10.1598 | 136050 | 0.0006 | - |
| 10.1635 | 136100 | 0.0005 | - |
| 10.1673 | 136150 | 0.0007 | - |
| 10.1710 | 136200 | 0.0004 | - |
| 10.1747 | 136250 | 0.0005 | - |
| 10.1785 | 136300 | 0.0006 | - |
| 10.1822 | 136350 | 0.0005 | - |
| 10.1859 | 136400 | 0.0007 | - |
| 10.1897 | 136450 | 0.0007 | - |
| 10.1934 | 136500 | 0.0002 | - |
| 10.1971 | 136550 | 0.0001 | - |
| 10.2009 | 136600 | 0.0001 | - |
| 10.2046 | 136650 | 0.0002 | - |
| 10.2083 | 136700 | 0.0002 | - |
| 10.2121 | 136750 | 0.0007 | - |
| 10.2158 | 136800 | 0.001 | - |
| 10.2196 | 136850 | 0.0004 | - |
| 10.2233 | 136900 | 0.0006 | - |
| 10.2270 | 136950 | 0.0001 | - |
| 10.2308 | 137000 | 0.0008 | - |
| 10.2345 | 137050 | 0.0006 | - |
| 10.2382 | 137100 | 0.0004 | - |
| 10.2420 | 137150 | 0.0002 | - |
| 10.2457 | 137200 | 0.0008 | - |
| 10.2494 | 137250 | 0.0002 | - |
| 10.2532 | 137300 | 0.0005 | - |
| 10.2569 | 137350 | 0.0003 | - |
| 10.2606 | 137400 | 0.0005 | - |
| 10.2644 | 137450 | 0.0003 | - |
| 10.2681 | 137500 | 0.0004 | - |
| 10.2718 | 137550 | 0.0003 | - |
| 10.2756 | 137600 | 0.0002 | - |
| 10.2793 | 137650 | 0.0006 | - |
| 10.2830 | 137700 | 0.0003 | - |
| 10.2868 | 137750 | 0.0004 | - |
| 10.2905 | 137800 | 0.0006 | - |
| 10.2942 | 137850 | 0.0004 | - |
| 10.2980 | 137900 | 0.0009 | - |
| 10.3017 | 137950 | 0.0003 | - |
| 10.3054 | 138000 | 0.0001 | - |
| 10.3092 | 138050 | 0.0004 | - |
| 10.3129 | 138100 | 0.0004 | - |
| 10.3166 | 138150 | 0.0006 | - |
| 10.3204 | 138200 | 0.0004 | - |
| 10.3241 | 138250 | 0.0006 | - |
| 10.3278 | 138300 | 0.0003 | - |
| 10.3316 | 138350 | 0.0014 | - |
| 10.3353 | 138400 | 0.0006 | - |
| 10.3390 | 138450 | 0.0003 | - |
| 10.3428 | 138500 | 0.0003 | - |
| 10.3465 | 138550 | 0.0001 | - |
| 10.3502 | 138600 | 0.0006 | - |
| 10.3540 | 138650 | 0.0003 | - |
| 10.3577 | 138700 | 0.0006 | - |
| 10.3614 | 138750 | 0.0003 | - |
| 10.3652 | 138800 | 0.0006 | - |
| 10.3689 | 138850 | 0.0006 | - |
| 10.3726 | 138900 | 0.0004 | - |
| 10.3764 | 138950 | 0.0009 | - |
| 10.3801 | 139000 | 0.0013 | - |
| 10.3838 | 139050 | 0.0005 | - |
| 10.3876 | 139100 | 0.0003 | - |
| 10.3913 | 139150 | 0.0006 | - |
| 10.3950 | 139200 | 0.0006 | - |
| 10.3988 | 139250 | 0.0001 | - |
| 10.4025 | 139300 | 0.0002 | - |
| 10.4062 | 139350 | 0.0002 | - |
| 10.4100 | 139400 | 0.0007 | - |
| 10.4137 | 139450 | 0.0005 | - |
| 10.4174 | 139500 | 0.0003 | - |
| 10.4212 | 139550 | 0.0004 | - |
| 10.4249 | 139600 | 0.0007 | - |
| 10.4286 | 139650 | 0.0006 | - |
| 10.4324 | 139700 | 0.0002 | - |
| 10.4361 | 139750 | 0.0003 | - |
| 10.4398 | 139800 | 0.0006 | - |
| 10.4436 | 139850 | 0.0006 | - |
| 10.4473 | 139900 | 0.0005 | - |
| 10.4510 | 139950 | 0.0002 | - |
| 10.4548 | 140000 | 0.0004 | - |
| 10.4585 | 140050 | 0.0004 | - |
| 10.4623 | 140100 | 0.0002 | - |
| 10.4660 | 140150 | 0.0001 | - |
| 10.4697 | 140200 | 0.0002 | - |
| 10.4735 | 140250 | 0.0004 | - |
| 10.4772 | 140300 | 0.0001 | - |
| 10.4809 | 140350 | 0.0001 | - |
| 10.4847 | 140400 | 0.0005 | - |
| 10.4884 | 140450 | 0.0003 | - |
| 10.4921 | 140500 | 0.0005 | - |
| 10.4959 | 140550 | 0.0007 | - |
| 10.4996 | 140600 | 0.0006 | - |
| 10.5033 | 140650 | 0.0001 | - |
| 10.5071 | 140700 | 0.0002 | - |
| 10.5108 | 140750 | 0.0002 | - |
| 10.5145 | 140800 | 0.0003 | - |
| 10.5183 | 140850 | 0.0003 | - |
| 10.5220 | 140900 | 0.0004 | - |
| 10.5257 | 140950 | 0.001 | - |
| 10.5295 | 141000 | 0.0002 | - |
| 10.5332 | 141050 | 0.0005 | - |
| 10.5369 | 141100 | 0.0006 | - |
| 10.5407 | 141150 | 0.0005 | - |
| 10.5444 | 141200 | 0.0001 | - |
| 10.5481 | 141250 | 0.0007 | - |
| 10.5519 | 141300 | 0.0004 | - |
| 10.5556 | 141350 | 0.0001 | - |
| 10.5593 | 141400 | 0.0002 | - |
| 10.5631 | 141450 | 0.0005 | - |
| 10.5668 | 141500 | 0.0006 | - |
| 10.5705 | 141550 | 0.0002 | - |
| 10.5743 | 141600 | 0.0003 | - |
| 10.5780 | 141650 | 0.0009 | - |
| 10.5817 | 141700 | 0.0006 | - |
| 10.5855 | 141750 | 0.0012 | - |
| 10.5892 | 141800 | 0.0008 | - |
| 10.5929 | 141850 | 0.001 | - |
| 10.5967 | 141900 | 0.0005 | - |
| 10.6004 | 141950 | 0.0004 | - |
| 10.6041 | 142000 | 0.0014 | - |
| 10.6079 | 142050 | 0.0002 | - |
| 10.6116 | 142100 | 0.0007 | - |
| 10.6153 | 142150 | 0.0005 | - |
| 10.6191 | 142200 | 0.0005 | - |
| 10.6228 | 142250 | 0.0009 | - |
| 10.6265 | 142300 | 0.0006 | - |
| 10.6303 | 142350 | 0.0004 | - |
| 10.6340 | 142400 | 0.0004 | - |
| 10.6377 | 142450 | 0.0003 | - |
| 10.6415 | 142500 | 0.0008 | - |
| 10.6452 | 142550 | 0.0004 | - |
| 10.6489 | 142600 | 0.0003 | - |
| 10.6527 | 142650 | 0.0003 | - |
| 10.6564 | 142700 | 0.0005 | - |
| 10.6601 | 142750 | 0.0004 | - |
| 10.6639 | 142800 | 0.0002 | - |
| 10.6676 | 142850 | 0.0009 | - |
| 10.6713 | 142900 | 0.0004 | - |
| 10.6751 | 142950 | 0.0002 | - |
| 10.6788 | 143000 | 0.0004 | - |
| 10.6825 | 143050 | 0.0004 | - |
| 10.6863 | 143100 | 0.0001 | - |
| 10.6900 | 143150 | 0.0001 | - |
| 10.6937 | 143200 | 0.0006 | - |
| 10.6975 | 143250 | 0.0004 | - |
| 10.7012 | 143300 | 0.0006 | - |
| 10.7050 | 143350 | 0.0005 | - |
| 10.7087 | 143400 | 0.0002 | - |
| 10.7124 | 143450 | 0.0002 | - |
| 10.7162 | 143500 | 0.0008 | - |
| 10.7199 | 143550 | 0.0004 | - |
| 10.7236 | 143600 | 0.0002 | - |
| 10.7274 | 143650 | 0.0004 | - |
| 10.7311 | 143700 | 0.0004 | - |
| 10.7348 | 143750 | 0.0004 | - |
| 10.7386 | 143800 | 0.0002 | - |
| 10.7423 | 143850 | 0.0003 | - |
| 10.7460 | 143900 | 0.0003 | - |
| 10.7498 | 143950 | 0.0006 | - |
| 10.7535 | 144000 | 0.0004 | - |
| 10.7572 | 144050 | 0.0003 | - |
| 10.7610 | 144100 | 0.0004 | - |
| 10.7647 | 144150 | 0.0009 | - |
| 10.7684 | 144200 | 0.0006 | - |
| 10.7722 | 144250 | 0.0009 | - |
| 10.7759 | 144300 | 0.0007 | - |
| 10.7796 | 144350 | 0.0001 | - |
| 10.7834 | 144400 | 0.0005 | - |
| 10.7871 | 144450 | 0.0005 | - |
| 10.7908 | 144500 | 0.0004 | - |
| 10.7946 | 144550 | 0.0005 | - |
| 10.7983 | 144600 | 0.0003 | - |
| 10.8020 | 144650 | 0.0002 | - |
| 10.8058 | 144700 | 0.0004 | - |
| 10.8095 | 144750 | 0.0009 | - |
| 10.8132 | 144800 | 0.0004 | - |
| 10.8170 | 144850 | 0.0005 | - |
| 10.8207 | 144900 | 0.0001 | - |
| 10.8244 | 144950 | 0.0002 | - |
| 10.8282 | 145000 | 0.0007 | - |
| 10.8319 | 145050 | 0.0003 | - |
| 10.8356 | 145100 | 0.0001 | - |
| 10.8394 | 145150 | 0.0002 | - |
| 10.8431 | 145200 | 0.0005 | - |
| 10.8468 | 145250 | 0.0004 | - |
| 10.8506 | 145300 | 0.0005 | - |
| 10.8543 | 145350 | 0.0008 | - |
| 10.8580 | 145400 | 0.0003 | - |
| 10.8618 | 145450 | 0.0001 | - |
| 10.8655 | 145500 | 0.0005 | - |
| 10.8692 | 145550 | 0.0004 | - |
| 10.8730 | 145600 | 0.0003 | - |
| 10.8767 | 145650 | 0.0005 | - |
| 10.8804 | 145700 | 0.0004 | - |
| 10.8842 | 145750 | 0.0008 | - |
| 10.8879 | 145800 | 0.0003 | - |
| 10.8916 | 145850 | 0.0004 | - |
| 10.8954 | 145900 | 0.0001 | - |
| 10.8991 | 145950 | 0.0003 | - |
| 10.9028 | 146000 | 0.0005 | - |
| 10.9066 | 146050 | 0.0009 | - |
| 10.9103 | 146100 | 0.0012 | - |
| 10.9140 | 146150 | 0.0001 | - |
| 10.9178 | 146200 | 0.0002 | - |
| 10.9215 | 146250 | 0.0001 | - |
| 10.9252 | 146300 | 0.0 | - |
| 10.9290 | 146350 | 0.0001 | - |
| 10.9327 | 146400 | 0.0006 | - |
| 10.9364 | 146450 | 0.0002 | - |
| 10.9402 | 146500 | 0.0 | - |
| 10.9439 | 146550 | 0.0001 | - |
| 10.9477 | 146600 | 0.0003 | - |
| 10.9514 | 146650 | 0.0001 | - |
| 10.9551 | 146700 | 0.0002 | - |
| 10.9589 | 146750 | 0.0005 | - |
| 10.9626 | 146800 | 0.0002 | - |
| 10.9663 | 146850 | 0.0003 | - |
| 10.9701 | 146900 | 0.0002 | - |
| 10.9738 | 146950 | 0.0004 | - |
| 10.9775 | 147000 | 0.0002 | - |
| 10.9813 | 147050 | 0.0005 | - |
| 10.9850 | 147100 | 0.0002 | - |
| 10.9887 | 147150 | 0.0002 | - |
| 10.9925 | 147200 | 0.0002 | - |
| 10.9962 | 147250 | 0.0002 | - |
| 10.9999 | 147300 | 0.0002 | - |
| 11.0037 | 147350 | 0.0002 | - |
| 11.0074 | 147400 | 0.0001 | - |
| 11.0111 | 147450 | 0.0002 | - |
| 11.0149 | 147500 | 0.0003 | - |
| 11.0186 | 147550 | 0.0002 | - |
| 11.0223 | 147600 | 0.0 | - |
| 11.0261 | 147650 | 0.0002 | - |
| 11.0298 | 147700 | 0.0002 | - |
| 11.0335 | 147750 | 0.0001 | - |
| 11.0373 | 147800 | 0.0001 | - |
| 11.0410 | 147850 | 0.0005 | - |
| 11.0447 | 147900 | 0.0002 | - |
| 11.0485 | 147950 | 0.0006 | - |
| 11.0522 | 148000 | 0.0002 | - |
| 11.0559 | 148050 | 0.0003 | - |
| 11.0597 | 148100 | 0.0003 | - |
| 11.0634 | 148150 | 0.0001 | - |
| 11.0671 | 148200 | 0.0003 | - |
| 11.0709 | 148250 | 0.0 | - |
| 11.0746 | 148300 | 0.0 | - |
| 11.0783 | 148350 | 0.0003 | - |
| 11.0821 | 148400 | 0.0004 | - |
| 11.0858 | 148450 | 0.0003 | - |
| 11.0895 | 148500 | 0.0004 | - |
| 11.0933 | 148550 | 0.0004 | - |
| 11.0970 | 148600 | 0.0005 | - |
| 11.1007 | 148650 | 0.0003 | - |
| 11.1045 | 148700 | 0.0005 | - |
| 11.1082 | 148750 | 0.0003 | - |
| 11.1119 | 148800 | 0.0007 | - |
| 11.1157 | 148850 | 0.0002 | - |
| 11.1194 | 148900 | 0.0008 | - |
| 11.1231 | 148950 | 0.0001 | - |
| 11.1269 | 149000 | 0.0003 | - |
| 11.1306 | 149050 | 0.0002 | - |
| 11.1343 | 149100 | 0.0002 | - |
| 11.1381 | 149150 | 0.0004 | - |
| 11.1418 | 149200 | 0.0002 | - |
| 11.1455 | 149250 | 0.0002 | - |
| 11.1493 | 149300 | 0.0006 | - |
| 11.1530 | 149350 | 0.0003 | - |
| 11.1567 | 149400 | 0.0006 | - |
| 11.1605 | 149450 | 0.0007 | - |
| 11.1642 | 149500 | 0.0004 | - |
| 11.1679 | 149550 | 0.0004 | - |
| 11.1717 | 149600 | 0.0006 | - |
| 11.1754 | 149650 | 0.0007 | - |
| 11.1792 | 149700 | 0.0006 | - |
| 11.1829 | 149750 | 0.0006 | - |
| 11.1866 | 149800 | 0.0002 | - |
| 11.1904 | 149850 | 0.0004 | - |
| 11.1941 | 149900 | 0.0004 | - |
| 11.1978 | 149950 | 0.0004 | - |
| 11.2016 | 150000 | 0.0006 | - |
| 11.2053 | 150050 | 0.0002 | - |
| 11.2090 | 150100 | 0.0004 | - |
| 11.2128 | 150150 | 0.0002 | - |
| 11.2165 | 150200 | 0.0003 | - |
| 11.2202 | 150250 | 0.0003 | - |
| 11.2240 | 150300 | 0.0005 | - |
| 11.2277 | 150350 | 0.0005 | - |
| 11.2314 | 150400 | 0.0002 | - |
| 11.2352 | 150450 | 0.0005 | - |
| 11.2389 | 150500 | 0.0002 | - |
| 11.2426 | 150550 | 0.0001 | - |
| 11.2464 | 150600 | 0.0 | - |
| 11.2501 | 150650 | 0.0008 | - |
| 11.2538 | 150700 | 0.0004 | - |
| 11.2576 | 150750 | 0.0004 | - |
| 11.2613 | 150800 | 0.0001 | - |
| 11.2650 | 150850 | 0.0003 | - |
| 11.2688 | 150900 | 0.0004 | - |
| 11.2725 | 150950 | 0.0005 | - |
| 11.2762 | 151000 | 0.0002 | - |
| 11.2800 | 151050 | 0.0003 | - |
| 11.2837 | 151100 | 0.0 | - |
| 11.2874 | 151150 | 0.0005 | - |
| 11.2912 | 151200 | 0.0002 | - |
| 11.2949 | 151250 | 0.0002 | - |
| 11.2986 | 151300 | 0.0002 | - |
| 11.3024 | 151350 | 0.0003 | - |
| 11.3061 | 151400 | 0.0 | - |
| 11.3098 | 151450 | 0.0004 | - |
| 11.3136 | 151500 | 0.0004 | - |
| 11.3173 | 151550 | 0.0004 | - |
| 11.3210 | 151600 | 0.0004 | - |
| 11.3248 | 151650 | 0.0006 | - |
| 11.3285 | 151700 | 0.0005 | - |
| 11.3322 | 151750 | 0.001 | - |
| 11.3360 | 151800 | 0.0002 | - |
| 11.3397 | 151850 | 0.0002 | - |
| 11.3434 | 151900 | 0.0005 | - |
| 11.3472 | 151950 | 0.0002 | - |
| 11.3509 | 152000 | 0.0 | - |
| 11.3546 | 152050 | 0.0002 | - |
| 11.3584 | 152100 | 0.0005 | - |
| 11.3621 | 152150 | 0.0001 | - |
| 11.3658 | 152200 | 0.0006 | - |
| 11.3696 | 152250 | 0.0002 | - |
| 11.3733 | 152300 | 0.0005 | - |
| 11.3770 | 152350 | 0.0002 | - |
| 11.3808 | 152400 | 0.0004 | - |
| 11.3845 | 152450 | 0.0004 | - |
| 11.3882 | 152500 | 0.0007 | - |
| 11.3920 | 152550 | 0.0007 | - |
| 11.3957 | 152600 | 0.0002 | - |
| 11.3994 | 152650 | 0.0003 | - |
| 11.4032 | 152700 | 0.0002 | - |
| 11.4069 | 152750 | 0.0004 | - |
| 11.4106 | 152800 | 0.0005 | - |
| 11.4144 | 152850 | 0.0001 | - |
| 11.4181 | 152900 | 0.0006 | - |
| 11.4219 | 152950 | 0.0005 | - |
| 11.4256 | 153000 | 0.0002 | - |
| 11.4293 | 153050 | 0.0005 | - |
| 11.4331 | 153100 | 0.0004 | - |
| 11.4368 | 153150 | 0.0002 | - |
| 11.4405 | 153200 | 0.0002 | - |
| 11.4443 | 153250 | 0.0005 | - |
| 11.4480 | 153300 | 0.0004 | - |
| 11.4517 | 153350 | 0.0002 | - |
| 11.4555 | 153400 | 0.0003 | - |
| 11.4592 | 153450 | 0.0 | - |
| 11.4629 | 153500 | 0.0002 | - |
| 11.4667 | 153550 | 0.0003 | - |
| 11.4704 | 153600 | 0.0002 | - |
| 11.4741 | 153650 | 0.0002 | - |
| 11.4779 | 153700 | 0.0005 | - |
| 11.4816 | 153750 | 0.0005 | - |
| 11.4853 | 153800 | 0.0005 | - |
| 11.4891 | 153850 | 0.0004 | - |
| 11.4928 | 153900 | 0.0005 | - |
| 11.4965 | 153950 | 0.0004 | - |
| 11.5003 | 154000 | 0.0007 | - |
| 11.5040 | 154050 | 0.0003 | - |
| 11.5077 | 154100 | 0.0 | - |
| 11.5115 | 154150 | 0.0008 | - |
| 11.5152 | 154200 | 0.0002 | - |
| 11.5189 | 154250 | 0.0002 | - |
| 11.5227 | 154300 | 0.0005 | - |
| 11.5264 | 154350 | 0.0002 | - |
| 11.5301 | 154400 | 0.0003 | - |
| 11.5339 | 154450 | 0.0 | - |
| 11.5376 | 154500 | 0.0005 | - |
| 11.5413 | 154550 | 0.0005 | - |
| 11.5451 | 154600 | 0.0003 | - |
| 11.5488 | 154650 | 0.0003 | - |
| 11.5525 | 154700 | 0.0001 | - |
| 11.5563 | 154750 | 0.0004 | - |
| 11.5600 | 154800 | 0.0003 | - |
| 11.5637 | 154850 | 0.0001 | - |
| 11.5675 | 154900 | 0.0003 | - |
| 11.5712 | 154950 | 0.0 | - |
| 11.5749 | 155000 | 0.0 | - |
| 11.5787 | 155050 | 0.0003 | - |
| 11.5824 | 155100 | 0.0005 | - |
| 11.5861 | 155150 | 0.0007 | - |
| 11.5899 | 155200 | 0.0003 | - |
| 11.5936 | 155250 | 0.0004 | - |
| 11.5973 | 155300 | 0.001 | - |
| 11.6011 | 155350 | 0.0011 | - |
| 11.6048 | 155400 | 0.0008 | - |
| 11.6085 | 155450 | 0.0007 | - |
| 11.6123 | 155500 | 0.0001 | - |
| 11.6160 | 155550 | 0.0001 | - |
| 11.6197 | 155600 | 0.0003 | - |
| 11.6235 | 155650 | 0.0005 | - |
| 11.6272 | 155700 | 0.0001 | - |
| 11.6309 | 155750 | 0.0007 | - |
| 11.6347 | 155800 | 0.0005 | - |
| 11.6384 | 155850 | 0.0003 | - |
| 11.6421 | 155900 | 0.0004 | - |
| 11.6459 | 155950 | 0.0007 | - |
| 11.6496 | 156000 | 0.0001 | - |
| 11.6533 | 156050 | 0.0007 | - |
| 11.6571 | 156100 | 0.0008 | - |
| 11.6608 | 156150 | 0.0007 | - |
| 11.6646 | 156200 | 0.0005 | - |
| 11.6683 | 156250 | 0.0005 | - |
| 11.6720 | 156300 | 0.0003 | - |
| 11.6758 | 156350 | 0.0002 | - |
| 11.6795 | 156400 | 0.0001 | - |
| 11.6832 | 156450 | 0.0003 | - |
| 11.6870 | 156500 | 0.0007 | - |
| 11.6907 | 156550 | 0.0002 | - |
| 11.6944 | 156600 | 0.0007 | - |
| 11.6982 | 156650 | 0.0004 | - |
| 11.7019 | 156700 | 0.0002 | - |
| 11.7056 | 156750 | 0.0002 | - |
| 11.7094 | 156800 | 0.0002 | - |
| 11.7131 | 156850 | 0.0005 | - |
| 11.7168 | 156900 | 0.0003 | - |
| 11.7206 | 156950 | 0.0002 | - |
| 11.7243 | 157000 | 0.0004 | - |
| 11.7280 | 157050 | 0.0008 | - |
| 11.7318 | 157100 | 0.0002 | - |
| 11.7355 | 157150 | 0.0002 | - |
| 11.7392 | 157200 | 0.0 | - |
| 11.7430 | 157250 | 0.0 | - |
| 11.7467 | 157300 | 0.0 | - |
| 11.7504 | 157350 | 0.0002 | - |
| 11.7542 | 157400 | 0.0004 | - |
| 11.7579 | 157450 | 0.0001 | - |
| 11.7616 | 157500 | 0.0004 | - |
| 11.7654 | 157550 | 0.0002 | - |
| 11.7691 | 157600 | 0.0008 | - |
| 11.7728 | 157650 | 0.0005 | - |
| 11.7766 | 157700 | 0.0005 | - |
| 11.7803 | 157750 | 0.0005 | - |
| 11.7840 | 157800 | 0.0004 | - |
| 11.7878 | 157850 | 0.0001 | - |
| 11.7915 | 157900 | 0.0001 | - |
| 11.7952 | 157950 | 0.0001 | - |
| 11.7990 | 158000 | 0.0002 | - |
| 11.8027 | 158050 | 0.0002 | - |
| 11.8064 | 158100 | 0.0002 | - |
| 11.8102 | 158150 | 0.0005 | - |
| 11.8139 | 158200 | 0.0004 | - |
| 11.8176 | 158250 | 0.0006 | - |
| 11.8214 | 158300 | 0.0004 | - |
| 11.8251 | 158350 | 0.0002 | - |
| 11.8288 | 158400 | 0.0004 | - |
| 11.8326 | 158450 | 0.0002 | - |
| 11.8363 | 158500 | 0.0001 | - |
| 11.8400 | 158550 | 0.0007 | - |
| 11.8438 | 158600 | 0.0005 | - |
| 11.8475 | 158650 | 0.0001 | - |
| 11.8512 | 158700 | 0.0001 | - |
| 11.8550 | 158750 | 0.0002 | - |
| 11.8587 | 158800 | 0.0001 | - |
| 11.8624 | 158850 | 0.0003 | - |
| 11.8662 | 158900 | 0.0005 | - |
| 11.8699 | 158950 | 0.0005 | - |
| 11.8736 | 159000 | 0.0001 | - |
| 11.8774 | 159050 | 0.0005 | - |
| 11.8811 | 159100 | 0.0001 | - |
| 11.8848 | 159150 | 0.0003 | - |
| 11.8886 | 159200 | 0.0 | - |
| 11.8923 | 159250 | 0.0002 | - |
| 11.8960 | 159300 | 0.0005 | - |
| 11.8998 | 159350 | 0.0001 | - |
| 11.9035 | 159400 | 0.0006 | - |
| 11.9073 | 159450 | 0.0005 | - |
| 11.9110 | 159500 | 0.0006 | - |
| 11.9147 | 159550 | 0.0004 | - |
| 11.9185 | 159600 | 0.0002 | - |
| 11.9222 | 159650 | 0.0013 | - |
| 11.9259 | 159700 | 0.0006 | - |
| 11.9297 | 159750 | 0.0002 | - |
| 11.9334 | 159800 | 0.0003 | - |
| 11.9371 | 159850 | 0.0003 | - |
| 11.9409 | 159900 | 0.0006 | - |
| 11.9446 | 159950 | 0.0002 | - |
| 11.9483 | 160000 | 0.0004 | - |
| 11.9521 | 160050 | 0.0002 | - |
| 11.9558 | 160100 | 0.0002 | - |
| 11.9595 | 160150 | 0.0004 | - |
| 11.9633 | 160200 | 0.0002 | - |
| 11.9670 | 160250 | 0.0 | - |
| 11.9707 | 160300 | 0.0005 | - |
| 11.9745 | 160350 | 0.0003 | - |
| 11.9782 | 160400 | 0.0002 | - |
| 11.9819 | 160450 | 0.0002 | - |
| 11.9857 | 160500 | 0.0002 | - |
| 11.9894 | 160550 | 0.0002 | - |
| 11.9931 | 160600 | 0.0003 | - |
| 11.9969 | 160650 | 0.0004 | - |
| 12.0006 | 160700 | 0.0002 | - |
| 12.0043 | 160750 | 0.0004 | - |
| 12.0081 | 160800 | 0.0002 | - |
| 12.0118 | 160850 | 0.0 | - |
| 12.0155 | 160900 | 0.0002 | - |
| 12.0193 | 160950 | 0.0008 | - |
| 12.0230 | 161000 | 0.0006 | - |
| 12.0267 | 161050 | 0.0004 | - |
| 12.0305 | 161100 | 0.0003 | - |
| 12.0342 | 161150 | 0.0003 | - |
| 12.0379 | 161200 | 0.0002 | - |
| 12.0417 | 161250 | 0.0005 | - |
| 12.0454 | 161300 | 0.0003 | - |
| 12.0491 | 161350 | 0.0003 | - |
| 12.0529 | 161400 | 0.0005 | - |
| 12.0566 | 161450 | 0.0003 | - |
| 12.0603 | 161500 | 0.0002 | - |
| 12.0641 | 161550 | 0.0004 | - |
| 12.0678 | 161600 | 0.0005 | - |
| 12.0715 | 161650 | 0.0004 | - |
| 12.0753 | 161700 | 0.0008 | - |
| 12.0790 | 161750 | 0.0002 | - |
| 12.0827 | 161800 | 0.0006 | - |
| 12.0865 | 161850 | 0.0001 | - |
| 12.0902 | 161900 | 0.0003 | - |
| 12.0939 | 161950 | 0.0004 | - |
| 12.0977 | 162000 | 0.0004 | - |
| 12.1014 | 162050 | 0.0002 | - |
| 12.1051 | 162100 | 0.0006 | - |
| 12.1089 | 162150 | 0.0002 | - |
| 12.1126 | 162200 | 0.0002 | - |
| 12.1163 | 162250 | 0.0005 | - |
| 12.1201 | 162300 | 0.0004 | - |
| 12.1238 | 162350 | 0.0001 | - |
| 12.1275 | 162400 | 0.0002 | - |
| 12.1313 | 162450 | 0.0003 | - |
| 12.1350 | 162500 | 0.0001 | - |
| 12.1387 | 162550 | 0.0005 | - |
| 12.1425 | 162600 | 0.0002 | - |
| 12.1462 | 162650 | 0.0 | - |
| 12.1500 | 162700 | 0.0003 | - |
| 12.1537 | 162750 | 0.0 | - |
| 12.1574 | 162800 | 0.0 | - |
| 12.1612 | 162850 | 0.0002 | - |
| 12.1649 | 162900 | 0.0004 | - |
| 12.1686 | 162950 | 0.0001 | - |
| 12.1724 | 163000 | 0.0003 | - |
| 12.1761 | 163050 | 0.0 | - |
| 12.1798 | 163100 | 0.0004 | - |
| 12.1836 | 163150 | 0.0 | - |
| 12.1873 | 163200 | 0.0 | - |
| 12.1910 | 163250 | 0.0001 | - |
| 12.1948 | 163300 | 0.0003 | - |
| 12.1985 | 163350 | 0.0006 | - |
| 12.2022 | 163400 | 0.0002 | - |
| 12.2060 | 163450 | 0.0001 | - |
| 12.2097 | 163500 | 0.0004 | - |
| 12.2134 | 163550 | 0.0 | - |
| 12.2172 | 163600 | 0.0001 | - |
| 12.2209 | 163650 | 0.0006 | - |
| 12.2246 | 163700 | 0.0002 | - |
| 12.2284 | 163750 | 0.0002 | - |
| 12.2321 | 163800 | 0.0003 | - |
| 12.2358 | 163850 | 0.0001 | - |
| 12.2396 | 163900 | 0.0005 | - |
| 12.2433 | 163950 | 0.0004 | - |
| 12.2470 | 164000 | 0.0002 | - |
| 12.2508 | 164050 | 0.0001 | - |
| 12.2545 | 164100 | 0.0005 | - |
| 12.2582 | 164150 | 0.0005 | - |
| 12.2620 | 164200 | 0.0001 | - |
| 12.2657 | 164250 | 0.0002 | - |
| 12.2694 | 164300 | 0.0004 | - |
| 12.2732 | 164350 | 0.0002 | - |
| 12.2769 | 164400 | 0.0003 | - |
| 12.2806 | 164450 | 0.0 | - |
| 12.2844 | 164500 | 0.0002 | - |
| 12.2881 | 164550 | 0.0001 | - |
| 12.2918 | 164600 | 0.0003 | - |
| 12.2956 | 164650 | 0.0006 | - |
| 12.2993 | 164700 | 0.0003 | - |
| 12.3030 | 164750 | 0.0008 | - |
| 12.3068 | 164800 | 0.0006 | - |
| 12.3105 | 164850 | 0.001 | - |
| 12.3142 | 164900 | 0.0008 | - |
| 12.3180 | 164950 | 0.001 | - |
| 12.3217 | 165000 | 0.0006 | - |
| 12.3254 | 165050 | 0.0003 | - |
| 12.3292 | 165100 | 0.0008 | - |
| 12.3329 | 165150 | 0.0012 | - |
| 12.3366 | 165200 | 0.001 | - |
| 12.3404 | 165250 | 0.0006 | - |
| 12.3441 | 165300 | 0.001 | - |
| 12.3478 | 165350 | 0.0005 | - |
| 12.3516 | 165400 | 0.0005 | - |
| 12.3553 | 165450 | 0.0005 | - |
| 12.3590 | 165500 | 0.0008 | - |
| 12.3628 | 165550 | 0.0004 | - |
| 12.3665 | 165600 | 0.0003 | - |
| 12.3702 | 165650 | 0.0006 | - |
| 12.3740 | 165700 | 0.0005 | - |
| 12.3777 | 165750 | 0.0004 | - |
| 12.3815 | 165800 | 0.0006 | - |
| 12.3852 | 165850 | 0.0006 | - |
| 12.3889 | 165900 | 0.0005 | - |
| 12.3927 | 165950 | 0.0002 | - |
| 12.3964 | 166000 | 0.0004 | - |
| 12.4001 | 166050 | 0.0004 | - |
| 12.4039 | 166100 | 0.0007 | - |
| 12.4076 | 166150 | 0.0006 | - |
| 12.4113 | 166200 | 0.0 | - |
| 12.4151 | 166250 | 0.0005 | - |
| 12.4188 | 166300 | 0.0003 | - |
| 12.4225 | 166350 | 0.0002 | - |
| 12.4263 | 166400 | 0.0004 | - |
| 12.4300 | 166450 | 0.0008 | - |
| 12.4337 | 166500 | 0.0008 | - |
| 12.4375 | 166550 | 0.0007 | - |
| 12.4412 | 166600 | 0.0002 | - |
| 12.4449 | 166650 | 0.0003 | - |
| 12.4487 | 166700 | 0.0008 | - |
| 12.4524 | 166750 | 0.0002 | - |
| 12.4561 | 166800 | 0.0002 | - |
| 12.4599 | 166850 | 0.0002 | - |
| 12.4636 | 166900 | 0.0007 | - |
| 12.4673 | 166950 | 0.0003 | - |
| 12.4711 | 167000 | 0.0002 | - |
| 12.4748 | 167050 | 0.0005 | - |
| 12.4785 | 167100 | 0.0001 | - |
| 12.4823 | 167150 | 0.0005 | - |
| 12.4860 | 167200 | 0.0 | - |
| 12.4897 | 167250 | 0.0003 | - |
| 12.4935 | 167300 | 0.0002 | - |
| 12.4972 | 167350 | 0.0002 | - |
| 12.5009 | 167400 | 0.0009 | - |
| 12.5047 | 167450 | 0.0006 | - |
| 12.5084 | 167500 | 0.0007 | - |
| 12.5121 | 167550 | 0.0002 | - |
| 12.5159 | 167600 | 0.0005 | - |
| 12.5196 | 167650 | 0.0004 | - |
| 12.5233 | 167700 | 0.0008 | - |
| 12.5271 | 167750 | 0.0 | - |
| 12.5308 | 167800 | 0.0002 | - |
| 12.5345 | 167850 | 0.0 | - |
| 12.5383 | 167900 | 0.0006 | - |
| 12.5420 | 167950 | 0.0003 | - |
| 12.5457 | 168000 | 0.0002 | - |
| 12.5495 | 168050 | 0.0004 | - |
| 12.5532 | 168100 | 0.0003 | - |
| 12.5569 | 168150 | 0.0006 | - |
| 12.5607 | 168200 | 0.0005 | - |
| 12.5644 | 168250 | 0.0003 | - |
| 12.5681 | 168300 | 0.0004 | - |
| 12.5719 | 168350 | 0.0002 | - |
| 12.5756 | 168400 | 0.0001 | - |
| 12.5793 | 168450 | 0.0002 | - |
| 12.5831 | 168500 | 0.0002 | - |
| 12.5868 | 168550 | 0.0001 | - |
| 12.5905 | 168600 | 0.0001 | - |
| 12.5943 | 168650 | 0.0005 | - |
| 12.5980 | 168700 | 0.0002 | - |
| 12.6017 | 168750 | 0.0002 | - |
| 12.6055 | 168800 | 0.0003 | - |
| 12.6092 | 168850 | 0.0002 | - |
| 12.6129 | 168900 | 0.0002 | - |
| 12.6167 | 168950 | 0.0001 | - |
| 12.6204 | 169000 | 0.0004 | - |
| 12.6242 | 169050 | 0.0008 | - |
| 12.6279 | 169100 | 0.0005 | - |
| 12.6316 | 169150 | 0.0007 | - |
| 12.6354 | 169200 | 0.0003 | - |
| 12.6391 | 169250 | 0.0003 | - |
| 12.6428 | 169300 | 0.0002 | - |
| 12.6466 | 169350 | 0.0004 | - |
| 12.6503 | 169400 | 0.0001 | - |
| 12.6540 | 169450 | 0.0005 | - |
| 12.6578 | 169500 | 0.0005 | - |
| 12.6615 | 169550 | 0.0005 | - |
| 12.6652 | 169600 | 0.0002 | - |
| 12.6690 | 169650 | 0.0 | - |
| 12.6727 | 169700 | 0.0002 | - |
| 12.6764 | 169750 | 0.0 | - |
| 12.6802 | 169800 | 0.0002 | - |
| 12.6839 | 169850 | 0.0007 | - |
| 12.6876 | 169900 | 0.0007 | - |
| 12.6914 | 169950 | 0.0003 | - |
| 12.6951 | 170000 | 0.0004 | - |
| 12.6988 | 170050 | 0.0007 | - |
| 12.7026 | 170100 | 0.0007 | - |
| 12.7063 | 170150 | 0.0008 | - |
| 12.7100 | 170200 | 0.0005 | - |
| 12.7138 | 170250 | 0.0004 | - |
| 12.7175 | 170300 | 0.0004 | - |
| 12.7212 | 170350 | 0.0002 | - |
| 12.7250 | 170400 | 0.0003 | - |
| 12.7287 | 170450 | 0.0005 | - |
| 12.7324 | 170500 | 0.0003 | - |
| 12.7362 | 170550 | 0.0005 | - |
| 12.7399 | 170600 | 0.0004 | - |
| 12.7436 | 170650 | 0.0011 | - |
| 12.7474 | 170700 | 0.0008 | - |
| 12.7511 | 170750 | 0.0003 | - |
| 12.7548 | 170800 | 0.0009 | - |
| 12.7586 | 170850 | 0.0004 | - |
| 12.7623 | 170900 | 0.0004 | - |
| 12.7660 | 170950 | 0.0011 | - |
| 12.7698 | 171000 | 0.0006 | - |
| 12.7735 | 171050 | 0.0001 | - |
| 12.7772 | 171100 | 0.0004 | - |
| 12.7810 | 171150 | 0.0005 | - |
| 12.7847 | 171200 | 0.0002 | - |
| 12.7884 | 171250 | 0.0003 | - |
| 12.7922 | 171300 | 0.0006 | - |
| 12.7959 | 171350 | 0.0006 | - |
| 12.7996 | 171400 | 0.0006 | - |
| 12.8034 | 171450 | 0.0005 | - |
| 12.8071 | 171500 | 0.0005 | - |
| 12.8108 | 171550 | 0.0006 | - |
| 12.8146 | 171600 | 0.0005 | - |
| 12.8183 | 171650 | 0.0002 | - |
| 12.8220 | 171700 | 0.0003 | - |
| 12.8258 | 171750 | 0.0007 | - |
| 12.8295 | 171800 | 0.0007 | - |
| 12.8332 | 171850 | 0.0008 | - |
| 12.8370 | 171900 | 0.0005 | - |
| 12.8407 | 171950 | 0.0005 | - |
| 12.8444 | 172000 | 0.0004 | - |
| 12.8482 | 172050 | 0.0007 | - |
| 12.8519 | 172100 | 0.0004 | - |
| 12.8556 | 172150 | 0.0007 | - |
| 12.8594 | 172200 | 0.0008 | - |
| 12.8631 | 172250 | 0.0001 | - |
| 12.8669 | 172300 | 0.0001 | - |
| 12.8706 | 172350 | 0.0001 | - |
| 12.8743 | 172400 | 0.0005 | - |
| 12.8781 | 172450 | 0.0005 | - |
| 12.8818 | 172500 | 0.0005 | - |
| 12.8855 | 172550 | 0.0007 | - |
| 12.8893 | 172600 | 0.0004 | - |
| 12.8930 | 172650 | 0.0007 | - |
| 12.8967 | 172700 | 0.0008 | - |
| 12.9005 | 172750 | 0.0009 | - |
| 12.9042 | 172800 | 0.0006 | - |
| 12.9079 | 172850 | 0.0009 | - |
| 12.9117 | 172900 | 0.0004 | - |
| 12.9154 | 172950 | 0.0003 | - |
| 12.9191 | 173000 | 0.0001 | - |
| 12.9229 | 173050 | 0.0002 | - |
| 12.9266 | 173100 | 0.0007 | - |
| 12.9303 | 173150 | 0.0002 | - |
| 12.9341 | 173200 | 0.0001 | - |
| 12.9378 | 173250 | 0.0007 | - |
| 12.9415 | 173300 | 0.0003 | - |
| 12.9453 | 173350 | 0.0003 | - |
| 12.9490 | 173400 | 0.0002 | - |
| 12.9527 | 173450 | 0.0003 | - |
| 12.9565 | 173500 | 0.0008 | - |
| 12.9602 | 173550 | 0.0 | - |
| 12.9639 | 173600 | 0.0003 | - |
| 12.9677 | 173650 | 0.0004 | - |
| 12.9714 | 173700 | 0.0005 | - |
| 12.9751 | 173750 | 0.0005 | - |
| 12.9789 | 173800 | 0.0006 | - |
| 12.9826 | 173850 | 0.0005 | - |
| 12.9863 | 173900 | 0.0002 | - |
| 12.9901 | 173950 | 0.0002 | - |
| 12.9938 | 174000 | 0.0008 | - |
| 12.9975 | 174050 | 0.0 | - |
| 13.0013 | 174100 | 0.0009 | - |
| 13.0050 | 174150 | 0.0005 | - |
| 13.0087 | 174200 | 0.0002 | - |
| 13.0125 | 174250 | 0.0005 | - |
| 13.0162 | 174300 | 0.0006 | - |
| 13.0199 | 174350 | 0.0003 | - |
| 13.0237 | 174400 | 0.0002 | - |
| 13.0274 | 174450 | 0.0005 | - |
| 13.0311 | 174500 | 0.0005 | - |
| 13.0349 | 174550 | 0.0004 | - |
| 13.0386 | 174600 | 0.0002 | - |
| 13.0423 | 174650 | 0.0 | - |
| 13.0461 | 174700 | 0.0003 | - |
| 13.0498 | 174750 | 0.0004 | - |
| 13.0535 | 174800 | 0.0004 | - |
| 13.0573 | 174850 | 0.0006 | - |
| 13.0610 | 174900 | 0.0004 | - |
| 13.0647 | 174950 | 0.0003 | - |
| 13.0685 | 175000 | 0.0005 | - |
| 13.0722 | 175050 | 0.0003 | - |
| 13.0759 | 175100 | 0.0003 | - |
| 13.0797 | 175150 | 0.0 | - |
| 13.0834 | 175200 | 0.0001 | - |
| 13.0871 | 175250 | 0.0003 | - |
| 13.0909 | 175300 | 0.0001 | - |
| 13.0946 | 175350 | 0.0003 | - |
| 13.0983 | 175400 | 0.0005 | - |
| 13.1021 | 175450 | 0.0001 | - |
| 13.1058 | 175500 | 0.0006 | - |
| 13.1096 | 175550 | 0.0003 | - |
| 13.1133 | 175600 | 0.0004 | - |
| 13.1170 | 175650 | 0.0006 | - |
| 13.1208 | 175700 | 0.0004 | - |
| 13.1245 | 175750 | 0.0003 | - |
| 13.1282 | 175800 | 0.0004 | - |
| 13.1320 | 175850 | 0.0002 | - |
| 13.1357 | 175900 | 0.0004 | - |
| 13.1394 | 175950 | 0.0001 | - |
| 13.1432 | 176000 | 0.0001 | - |
| 13.1469 | 176050 | 0.0005 | - |
| 13.1506 | 176100 | 0.0001 | - |
| 13.1544 | 176150 | 0.0003 | - |
| 13.1581 | 176200 | 0.0002 | - |
| 13.1618 | 176250 | 0.0001 | - |
| 13.1656 | 176300 | 0.0006 | - |
| 13.1693 | 176350 | 0.0003 | - |
| 13.1730 | 176400 | 0.0007 | - |
| 13.1768 | 176450 | 0.0007 | - |
| 13.1805 | 176500 | 0.0006 | - |
| 13.1842 | 176550 | 0.0006 | - |
| 13.1880 | 176600 | 0.0003 | - |
| 13.1917 | 176650 | 0.0005 | - |
| 13.1954 | 176700 | 0.0004 | - |
| 13.1992 | 176750 | 0.0003 | - |
| 13.2029 | 176800 | 0.0001 | - |
| 13.2066 | 176850 | 0.0002 | - |
| 13.2104 | 176900 | 0.0003 | - |
| 13.2141 | 176950 | 0.0004 | - |
| 13.2178 | 177000 | 0.0 | - |
| 13.2216 | 177050 | 0.0002 | - |
| 13.2253 | 177100 | 0.0002 | - |
| 13.2290 | 177150 | 0.0003 | - |
| 13.2328 | 177200 | 0.0 | - |
| 13.2365 | 177250 | 0.0002 | - |
| 13.2402 | 177300 | 0.0008 | - |
| 13.2440 | 177350 | 0.0005 | - |
| 13.2477 | 177400 | 0.0002 | - |
| 13.2514 | 177450 | 0.0002 | - |
| 13.2552 | 177500 | 0.0001 | - |
| 13.2589 | 177550 | 0.0001 | - |
| 13.2626 | 177600 | 0.0002 | - |
| 13.2664 | 177650 | 0.0004 | - |
| 13.2701 | 177700 | 0.0001 | - |
| 13.2738 | 177750 | 0.0002 | - |
| 13.2776 | 177800 | 0.0 | - |
| 13.2813 | 177850 | 0.0004 | - |
| 13.2850 | 177900 | 0.0001 | - |
| 13.2888 | 177950 | 0.0002 | - |
| 13.2925 | 178000 | 0.0001 | - |
| 13.2962 | 178050 | 0.0005 | - |
| 13.3000 | 178100 | 0.0 | - |
| 13.3037 | 178150 | 0.0 | - |
| 13.3074 | 178200 | 0.0005 | - |
| 13.3112 | 178250 | 0.0004 | - |
| 13.3149 | 178300 | 0.0005 | - |
| 13.3186 | 178350 | 0.0003 | - |
| 13.3224 | 178400 | 0.0 | - |
| 13.3261 | 178450 | 0.0001 | - |
| 13.3298 | 178500 | 0.0002 | - |
| 13.3336 | 178550 | 0.0005 | - |
| 13.3373 | 178600 | 0.0003 | - |
| 13.3410 | 178650 | 0.0001 | - |
| 13.3448 | 178700 | 0.0002 | - |
| 13.3485 | 178750 | 0.0 | - |
| 13.3523 | 178800 | 0.0007 | - |
| 13.3560 | 178850 | 0.0001 | - |
| 13.3597 | 178900 | 0.0003 | - |
| 13.3635 | 178950 | 0.0002 | - |
| 13.3672 | 179000 | 0.0001 | - |
| 13.3709 | 179050 | 0.0003 | - |
| 13.3747 | 179100 | 0.0 | - |
| 13.3784 | 179150 | 0.0003 | - |
| 13.3821 | 179200 | 0.0004 | - |
| 13.3859 | 179250 | 0.0004 | - |
| 13.3896 | 179300 | 0.0001 | - |
| 13.3933 | 179350 | 0.0002 | - |
| 13.3971 | 179400 | 0.0005 | - |
| 13.4008 | 179450 | 0.0003 | - |
| 13.4045 | 179500 | 0.0002 | - |
| 13.4083 | 179550 | 0.0005 | - |
| 13.4120 | 179600 | 0.0004 | - |
| 13.4157 | 179650 | 0.0002 | - |
| 13.4195 | 179700 | 0.0005 | - |
| 13.4232 | 179750 | 0.0003 | - |
| 13.4269 | 179800 | 0.0005 | - |
| 13.4307 | 179850 | 0.0002 | - |
| 13.4344 | 179900 | 0.0004 | - |
| 13.4381 | 179950 | 0.0001 | - |
| 13.4419 | 180000 | 0.0003 | - |
| 13.4456 | 180050 | 0.0001 | - |
| 13.4493 | 180100 | 0.0002 | - |
| 13.4531 | 180150 | 0.0001 | - |
| 13.4568 | 180200 | 0.0001 | - |
| 13.4605 | 180250 | 0.0003 | - |
| 13.4643 | 180300 | 0.0001 | - |
| 13.4680 | 180350 | 0.0002 | - |
| 13.4717 | 180400 | 0.0002 | - |
| 13.4755 | 180450 | 0.0003 | - |
| 13.4792 | 180500 | 0.0003 | - |
| 13.4829 | 180550 | 0.0001 | - |
| 13.4867 | 180600 | 0.0004 | - |
| 13.4904 | 180650 | 0.0001 | - |
| 13.4941 | 180700 | 0.0005 | - |
| 13.4979 | 180750 | 0.0003 | - |
| 13.5016 | 180800 | 0.0005 | - |
| 13.5053 | 180850 | 0.0005 | - |
| 13.5091 | 180900 | 0.0002 | - |
| 13.5128 | 180950 | 0.0002 | - |
| 13.5165 | 181000 | 0.0005 | - |
| 13.5203 | 181050 | 0.0008 | - |
| 13.5240 | 181100 | 0.001 | - |
| 13.5277 | 181150 | 0.0004 | - |
| 13.5315 | 181200 | 0.0002 | - |
| 13.5352 | 181250 | 0.0008 | - |
| 13.5389 | 181300 | 0.0006 | - |
| 13.5427 | 181350 | 0.0004 | - |
| 13.5464 | 181400 | 0.0002 | - |
| 13.5501 | 181450 | 0.0001 | - |
| 13.5539 | 181500 | 0.0002 | - |
| 13.5576 | 181550 | 0.0003 | - |
| 13.5613 | 181600 | 0.0 | - |
| 13.5651 | 181650 | 0.0 | - |
| 13.5688 | 181700 | 0.0003 | - |
| 13.5725 | 181750 | 0.0003 | - |
| 13.5763 | 181800 | 0.0002 | - |
| 13.5800 | 181850 | 0.0005 | - |
| 13.5838 | 181900 | 0.0024 | - |
| 13.5875 | 181950 | 0.0008 | - |
| 13.5912 | 182000 | 0.0012 | - |
| 13.5950 | 182050 | 0.0007 | - |
| 13.5987 | 182100 | 0.0007 | - |
| 13.6024 | 182150 | 0.0006 | - |
| 13.6062 | 182200 | 0.0005 | - |
| 13.6099 | 182250 | 0.0003 | - |
| 13.6136 | 182300 | 0.0004 | - |
| 13.6174 | 182350 | 0.0002 | - |
| 13.6211 | 182400 | 0.0005 | - |
| 13.6248 | 182450 | 0.0011 | - |
| 13.6286 | 182500 | 0.0002 | - |
| 13.6323 | 182550 | 0.0002 | - |
| 13.6360 | 182600 | 0.0005 | - |
| 13.6398 | 182650 | 0.0005 | - |
| 13.6435 | 182700 | 0.0003 | - |
| 13.6472 | 182750 | 0.0005 | - |
| 13.6510 | 182800 | 0.0009 | - |
| 13.6547 | 182850 | 0.0006 | - |
| 13.6584 | 182900 | 0.0005 | - |
| 13.6622 | 182950 | 0.0002 | - |
| 13.6659 | 183000 | 0.0002 | - |
| 13.6696 | 183050 | 0.0002 | - |
| 13.6734 | 183100 | 0.0005 | - |
| 13.6771 | 183150 | 0.0003 | - |
| 13.6808 | 183200 | 0.0006 | - |
| 13.6846 | 183250 | 0.0003 | - |
| 13.6883 | 183300 | 0.0005 | - |
| 13.6920 | 183350 | 0.0002 | - |
| 13.6958 | 183400 | 0.0004 | - |
| 13.6995 | 183450 | 0.0003 | - |
| 13.7032 | 183500 | 0.0005 | - |
| 13.7070 | 183550 | 0.0004 | - |
| 13.7107 | 183600 | 0.0005 | - |
| 13.7144 | 183650 | 0.0002 | - |
| 13.7182 | 183700 | 0.0003 | - |
| 13.7219 | 183750 | 0.0001 | - |
| 13.7256 | 183800 | 0.0003 | - |
| 13.7294 | 183850 | 0.0 | - |
| 13.7331 | 183900 | 0.0002 | - |
| 13.7368 | 183950 | 0.0001 | - |
| 13.7406 | 184000 | 0.0001 | - |
| 13.7443 | 184050 | 0.0003 | - |
| 13.7480 | 184100 | 0.0004 | - |
| 13.7518 | 184150 | 0.0005 | - |
| 13.7555 | 184200 | 0.0003 | - |
| 13.7592 | 184250 | 0.0004 | - |
| 13.7630 | 184300 | 0.0002 | - |
| 13.7667 | 184350 | 0.0 | - |
| 13.7704 | 184400 | 0.0002 | - |
| 13.7742 | 184450 | 0.0003 | - |
| 13.7779 | 184500 | 0.0 | - |
| 13.7816 | 184550 | 0.0 | - |
| 13.7854 | 184600 | 0.0001 | - |
| 13.7891 | 184650 | 0.0002 | - |
| 13.7928 | 184700 | 0.0002 | - |
| 13.7966 | 184750 | 0.0003 | - |
| 13.8003 | 184800 | 0.0 | - |
| 13.8040 | 184850 | 0.0002 | - |
| 13.8078 | 184900 | 0.0 | - |
| 13.8115 | 184950 | 0.0003 | - |
| 13.8152 | 185000 | 0.0005 | - |
| 13.8190 | 185050 | 0.0002 | - |
| 13.8227 | 185100 | 0.0002 | - |
| 13.8265 | 185150 | 0.0003 | - |
| 13.8302 | 185200 | 0.0003 | - |
| 13.8339 | 185250 | 0.0 | - |
| 13.8377 | 185300 | 0.0 | - |
| 13.8414 | 185350 | 0.0003 | - |
| 13.8451 | 185400 | 0.0004 | - |
| 13.8489 | 185450 | 0.0002 | - |
| 13.8526 | 185500 | 0.0005 | - |
| 13.8563 | 185550 | 0.0004 | - |
| 13.8601 | 185600 | 0.0002 | - |
| 13.8638 | 185650 | 0.0001 | - |
| 13.8675 | 185700 | 0.0003 | - |
| 13.8713 | 185750 | 0.0002 | - |
| 13.8750 | 185800 | 0.0004 | - |
| 13.8787 | 185850 | 0.0 | - |
| 13.8825 | 185900 | 0.0008 | - |
| 13.8862 | 185950 | 0.0002 | - |
| 13.8899 | 186000 | 0.0004 | - |
| 13.8937 | 186050 | 0.0002 | - |
| 13.8974 | 186100 | 0.0 | - |
| 13.9011 | 186150 | 0.0002 | - |
| 13.9049 | 186200 | 0.0003 | - |
| 13.9086 | 186250 | 0.0003 | - |
| 13.9123 | 186300 | 0.0002 | - |
| 13.9161 | 186350 | 0.0001 | - |
| 13.9198 | 186400 | 0.0 | - |
| 13.9235 | 186450 | 0.0002 | - |
| 13.9273 | 186500 | 0.0 | - |
| 13.9310 | 186550 | 0.0005 | - |
| 13.9347 | 186600 | 0.0004 | - |
| 13.9385 | 186650 | 0.0 | - |
| 13.9422 | 186700 | 0.0001 | - |
| 13.9459 | 186750 | 0.0001 | - |
| 13.9497 | 186800 | 0.0002 | - |
| 13.9534 | 186850 | 0.0 | - |
| 13.9571 | 186900 | 0.0003 | - |
| 13.9609 | 186950 | 0.0003 | - |
| 13.9646 | 187000 | 0.0001 | - |
| 13.9683 | 187050 | 0.0002 | - |
| 13.9721 | 187100 | 0.0 | - |
| 13.9758 | 187150 | 0.0002 | - |
| 13.9795 | 187200 | 0.0006 | - |
| 13.9833 | 187250 | 0.0003 | - |
| 13.9870 | 187300 | 0.0002 | - |
| 13.9907 | 187350 | 0.0002 | - |
| 13.9945 | 187400 | 0.0002 | - |
| 13.9982 | 187450 | 0.0006 | - |
| 14.0019 | 187500 | 0.0002 | - |
| 14.0057 | 187550 | 0.0 | - |
| 14.0094 | 187600 | 0.0 | - |
| 14.0131 | 187650 | 0.0002 | - |
| 14.0169 | 187700 | 0.0002 | - |
| 14.0206 | 187750 | 0.0 | - |
| 14.0243 | 187800 | 0.0008 | - |
| 14.0281 | 187850 | 0.0008 | - |
| 14.0318 | 187900 | 0.0003 | - |
| 14.0355 | 187950 | 0.0007 | - |
| 14.0393 | 188000 | 0.0008 | - |
| 14.0430 | 188050 | 0.0006 | - |
| 14.0467 | 188100 | 0.0002 | - |
| 14.0505 | 188150 | 0.0003 | - |
| 14.0542 | 188200 | 0.0005 | - |
| 14.0579 | 188250 | 0.0004 | - |
| 14.0617 | 188300 | 0.0004 | - |
| 14.0654 | 188350 | 0.0 | - |
| 14.0692 | 188400 | 0.0002 | - |
| 14.0729 | 188450 | 0.0005 | - |
| 14.0766 | 188500 | 0.0003 | - |
| 14.0804 | 188550 | 0.0003 | - |
| 14.0841 | 188600 | 0.0005 | - |
| 14.0878 | 188650 | 0.0005 | - |
| 14.0916 | 188700 | 0.0003 | - |
| 14.0953 | 188750 | 0.0002 | - |
| 14.0990 | 188800 | 0.0002 | - |
| 14.1028 | 188850 | 0.0 | - |
| 14.1065 | 188900 | 0.0004 | - |
| 14.1102 | 188950 | 0.0004 | - |
| 14.1140 | 189000 | 0.0007 | - |
| 14.1177 | 189050 | 0.0003 | - |
| 14.1214 | 189100 | 0.0002 | - |
| 14.1252 | 189150 | 0.0003 | - |
| 14.1289 | 189200 | 0.0004 | - |
| 14.1326 | 189250 | 0.0002 | - |
| 14.1364 | 189300 | 0.0003 | - |
| 14.1401 | 189350 | 0.0004 | - |
| 14.1438 | 189400 | 0.0001 | - |
| 14.1476 | 189450 | 0.0003 | - |
| 14.1513 | 189500 | 0.0001 | - |
| 14.1550 | 189550 | 0.0004 | - |
| 14.1588 | 189600 | 0.0008 | - |
| 14.1625 | 189650 | 0.0005 | - |
| 14.1662 | 189700 | 0.0006 | - |
| 14.1700 | 189750 | 0.0004 | - |
| 14.1737 | 189800 | 0.0005 | - |
| 14.1774 | 189850 | 0.0007 | - |
| 14.1812 | 189900 | 0.0009 | - |
| 14.1849 | 189950 | 0.001 | - |
| 14.1886 | 190000 | 0.0005 | - |
| 14.1924 | 190050 | 0.0007 | - |
| 14.1961 | 190100 | 0.0002 | - |
| 14.1998 | 190150 | 0.0002 | - |
| 14.2036 | 190200 | 0.0006 | - |
| 14.2073 | 190250 | 0.0003 | - |
| 14.2110 | 190300 | 0.0002 | - |
| 14.2148 | 190350 | 0.0004 | - |
| 14.2185 | 190400 | 0.0002 | - |
| 14.2222 | 190450 | 0.0002 | - |
| 14.2260 | 190500 | 0.0002 | - |
| 14.2297 | 190550 | 0.0 | - |
| 14.2334 | 190600 | 0.0002 | - |
| 14.2372 | 190650 | 0.0002 | - |
| 14.2409 | 190700 | 0.0008 | - |
| 14.2446 | 190750 | 0.0001 | - |
| 14.2484 | 190800 | 0.0004 | - |
| 14.2521 | 190850 | 0.0005 | - |
| 14.2558 | 190900 | 0.0001 | - |
| 14.2596 | 190950 | 0.0001 | - |
| 14.2633 | 191000 | 0.0005 | - |
| 14.2670 | 191050 | 0.0001 | - |
| 14.2708 | 191100 | 0.0004 | - |
| 14.2745 | 191150 | 0.0003 | - |
| 14.2782 | 191200 | 0.0004 | - |
| 14.2820 | 191250 | 0.0001 | - |
| 14.2857 | 191300 | 0.0002 | - |
| 14.2894 | 191350 | 0.0002 | - |
| 14.2932 | 191400 | 0.0002 | - |
| 14.2969 | 191450 | 0.0003 | - |
| 14.3006 | 191500 | 0.0002 | - |
| 14.3044 | 191550 | 0.0001 | - |
| 14.3081 | 191600 | 0.0001 | - |
| 14.3119 | 191650 | 0.0 | - |
| 14.3156 | 191700 | 0.0003 | - |
| 14.3193 | 191750 | 0.0005 | - |
| 14.3231 | 191800 | 0.0 | - |
| 14.3268 | 191850 | 0.0002 | - |
| 14.3305 | 191900 | 0.0002 | - |
| 14.3343 | 191950 | 0.0002 | - |
| 14.3380 | 192000 | 0.0 | - |
| 14.3417 | 192050 | 0.0 | - |
| 14.3455 | 192100 | 0.0 | - |
| 14.3492 | 192150 | 0.0 | - |
| 14.3529 | 192200 | 0.0003 | - |
| 14.3567 | 192250 | 0.0 | - |
| 14.3604 | 192300 | 0.0001 | - |
| 14.3641 | 192350 | 0.0 | - |
| 14.3679 | 192400 | 0.0 | - |
| 14.3716 | 192450 | 0.0002 | - |
| 14.3753 | 192500 | 0.0006 | - |
| 14.3791 | 192550 | 0.0 | - |
| 14.3828 | 192600 | 0.0002 | - |
| 14.3865 | 192650 | 0.0 | - |
| 14.3903 | 192700 | 0.0001 | - |
| 14.3940 | 192750 | 0.0003 | - |
| 14.3977 | 192800 | 0.0001 | - |
| 14.4015 | 192850 | 0.0001 | - |
| 14.4052 | 192900 | 0.0002 | - |
| 14.4089 | 192950 | 0.0003 | - |
| 14.4127 | 193000 | 0.0003 | - |
| 14.4164 | 193050 | 0.0002 | - |
| 14.4201 | 193100 | 0.0003 | - |
| 14.4239 | 193150 | 0.0005 | - |
| 14.4276 | 193200 | 0.0005 | - |
| 14.4313 | 193250 | 0.0002 | - |
| 14.4351 | 193300 | 0.0001 | - |
| 14.4388 | 193350 | 0.0003 | - |
| 14.4425 | 193400 | 0.0 | - |
| 14.4463 | 193450 | 0.0005 | - |
| 14.4500 | 193500 | 0.0002 | - |
| 14.4537 | 193550 | 0.0002 | - |
| 14.4575 | 193600 | 0.0007 | - |
| 14.4612 | 193650 | 0.0004 | - |
| 14.4649 | 193700 | 0.0002 | - |
| 14.4687 | 193750 | 0.0001 | - |
| 14.4724 | 193800 | 0.0002 | - |
| 14.4761 | 193850 | 0.0002 | - |
| 14.4799 | 193900 | 0.0009 | - |
| 14.4836 | 193950 | 0.0007 | - |
| 14.4873 | 194000 | 0.0006 | - |
| 14.4911 | 194050 | 0.0004 | - |
| 14.4948 | 194100 | 0.0001 | - |
| 14.4985 | 194150 | 0.0008 | - |
| 14.5023 | 194200 | 0.001 | - |
| 14.5060 | 194250 | 0.0006 | - |
| 14.5097 | 194300 | 0.0007 | - |
| 14.5135 | 194350 | 0.0007 | - |
| 14.5172 | 194400 | 0.0005 | - |
| 14.5209 | 194450 | 0.0007 | - |
| 14.5247 | 194500 | 0.0003 | - |
| 14.5284 | 194550 | 0.0009 | - |
| 14.5321 | 194600 | 0.0007 | - |
| 14.5359 | 194650 | 0.0007 | - |
| 14.5396 | 194700 | 0.0005 | - |
| 14.5434 | 194750 | 0.0004 | - |
| 14.5471 | 194800 | 0.0005 | - |
| 14.5508 | 194850 | 0.0007 | - |
| 14.5546 | 194900 | 0.0005 | - |
| 14.5583 | 194950 | 0.0005 | - |
| 14.5620 | 195000 | 0.0004 | - |
| 14.5658 | 195050 | 0.0003 | - |
| 14.5695 | 195100 | 0.0005 | - |
| 14.5732 | 195150 | 0.0006 | - |
| 14.5770 | 195200 | 0.0001 | - |
| 14.5807 | 195250 | 0.0002 | - |
| 14.5844 | 195300 | 0.0001 | - |
| 14.5882 | 195350 | 0.0005 | - |
| 14.5919 | 195400 | 0.0002 | - |
| 14.5956 | 195450 | 0.0004 | - |
| 14.5994 | 195500 | 0.0 | - |
| 14.6031 | 195550 | 0.0004 | - |
| 14.6068 | 195600 | 0.0004 | - |
| 14.6106 | 195650 | 0.0006 | - |
| 14.6143 | 195700 | 0.0004 | - |
| 14.6180 | 195750 | 0.0004 | - |
| 14.6218 | 195800 | 0.0003 | - |
| 14.6255 | 195850 | 0.0003 | - |
| 14.6292 | 195900 | 0.0002 | - |
| 14.6330 | 195950 | 0.0003 | - |
| 14.6367 | 196000 | 0.0005 | - |
| 14.6404 | 196050 | 0.0002 | - |
| 14.6442 | 196100 | 0.0001 | - |
| 14.6479 | 196150 | 0.0004 | - |
| 14.6516 | 196200 | 0.0008 | - |
| 14.6554 | 196250 | 0.0001 | - |
| 14.6591 | 196300 | 0.0005 | - |
| 14.6628 | 196350 | 0.0004 | - |
| 14.6666 | 196400 | 0.0008 | - |
| 14.6703 | 196450 | 0.0002 | - |
| 14.6740 | 196500 | 0.0001 | - |
| 14.6778 | 196550 | 0.0002 | - |
| 14.6815 | 196600 | 0.0002 | - |
| 14.6852 | 196650 | 0.0004 | - |
| 14.6890 | 196700 | 0.0002 | - |
| 14.6927 | 196750 | 0.0001 | - |
| 14.6964 | 196800 | 0.0003 | - |
| 14.7002 | 196850 | 0.0002 | - |
| 14.7039 | 196900 | 0.0002 | - |
| 14.7076 | 196950 | 0.0002 | - |
| 14.7114 | 197000 | 0.0 | - |
| 14.7151 | 197050 | 0.0006 | - |
| 14.7188 | 197100 | 0.0 | - |
| 14.7226 | 197150 | 0.0008 | - |
| 14.7263 | 197200 | 0.0001 | - |
| 14.7300 | 197250 | 0.0002 | - |
| 14.7338 | 197300 | 0.0 | - |
| 14.7375 | 197350 | 0.0001 | - |
| 14.7412 | 197400 | 0.0003 | - |
| 14.7450 | 197450 | 0.0006 | - |
| 14.7487 | 197500 | 0.0002 | - |
| 14.7524 | 197550 | 0.0003 | - |
| 14.7562 | 197600 | 0.0002 | - |
| 14.7599 | 197650 | 0.0001 | - |
| 14.7636 | 197700 | 0.0 | - |
| 14.7674 | 197750 | 0.0003 | - |
| 14.7711 | 197800 | 0.0 | - |
| 14.7748 | 197850 | 0.0002 | - |
| 14.7786 | 197900 | 0.0002 | - |
| 14.7823 | 197950 | 0.0 | - |
| 14.7861 | 198000 | 0.0005 | - |
| 14.7898 | 198050 | 0.0006 | - |
| 14.7935 | 198100 | 0.0001 | - |
| 14.7973 | 198150 | 0.0001 | - |
| 14.8010 | 198200 | 0.0003 | - |
| 14.8047 | 198250 | 0.0002 | - |
| 14.8085 | 198300 | 0.0003 | - |
| 14.8122 | 198350 | 0.0 | - |
| 14.8159 | 198400 | 0.0001 | - |
| 14.8197 | 198450 | 0.0 | - |
| 14.8234 | 198500 | 0.0 | - |
| 14.8271 | 198550 | 0.0002 | - |
| 14.8309 | 198600 | 0.0002 | - |
| 14.8346 | 198650 | 0.0 | - |
| 14.8383 | 198700 | 0.0003 | - |
| 14.8421 | 198750 | 0.0005 | - |
| 14.8458 | 198800 | 0.0002 | - |
| 14.8495 | 198850 | 0.0002 | - |
| 14.8533 | 198900 | 0.0001 | - |
| 14.8570 | 198950 | 0.0002 | - |
| 14.8607 | 199000 | 0.0003 | - |
| 14.8645 | 199050 | 0.0 | - |
| 14.8682 | 199100 | 0.0 | - |
| 14.8719 | 199150 | 0.0002 | - |
| 14.8757 | 199200 | 0.0006 | - |
| 14.8794 | 199250 | 0.0003 | - |
| 14.8831 | 199300 | 0.0 | - |
| 14.8869 | 199350 | 0.0 | - |
| 14.8906 | 199400 | 0.0003 | - |
| 14.8943 | 199450 | 0.0002 | - |
| 14.8981 | 199500 | 0.0003 | - |
| 14.9018 | 199550 | 0.0 | - |
| 14.9055 | 199600 | 0.0 | - |
| 14.9093 | 199650 | 0.0002 | - |
| 14.9130 | 199700 | 0.0003 | - |
| 14.9167 | 199750 | 0.0002 | - |
| 14.9205 | 199800 | 0.0 | - |
| 14.9242 | 199850 | 0.0001 | - |
| 14.9279 | 199900 | 0.0003 | - |
| 14.9317 | 199950 | 0.0005 | - |
| 14.9354 | 200000 | 0.0 | - |
| 14.9391 | 200050 | 0.0003 | - |
| 14.9429 | 200100 | 0.0 | - |
| 14.9466 | 200150 | 0.0001 | - |
| 14.9503 | 200200 | 0.0003 | - |
| 14.9541 | 200250 | 0.0005 | - |
| 14.9578 | 200300 | 0.0002 | - |
| 14.9615 | 200350 | 0.0003 | - |
| 14.9653 | 200400 | 0.0002 | - |
| 14.9690 | 200450 | 0.0 | - |
| 14.9727 | 200500 | 0.0003 | - |
| 14.9765 | 200550 | 0.0 | - |
| 14.9802 | 200600 | 0.0 | - |
| 14.9839 | 200650 | 0.0001 | - |
| 14.9877 | 200700 | 0.0003 | - |
| 14.9914 | 200750 | 0.0001 | - |
| 14.9951 | 200800 | 0.0003 | - |
| 14.9989 | 200850 | 0.0002 | - |
| 15.0026 | 200900 | 0.0001 | - |
| 15.0063 | 200950 | 0.0006 | - |
| 15.0101 | 201000 | 0.0001 | - |
| 15.0138 | 201050 | 0.0004 | - |
| 15.0175 | 201100 | 0.0 | - |
| 15.0213 | 201150 | 0.0003 | - |
| 15.0250 | 201200 | 0.0 | - |
| 15.0288 | 201250 | 0.0003 | - |
| 15.0325 | 201300 | 0.0002 | - |
| 15.0362 | 201350 | 0.0003 | - |
| 15.0400 | 201400 | 0.0002 | - |
| 15.0437 | 201450 | 0.0002 | - |
| 15.0474 | 201500 | 0.0002 | - |
| 15.0512 | 201550 | 0.0002 | - |
| 15.0549 | 201600 | 0.0001 | - |
| 15.0586 | 201650 | 0.0009 | - |
| 15.0624 | 201700 | 0.0 | - |
| 15.0661 | 201750 | 0.0002 | - |
| 15.0698 | 201800 | 0.0004 | - |
| 15.0736 | 201850 | 0.0005 | - |
| 15.0773 | 201900 | 0.0002 | - |
| 15.0810 | 201950 | 0.0002 | - |
| 15.0848 | 202000 | 0.0005 | - |
| 15.0885 | 202050 | 0.0002 | - |
| 15.0922 | 202100 | 0.0002 | - |
| 15.0960 | 202150 | 0.0003 | - |
| 15.0997 | 202200 | 0.0002 | - |
| 15.1034 | 202250 | 0.0002 | - |
| 15.1072 | 202300 | 0.0001 | - |
| 15.1109 | 202350 | 0.0005 | - |
| 15.1146 | 202400 | 0.0003 | - |
| 15.1184 | 202450 | 0.0002 | - |
| 15.1221 | 202500 | 0.0005 | - |
| 15.1258 | 202550 | 0.0 | - |
| 15.1296 | 202600 | 0.0002 | - |
| 15.1333 | 202650 | 0.0003 | - |
| 15.1370 | 202700 | 0.0002 | - |
| 15.1408 | 202750 | 0.0002 | - |
| 15.1445 | 202800 | 0.0003 | - |
| 15.1482 | 202850 | 0.0005 | - |
| 15.1520 | 202900 | 0.0002 | - |
| 15.1557 | 202950 | 0.0 | - |
| 15.1594 | 203000 | 0.0002 | - |
| 15.1632 | 203050 | 0.0 | - |
| 15.1669 | 203100 | 0.0 | - |
| 15.1706 | 203150 | 0.0 | - |
| 15.1744 | 203200 | 0.0 | - |
| 15.1781 | 203250 | 0.0 | - |
| 15.1818 | 203300 | 0.0 | - |
| 15.1856 | 203350 | 0.0002 | - |
| 15.1893 | 203400 | 0.0002 | - |
| 15.1930 | 203450 | 0.0 | - |
| 15.1968 | 203500 | 0.0002 | - |
| 15.2005 | 203550 | 0.0002 | - |
| 15.2042 | 203600 | 0.0003 | - |
| 15.2080 | 203650 | 0.0002 | - |
| 15.2117 | 203700 | 0.0004 | - |
| 15.2154 | 203750 | 0.0 | - |
| 15.2192 | 203800 | 0.0004 | - |
| 15.2229 | 203850 | 0.0003 | - |
| 15.2266 | 203900 | 0.0001 | - |
| 15.2304 | 203950 | 0.0002 | - |
| 15.2341 | 204000 | 0.0003 | - |
| 15.2378 | 204050 | 0.0001 | - |
| 15.2416 | 204100 | 0.0002 | - |
| 15.2453 | 204150 | 0.0003 | - |
| 15.2490 | 204200 | 0.0002 | - |
| 15.2528 | 204250 | 0.0 | - |
| 15.2565 | 204300 | 0.0002 | - |
| 15.2602 | 204350 | 0.0002 | - |
| 15.2640 | 204400 | 0.0 | - |
| 15.2677 | 204450 | 0.0002 | - |
| 15.2715 | 204500 | 0.0 | - |
| 15.2752 | 204550 | 0.0 | - |
| 15.2789 | 204600 | 0.0002 | - |
| 15.2827 | 204650 | 0.0002 | - |
| 15.2864 | 204700 | 0.0002 | - |
| 15.2901 | 204750 | 0.0005 | - |
| 15.2939 | 204800 | 0.0 | - |
| 15.2976 | 204850 | 0.0 | - |
| 15.3013 | 204900 | 0.0001 | - |
| 15.3051 | 204950 | 0.0 | - |
| 15.3088 | 205000 | 0.0003 | - |
| 15.3125 | 205050 | 0.0002 | - |
| 15.3163 | 205100 | 0.0002 | - |
| 15.3200 | 205150 | 0.0002 | - |
| 15.3237 | 205200 | 0.0 | - |
| 15.3275 | 205250 | 0.0002 | - |
| 15.3312 | 205300 | 0.0 | - |
| 15.3349 | 205350 | 0.0003 | - |
| 15.3387 | 205400 | 0.0001 | - |
| 15.3424 | 205450 | 0.0 | - |
| 15.3461 | 205500 | 0.0003 | - |
| 15.3499 | 205550 | 0.0 | - |
| 15.3536 | 205600 | 0.0002 | - |
| 15.3573 | 205650 | 0.0 | - |
| 15.3611 | 205700 | 0.0002 | - |
| 15.3648 | 205750 | 0.0001 | - |
| 15.3685 | 205800 | 0.0 | - |
| 15.3723 | 205850 | 0.0001 | - |
| 15.3760 | 205900 | 0.0 | - |
| 15.3797 | 205950 | 0.0 | - |
| 15.3835 | 206000 | 0.0 | - |
| 15.3872 | 206050 | 0.0002 | - |
| 15.3909 | 206100 | 0.0 | - |
| 15.3947 | 206150 | 0.0 | - |
| 15.3984 | 206200 | 0.0 | - |
| 15.4021 | 206250 | 0.0002 | - |
| 15.4059 | 206300 | 0.0 | - |
| 15.4096 | 206350 | 0.0002 | - |
| 15.4133 | 206400 | 0.0 | - |
| 15.4171 | 206450 | 0.0003 | - |
| 15.4208 | 206500 | 0.0001 | - |
| 15.4245 | 206550 | 0.0002 | - |
| 15.4283 | 206600 | 0.0004 | - |
| 15.4320 | 206650 | 0.0004 | - |
| 15.4357 | 206700 | 0.0 | - |
| 15.4395 | 206750 | 0.0002 | - |
| 15.4432 | 206800 | 0.0005 | - |
| 15.4469 | 206850 | 0.0004 | - |
| 15.4507 | 206900 | 0.0006 | - |
| 15.4544 | 206950 | 0.0004 | - |
| 15.4581 | 207000 | 0.0001 | - |
| 15.4619 | 207050 | 0.0001 | - |
| 15.4656 | 207100 | 0.0003 | - |
| 15.4693 | 207150 | 0.0001 | - |
| 15.4731 | 207200 | 0.0002 | - |
| 15.4768 | 207250 | 0.0001 | - |
| 15.4805 | 207300 | 0.0003 | - |
| 15.4843 | 207350 | 0.0001 | - |
| 15.4880 | 207400 | 0.0007 | - |
| 15.4917 | 207450 | 0.0002 | - |
| 15.4955 | 207500 | 0.0002 | - |
| 15.4992 | 207550 | 0.0001 | - |
| 15.5029 | 207600 | 0.0 | - |
| 15.5067 | 207650 | 0.0005 | - |
| 15.5104 | 207700 | 0.0002 | - |
| 15.5142 | 207750 | 0.0006 | - |
| 15.5179 | 207800 | 0.0001 | - |
| 15.5216 | 207850 | 0.0003 | - |
| 15.5254 | 207900 | 0.0004 | - |
| 15.5291 | 207950 | 0.0004 | - |
| 15.5328 | 208000 | 0.0002 | - |
| 15.5366 | 208050 | 0.0 | - |
| 15.5403 | 208100 | 0.0001 | - |
| 15.5440 | 208150 | 0.0006 | - |
| 15.5478 | 208200 | 0.0008 | - |
| 15.5515 | 208250 | 0.0002 | - |
| 15.5552 | 208300 | 0.0001 | - |
| 15.5590 | 208350 | 0.0007 | - |
| 15.5627 | 208400 | 0.0001 | - |
| 15.5664 | 208450 | 0.0002 | - |
| 15.5702 | 208500 | 0.0001 | - |
| 15.5739 | 208550 | 0.0004 | - |
| 15.5776 | 208600 | 0.0003 | - |
| 15.5814 | 208650 | 0.0003 | - |
| 15.5851 | 208700 | 0.0002 | - |
| 15.5888 | 208750 | 0.0004 | - |
| 15.5926 | 208800 | 0.0002 | - |
| 15.5963 | 208850 | 0.0001 | - |
| 15.6000 | 208900 | 0.0003 | - |
| 15.6038 | 208950 | 0.0002 | - |
| 15.6075 | 209000 | 0.0004 | - |
| 15.6112 | 209050 | 0.0002 | - |
| 15.6150 | 209100 | 0.0004 | - |
| 15.6187 | 209150 | 0.0004 | - |
| 15.6224 | 209200 | 0.0002 | - |
| 15.6262 | 209250 | 0.0005 | - |
| 15.6299 | 209300 | 0.0002 | - |
| 15.6336 | 209350 | 0.0003 | - |
| 15.6374 | 209400 | 0.0005 | - |
| 15.6411 | 209450 | 0.0007 | - |
| 15.6448 | 209500 | 0.0004 | - |
| 15.6486 | 209550 | 0.0003 | - |
| 15.6523 | 209600 | 0.0002 | - |
| 15.6560 | 209650 | 0.0 | - |
| 15.6598 | 209700 | 0.0006 | - |
| 15.6635 | 209750 | 0.0 | - |
| 15.6672 | 209800 | 0.0005 | - |
| 15.6710 | 209850 | 0.0002 | - |
| 15.6747 | 209900 | 0.0002 | - |
| 15.6784 | 209950 | 0.0 | - |
| 15.6822 | 210000 | 0.0003 | - |
| 15.6859 | 210050 | 0.0 | - |
| 15.6896 | 210100 | 0.0002 | - |
| 15.6934 | 210150 | 0.0 | - |
| 15.6971 | 210200 | 0.0003 | - |
| 15.7008 | 210250 | 0.0003 | - |
| 15.7046 | 210300 | 0.0003 | - |
| 15.7083 | 210350 | 0.0002 | - |
| 15.7120 | 210400 | 0.0002 | - |
| 15.7158 | 210450 | 0.0 | - |
| 15.7195 | 210500 | 0.0005 | - |
| 15.7232 | 210550 | 0.0002 | - |
| 15.7270 | 210600 | 0.0002 | - |
| 15.7307 | 210650 | 0.0005 | - |
| 15.7344 | 210700 | 0.0002 | - |
| 15.7382 | 210750 | 0.0002 | - |
| 15.7419 | 210800 | 0.0001 | - |
| 15.7457 | 210850 | 0.0002 | - |
| 15.7494 | 210900 | 0.0003 | - |
| 15.7531 | 210950 | 0.0002 | - |
| 15.7569 | 211000 | 0.0005 | - |
| 15.7606 | 211050 | 0.0002 | - |
| 15.7643 | 211100 | 0.0004 | - |
| 15.7681 | 211150 | 0.0001 | - |
| 15.7718 | 211200 | 0.0003 | - |
| 15.7755 | 211250 | 0.0 | - |
| 15.7793 | 211300 | 0.0002 | - |
| 15.7830 | 211350 | 0.0003 | - |
| 15.7867 | 211400 | 0.0003 | - |
| 15.7905 | 211450 | 0.0 | - |
| 15.7942 | 211500 | 0.0 | - |
| 15.7979 | 211550 | 0.0 | - |
| 15.8017 | 211600 | 0.0002 | - |
| 15.8054 | 211650 | 0.0 | - |
| 15.8091 | 211700 | 0.0 | - |
| 15.8129 | 211750 | 0.0001 | - |
| 15.8166 | 211800 | 0.0002 | - |
| 15.8203 | 211850 | 0.0003 | - |
| 15.8241 | 211900 | 0.0003 | - |
| 15.8278 | 211950 | 0.0003 | - |
| 15.8315 | 212000 | 0.0 | - |
| 15.8353 | 212050 | 0.0 | - |
| 15.8390 | 212100 | 0.0003 | - |
| 15.8427 | 212150 | 0.0 | - |
| 15.8465 | 212200 | 0.0 | - |
| 15.8502 | 212250 | 0.0002 | - |
| 15.8539 | 212300 | 0.0002 | - |
| 15.8577 | 212350 | 0.0002 | - |
| 15.8614 | 212400 | 0.0003 | - |
| 15.8651 | 212450 | 0.0003 | - |
| 15.8689 | 212500 | 0.0 | - |
| 15.8726 | 212550 | 0.0001 | - |
| 15.8763 | 212600 | 0.0002 | - |
| 15.8801 | 212650 | 0.0005 | - |
| 15.8838 | 212700 | 0.0002 | - |
| 15.8875 | 212750 | 0.0002 | - |
| 15.8913 | 212800 | 0.0002 | - |
| 15.8950 | 212850 | 0.0002 | - |
| 15.8987 | 212900 | 0.0 | - |
| 15.9025 | 212950 | 0.0003 | - |
| 15.9062 | 213000 | 0.0 | - |
| 15.9099 | 213050 | 0.0002 | - |
| 15.9137 | 213100 | 0.0002 | - |
| 15.9174 | 213150 | 0.0 | - |
| 15.9211 | 213200 | 0.0 | - |
| 15.9249 | 213250 | 0.0002 | - |
| 15.9286 | 213300 | 0.0002 | - |
| 15.9323 | 213350 | 0.0002 | - |
| 15.9361 | 213400 | 0.0003 | - |
| 15.9398 | 213450 | 0.0002 | - |
| 15.9435 | 213500 | 0.0 | - |
| 15.9473 | 213550 | 0.0002 | - |
| 15.9510 | 213600 | 0.0002 | - |
| 15.9547 | 213650 | 0.0003 | - |
| 15.9585 | 213700 | 0.0 | - |
| 15.9622 | 213750 | 0.0003 | - |
| 15.9659 | 213800 | 0.0003 | - |
| 15.9697 | 213850 | 0.0009 | - |
| 15.9734 | 213900 | 0.0004 | - |
| 15.9771 | 213950 | 0.0008 | - |
| 15.9809 | 214000 | 0.0007 | - |
| 15.9846 | 214050 | 0.0003 | - |
| 15.9884 | 214100 | 0.0004 | - |
| 15.9921 | 214150 | 0.0002 | - |
| 15.9958 | 214200 | 0.0 | - |
| 15.9996 | 214250 | 0.0003 | - |
| 16.0033 | 214300 | 0.0001 | - |
| 16.0070 | 214350 | 0.0004 | - |
| 16.0108 | 214400 | 0.0 | - |
| 16.0145 | 214450 | 0.0002 | - |
| 16.0182 | 214500 | 0.0 | - |
| 16.0220 | 214550 | 0.0005 | - |
| 16.0257 | 214600 | 0.0005 | - |
| 16.0294 | 214650 | 0.0002 | - |
| 16.0332 | 214700 | 0.0003 | - |
| 16.0369 | 214750 | 0.0 | - |
| 16.0406 | 214800 | 0.0002 | - |
| 16.0444 | 214850 | 0.0009 | - |
| 16.0481 | 214900 | 0.0 | - |
| 16.0518 | 214950 | 0.0002 | - |
| 16.0556 | 215000 | 0.0003 | - |
| 16.0593 | 215050 | 0.0004 | - |
| 16.0630 | 215100 | 0.0007 | - |
| 16.0668 | 215150 | 0.0002 | - |
| 16.0705 | 215200 | 0.0002 | - |
| 16.0742 | 215250 | 0.0 | - |
| 16.0780 | 215300 | 0.0001 | - |
| 16.0817 | 215350 | 0.0 | - |
| 16.0854 | 215400 | 0.0002 | - |
| 16.0892 | 215450 | 0.0 | - |
| 16.0929 | 215500 | 0.0 | - |
| 16.0966 | 215550 | 0.0001 | - |
| 16.1004 | 215600 | 0.0003 | - |
| 16.1041 | 215650 | 0.0003 | - |
| 16.1078 | 215700 | 0.0001 | - |
| 16.1116 | 215750 | 0.0 | - |
| 16.1153 | 215800 | 0.0002 | - |
| 16.1190 | 215850 | 0.0003 | - |
| 16.1228 | 215900 | 0.0002 | - |
| 16.1265 | 215950 | 0.0 | - |
| 16.1302 | 216000 | 0.0005 | - |
| 16.1340 | 216050 | 0.0002 | - |
| 16.1377 | 216100 | 0.0003 | - |
| 16.1414 | 216150 | 0.0002 | - |
| 16.1452 | 216200 | 0.0001 | - |
| 16.1489 | 216250 | 0.0 | - |
| 16.1526 | 216300 | 0.0004 | - |
| 16.1564 | 216350 | 0.0001 | - |
| 16.1601 | 216400 | 0.0 | - |
| 16.1638 | 216450 | 0.0002 | - |
| 16.1676 | 216500 | 0.0 | - |
| 16.1713 | 216550 | 0.0003 | - |
| 16.1750 | 216600 | 0.0002 | - |
| 16.1788 | 216650 | 0.0003 | - |
| 16.1825 | 216700 | 0.0 | - |
| 16.1862 | 216750 | 0.0003 | - |
| 16.1900 | 216800 | 0.0001 | - |
| 16.1937 | 216850 | 0.0 | - |
| 16.1974 | 216900 | 0.0002 | - |
| 16.2012 | 216950 | 0.0005 | - |
| 16.2049 | 217000 | 0.0 | - |
| 16.2086 | 217050 | 0.0002 | - |
| 16.2124 | 217100 | 0.0002 | - |
| 16.2161 | 217150 | 0.0 | - |
| 16.2198 | 217200 | 0.0002 | - |
| 16.2236 | 217250 | 0.0 | - |
| 16.2273 | 217300 | 0.0001 | - |
| 16.2311 | 217350 | 0.0 | - |
| 16.2348 | 217400 | 0.0005 | - |
| 16.2385 | 217450 | 0.0003 | - |
| 16.2423 | 217500 | 0.0 | - |
| 16.2460 | 217550 | 0.0002 | - |
| 16.2497 | 217600 | 0.0002 | - |
| 16.2535 | 217650 | 0.0 | - |
| 16.2572 | 217700 | 0.0003 | - |
| 16.2609 | 217750 | 0.0002 | - |
| 16.2647 | 217800 | 0.0002 | - |
| 16.2684 | 217850 | 0.0 | - |
| 16.2721 | 217900 | 0.0 | - |
| 16.2759 | 217950 | 0.0003 | - |
| 16.2796 | 218000 | 0.0003 | - |
| 16.2833 | 218050 | 0.0006 | - |
| 16.2871 | 218100 | 0.0004 | - |
| 16.2908 | 218150 | 0.0002 | - |
| 16.2945 | 218200 | 0.0004 | - |
| 16.2983 | 218250 | 0.0002 | - |
| 16.3020 | 218300 | 0.0004 | - |
| 16.3057 | 218350 | 0.0004 | - |
| 16.3095 | 218400 | 0.0003 | - |
| 16.3132 | 218450 | 0.0 | - |
| 16.3169 | 218500 | 0.0002 | - |
| 16.3207 | 218550 | 0.0005 | - |
| 16.3244 | 218600 | 0.0004 | - |
| 16.3281 | 218650 | 0.0004 | - |
| 16.3319 | 218700 | 0.0001 | - |
| 16.3356 | 218750 | 0.0002 | - |
| 16.3393 | 218800 | 0.0002 | - |
| 16.3431 | 218850 | 0.0002 | - |
| 16.3468 | 218900 | 0.0002 | - |
| 16.3505 | 218950 | 0.0002 | - |
| 16.3543 | 219000 | 0.0003 | - |
| 16.3580 | 219050 | 0.0007 | - |
| 16.3617 | 219100 | 0.0002 | - |
| 16.3655 | 219150 | 0.0001 | - |
| 16.3692 | 219200 | 0.0003 | - |
| 16.3729 | 219250 | 0.0002 | - |
| 16.3767 | 219300 | 0.0 | - |
| 16.3804 | 219350 | 0.0003 | - |
| 16.3841 | 219400 | 0.0002 | - |
| 16.3879 | 219450 | 0.0005 | - |
| 16.3916 | 219500 | 0.0 | - |
| 16.3953 | 219550 | 0.0003 | - |
| 16.3991 | 219600 | 0.0003 | - |
| 16.4028 | 219650 | 0.0 | - |
| 16.4065 | 219700 | 0.0003 | - |
| 16.4103 | 219750 | 0.0001 | - |
| 16.4140 | 219800 | 0.0 | - |
| 16.4177 | 219850 | 0.0002 | - |
| 16.4215 | 219900 | 0.0 | - |
| 16.4252 | 219950 | 0.0002 | - |
| 16.4289 | 220000 | 0.0001 | - |
| 16.4327 | 220050 | 0.0003 | - |
| 16.4364 | 220100 | 0.0002 | - |
| 16.4401 | 220150 | 0.0002 | - |
| 16.4439 | 220200 | 0.0 | - |
| 16.4476 | 220250 | 0.0006 | - |
| 16.4513 | 220300 | 0.0 | - |
| 16.4551 | 220350 | 0.0 | - |
| 16.4588 | 220400 | 0.0002 | - |
| 16.4625 | 220450 | 0.0004 | - |
| 16.4663 | 220500 | 0.0002 | - |
| 16.4700 | 220550 | 0.0 | - |
| 16.4738 | 220600 | 0.0002 | - |
| 16.4775 | 220650 | 0.0 | - |
| 16.4812 | 220700 | 0.0001 | - |
| 16.4850 | 220750 | 0.0002 | - |
| 16.4887 | 220800 | 0.0003 | - |
| 16.4924 | 220850 | 0.0002 | - |
| 16.4962 | 220900 | 0.0 | - |
| 16.4999 | 220950 | 0.0002 | - |
| 16.5036 | 221000 | 0.0 | - |
| 16.5074 | 221050 | 0.0002 | - |
| 16.5111 | 221100 | 0.0 | - |
| 16.5148 | 221150 | 0.0 | - |
| 16.5186 | 221200 | 0.0 | - |
| 16.5223 | 221250 | 0.0002 | - |
| 16.5260 | 221300 | 0.0 | - |
| 16.5298 | 221350 | 0.0 | - |
| 16.5335 | 221400 | 0.0001 | - |
| 16.5372 | 221450 | 0.0002 | - |
| 16.5410 | 221500 | 0.0 | - |
| 16.5447 | 221550 | 0.0001 | - |
| 16.5484 | 221600 | 0.0002 | - |
| 16.5522 | 221650 | 0.0003 | - |
| 16.5559 | 221700 | 0.0004 | - |
| 16.5596 | 221750 | 0.0 | - |
| 16.5634 | 221800 | 0.0002 | - |
| 16.5671 | 221850 | 0.0002 | - |
| 16.5708 | 221900 | 0.0 | - |
| 16.5746 | 221950 | 0.0002 | - |
| 16.5783 | 222000 | 0.0003 | - |
| 16.5820 | 222050 | 0.0002 | - |
| 16.5858 | 222100 | 0.0003 | - |
| 16.5895 | 222150 | 0.0003 | - |
| 16.5932 | 222200 | 0.0003 | - |
| 16.5970 | 222250 | 0.0004 | - |
| 16.6007 | 222300 | 0.0001 | - |
| 16.6044 | 222350 | 0.0 | - |
| 16.6082 | 222400 | 0.0005 | - |
| 16.6119 | 222450 | 0.0001 | - |
| 16.6156 | 222500 | 0.0002 | - |
| 16.6194 | 222550 | 0.0006 | - |
| 16.6231 | 222600 | 0.0003 | - |
| 16.6268 | 222650 | 0.0005 | - |
| 16.6306 | 222700 | 0.0 | - |
| 16.6343 | 222750 | 0.0001 | - |
| 16.6380 | 222800 | 0.0002 | - |
| 16.6418 | 222850 | 0.0002 | - |
| 16.6455 | 222900 | 0.0 | - |
| 16.6492 | 222950 | 0.0 | - |
| 16.6530 | 223000 | 0.0 | - |
| 16.6567 | 223050 | 0.0001 | - |
| 16.6604 | 223100 | 0.0004 | - |
| 16.6642 | 223150 | 0.0005 | - |
| 16.6679 | 223200 | 0.0002 | - |
| 16.6716 | 223250 | 0.0002 | - |
| 16.6754 | 223300 | 0.0 | - |
| 16.6791 | 223350 | 0.0 | - |
| 16.6828 | 223400 | 0.0 | - |
| 16.6866 | 223450 | 0.0005 | - |
| 16.6903 | 223500 | 0.0 | - |
| 16.6940 | 223550 | 0.0 | - |
| 16.6978 | 223600 | 0.0002 | - |
| 16.7015 | 223650 | 0.0 | - |
| 16.7052 | 223700 | 0.0002 | - |
| 16.7090 | 223750 | 0.0 | - |
| 16.7127 | 223800 | 0.0003 | - |
| 16.7165 | 223850 | 0.0007 | - |
| 16.7202 | 223900 | 0.0 | - |
| 16.7239 | 223950 | 0.0001 | - |
| 16.7277 | 224000 | 0.0002 | - |
| 16.7314 | 224050 | 0.0003 | - |
| 16.7351 | 224100 | 0.0003 | - |
| 16.7389 | 224150 | 0.0 | - |
| 16.7426 | 224200 | 0.0 | - |
| 16.7463 | 224250 | 0.0004 | - |
| 16.7501 | 224300 | 0.0002 | - |
| 16.7538 | 224350 | 0.0002 | - |
| 16.7575 | 224400 | 0.0 | - |
| 16.7613 | 224450 | 0.0 | - |
| 16.7650 | 224500 | 0.0 | - |
| 16.7687 | 224550 | 0.0002 | - |
| 16.7725 | 224600 | 0.0002 | - |
| 16.7762 | 224650 | 0.0004 | - |
| 16.7799 | 224700 | 0.0005 | - |
| 16.7837 | 224750 | 0.0003 | - |
| 16.7874 | 224800 | 0.0 | - |
| 16.7911 | 224850 | 0.0002 | - |
| 16.7949 | 224900 | 0.0002 | - |
| 16.7986 | 224950 | 0.0001 | - |
| 16.8023 | 225000 | 0.0005 | - |
| 16.8061 | 225050 | 0.0005 | - |
| 16.8098 | 225100 | 0.0 | - |
| 16.8135 | 225150 | 0.0004 | - |
| 16.8173 | 225200 | 0.0 | - |
| 16.8210 | 225250 | 0.0004 | - |
| 16.8247 | 225300 | 0.0002 | - |
| 16.8285 | 225350 | 0.0 | - |
| 16.8322 | 225400 | 0.0 | - |
| 16.8359 | 225450 | 0.0002 | - |
| 16.8397 | 225500 | 0.0002 | - |
| 16.8434 | 225550 | 0.0003 | - |
| 16.8471 | 225600 | 0.0003 | - |
| 16.8509 | 225650 | 0.0004 | - |
| 16.8546 | 225700 | 0.0 | - |
| 16.8583 | 225750 | 0.0 | - |
| 16.8621 | 225800 | 0.0004 | - |
| 16.8658 | 225850 | 0.0003 | - |
| 16.8695 | 225900 | 0.0 | - |
| 16.8733 | 225950 | 0.0001 | - |
| 16.8770 | 226000 | 0.0 | - |
| 16.8807 | 226050 | 0.0001 | - |
| 16.8845 | 226100 | 0.0 | - |
| 16.8882 | 226150 | 0.0 | - |
| 16.8919 | 226200 | 0.0001 | - |
| 16.8957 | 226250 | 0.0 | - |
| 16.8994 | 226300 | 0.0002 | - |
| 16.9031 | 226350 | 0.0 | - |
| 16.9069 | 226400 | 0.0002 | - |
| 16.9106 | 226450 | 0.0002 | - |
| 16.9143 | 226500 | 0.0001 | - |
| 16.9181 | 226550 | 0.0002 | - |
| 16.9218 | 226600 | 0.0005 | - |
| 16.9255 | 226650 | 0.0 | - |
| 16.9293 | 226700 | 0.0002 | - |
| 16.9330 | 226750 | 0.0001 | - |
| 16.9367 | 226800 | 0.0002 | - |
| 16.9405 | 226850 | 0.0003 | - |
| 16.9442 | 226900 | 0.0 | - |
| 16.9480 | 226950 | 0.0003 | - |
| 16.9517 | 227000 | 0.0001 | - |
| 16.9554 | 227050 | 0.0 | - |
| 16.9592 | 227100 | 0.0 | - |
| 16.9629 | 227150 | 0.0 | - |
| 16.9666 | 227200 | 0.0 | - |
| 16.9704 | 227250 | 0.0002 | - |
| 16.9741 | 227300 | 0.0004 | - |
| 16.9778 | 227350 | 0.0002 | - |
| 16.9816 | 227400 | 0.0 | - |
| 16.9853 | 227450 | 0.0 | - |
| 16.9890 | 227500 | 0.0 | - |
| 16.9928 | 227550 | 0.0002 | - |
| 16.9965 | 227600 | 0.0 | - |
| 17.0002 | 227650 | 0.0003 | - |
| 17.0040 | 227700 | 0.0005 | - |
| 17.0077 | 227750 | 0.0 | - |
| 17.0114 | 227800 | 0.0 | - |
| 17.0152 | 227850 | 0.0003 | - |
| 17.0189 | 227900 | 0.0003 | - |
| 17.0226 | 227950 | 0.0002 | - |
| 17.0264 | 228000 | 0.0002 | - |
| 17.0301 | 228050 | 0.0002 | - |
| 17.0338 | 228100 | 0.0003 | - |
| 17.0376 | 228150 | 0.0002 | - |
| 17.0413 | 228200 | 0.0002 | - |
| 17.0450 | 228250 | 0.0 | - |
| 17.0488 | 228300 | 0.0001 | - |
| 17.0525 | 228350 | 0.0001 | - |
| 17.0562 | 228400 | 0.0 | - |
| 17.0600 | 228450 | 0.0 | - |
| 17.0637 | 228500 | 0.0002 | - |
| 17.0674 | 228550 | 0.0 | - |
| 17.0712 | 228600 | 0.0 | - |
| 17.0749 | 228650 | 0.0002 | - |
| 17.0786 | 228700 | 0.0002 | - |
| 17.0824 | 228750 | 0.0002 | - |
| 17.0861 | 228800 | 0.0002 | - |
| 17.0898 | 228850 | 0.0002 | - |
| 17.0936 | 228900 | 0.0004 | - |
| 17.0973 | 228950 | 0.0002 | - |
| 17.1010 | 229000 | 0.0001 | - |
| 17.1048 | 229050 | 0.0001 | - |
| 17.1085 | 229100 | 0.0001 | - |
| 17.1122 | 229150 | 0.0 | - |
| 17.1160 | 229200 | 0.0002 | - |
| 17.1197 | 229250 | 0.0002 | - |
| 17.1234 | 229300 | 0.0 | - |
| 17.1272 | 229350 | 0.0 | - |
| 17.1309 | 229400 | 0.0 | - |
| 17.1346 | 229450 | 0.0001 | - |
| 17.1384 | 229500 | 0.0003 | - |
| 17.1421 | 229550 | 0.0003 | - |
| 17.1458 | 229600 | 0.0 | - |
| 17.1496 | 229650 | 0.0002 | - |
| 17.1533 | 229700 | 0.0001 | - |
| 17.1570 | 229750 | 0.0 | - |
| 17.1608 | 229800 | 0.0006 | - |
| 17.1645 | 229850 | 0.0 | - |
| 17.1682 | 229900 | 0.0 | - |
| 17.1720 | 229950 | 0.0002 | - |
| 17.1757 | 230000 | 0.0002 | - |
| 17.1794 | 230050 | 0.0 | - |
| 17.1832 | 230100 | 0.0002 | - |
| 17.1869 | 230150 | 0.0002 | - |
| 17.1907 | 230200 | 0.0002 | - |
| 17.1944 | 230250 | 0.0 | - |
| 17.1981 | 230300 | 0.0 | - |
| 17.2019 | 230350 | 0.0001 | - |
| 17.2056 | 230400 | 0.0002 | - |
| 17.2093 | 230450 | 0.0 | - |
| 17.2131 | 230500 | 0.0003 | - |
| 17.2168 | 230550 | 0.0002 | - |
| 17.2205 | 230600 | 0.0002 | - |
| 17.2243 | 230650 | 0.0 | - |
| 17.2280 | 230700 | 0.0003 | - |
| 17.2317 | 230750 | 0.0 | - |
| 17.2355 | 230800 | 0.0002 | - |
| 17.2392 | 230850 | 0.0002 | - |
| 17.2429 | 230900 | 0.0002 | - |
| 17.2467 | 230950 | 0.0002 | - |
| 17.2504 | 231000 | 0.0005 | - |
| 17.2541 | 231050 | 0.0006 | - |
| 17.2579 | 231100 | 0.0003 | - |
| 17.2616 | 231150 | 0.0002 | - |
| 17.2653 | 231200 | 0.0002 | - |
| 17.2691 | 231250 | 0.0001 | - |
| 17.2728 | 231300 | 0.0002 | - |
| 17.2765 | 231350 | 0.0003 | - |
| 17.2803 | 231400 | 0.0 | - |
| 17.2840 | 231450 | 0.0002 | - |
| 17.2877 | 231500 | 0.0 | - |
| 17.2915 | 231550 | 0.0 | - |
| 17.2952 | 231600 | 0.0 | - |
| 17.2989 | 231650 | 0.0005 | - |
| 17.3027 | 231700 | 0.0002 | - |
| 17.3064 | 231750 | 0.0 | - |
| 17.3101 | 231800 | 0.0003 | - |
| 17.3139 | 231850 | 0.0002 | - |
| 17.3176 | 231900 | 0.0002 | - |
| 17.3213 | 231950 | 0.0001 | - |
| 17.3251 | 232000 | 0.0002 | - |
| 17.3288 | 232050 | 0.0 | - |
| 17.3325 | 232100 | 0.0003 | - |
| 17.3363 | 232150 | 0.0 | - |
| 17.3400 | 232200 | 0.0 | - |
| 17.3437 | 232250 | 0.0002 | - |
| 17.3475 | 232300 | 0.0002 | - |
| 17.3512 | 232350 | 0.0 | - |
| 17.3549 | 232400 | 0.0001 | - |
| 17.3587 | 232450 | 0.0001 | - |
| 17.3624 | 232500 | 0.0003 | - |
| 17.3661 | 232550 | 0.0005 | - |
| 17.3699 | 232600 | 0.0 | - |
| 17.3736 | 232650 | 0.0002 | - |
| 17.3773 | 232700 | 0.0001 | - |
| 17.3811 | 232750 | 0.0001 | - |
| 17.3848 | 232800 | 0.0002 | - |
| 17.3885 | 232850 | 0.0003 | - |
| 17.3923 | 232900 | 0.0 | - |
| 17.3960 | 232950 | 0.0002 | - |
| 17.3997 | 233000 | 0.0001 | - |
| 17.4035 | 233050 | 0.0001 | - |
| 17.4072 | 233100 | 0.0003 | - |
| 17.4109 | 233150 | 0.0005 | - |
| 17.4147 | 233200 | 0.0 | - |
| 17.4184 | 233250 | 0.0001 | - |
| 17.4221 | 233300 | 0.0001 | - |
| 17.4259 | 233350 | 0.0002 | - |
| 17.4296 | 233400 | 0.0002 | - |
| 17.4334 | 233450 | 0.0002 | - |
| 17.4371 | 233500 | 0.0004 | - |
| 17.4408 | 233550 | 0.0003 | - |
| 17.4446 | 233600 | 0.0003 | - |
| 17.4483 | 233650 | 0.0011 | - |
| 17.4520 | 233700 | 0.0002 | - |
| 17.4558 | 233750 | 0.0 | - |
| 17.4595 | 233800 | 0.0 | - |
| 17.4632 | 233850 | 0.0002 | - |
| 17.4670 | 233900 | 0.0003 | - |
| 17.4707 | 233950 | 0.0001 | - |
| 17.4744 | 234000 | 0.0001 | - |
| 17.4782 | 234050 | 0.0005 | - |
| 17.4819 | 234100 | 0.0003 | - |
| 17.4856 | 234150 | 0.0002 | - |
| 17.4894 | 234200 | 0.0 | - |
| 17.4931 | 234250 | 0.0003 | - |
| 17.4968 | 234300 | 0.0001 | - |
| 17.5006 | 234350 | 0.0001 | - |
| 17.5043 | 234400 | 0.0002 | - |
| 17.5080 | 234450 | 0.0002 | - |
| 17.5118 | 234500 | 0.0003 | - |
| 17.5155 | 234550 | 0.0004 | - |
| 17.5192 | 234600 | 0.0001 | - |
| 17.5230 | 234650 | 0.0003 | - |
| 17.5267 | 234700 | 0.0003 | - |
| 17.5304 | 234750 | 0.0003 | - |
| 17.5342 | 234800 | 0.0 | - |
| 17.5379 | 234850 | 0.0 | - |
| 17.5416 | 234900 | 0.0004 | - |
| 17.5454 | 234950 | 0.0003 | - |
| 17.5491 | 235000 | 0.0 | - |
| 17.5528 | 235050 | 0.0 | - |
| 17.5566 | 235100 | 0.0 | - |
| 17.5603 | 235150 | 0.0003 | - |
| 17.5640 | 235200 | 0.0 | - |
| 17.5678 | 235250 | 0.0002 | - |
| 17.5715 | 235300 | 0.0 | - |
| 17.5752 | 235350 | 0.0002 | - |
| 17.5790 | 235400 | 0.0002 | - |
| 17.5827 | 235450 | 0.0 | - |
| 17.5864 | 235500 | 0.0 | - |
| 17.5902 | 235550 | 0.0 | - |
| 17.5939 | 235600 | 0.0002 | - |
| 17.5976 | 235650 | 0.0001 | - |
| 17.6014 | 235700 | 0.0002 | - |
| 17.6051 | 235750 | 0.0003 | - |
| 17.6088 | 235800 | 0.0002 | - |
| 17.6126 | 235850 | 0.0003 | - |
| 17.6163 | 235900 | 0.0005 | - |
| 17.6200 | 235950 | 0.0003 | - |
| 17.6238 | 236000 | 0.0 | - |
| 17.6275 | 236050 | 0.0002 | - |
| 17.6312 | 236100 | 0.0002 | - |
| 17.6350 | 236150 | 0.0002 | - |
| 17.6387 | 236200 | 0.0002 | - |
| 17.6424 | 236250 | 0.0 | - |
| 17.6462 | 236300 | 0.0002 | - |
| 17.6499 | 236350 | 0.0 | - |
| 17.6536 | 236400 | 0.0002 | - |
| 17.6574 | 236450 | 0.0002 | - |
| 17.6611 | 236500 | 0.0004 | - |
| 17.6648 | 236550 | 0.0001 | - |
| 17.6686 | 236600 | 0.0003 | - |
| 17.6723 | 236650 | 0.0 | - |
| 17.6761 | 236700 | 0.0003 | - |
| 17.6798 | 236750 | 0.0001 | - |
| 17.6835 | 236800 | 0.0003 | - |
| 17.6873 | 236850 | 0.0002 | - |
| 17.6910 | 236900 | 0.0002 | - |
| 17.6947 | 236950 | 0.0004 | - |
| 17.6985 | 237000 | 0.0002 | - |
| 17.7022 | 237050 | 0.0 | - |
| 17.7059 | 237100 | 0.0002 | - |
| 17.7097 | 237150 | 0.0001 | - |
| 17.7134 | 237200 | 0.0002 | - |
| 17.7171 | 237250 | 0.0003 | - |
| 17.7209 | 237300 | 0.0002 | - |
| 17.7246 | 237350 | 0.0002 | - |
| 17.7283 | 237400 | 0.0003 | - |
| 17.7321 | 237450 | 0.0002 | - |
| 17.7358 | 237500 | 0.0 | - |
| 17.7395 | 237550 | 0.0002 | - |
| 17.7433 | 237600 | 0.0 | - |
| 17.7470 | 237650 | 0.0001 | - |
| 17.7507 | 237700 | 0.0 | - |
| 17.7545 | 237750 | 0.0 | - |
| 17.7582 | 237800 | 0.0002 | - |
| 17.7619 | 237850 | 0.0 | - |
| 17.7657 | 237900 | 0.0003 | - |
| 17.7694 | 237950 | 0.0 | - |
| 17.7731 | 238000 | 0.0002 | - |
| 17.7769 | 238050 | 0.0003 | - |
| 17.7806 | 238100 | 0.0001 | - |
| 17.7843 | 238150 | 0.0002 | - |
| 17.7881 | 238200 | 0.0 | - |
| 17.7918 | 238250 | 0.0002 | - |
| 17.7955 | 238300 | 0.0 | - |
| 17.7993 | 238350 | 0.0 | - |
| 17.8030 | 238400 | 0.0 | - |
| 17.8067 | 238450 | 0.0 | - |
| 17.8105 | 238500 | 0.0 | - |
| 17.8142 | 238550 | 0.0002 | - |
| 17.8179 | 238600 | 0.0002 | - |
| 17.8217 | 238650 | 0.0004 | - |
| 17.8254 | 238700 | 0.0006 | - |
| 17.8291 | 238750 | 0.0002 | - |
| 17.8329 | 238800 | 0.0004 | - |
| 17.8366 | 238850 | 0.0004 | - |
| 17.8403 | 238900 | 0.0002 | - |
| 17.8441 | 238950 | 0.0002 | - |
| 17.8478 | 239000 | 0.0002 | - |
| 17.8515 | 239050 | 0.0 | - |
| 17.8553 | 239100 | 0.0001 | - |
| 17.8590 | 239150 | 0.0 | - |
| 17.8627 | 239200 | 0.0001 | - |
| 17.8665 | 239250 | 0.0001 | - |
| 17.8702 | 239300 | 0.0002 | - |
| 17.8739 | 239350 | 0.0005 | - |
| 17.8777 | 239400 | 0.0006 | - |
| 17.8814 | 239450 | 0.0002 | - |
| 17.8851 | 239500 | 0.0002 | - |
| 17.8889 | 239550 | 0.0 | - |
| 17.8926 | 239600 | 0.0004 | - |
| 17.8963 | 239650 | 0.0003 | - |
| 17.9001 | 239700 | 0.0002 | - |
| 17.9038 | 239750 | 0.0003 | - |
| 17.9075 | 239800 | 0.0 | - |
| 17.9113 | 239850 | 0.0004 | - |
| 17.9150 | 239900 | 0.0 | - |
| 17.9188 | 239950 | 0.0001 | - |
| 17.9225 | 240000 | 0.0 | - |
| 17.9262 | 240050 | 0.0004 | - |
| 17.9300 | 240100 | 0.0002 | - |
| 17.9337 | 240150 | 0.0 | - |
| 17.9374 | 240200 | 0.0 | - |
| 17.9412 | 240250 | 0.0 | - |
| 17.9449 | 240300 | 0.0005 | - |
| 17.9486 | 240350 | 0.0 | - |
| 17.9524 | 240400 | 0.0002 | - |
| 17.9561 | 240450 | 0.0002 | - |
| 17.9598 | 240500 | 0.0003 | - |
| 17.9636 | 240550 | 0.0005 | - |
| 17.9673 | 240600 | 0.0002 | - |
| 17.9710 | 240650 | 0.0006 | - |
| 17.9748 | 240700 | 0.0002 | - |
| 17.9785 | 240750 | 0.0006 | - |
| 17.9822 | 240800 | 0.0 | - |
| 17.9860 | 240850 | 0.0 | - |
| 17.9897 | 240900 | 0.0 | - |
| 17.9934 | 240950 | 0.0002 | - |
| 17.9972 | 241000 | 0.0 | - |
| 18.0009 | 241050 | 0.0002 | - |
| 18.0046 | 241100 | 0.0002 | - |
| 18.0084 | 241150 | 0.0004 | - |
| 18.0121 | 241200 | 0.0004 | - |
| 18.0158 | 241250 | 0.0004 | - |
| 18.0196 | 241300 | 0.0003 | - |
| 18.0233 | 241350 | 0.0001 | - |
| 18.0270 | 241400 | 0.0001 | - |
| 18.0308 | 241450 | 0.0002 | - |
| 18.0345 | 241500 | 0.0002 | - |
| 18.0382 | 241550 | 0.0004 | - |
| 18.0420 | 241600 | 0.0002 | - |
| 18.0457 | 241650 | 0.0002 | - |
| 18.0494 | 241700 | 0.0002 | - |
| 18.0532 | 241750 | 0.0008 | - |
| 18.0569 | 241800 | 0.0 | - |
| 18.0606 | 241850 | 0.0006 | - |
| 18.0644 | 241900 | 0.0004 | - |
| 18.0681 | 241950 | 0.0 | - |
| 18.0718 | 242000 | 0.0003 | - |
| 18.0756 | 242050 | 0.0004 | - |
| 18.0793 | 242100 | 0.0003 | - |
| 18.0830 | 242150 | 0.0005 | - |
| 18.0868 | 242200 | 0.0 | - |
| 18.0905 | 242250 | 0.0002 | - |
| 18.0942 | 242300 | 0.0002 | - |
| 18.0980 | 242350 | 0.0 | - |
| 18.1017 | 242400 | 0.0002 | - |
| 18.1054 | 242450 | 0.0004 | - |
| 18.1092 | 242500 | 0.0001 | - |
| 18.1129 | 242550 | 0.0003 | - |
| 18.1166 | 242600 | 0.0002 | - |
| 18.1204 | 242650 | 0.0002 | - |
| 18.1241 | 242700 | 0.0001 | - |
| 18.1278 | 242750 | 0.0002 | - |
| 18.1316 | 242800 | 0.0002 | - |
| 18.1353 | 242850 | 0.0002 | - |
| 18.1390 | 242900 | 0.0003 | - |
| 18.1428 | 242950 | 0.0002 | - |
| 18.1465 | 243000 | 0.0005 | - |
| 18.1503 | 243050 | 0.0 | - |
| 18.1540 | 243100 | 0.0002 | - |
| 18.1577 | 243150 | 0.0003 | - |
| 18.1615 | 243200 | 0.0003 | - |
| 18.1652 | 243250 | 0.0 | - |
| 18.1689 | 243300 | 0.0006 | - |
| 18.1727 | 243350 | 0.0007 | - |
| 18.1764 | 243400 | 0.0 | - |
| 18.1801 | 243450 | 0.0005 | - |
| 18.1839 | 243500 | 0.0003 | - |
| 18.1876 | 243550 | 0.0001 | - |
| 18.1913 | 243600 | 0.0001 | - |
| 18.1951 | 243650 | 0.0002 | - |
| 18.1988 | 243700 | 0.0003 | - |
| 18.2025 | 243750 | 0.0 | - |
| 18.2063 | 243800 | 0.0002 | - |
| 18.2100 | 243850 | 0.0002 | - |
| 18.2137 | 243900 | 0.0 | - |
| 18.2175 | 243950 | 0.0002 | - |
| 18.2212 | 244000 | 0.0002 | - |
| 18.2249 | 244050 | 0.0001 | - |
| 18.2287 | 244100 | 0.0005 | - |
| 18.2324 | 244150 | 0.0001 | - |
| 18.2361 | 244200 | 0.0002 | - |
| 18.2399 | 244250 | 0.0 | - |
| 18.2436 | 244300 | 0.0 | - |
| 18.2473 | 244350 | 0.0007 | - |
| 18.2511 | 244400 | 0.0001 | - |
| 18.2548 | 244450 | 0.0001 | - |
| 18.2585 | 244500 | 0.0001 | - |
| 18.2623 | 244550 | 0.0006 | - |
| 18.2660 | 244600 | 0.0 | - |
| 18.2697 | 244650 | 0.0003 | - |
| 18.2735 | 244700 | 0.0003 | - |
| 18.2772 | 244750 | 0.0 | - |
| 18.2809 | 244800 | 0.0002 | - |
| 18.2847 | 244850 | 0.0001 | - |
| 18.2884 | 244900 | 0.0002 | - |
| 18.2921 | 244950 | 0.0 | - |
| 18.2959 | 245000 | 0.0003 | - |
| 18.2996 | 245050 | 0.0 | - |
| 18.3033 | 245100 | 0.0003 | - |
| 18.3071 | 245150 | 0.0 | - |
| 18.3108 | 245200 | 0.0 | - |
| 18.3145 | 245250 | 0.0002 | - |
| 18.3183 | 245300 | 0.0003 | - |
| 18.3220 | 245350 | 0.0002 | - |
| 18.3257 | 245400 | 0.0002 | - |
| 18.3295 | 245450 | 0.0001 | - |
| 18.3332 | 245500 | 0.0003 | - |
| 18.3369 | 245550 | 0.0 | - |
| 18.3407 | 245600 | 0.0002 | - |
| 18.3444 | 245650 | 0.0002 | - |
| 18.3481 | 245700 | 0.0004 | - |
| 18.3519 | 245750 | 0.0002 | - |
| 18.3556 | 245800 | 0.0 | - |
| 18.3593 | 245850 | 0.0 | - |
| 18.3631 | 245900 | 0.0 | - |
| 18.3668 | 245950 | 0.0002 | - |
| 18.3705 | 246000 | 0.0001 | - |
| 18.3743 | 246050 | 0.0002 | - |
| 18.3780 | 246100 | 0.0002 | - |
| 18.3817 | 246150 | 0.0002 | - |
| 18.3855 | 246200 | 0.0 | - |
| 18.3892 | 246250 | 0.0002 | - |
| 18.3930 | 246300 | 0.0001 | - |
| 18.3967 | 246350 | 0.0 | - |
| 18.4004 | 246400 | 0.0 | - |
| 18.4042 | 246450 | 0.0 | - |
| 18.4079 | 246500 | 0.0002 | - |
| 18.4116 | 246550 | 0.0 | - |
| 18.4154 | 246600 | 0.0002 | - |
| 18.4191 | 246650 | 0.0 | - |
| 18.4228 | 246700 | 0.0 | - |
| 18.4266 | 246750 | 0.0 | - |
| 18.4303 | 246800 | 0.0 | - |
| 18.4340 | 246850 | 0.0 | - |
| 18.4378 | 246900 | 0.0 | - |
| 18.4415 | 246950 | 0.0 | - |
| 18.4452 | 247000 | 0.0 | - |
| 18.4490 | 247050 | 0.0 | - |
| 18.4527 | 247100 | 0.0002 | - |
| 18.4564 | 247150 | 0.0 | - |
| 18.4602 | 247200 | 0.0 | - |
| 18.4639 | 247250 | 0.0002 | - |
| 18.4676 | 247300 | 0.0002 | - |
| 18.4714 | 247350 | 0.0001 | - |
| 18.4751 | 247400 | 0.0002 | - |
| 18.4788 | 247450 | 0.0002 | - |
| 18.4826 | 247500 | 0.0003 | - |
| 18.4863 | 247550 | 0.0 | - |
| 18.4900 | 247600 | 0.0002 | - |
| 18.4938 | 247650 | 0.0 | - |
| 18.4975 | 247700 | 0.0 | - |
| 18.5012 | 247750 | 0.0 | - |
| 18.5050 | 247800 | 0.0 | - |
| 18.5087 | 247850 | 0.0002 | - |
| 18.5124 | 247900 | 0.0002 | - |
| 18.5162 | 247950 | 0.0 | - |
| 18.5199 | 248000 | 0.0003 | - |
| 18.5236 | 248050 | 0.0003 | - |
| 18.5274 | 248100 | 0.0001 | - |
| 18.5311 | 248150 | 0.0 | - |
| 18.5348 | 248200 | 0.0 | - |
| 18.5386 | 248250 | 0.0 | - |
| 18.5423 | 248300 | 0.0 | - |
| 18.5460 | 248350 | 0.0 | - |
| 18.5498 | 248400 | 0.0003 | - |
| 18.5535 | 248450 | 0.0002 | - |
| 18.5572 | 248500 | 0.0001 | - |
| 18.5610 | 248550 | 0.0 | - |
| 18.5647 | 248600 | 0.0 | - |
| 18.5684 | 248650 | 0.0 | - |
| 18.5722 | 248700 | 0.0003 | - |
| 18.5759 | 248750 | 0.0002 | - |
| 18.5796 | 248800 | 0.0003 | - |
| 18.5834 | 248850 | 0.0006 | - |
| 18.5871 | 248900 | 0.0003 | - |
| 18.5908 | 248950 | 0.0003 | - |
| 18.5946 | 249000 | 0.0 | - |
| 18.5983 | 249050 | 0.0 | - |
| 18.6020 | 249100 | 0.0001 | - |
| 18.6058 | 249150 | 0.0005 | - |
| 18.6095 | 249200 | 0.0 | - |
| 18.6132 | 249250 | 0.0 | - |
| 18.6170 | 249300 | 0.0 | - |
| 18.6207 | 249350 | 0.0001 | - |
| 18.6244 | 249400 | 0.0 | - |
| 18.6282 | 249450 | 0.0 | - |
| 18.6319 | 249500 | 0.0003 | - |
| 18.6357 | 249550 | 0.0003 | - |
| 18.6394 | 249600 | 0.0002 | - |
| 18.6431 | 249650 | 0.0001 | - |
| 18.6469 | 249700 | 0.0002 | - |
| 18.6506 | 249750 | 0.0 | - |
| 18.6543 | 249800 | 0.0006 | - |
| 18.6581 | 249850 | 0.0001 | - |
| 18.6618 | 249900 | 0.0002 | - |
| 18.6655 | 249950 | 0.0 | - |
| 18.6693 | 250000 | 0.0001 | - |
| 18.6730 | 250050 | 0.0001 | - |
| 18.6767 | 250100 | 0.0002 | - |
| 18.6805 | 250150 | 0.0001 | - |
| 18.6842 | 250200 | 0.0004 | - |
| 18.6879 | 250250 | 0.0 | - |
| 18.6917 | 250300 | 0.0002 | - |
| 18.6954 | 250350 | 0.0 | - |
| 18.6991 | 250400 | 0.0002 | - |
| 18.7029 | 250450 | 0.0002 | - |
| 18.7066 | 250500 | 0.0001 | - |
| 18.7103 | 250550 | 0.0 | - |
| 18.7141 | 250600 | 0.0002 | - |
| 18.7178 | 250650 | 0.0001 | - |
| 18.7215 | 250700 | 0.0003 | - |
| 18.7253 | 250750 | 0.0002 | - |
| 18.7290 | 250800 | 0.0 | - |
| 18.7327 | 250850 | 0.0 | - |
| 18.7365 | 250900 | 0.0001 | - |
| 18.7402 | 250950 | 0.0001 | - |
| 18.7439 | 251000 | 0.0 | - |
| 18.7477 | 251050 | 0.0002 | - |
| 18.7514 | 251100 | 0.0007 | - |
| 18.7551 | 251150 | 0.0002 | - |
| 18.7589 | 251200 | 0.0003 | - |
| 18.7626 | 251250 | 0.0005 | - |
| 18.7663 | 251300 | 0.0001 | - |
| 18.7701 | 251350 | 0.0003 | - |
| 18.7738 | 251400 | 0.0 | - |
| 18.7775 | 251450 | 0.0001 | - |
| 18.7813 | 251500 | 0.0 | - |
| 18.7850 | 251550 | 0.0002 | - |
| 18.7887 | 251600 | 0.0 | - |
| 18.7925 | 251650 | 0.0002 | - |
| 18.7962 | 251700 | 0.0 | - |
| 18.7999 | 251750 | 0.0003 | - |
| 18.8037 | 251800 | 0.0003 | - |
| 18.8074 | 251850 | 0.0 | - |
| 18.8111 | 251900 | 0.0 | - |
| 18.8149 | 251950 | 0.0002 | - |
| 18.8186 | 252000 | 0.0003 | - |
| 18.8223 | 252050 | 0.0005 | - |
| 18.8261 | 252100 | 0.0005 | - |
| 18.8298 | 252150 | 0.0009 | - |
| 18.8335 | 252200 | 0.0006 | - |
| 18.8373 | 252250 | 0.0001 | - |
| 18.8410 | 252300 | 0.0003 | - |
| 18.8447 | 252350 | 0.0001 | - |
| 18.8485 | 252400 | 0.0 | - |
| 18.8522 | 252450 | 0.0001 | - |
| 18.8559 | 252500 | 0.0005 | - |
| 18.8597 | 252550 | 0.0007 | - |
| 18.8634 | 252600 | 0.0003 | - |
| 18.8671 | 252650 | 0.0 | - |
| 18.8709 | 252700 | 0.0001 | - |
| 18.8746 | 252750 | 0.0001 | - |
| 18.8784 | 252800 | 0.0004 | - |
| 18.8821 | 252850 | 0.0002 | - |
| 18.8858 | 252900 | 0.0003 | - |
| 18.8896 | 252950 | 0.0 | - |
| 18.8933 | 253000 | 0.0002 | - |
| 18.8970 | 253050 | 0.0003 | - |
| 18.9008 | 253100 | 0.0 | - |
| 18.9045 | 253150 | 0.0 | - |
| 18.9082 | 253200 | 0.0002 | - |
| 18.9120 | 253250 | 0.0002 | - |
| 18.9157 | 253300 | 0.0003 | - |
| 18.9194 | 253350 | 0.0003 | - |
| 18.9232 | 253400 | 0.0 | - |
| 18.9269 | 253450 | 0.0 | - |
| 18.9306 | 253500 | 0.0003 | - |
| 18.9344 | 253550 | 0.0 | - |
| 18.9381 | 253600 | 0.0 | - |
| 18.9418 | 253650 | 0.0004 | - |
| 18.9456 | 253700 | 0.0 | - |
| 18.9493 | 253750 | 0.0 | - |
| 18.9530 | 253800 | 0.0002 | - |
| 18.9568 | 253850 | 0.0002 | - |
| 18.9605 | 253900 | 0.0 | - |
| 18.9642 | 253950 | 0.0002 | - |
| 18.9680 | 254000 | 0.0 | - |
| 18.9717 | 254050 | 0.0002 | - |
| 18.9754 | 254100 | 0.0003 | - |
| 18.9792 | 254150 | 0.0002 | - |
| 18.9829 | 254200 | 0.0001 | - |
| 18.9866 | 254250 | 0.0002 | - |
| 18.9904 | 254300 | 0.0 | - |
| 18.9941 | 254350 | 0.0 | - |
| 18.9978 | 254400 | 0.0001 | - |
| 19.0016 | 254450 | 0.0002 | - |
| 19.0053 | 254500 | 0.0002 | - |
| 19.0090 | 254550 | 0.0002 | - |
| 19.0128 | 254600 | 0.0002 | - |
| 19.0165 | 254650 | 0.0002 | - |
| 19.0202 | 254700 | 0.0001 | - |
| 19.0240 | 254750 | 0.0 | - |
| 19.0277 | 254800 | 0.0003 | - |
| 19.0314 | 254850 | 0.0001 | - |
| 19.0352 | 254900 | 0.0001 | - |
| 19.0389 | 254950 | 0.0001 | - |
| 19.0426 | 255000 | 0.0002 | - |
| 19.0464 | 255050 | 0.0002 | - |
| 19.0501 | 255100 | 0.0 | - |
| 19.0538 | 255150 | 0.0 | - |
| 19.0576 | 255200 | 0.0003 | - |
| 19.0613 | 255250 | 0.0002 | - |
| 19.0650 | 255300 | 0.0002 | - |
| 19.0688 | 255350 | 0.0 | - |
| 19.0725 | 255400 | 0.0 | - |
| 19.0762 | 255450 | 0.0 | - |
| 19.0800 | 255500 | 0.0002 | - |
| 19.0837 | 255550 | 0.0 | - |
| 19.0874 | 255600 | 0.0 | - |
| 19.0912 | 255650 | 0.0 | - |
| 19.0949 | 255700 | 0.0 | - |
| 19.0986 | 255750 | 0.0002 | - |
| 19.1024 | 255800 | 0.0002 | - |
| 19.1061 | 255850 | 0.0003 | - |
| 19.1098 | 255900 | 0.0002 | - |
| 19.1136 | 255950 | 0.0 | - |
| 19.1173 | 256000 | 0.0003 | - |
| 19.1211 | 256050 | 0.0 | - |
| 19.1248 | 256100 | 0.0002 | - |
| 19.1285 | 256150 | 0.0002 | - |
| 19.1323 | 256200 | 0.0 | - |
| 19.1360 | 256250 | 0.0002 | - |
| 19.1397 | 256300 | 0.0002 | - |
| 19.1435 | 256350 | 0.0 | - |
| 19.1472 | 256400 | 0.0002 | - |
| 19.1509 | 256450 | 0.0 | - |
| 19.1547 | 256500 | 0.0 | - |
| 19.1584 | 256550 | 0.0002 | - |
| 19.1621 | 256600 | 0.0 | - |
| 19.1659 | 256650 | 0.0004 | - |
| 19.1696 | 256700 | 0.0003 | - |
| 19.1733 | 256750 | 0.0 | - |
| 19.1771 | 256800 | 0.0 | - |
| 19.1808 | 256850 | 0.0006 | - |
| 19.1845 | 256900 | 0.0002 | - |
| 19.1883 | 256950 | 0.0003 | - |
| 19.1920 | 257000 | 0.0002 | - |
| 19.1957 | 257050 | 0.0004 | - |
| 19.1995 | 257100 | 0.0 | - |
| 19.2032 | 257150 | 0.0002 | - |
| 19.2069 | 257200 | 0.0 | - |
| 19.2107 | 257250 | 0.0 | - |
| 19.2144 | 257300 | 0.0 | - |
| 19.2181 | 257350 | 0.0 | - |
| 19.2219 | 257400 | 0.0002 | - |
| 19.2256 | 257450 | 0.0002 | - |
| 19.2293 | 257500 | 0.0005 | - |
| 19.2331 | 257550 | 0.0 | - |
| 19.2368 | 257600 | 0.0 | - |
| 19.2405 | 257650 | 0.0003 | - |
| 19.2443 | 257700 | 0.0003 | - |
| 19.2480 | 257750 | 0.0004 | - |
| 19.2517 | 257800 | 0.0005 | - |
| 19.2555 | 257850 | 0.0 | - |
| 19.2592 | 257900 | 0.0 | - |
| 19.2629 | 257950 | 0.0 | - |
| 19.2667 | 258000 | 0.0002 | - |
| 19.2704 | 258050 | 0.0 | - |
| 19.2741 | 258100 | 0.0001 | - |
| 19.2779 | 258150 | 0.0002 | - |
| 19.2816 | 258200 | 0.0004 | - |
| 19.2853 | 258250 | 0.0003 | - |
| 19.2891 | 258300 | 0.0001 | - |
| 19.2928 | 258350 | 0.0 | - |
| 19.2965 | 258400 | 0.0002 | - |
| 19.3003 | 258450 | 0.0 | - |
| 19.3040 | 258500 | 0.0 | - |
| 19.3077 | 258550 | 0.0 | - |
| 19.3115 | 258600 | 0.0 | - |
| 19.3152 | 258650 | 0.0 | - |
| 19.3189 | 258700 | 0.0002 | - |
| 19.3227 | 258750 | 0.0001 | - |
| 19.3264 | 258800 | 0.0005 | - |
| 19.3301 | 258850 | 0.0 | - |
| 19.3339 | 258900 | 0.0 | - |
| 19.3376 | 258950 | 0.0002 | - |
| 19.3413 | 259000 | 0.0002 | - |
| 19.3451 | 259050 | 0.0 | - |
| 19.3488 | 259100 | 0.0 | - |
| 19.3526 | 259150 | 0.0002 | - |
| 19.3563 | 259200 | 0.0 | - |
| 19.3600 | 259250 | 0.0 | - |
| 19.3638 | 259300 | 0.0002 | - |
| 19.3675 | 259350 | 0.0 | - |
| 19.3712 | 259400 | 0.0006 | - |
| 19.3750 | 259450 | 0.0002 | - |
| 19.3787 | 259500 | 0.0001 | - |
| 19.3824 | 259550 | 0.0002 | - |
| 19.3862 | 259600 | 0.0002 | - |
| 19.3899 | 259650 | 0.0003 | - |
| 19.3936 | 259700 | 0.0004 | - |
| 19.3974 | 259750 | 0.0 | - |
| 19.4011 | 259800 | 0.0002 | - |
| 19.4048 | 259850 | 0.0002 | - |
| 19.4086 | 259900 | 0.0 | - |
| 19.4123 | 259950 | 0.0 | - |
| 19.4160 | 260000 | 0.0002 | - |
| 19.4198 | 260050 | 0.0002 | - |
| 19.4235 | 260100 | 0.0003 | - |
| 19.4272 | 260150 | 0.0001 | - |
| 19.4310 | 260200 | 0.0 | - |
| 19.4347 | 260250 | 0.0002 | - |
| 19.4384 | 260300 | 0.0001 | - |
| 19.4422 | 260350 | 0.0002 | - |
| 19.4459 | 260400 | 0.0 | - |
| 19.4496 | 260450 | 0.0005 | - |
| 19.4534 | 260500 | 0.0 | - |
| 19.4571 | 260550 | 0.0001 | - |
| 19.4608 | 260600 | 0.0001 | - |
| 19.4646 | 260650 | 0.0 | - |
| 19.4683 | 260700 | 0.0 | - |
| 19.4720 | 260750 | 0.0 | - |
| 19.4758 | 260800 | 0.0 | - |
| 19.4795 | 260850 | 0.0 | - |
| 19.4832 | 260900 | 0.0 | - |
| 19.4870 | 260950 | 0.0002 | - |
| 19.4907 | 261000 | 0.0 | - |
| 19.4944 | 261050 | 0.0001 | - |
| 19.4982 | 261100 | 0.0002 | - |
| 19.5019 | 261150 | 0.0 | - |
| 19.5056 | 261200 | 0.0001 | - |
| 19.5094 | 261250 | 0.0002 | - |
| 19.5131 | 261300 | 0.0002 | - |
| 19.5168 | 261350 | 0.0 | - |
| 19.5206 | 261400 | 0.0002 | - |
| 19.5243 | 261450 | 0.0 | - |
| 19.5280 | 261500 | 0.0001 | - |
| 19.5318 | 261550 | 0.0001 | - |
| 19.5355 | 261600 | 0.0004 | - |
| 19.5392 | 261650 | 0.0004 | - |
| 19.5430 | 261700 | 0.0002 | - |
| 19.5467 | 261750 | 0.0007 | - |
| 19.5504 | 261800 | 0.0002 | - |
| 19.5542 | 261850 | 0.0 | - |
| 19.5579 | 261900 | 0.0 | - |
| 19.5616 | 261950 | 0.0 | - |
| 19.5654 | 262000 | 0.0006 | - |
| 19.5691 | 262050 | 0.0004 | - |
| 19.5728 | 262100 | 0.0004 | - |
| 19.5766 | 262150 | 0.0003 | - |
| 19.5803 | 262200 | 0.0002 | - |
| 19.5840 | 262250 | 0.0 | - |
| 19.5878 | 262300 | 0.0 | - |
| 19.5915 | 262350 | 0.0002 | - |
| 19.5953 | 262400 | 0.0004 | - |
| 19.5990 | 262450 | 0.0 | - |
| 19.6027 | 262500 | 0.0002 | - |
| 19.6065 | 262550 | 0.0002 | - |
| 19.6102 | 262600 | 0.0002 | - |
| 19.6139 | 262650 | 0.0 | - |
| 19.6177 | 262700 | 0.0 | - |
| 19.6214 | 262750 | 0.0002 | - |
| 19.6251 | 262800 | 0.0001 | - |
| 19.6289 | 262850 | 0.0003 | - |
| 19.6326 | 262900 | 0.0 | - |
| 19.6363 | 262950 | 0.0002 | - |
| 19.6401 | 263000 | 0.0001 | - |
| 19.6438 | 263050 | 0.0002 | - |
| 19.6475 | 263100 | 0.0 | - |
| 19.6513 | 263150 | 0.0002 | - |
| 19.6550 | 263200 | 0.0002 | - |
| 19.6587 | 263250 | 0.0 | - |
| 19.6625 | 263300 | 0.0002 | - |
| 19.6662 | 263350 | 0.0 | - |
| 19.6699 | 263400 | 0.0 | - |
| 19.6737 | 263450 | 0.0 | - |
| 19.6774 | 263500 | 0.0003 | - |
| 19.6811 | 263550 | 0.0004 | - |
| 19.6849 | 263600 | 0.0002 | - |
| 19.6886 | 263650 | 0.0001 | - |
| 19.6923 | 263700 | 0.0003 | - |
| 19.6961 | 263750 | 0.0002 | - |
| 19.6998 | 263800 | 0.0 | - |
| 19.7035 | 263850 | 0.0 | - |
| 19.7073 | 263900 | 0.0002 | - |
| 19.7110 | 263950 | 0.0 | - |
| 19.7147 | 264000 | 0.0 | - |
| 19.7185 | 264050 | 0.0 | - |
| 19.7222 | 264100 | 0.0001 | - |
| 19.7259 | 264150 | 0.0 | - |
| 19.7297 | 264200 | 0.0002 | - |
| 19.7334 | 264250 | 0.0 | - |
| 19.7371 | 264300 | 0.0001 | - |
| 19.7409 | 264350 | 0.0003 | - |
| 19.7446 | 264400 | 0.0 | - |
| 19.7483 | 264450 | 0.0 | - |
| 19.7521 | 264500 | 0.0 | - |
| 19.7558 | 264550 | 0.0002 | - |
| 19.7595 | 264600 | 0.0002 | - |
| 19.7633 | 264650 | 0.0 | - |
| 19.7670 | 264700 | 0.0002 | - |
| 19.7707 | 264750 | 0.0 | - |
| 19.7745 | 264800 | 0.0003 | - |
| 19.7782 | 264850 | 0.0 | - |
| 19.7819 | 264900 | 0.0001 | - |
| 19.7857 | 264950 | 0.0 | - |
| 19.7894 | 265000 | 0.0 | - |
| 19.7931 | 265050 | 0.0003 | - |
| 19.7969 | 265100 | 0.0002 | - |
| 19.8006 | 265150 | 0.0 | - |
| 19.8043 | 265200 | 0.0002 | - |
| 19.8081 | 265250 | 0.0 | - |
| 19.8118 | 265300 | 0.0 | - |
| 19.8155 | 265350 | 0.0 | - |
| 19.8193 | 265400 | 0.0 | - |
| 19.8230 | 265450 | 0.0002 | - |
| 19.8267 | 265500 | 0.0 | - |
| 19.8305 | 265550 | 0.0001 | - |
| 19.8342 | 265600 | 0.0002 | - |
| 19.8380 | 265650 | 0.0002 | - |
| 19.8417 | 265700 | 0.0 | - |
| 19.8454 | 265750 | 0.0 | - |
| 19.8492 | 265800 | 0.0002 | - |
| 19.8529 | 265850 | 0.0002 | - |
| 19.8566 | 265900 | 0.0 | - |
| 19.8604 | 265950 | 0.0 | - |
| 19.8641 | 266000 | 0.0 | - |
| 19.8678 | 266050 | 0.0 | - |
| 19.8716 | 266100 | 0.0 | - |
| 19.8753 | 266150 | 0.0002 | - |
| 19.8790 | 266200 | 0.0 | - |
| 19.8828 | 266250 | 0.0 | - |
| 19.8865 | 266300 | 0.0 | - |
| 19.8902 | 266350 | 0.0002 | - |
| 19.8940 | 266400 | 0.0 | - |
| 19.8977 | 266450 | 0.0002 | - |
| 19.9014 | 266500 | 0.0 | - |
| 19.9052 | 266550 | 0.0004 | - |
| 19.9089 | 266600 | 0.0 | - |
| 19.9126 | 266650 | 0.0002 | - |
| 19.9164 | 266700 | 0.0002 | - |
| 19.9201 | 266750 | 0.0002 | - |
| 19.9238 | 266800 | 0.0002 | - |
| 19.9276 | 266850 | 0.0 | - |
| 19.9313 | 266900 | 0.0002 | - |
| 19.9350 | 266950 | 0.0003 | - |
| 19.9388 | 267000 | 0.0003 | - |
| 19.9425 | 267050 | 0.0002 | - |
| 19.9462 | 267100 | 0.0001 | - |
| 19.9500 | 267150 | 0.0003 | - |
| 19.9537 | 267200 | 0.0003 | - |
| 19.9574 | 267250 | 0.0004 | - |
| 19.9612 | 267300 | 0.0004 | - |
| 19.9649 | 267350 | 0.0 | - |
| 19.9686 | 267400 | 0.0002 | - |
| 19.9724 | 267450 | 0.0002 | - |
| 19.9761 | 267500 | 0.0001 | - |
| 19.9798 | 267550 | 0.0 | - |
| 19.9836 | 267600 | 0.0 | - |
| 19.9873 | 267650 | 0.0002 | - |
| 19.9910 | 267700 | 0.0 | - |
| 19.9948 | 267750 | 0.0001 | - |
| 19.9985 | 267800 | 0.0 | - |
| 20.0022 | 267850 | 0.0 | - |
| 20.0060 | 267900 | 0.0002 | - |
| 20.0097 | 267950 | 0.0002 | - |
| 20.0134 | 268000 | 0.0 | - |
| 20.0172 | 268050 | 0.0 | - |
| 20.0209 | 268100 | 0.0002 | - |
| 20.0246 | 268150 | 0.0002 | - |
| 20.0284 | 268200 | 0.0 | - |
| 20.0321 | 268250 | 0.0002 | - |
| 20.0358 | 268300 | 0.0 | - |
| 20.0396 | 268350 | 0.0 | - |
| 20.0433 | 268400 | 0.0002 | - |
| 20.0470 | 268450 | 0.0001 | - |
| 20.0508 | 268500 | 0.0002 | - |
| 20.0545 | 268550 | 0.0002 | - |
| 20.0582 | 268600 | 0.0002 | - |
| 20.0620 | 268650 | 0.0001 | - |
| 20.0657 | 268700 | 0.0001 | - |
| 20.0694 | 268750 | 0.0002 | - |
| 20.0732 | 268800 | 0.0 | - |
| 20.0769 | 268850 | 0.0002 | - |
| 20.0807 | 268900 | 0.0001 | - |
| 20.0844 | 268950 | 0.0 | - |
| 20.0881 | 269000 | 0.0 | - |
| 20.0919 | 269050 | 0.0003 | - |
| 20.0956 | 269100 | 0.0 | - |
| 20.0993 | 269150 | 0.0 | - |
| 20.1031 | 269200 | 0.0002 | - |
| 20.1068 | 269250 | 0.0002 | - |
| 20.1105 | 269300 | 0.0001 | - |
| 20.1143 | 269350 | 0.0 | - |
| 20.1180 | 269400 | 0.0 | - |
| 20.1217 | 269450 | 0.0002 | - |
| 20.1255 | 269500 | 0.0002 | - |
| 20.1292 | 269550 | 0.0002 | - |
| 20.1329 | 269600 | 0.0 | - |
| 20.1367 | 269650 | 0.0001 | - |
| 20.1404 | 269700 | 0.0 | - |
| 20.1441 | 269750 | 0.0003 | - |
| 20.1479 | 269800 | 0.0 | - |
| 20.1516 | 269850 | 0.0002 | - |
| 20.1553 | 269900 | 0.0 | - |
| 20.1591 | 269950 | 0.0002 | - |
| 20.1628 | 270000 | 0.0 | - |
| 20.1665 | 270050 | 0.0 | - |
| 20.1703 | 270100 | 0.0002 | - |
| 20.1740 | 270150 | 0.0002 | - |
| 20.1777 | 270200 | 0.0002 | - |
| 20.1815 | 270250 | 0.0002 | - |
| 20.1852 | 270300 | 0.0002 | - |
| 20.1889 | 270350 | 0.0 | - |
| 20.1927 | 270400 | 0.0 | - |
| 20.1964 | 270450 | 0.0 | - |
| 20.2001 | 270500 | 0.0 | - |
| 20.2039 | 270550 | 0.0 | - |
| 20.2076 | 270600 | 0.0 | - |
| 20.2113 | 270650 | 0.0002 | - |
| 20.2151 | 270700 | 0.0 | - |
| 20.2188 | 270750 | 0.0001 | - |
| 20.2225 | 270800 | 0.0002 | - |
| 20.2263 | 270850 | 0.0 | - |
| 20.2300 | 270900 | 0.0005 | - |
| 20.2337 | 270950 | 0.0002 | - |
| 20.2375 | 271000 | 0.0002 | - |
| 20.2412 | 271050 | 0.0 | - |
| 20.2449 | 271100 | 0.0 | - |
| 20.2487 | 271150 | 0.0002 | - |
| 20.2524 | 271200 | 0.0004 | - |
| 20.2561 | 271250 | 0.0 | - |
| 20.2599 | 271300 | 0.0 | - |
| 20.2636 | 271350 | 0.0 | - |
| 20.2673 | 271400 | 0.0 | - |
| 20.2711 | 271450 | 0.0 | - |
| 20.2748 | 271500 | 0.0002 | - |
| 20.2785 | 271550 | 0.0002 | - |
| 20.2823 | 271600 | 0.0002 | - |
| 20.2860 | 271650 | 0.0001 | - |
| 20.2897 | 271700 | 0.0 | - |
| 20.2935 | 271750 | 0.0002 | - |
| 20.2972 | 271800 | 0.0001 | - |
| 20.3009 | 271850 | 0.0 | - |
| 20.3047 | 271900 | 0.0 | - |
| 20.3084 | 271950 | 0.0 | - |
| 20.3121 | 272000 | 0.0 | - |
| 20.3159 | 272050 | 0.0003 | - |
| 20.3196 | 272100 | 0.0003 | - |
| 20.3234 | 272150 | 0.0002 | - |
| 20.3271 | 272200 | 0.0001 | - |
| 20.3308 | 272250 | 0.0002 | - |
| 20.3346 | 272300 | 0.0001 | - |
| 20.3383 | 272350 | 0.0 | - |
| 20.3420 | 272400 | 0.0002 | - |
| 20.3458 | 272450 | 0.0004 | - |
| 20.3495 | 272500 | 0.0002 | - |
| 20.3532 | 272550 | 0.0003 | - |
| 20.3570 | 272600 | 0.0 | - |
| 20.3607 | 272650 | 0.0001 | - |
| 20.3644 | 272700 | 0.0 | - |
| 20.3682 | 272750 | 0.0 | - |
| 20.3719 | 272800 | 0.0 | - |
| 20.3756 | 272850 | 0.0001 | - |
| 20.3794 | 272900 | 0.0001 | - |
| 20.3831 | 272950 | 0.0003 | - |
| 20.3868 | 273000 | 0.0001 | - |
| 20.3906 | 273050 | 0.0002 | - |
| 20.3943 | 273100 | 0.0 | - |
| 20.3980 | 273150 | 0.0 | - |
| 20.4018 | 273200 | 0.0001 | - |
| 20.4055 | 273250 | 0.0001 | - |
| 20.4092 | 273300 | 0.0002 | - |
| 20.4130 | 273350 | 0.0001 | - |
| 20.4167 | 273400 | 0.0002 | - |
| 20.4204 | 273450 | 0.0002 | - |
| 20.4242 | 273500 | 0.0001 | - |
| 20.4279 | 273550 | 0.0002 | - |
| 20.4316 | 273600 | 0.0001 | - |
| 20.4354 | 273650 | 0.0001 | - |
| 20.4391 | 273700 | 0.0 | - |
| 20.4428 | 273750 | 0.0 | - |
| 20.4466 | 273800 | 0.0002 | - |
| 20.4503 | 273850 | 0.0 | - |
| 20.4540 | 273900 | 0.0002 | - |
| 20.4578 | 273950 | 0.0002 | - |
| 20.4615 | 274000 | 0.0 | - |
| 20.4652 | 274050 | 0.0003 | - |
| 20.4690 | 274100 | 0.0 | - |
| 20.4727 | 274150 | 0.0002 | - |
| 20.4764 | 274200 | 0.0 | - |
| 20.4802 | 274250 | 0.0002 | - |
| 20.4839 | 274300 | 0.0002 | - |
| 20.4876 | 274350 | 0.0 | - |
| 20.4914 | 274400 | 0.0 | - |
| 20.4951 | 274450 | 0.0005 | - |
| 20.4988 | 274500 | 0.0 | - |
| 20.5026 | 274550 | 0.0 | - |
| 20.5063 | 274600 | 0.0 | - |
| 20.5100 | 274650 | 0.0 | - |
| 20.5138 | 274700 | 0.0 | - |
| 20.5175 | 274750 | 0.0 | - |
| 20.5212 | 274800 | 0.0 | - |
| 20.5250 | 274850 | 0.0 | - |
| 20.5287 | 274900 | 0.0 | - |
| 20.5324 | 274950 | 0.0001 | - |
| 20.5362 | 275000 | 0.0002 | - |
| 20.5399 | 275050 | 0.0 | - |
| 20.5436 | 275100 | 0.0 | - |
| 20.5474 | 275150 | 0.0 | - |
| 20.5511 | 275200 | 0.0002 | - |
| 20.5549 | 275250 | 0.0 | - |
| 20.5586 | 275300 | 0.0002 | - |
| 20.5623 | 275350 | 0.0001 | - |
| 20.5661 | 275400 | 0.0 | - |
| 20.5698 | 275450 | 0.0001 | - |
| 20.5735 | 275500 | 0.0001 | - |
| 20.5773 | 275550 | 0.0 | - |
| 20.5810 | 275600 | 0.0 | - |
| 20.5847 | 275650 | 0.0 | - |
| 20.5885 | 275700 | 0.0002 | - |
| 20.5922 | 275750 | 0.0 | - |
| 20.5959 | 275800 | 0.0002 | - |
| 20.5997 | 275850 | 0.0002 | - |
| 20.6034 | 275900 | 0.0002 | - |
| 20.6071 | 275950 | 0.0 | - |
| 20.6109 | 276000 | 0.0 | - |
| 20.6146 | 276050 | 0.0001 | - |
| 20.6183 | 276100 | 0.0002 | - |
| 20.6221 | 276150 | 0.0 | - |
| 20.6258 | 276200 | 0.0 | - |
| 20.6295 | 276250 | 0.0003 | - |
| 20.6333 | 276300 | 0.0 | - |
| 20.6370 | 276350 | 0.0003 | - |
| 20.6407 | 276400 | 0.0002 | - |
| 20.6445 | 276450 | 0.0003 | - |
| 20.6482 | 276500 | 0.0002 | - |
| 20.6519 | 276550 | 0.0001 | - |
| 20.6557 | 276600 | 0.0 | - |
| 20.6594 | 276650 | 0.0 | - |
| 20.6631 | 276700 | 0.0 | - |
| 20.6669 | 276750 | 0.0 | - |
| 20.6706 | 276800 | 0.0 | - |
| 20.6743 | 276850 | 0.0 | - |
| 20.6781 | 276900 | 0.0001 | - |
| 20.6818 | 276950 | 0.0 | - |
| 20.6855 | 277000 | 0.0003 | - |
| 20.6893 | 277050 | 0.0 | - |
| 20.6930 | 277100 | 0.0002 | - |
| 20.6967 | 277150 | 0.0 | - |
| 20.7005 | 277200 | 0.0 | - |
| 20.7042 | 277250 | 0.0002 | - |
| 20.7079 | 277300 | 0.0 | - |
| 20.7117 | 277350 | 0.0001 | - |
| 20.7154 | 277400 | 0.0002 | - |
| 20.7191 | 277450 | 0.0 | - |
| 20.7229 | 277500 | 0.0003 | - |
| 20.7266 | 277550 | 0.0001 | - |
| 20.7303 | 277600 | 0.0002 | - |
| 20.7341 | 277650 | 0.0003 | - |
| 20.7378 | 277700 | 0.0 | - |
| 20.7415 | 277750 | 0.0 | - |
| 20.7453 | 277800 | 0.0003 | - |
| 20.7490 | 277850 | 0.0 | - |
| 20.7527 | 277900 | 0.0002 | - |
| 20.7565 | 277950 | 0.0 | - |
| 20.7602 | 278000 | 0.0002 | - |
| 20.7639 | 278050 | 0.0001 | - |
| 20.7677 | 278100 | 0.0002 | - |
| 20.7714 | 278150 | 0.0003 | - |
| 20.7751 | 278200 | 0.0 | - |
| 20.7789 | 278250 | 0.0 | - |
| 20.7826 | 278300 | 0.0002 | - |
| 20.7863 | 278350 | 0.0002 | - |
| 20.7901 | 278400 | 0.0002 | - |
| 20.7938 | 278450 | 0.0002 | - |
| 20.7976 | 278500 | 0.0 | - |
| 20.8013 | 278550 | 0.0 | - |
| 20.8050 | 278600 | 0.0002 | - |
| 20.8088 | 278650 | 0.0004 | - |
| 20.8125 | 278700 | 0.0001 | - |
| 20.8162 | 278750 | 0.0002 | - |
| 20.8200 | 278800 | 0.0 | - |
| 20.8237 | 278850 | 0.0 | - |
| 20.8274 | 278900 | 0.0002 | - |
| 20.8312 | 278950 | 0.0 | - |
| 20.8349 | 279000 | 0.0 | - |
| 20.8386 | 279050 | 0.0 | - |
| 20.8424 | 279100 | 0.0 | - |
| 20.8461 | 279150 | 0.0002 | - |
| 20.8498 | 279200 | 0.0003 | - |
| 20.8536 | 279250 | 0.0 | - |
| 20.8573 | 279300 | 0.0005 | - |
| 20.8610 | 279350 | 0.0 | - |
| 20.8648 | 279400 | 0.0002 | - |
| 20.8685 | 279450 | 0.0002 | - |
| 20.8722 | 279500 | 0.0002 | - |
| 20.8760 | 279550 | 0.0001 | - |
| 20.8797 | 279600 | 0.0002 | - |
| 20.8834 | 279650 | 0.0 | - |
| 20.8872 | 279700 | 0.0001 | - |
| 20.8909 | 279750 | 0.0001 | - |
| 20.8946 | 279800 | 0.0001 | - |
| 20.8984 | 279850 | 0.0001 | - |
| 20.9021 | 279900 | 0.0 | - |
| 20.9058 | 279950 | 0.0 | - |
| 20.9096 | 280000 | 0.0 | - |
| 20.9133 | 280050 | 0.0 | - |
| 20.9170 | 280100 | 0.0 | - |
| 20.9208 | 280150 | 0.0001 | - |
| 20.9245 | 280200 | 0.0002 | - |
| 20.9282 | 280250 | 0.0 | - |
| 20.9320 | 280300 | 0.0002 | - |
| 20.9357 | 280350 | 0.0 | - |
| 20.9394 | 280400 | 0.0001 | - |
| 20.9432 | 280450 | 0.0002 | - |
| 20.9469 | 280500 | 0.0 | - |
| 20.9506 | 280550 | 0.0003 | - |
| 20.9544 | 280600 | 0.0 | - |
| 20.9581 | 280650 | 0.0 | - |
| 20.9618 | 280700 | 0.0 | - |
| 20.9656 | 280750 | 0.0 | - |
| 20.9693 | 280800 | 0.0 | - |
| 20.9730 | 280850 | 0.0004 | - |
| 20.9768 | 280900 | 0.0002 | - |
| 20.9805 | 280950 | 0.0 | - |
| 20.9842 | 281000 | 0.0 | - |
| 20.9880 | 281050 | 0.0 | - |
| 20.9917 | 281100 | 0.0 | - |
| 20.9954 | 281150 | 0.0002 | - |
| 20.9992 | 281200 | 0.0 | - |
| 21.0029 | 281250 | 0.0002 | - |
| 21.0066 | 281300 | 0.0 | - |
| 21.0104 | 281350 | 0.0 | - |
| 21.0141 | 281400 | 0.0 | - |
| 21.0178 | 281450 | 0.0 | - |
| 21.0216 | 281500 | 0.0002 | - |
| 21.0253 | 281550 | 0.0002 | - |
| 21.0290 | 281600 | 0.0 | - |
| 21.0328 | 281650 | 0.0 | - |
| 21.0365 | 281700 | 0.0 | - |
| 21.0403 | 281750 | 0.0002 | - |
| 21.0440 | 281800 | 0.0 | - |
| 21.0477 | 281850 | 0.0 | - |
| 21.0515 | 281900 | 0.0 | - |
| 21.0552 | 281950 | 0.0002 | - |
| 21.0589 | 282000 | 0.0 | - |
| 21.0627 | 282050 | 0.0 | - |
| 21.0664 | 282100 | 0.0 | - |
| 21.0701 | 282150 | 0.0 | - |
| 21.0739 | 282200 | 0.0 | - |
| 21.0776 | 282250 | 0.0002 | - |
| 21.0813 | 282300 | 0.0 | - |
| 21.0851 | 282350 | 0.0002 | - |
| 21.0888 | 282400 | 0.0 | - |
| 21.0925 | 282450 | 0.0002 | - |
| 21.0963 | 282500 | 0.0002 | - |
| 21.1000 | 282550 | 0.0 | - |
| 21.1037 | 282600 | 0.0002 | - |
| 21.1075 | 282650 | 0.0001 | - |
| 21.1112 | 282700 | 0.0 | - |
| 21.1149 | 282750 | 0.0002 | - |
| 21.1187 | 282800 | 0.0 | - |
| 21.1224 | 282850 | 0.0 | - |
| 21.1261 | 282900 | 0.0 | - |
| 21.1299 | 282950 | 0.0002 | - |
| 21.1336 | 283000 | 0.0 | - |
| 21.1373 | 283050 | 0.0002 | - |
| 21.1411 | 283100 | 0.0 | - |
| 21.1448 | 283150 | 0.0 | - |
| 21.1485 | 283200 | 0.0 | - |
| 21.1523 | 283250 | 0.0002 | - |
| 21.1560 | 283300 | 0.0002 | - |
| 21.1597 | 283350 | 0.0002 | - |
| 21.1635 | 283400 | 0.0002 | - |
| 21.1672 | 283450 | 0.0 | - |
| 21.1709 | 283500 | 0.0 | - |
| 21.1747 | 283550 | 0.0 | - |
| 21.1784 | 283600 | 0.0002 | - |
| 21.1821 | 283650 | 0.0003 | - |
| 21.1859 | 283700 | 0.0002 | - |
| 21.1896 | 283750 | 0.0 | - |
| 21.1933 | 283800 | 0.0 | - |
| 21.1971 | 283850 | 0.0002 | - |
| 21.2008 | 283900 | 0.0002 | - |
| 21.2045 | 283950 | 0.0001 | - |
| 21.2083 | 284000 | 0.0003 | - |
| 21.2120 | 284050 | 0.0001 | - |
| 21.2157 | 284100 | 0.0 | - |
| 21.2195 | 284150 | 0.0 | - |
| 21.2232 | 284200 | 0.0003 | - |
| 21.2269 | 284250 | 0.0 | - |
| 21.2307 | 284300 | 0.0 | - |
| 21.2344 | 284350 | 0.0002 | - |
| 21.2381 | 284400 | 0.0002 | - |
| 21.2419 | 284450 | 0.0 | - |
| 21.2456 | 284500 | 0.0 | - |
| 21.2493 | 284550 | 0.0002 | - |
| 21.2531 | 284600 | 0.0 | - |
| 21.2568 | 284650 | 0.0 | - |
| 21.2605 | 284700 | 0.0 | - |
| 21.2643 | 284750 | 0.0 | - |
| 21.2680 | 284800 | 0.0001 | - |
| 21.2717 | 284850 | 0.0 | - |
| 21.2755 | 284900 | 0.0005 | - |
| 21.2792 | 284950 | 0.0001 | - |
| 21.2830 | 285000 | 0.0001 | - |
| 21.2867 | 285050 | 0.0003 | - |
| 21.2904 | 285100 | 0.0002 | - |
| 21.2942 | 285150 | 0.0 | - |
| 21.2979 | 285200 | 0.0002 | - |
| 21.3016 | 285250 | 0.0002 | - |
| 21.3054 | 285300 | 0.0 | - |
| 21.3091 | 285350 | 0.0 | - |
| 21.3128 | 285400 | 0.0005 | - |
| 21.3166 | 285450 | 0.0001 | - |
| 21.3203 | 285500 | 0.0 | - |
| 21.3240 | 285550 | 0.0 | - |
| 21.3278 | 285600 | 0.0003 | - |
| 21.3315 | 285650 | 0.0 | - |
| 21.3352 | 285700 | 0.0001 | - |
| 21.3390 | 285750 | 0.0 | - |
| 21.3427 | 285800 | 0.0002 | - |
| 21.3464 | 285850 | 0.0 | - |
| 21.3502 | 285900 | 0.0001 | - |
| 21.3539 | 285950 | 0.0 | - |
| 21.3576 | 286000 | 0.0 | - |
| 21.3614 | 286050 | 0.0 | - |
| 21.3651 | 286100 | 0.0 | - |
| 21.3688 | 286150 | 0.0002 | - |
| 21.3726 | 286200 | 0.0 | - |
| 21.3763 | 286250 | 0.0 | - |
| 21.3800 | 286300 | 0.0 | - |
| 21.3838 | 286350 | 0.0001 | - |
| 21.3875 | 286400 | 0.0002 | - |
| 21.3912 | 286450 | 0.0 | - |
| 21.3950 | 286500 | 0.0 | - |
| 21.3987 | 286550 | 0.0002 | - |
| 21.4024 | 286600 | 0.0002 | - |
| 21.4062 | 286650 | 0.0002 | - |
| 21.4099 | 286700 | 0.0 | - |
| 21.4136 | 286750 | 0.0002 | - |
| 21.4174 | 286800 | 0.0 | - |
| 21.4211 | 286850 | 0.0002 | - |
| 21.4248 | 286900 | 0.0 | - |
| 21.4286 | 286950 | 0.0 | - |
| 21.4323 | 287000 | 0.0003 | - |
| 21.4360 | 287050 | 0.0 | - |
| 21.4398 | 287100 | 0.0003 | - |
| 21.4435 | 287150 | 0.0002 | - |
| 21.4472 | 287200 | 0.0 | - |
| 21.4510 | 287250 | 0.0002 | - |
| 21.4547 | 287300 | 0.0001 | - |
| 21.4584 | 287350 | 0.0002 | - |
| 21.4622 | 287400 | 0.0 | - |
| 21.4659 | 287450 | 0.0 | - |
| 21.4696 | 287500 | 0.0 | - |
| 21.4734 | 287550 | 0.0 | - |
| 21.4771 | 287600 | 0.0001 | - |
| 21.4808 | 287650 | 0.0 | - |
| 21.4846 | 287700 | 0.0 | - |
| 21.4883 | 287750 | 0.0 | - |
| 21.4920 | 287800 | 0.0 | - |
| 21.4958 | 287850 | 0.0002 | - |
| 21.4995 | 287900 | 0.0 | - |
| 21.5032 | 287950 | 0.0 | - |
| 21.5070 | 288000 | 0.0 | - |
| 21.5107 | 288050 | 0.0003 | - |
| 21.5145 | 288100 | 0.0 | - |
| 21.5182 | 288150 | 0.0001 | - |
| 21.5219 | 288200 | 0.0002 | - |
| 21.5257 | 288250 | 0.0 | - |
| 21.5294 | 288300 | 0.0 | - |
| 21.5331 | 288350 | 0.0 | - |
| 21.5369 | 288400 | 0.0001 | - |
| 21.5406 | 288450 | 0.0002 | - |
| 21.5443 | 288500 | 0.0002 | - |
| 21.5481 | 288550 | 0.0 | - |
| 21.5518 | 288600 | 0.0 | - |
| 21.5555 | 288650 | 0.0002 | - |
| 21.5593 | 288700 | 0.0002 | - |
| 21.5630 | 288750 | 0.0 | - |
| 21.5667 | 288800 | 0.0005 | - |
| 21.5705 | 288850 | 0.0 | - |
| 21.5742 | 288900 | 0.0002 | - |
| 21.5779 | 288950 | 0.0 | - |
| 21.5817 | 289000 | 0.0 | - |
| 21.5854 | 289050 | 0.0002 | - |
| 21.5891 | 289100 | 0.0 | - |
| 21.5929 | 289150 | 0.0002 | - |
| 21.5966 | 289200 | 0.0001 | - |
| 21.6003 | 289250 | 0.0 | - |
| 21.6041 | 289300 | 0.0 | - |
| 21.6078 | 289350 | 0.0 | - |
| 21.6115 | 289400 | 0.0001 | - |
| 21.6153 | 289450 | 0.0002 | - |
| 21.6190 | 289500 | 0.0002 | - |
| 21.6227 | 289550 | 0.0002 | - |
| 21.6265 | 289600 | 0.0 | - |
| 21.6302 | 289650 | 0.0 | - |
| 21.6339 | 289700 | 0.0 | - |
| 21.6377 | 289750 | 0.0002 | - |
| 21.6414 | 289800 | 0.0 | - |
| 21.6451 | 289850 | 0.0 | - |
| 21.6489 | 289900 | 0.0 | - |
| 21.6526 | 289950 | 0.0 | - |
| 21.6563 | 290000 | 0.0 | - |
| 21.6601 | 290050 | 0.0002 | - |
| 21.6638 | 290100 | 0.0004 | - |
| 21.6675 | 290150 | 0.0 | - |
| 21.6713 | 290200 | 0.0001 | - |
| 21.6750 | 290250 | 0.0 | - |
| 21.6787 | 290300 | 0.0005 | - |
| 21.6825 | 290350 | 0.0002 | - |
| 21.6862 | 290400 | 0.0002 | - |
| 21.6899 | 290450 | 0.0 | - |
| 21.6937 | 290500 | 0.0 | - |
| 21.6974 | 290550 | 0.0 | - |
| 21.7011 | 290600 | 0.0002 | - |
| 21.7049 | 290650 | 0.0001 | - |
| 21.7086 | 290700 | 0.0 | - |
| 21.7123 | 290750 | 0.0 | - |
| 21.7161 | 290800 | 0.0003 | - |
| 21.7198 | 290850 | 0.0 | - |
| 21.7235 | 290900 | 0.0 | - |
| 21.7273 | 290950 | 0.0002 | - |
| 21.7310 | 291000 | 0.0 | - |
| 21.7347 | 291050 | 0.0 | - |
| 21.7385 | 291100 | 0.0 | - |
| 21.7422 | 291150 | 0.0 | - |
| 21.7459 | 291200 | 0.0 | - |
| 21.7497 | 291250 | 0.0001 | - |
| 21.7534 | 291300 | 0.0 | - |
| 21.7572 | 291350 | 0.0 | - |
| 21.7609 | 291400 | 0.0 | - |
| 21.7646 | 291450 | 0.0 | - |
| 21.7684 | 291500 | 0.0002 | - |
| 21.7721 | 291550 | 0.0002 | - |
| 21.7758 | 291600 | 0.0 | - |
| 21.7796 | 291650 | 0.0002 | - |
| 21.7833 | 291700 | 0.0 | - |
| 21.7870 | 291750 | 0.0 | - |
| 21.7908 | 291800 | 0.0 | - |
| 21.7945 | 291850 | 0.0 | - |
| 21.7982 | 291900 | 0.0002 | - |
| 21.8020 | 291950 | 0.0 | - |
| 21.8057 | 292000 | 0.0002 | - |
| 21.8094 | 292050 | 0.0 | - |
| 21.8132 | 292100 | 0.0002 | - |
| 21.8169 | 292150 | 0.0 | - |
| 21.8206 | 292200 | 0.0 | - |
| 21.8244 | 292250 | 0.0001 | - |
| 21.8281 | 292300 | 0.0 | - |
| 21.8318 | 292350 | 0.0004 | - |
| 21.8356 | 292400 | 0.0002 | - |
| 21.8393 | 292450 | 0.0 | - |
| 21.8430 | 292500 | 0.0002 | - |
| 21.8468 | 292550 | 0.0002 | - |
| 21.8505 | 292600 | 0.0 | - |
| 21.8542 | 292650 | 0.0 | - |
| 21.8580 | 292700 | 0.0002 | - |
| 21.8617 | 292750 | 0.0 | - |
| 21.8654 | 292800 | 0.0 | - |
| 21.8692 | 292850 | 0.0 | - |
| 21.8729 | 292900 | 0.0002 | - |
| 21.8766 | 292950 | 0.0 | - |
| 21.8804 | 293000 | 0.0 | - |
| 21.8841 | 293050 | 0.0 | - |
| 21.8878 | 293100 | 0.0001 | - |
| 21.8916 | 293150 | 0.0 | - |
| 21.8953 | 293200 | 0.0002 | - |
| 21.8990 | 293250 | 0.0 | - |
| 21.9028 | 293300 | 0.0 | - |
| 21.9065 | 293350 | 0.0001 | - |
| 21.9102 | 293400 | 0.0002 | - |
| 21.9140 | 293450 | 0.0002 | - |
| 21.9177 | 293500 | 0.0001 | - |
| 21.9214 | 293550 | 0.0002 | - |
| 21.9252 | 293600 | 0.0 | - |
| 21.9289 | 293650 | 0.0001 | - |
| 21.9326 | 293700 | 0.0002 | - |
| 21.9364 | 293750 | 0.0 | - |
| 21.9401 | 293800 | 0.0 | - |
| 21.9438 | 293850 | 0.0001 | - |
| 21.9476 | 293900 | 0.0 | - |
| 21.9513 | 293950 | 0.0 | - |
| 21.9550 | 294000 | 0.0 | - |
| 21.9588 | 294050 | 0.0 | - |
| 21.9625 | 294100 | 0.0 | - |
| 21.9662 | 294150 | 0.0 | - |
| 21.9700 | 294200 | 0.0 | - |
| 21.9737 | 294250 | 0.0001 | - |
| 21.9774 | 294300 | 0.0002 | - |
| 21.9812 | 294350 | 0.0001 | - |
| 21.9849 | 294400 | 0.0 | - |
| 21.9886 | 294450 | 0.0002 | - |
| 21.9924 | 294500 | 0.0 | - |
| 21.9961 | 294550 | 0.0 | - |
| 21.9999 | 294600 | 0.0 | - |
| 22.0036 | 294650 | 0.0 | - |
| 22.0073 | 294700 | 0.0 | - |
| 22.0111 | 294750 | 0.0 | - |
| 22.0148 | 294800 | 0.0 | - |
| 22.0185 | 294850 | 0.0 | - |
| 22.0223 | 294900 | 0.0 | - |
| 22.0260 | 294950 | 0.0003 | - |
| 22.0297 | 295000 | 0.0 | - |
| 22.0335 | 295050 | 0.0 | - |
| 22.0372 | 295100 | 0.0 | - |
| 22.0409 | 295150 | 0.0002 | - |
| 22.0447 | 295200 | 0.0001 | - |
| 22.0484 | 295250 | 0.0003 | - |
| 22.0521 | 295300 | 0.0 | - |
| 22.0559 | 295350 | 0.0001 | - |
| 22.0596 | 295400 | 0.0 | - |
| 22.0633 | 295450 | 0.0001 | - |
| 22.0671 | 295500 | 0.0 | - |
| 22.0708 | 295550 | 0.0 | - |
| 22.0745 | 295600 | 0.0002 | - |
| 22.0783 | 295650 | 0.0 | - |
| 22.0820 | 295700 | 0.0 | - |
| 22.0857 | 295750 | 0.0001 | - |
| 22.0895 | 295800 | 0.0 | - |
| 22.0932 | 295850 | 0.0 | - |
| 22.0969 | 295900 | 0.0002 | - |
| 22.1007 | 295950 | 0.0 | - |
| 22.1044 | 296000 | 0.0002 | - |
| 22.1081 | 296050 | 0.0 | - |
| 22.1119 | 296100 | 0.0 | - |
| 22.1156 | 296150 | 0.0002 | - |
| 22.1193 | 296200 | 0.0002 | - |
| 22.1231 | 296250 | 0.0002 | - |
| 22.1268 | 296300 | 0.0 | - |
| 22.1305 | 296350 | 0.0 | - |
| 22.1343 | 296400 | 0.0 | - |
| 22.1380 | 296450 | 0.0 | - |
| 22.1417 | 296500 | 0.0001 | - |
| 22.1455 | 296550 | 0.0 | - |
| 22.1492 | 296600 | 0.0 | - |
| 22.1529 | 296650 | 0.0002 | - |
| 22.1567 | 296700 | 0.0002 | - |
| 22.1604 | 296750 | 0.0 | - |
| 22.1641 | 296800 | 0.0 | - |
| 22.1679 | 296850 | 0.0002 | - |
| 22.1716 | 296900 | 0.0002 | - |
| 22.1753 | 296950 | 0.0001 | - |
| 22.1791 | 297000 | 0.0 | - |
| 22.1828 | 297050 | 0.0 | - |
| 22.1865 | 297100 | 0.0002 | - |
| 22.1903 | 297150 | 0.0 | - |
| 22.1940 | 297200 | 0.0 | - |
| 22.1977 | 297250 | 0.0 | - |
| 22.2015 | 297300 | 0.0 | - |
| 22.2052 | 297350 | 0.0002 | - |
| 22.2089 | 297400 | 0.0002 | - |
| 22.2127 | 297450 | 0.0 | - |
| 22.2164 | 297500 | 0.0002 | - |
| 22.2201 | 297550 | 0.0 | - |
| 22.2239 | 297600 | 0.0 | - |
| 22.2276 | 297650 | 0.0 | - |
| 22.2313 | 297700 | 0.0 | - |
| 22.2351 | 297750 | 0.0001 | - |
| 22.2388 | 297800 | 0.0 | - |
| 22.2426 | 297850 | 0.0 | - |
| 22.2463 | 297900 | 0.0 | - |
| 22.2500 | 297950 | 0.0 | - |
| 22.2538 | 298000 | 0.0 | - |
| 22.2575 | 298050 | 0.0 | - |
| 22.2612 | 298100 | 0.0 | - |
| 22.2650 | 298150 | 0.0002 | - |
| 22.2687 | 298200 | 0.0 | - |
| 22.2724 | 298250 | 0.0 | - |
| 22.2762 | 298300 | 0.0 | - |
| 22.2799 | 298350 | 0.0002 | - |
| 22.2836 | 298400 | 0.0 | - |
| 22.2874 | 298450 | 0.0 | - |
| 22.2911 | 298500 | 0.0002 | - |
| 22.2948 | 298550 | 0.0 | - |
| 22.2986 | 298600 | 0.0 | - |
| 22.3023 | 298650 | 0.0002 | - |
| 22.3060 | 298700 | 0.0 | - |
| 22.3098 | 298750 | 0.0 | - |
| 22.3135 | 298800 | 0.0 | - |
| 22.3172 | 298850 | 0.0001 | - |
| 22.3210 | 298900 | 0.0 | - |
| 22.3247 | 298950 | 0.0002 | - |
| 22.3284 | 299000 | 0.0002 | - |
| 22.3322 | 299050 | 0.0 | - |
| 22.3359 | 299100 | 0.0 | - |
| 22.3396 | 299150 | 0.0 | - |
| 22.3434 | 299200 | 0.0001 | - |
| 22.3471 | 299250 | 0.0003 | - |
| 22.3508 | 299300 | 0.0 | - |
| 22.3546 | 299350 | 0.0 | - |
| 22.3583 | 299400 | 0.0 | - |
| 22.3620 | 299450 | 0.0002 | - |
| 22.3658 | 299500 | 0.0001 | - |
| 22.3695 | 299550 | 0.0002 | - |
| 22.3732 | 299600 | 0.0 | - |
| 22.3770 | 299650 | 0.0003 | - |
| 22.3807 | 299700 | 0.0003 | - |
| 22.3844 | 299750 | 0.0 | - |
| 22.3882 | 299800 | 0.0 | - |
| 22.3919 | 299850 | 0.0002 | - |
| 22.3956 | 299900 | 0.0002 | - |
| 22.3994 | 299950 | 0.0003 | - |
| 22.4031 | 300000 | 0.0 | - |
| 22.4068 | 300050 | 0.0 | - |
| 22.4106 | 300100 | 0.0 | - |
| 22.4143 | 300150 | 0.0005 | - |
| 22.4180 | 300200 | 0.0 | - |
| 22.4218 | 300250 | 0.0002 | - |
| 22.4255 | 300300 | 0.0 | - |
| 22.4292 | 300350 | 0.0003 | - |
| 22.4330 | 300400 | 0.0 | - |
| 22.4367 | 300450 | 0.0 | - |
| 22.4404 | 300500 | 0.0 | - |
| 22.4442 | 300550 | 0.0002 | - |
| 22.4479 | 300600 | 0.0 | - |
| 22.4516 | 300650 | 0.0002 | - |
| 22.4554 | 300700 | 0.0 | - |
| 22.4591 | 300750 | 0.0 | - |
| 22.4628 | 300800 | 0.0002 | - |
| 22.4666 | 300850 | 0.0003 | - |
| 22.4703 | 300900 | 0.0 | - |
| 22.4740 | 300950 | 0.0 | - |
| 22.4778 | 301000 | 0.0 | - |
| 22.4815 | 301050 | 0.0005 | - |
| 22.4853 | 301100 | 0.0004 | - |
| 22.4890 | 301150 | 0.0 | - |
| 22.4927 | 301200 | 0.0 | - |
| 22.4965 | 301250 | 0.0 | - |
| 22.5002 | 301300 | 0.0002 | - |
| 22.5039 | 301350 | 0.0002 | - |
| 22.5077 | 301400 | 0.0 | - |
| 22.5114 | 301450 | 0.0001 | - |
| 22.5151 | 301500 | 0.0 | - |
| 22.5189 | 301550 | 0.0 | - |
| 22.5226 | 301600 | 0.0001 | - |
| 22.5263 | 301650 | 0.0 | - |
| 22.5301 | 301700 | 0.0 | - |
| 22.5338 | 301750 | 0.0 | - |
| 22.5375 | 301800 | 0.0001 | - |
| 22.5413 | 301850 | 0.0 | - |
| 22.5450 | 301900 | 0.0 | - |
| 22.5487 | 301950 | 0.0 | - |
| 22.5525 | 302000 | 0.0 | - |
| 22.5562 | 302050 | 0.0 | - |
| 22.5599 | 302100 | 0.0001 | - |
| 22.5637 | 302150 | 0.0 | - |
| 22.5674 | 302200 | 0.0 | - |
| 22.5711 | 302250 | 0.0002 | - |
| 22.5749 | 302300 | 0.0001 | - |
| 22.5786 | 302350 | 0.0 | - |
| 22.5823 | 302400 | 0.0002 | - |
| 22.5861 | 302450 | 0.0002 | - |
| 22.5898 | 302500 | 0.0 | - |
| 22.5935 | 302550 | 0.0002 | - |
| 22.5973 | 302600 | 0.0003 | - |
| 22.6010 | 302650 | 0.0002 | - |
| 22.6047 | 302700 | 0.0004 | - |
| 22.6085 | 302750 | 0.0002 | - |
| 22.6122 | 302800 | 0.0 | - |
| 22.6159 | 302850 | 0.0002 | - |
| 22.6197 | 302900 | 0.0003 | - |
| 22.6234 | 302950 | 0.0 | - |
| 22.6271 | 303000 | 0.0001 | - |
| 22.6309 | 303050 | 0.0 | - |
| 22.6346 | 303100 | 0.0 | - |
| 22.6383 | 303150 | 0.0002 | - |
| 22.6421 | 303200 | 0.0001 | - |
| 22.6458 | 303250 | 0.0 | - |
| 22.6495 | 303300 | 0.0 | - |
| 22.6533 | 303350 | 0.0 | - |
| 22.6570 | 303400 | 0.0003 | - |
| 22.6607 | 303450 | 0.0 | - |
| 22.6645 | 303500 | 0.0 | - |
| 22.6682 | 303550 | 0.0 | - |
| 22.6719 | 303600 | 0.0 | - |
| 22.6757 | 303650 | 0.0 | - |
| 22.6794 | 303700 | 0.0 | - |
| 22.6831 | 303750 | 0.0 | - |
| 22.6869 | 303800 | 0.0002 | - |
| 22.6906 | 303850 | 0.0 | - |
| 22.6943 | 303900 | 0.0 | - |
| 22.6981 | 303950 | 0.0003 | - |
| 22.7018 | 304000 | 0.0 | - |
| 22.7055 | 304050 | 0.0 | - |
| 22.7093 | 304100 | 0.0 | - |
| 22.7130 | 304150 | 0.0002 | - |
| 22.7168 | 304200 | 0.0 | - |
| 22.7205 | 304250 | 0.0 | - |
| 22.7242 | 304300 | 0.0 | - |
| 22.7280 | 304350 | 0.0 | - |
| 22.7317 | 304400 | 0.0 | - |
| 22.7354 | 304450 | 0.0 | - |
| 22.7392 | 304500 | 0.0003 | - |
| 22.7429 | 304550 | 0.0 | - |
| 22.7466 | 304600 | 0.0002 | - |
| 22.7504 | 304650 | 0.0002 | - |
| 22.7541 | 304700 | 0.0 | - |
| 22.7578 | 304750 | 0.0 | - |
| 22.7616 | 304800 | 0.0002 | - |
| 22.7653 | 304850 | 0.0003 | - |
| 22.7690 | 304900 | 0.0002 | - |
| 22.7728 | 304950 | 0.0 | - |
| 22.7765 | 305000 | 0.0002 | - |
| 22.7802 | 305050 | 0.0 | - |
| 22.7840 | 305100 | 0.0 | - |
| 22.7877 | 305150 | 0.0 | - |
| 22.7914 | 305200 | 0.0002 | - |
| 22.7952 | 305250 | 0.0 | - |
| 22.7989 | 305300 | 0.0 | - |
| 22.8026 | 305350 | 0.0002 | - |
| 22.8064 | 305400 | 0.0005 | - |
| 22.8101 | 305450 | 0.0 | - |
| 22.8138 | 305500 | 0.0002 | - |
| 22.8176 | 305550 | 0.0 | - |
| 22.8213 | 305600 | 0.0 | - |
| 22.8250 | 305650 | 0.0002 | - |
| 22.8288 | 305700 | 0.0 | - |
| 22.8325 | 305750 | 0.0002 | - |
| 22.8362 | 305800 | 0.0 | - |
| 22.8400 | 305850 | 0.0002 | - |
| 22.8437 | 305900 | 0.0 | - |
| 22.8474 | 305950 | 0.0002 | - |
| 22.8512 | 306000 | 0.0001 | - |
| 22.8549 | 306050 | 0.0 | - |
| 22.8586 | 306100 | 0.0002 | - |
| 22.8624 | 306150 | 0.0002 | - |
| 22.8661 | 306200 | 0.0 | - |
| 22.8698 | 306250 | 0.0002 | - |
| 22.8736 | 306300 | 0.0 | - |
| 22.8773 | 306350 | 0.0002 | - |
| 22.8810 | 306400 | 0.0 | - |
| 22.8848 | 306450 | 0.0002 | - |
| 22.8885 | 306500 | 0.0 | - |
| 22.8922 | 306550 | 0.0 | - |
| 22.8960 | 306600 | 0.0 | - |
| 22.8997 | 306650 | 0.0002 | - |
| 22.9034 | 306700 | 0.0 | - |
| 22.9072 | 306750 | 0.0 | - |
| 22.9109 | 306800 | 0.0 | - |
| 22.9146 | 306850 | 0.0 | - |
| 22.9184 | 306900 | 0.0 | - |
| 22.9221 | 306950 | 0.0003 | - |
| 22.9258 | 307000 | 0.0002 | - |
| 22.9296 | 307050 | 0.0002 | - |
| 22.9333 | 307100 | 0.0 | - |
| 22.9370 | 307150 | 0.0001 | - |
| 22.9408 | 307200 | 0.0 | - |
| 22.9445 | 307250 | 0.0 | - |
| 22.9482 | 307300 | 0.0 | - |
| 22.9520 | 307350 | 0.0002 | - |
| 22.9557 | 307400 | 0.0002 | - |
| 22.9595 | 307450 | 0.0 | - |
| 22.9632 | 307500 | 0.0 | - |
| 22.9669 | 307550 | 0.0002 | - |
| 22.9707 | 307600 | 0.0 | - |
| 22.9744 | 307650 | 0.0 | - |
| 22.9781 | 307700 | 0.0002 | - |
| 22.9819 | 307750 | 0.0 | - |
| 22.9856 | 307800 | 0.0 | - |
| 22.9893 | 307850 | 0.0002 | - |
| 22.9931 | 307900 | 0.0 | - |
| 22.9968 | 307950 | 0.0 | - |
| 23.0005 | 308000 | 0.0002 | - |
| 23.0043 | 308050 | 0.0 | - |
| 23.0080 | 308100 | 0.0 | - |
| 23.0117 | 308150 | 0.0 | - |
| 23.0155 | 308200 | 0.0 | - |
| 23.0192 | 308250 | 0.0001 | - |
| 23.0229 | 308300 | 0.0 | - |
| 23.0267 | 308350 | 0.0 | - |
| 23.0304 | 308400 | 0.0 | - |
| 23.0341 | 308450 | 0.0002 | - |
| 23.0379 | 308500 | 0.0002 | - |
| 23.0416 | 308550 | 0.0 | - |
| 23.0453 | 308600 | 0.0002 | - |
| 23.0491 | 308650 | 0.0 | - |
| 23.0528 | 308700 | 0.0 | - |
| 23.0565 | 308750 | 0.0 | - |
| 23.0603 | 308800 | 0.0 | - |
| 23.0640 | 308850 | 0.0 | - |
| 23.0677 | 308900 | 0.0002 | - |
| 23.0715 | 308950 | 0.0 | - |
| 23.0752 | 309000 | 0.0 | - |
| 23.0789 | 309050 | 0.0002 | - |
| 23.0827 | 309100 | 0.0001 | - |
| 23.0864 | 309150 | 0.0001 | - |
| 23.0901 | 309200 | 0.0 | - |
| 23.0939 | 309250 | 0.0002 | - |
| 23.0976 | 309300 | 0.0 | - |
| 23.1013 | 309350 | 0.0 | - |
| 23.1051 | 309400 | 0.0 | - |
| 23.1088 | 309450 | 0.0 | - |
| 23.1125 | 309500 | 0.0002 | - |
| 23.1163 | 309550 | 0.0 | - |
| 23.1200 | 309600 | 0.0 | - |
| 23.1237 | 309650 | 0.0 | - |
| 23.1275 | 309700 | 0.0 | - |
| 23.1312 | 309750 | 0.0003 | - |
| 23.1349 | 309800 | 0.0 | - |
| 23.1387 | 309850 | 0.0 | - |
| 23.1424 | 309900 | 0.0002 | - |
| 23.1461 | 309950 | 0.0002 | - |
| 23.1499 | 310000 | 0.0 | - |
| 23.1536 | 310050 | 0.0 | - |
| 23.1573 | 310100 | 0.0 | - |
| 23.1611 | 310150 | 0.0 | - |
| 23.1648 | 310200 | 0.0003 | - |
| 23.1685 | 310250 | 0.0 | - |
| 23.1723 | 310300 | 0.0 | - |
| 23.1760 | 310350 | 0.0 | - |
| 23.1797 | 310400 | 0.0 | - |
| 23.1835 | 310450 | 0.0 | - |
| 23.1872 | 310500 | 0.0001 | - |
| 23.1909 | 310550 | 0.0002 | - |
| 23.1947 | 310600 | 0.0 | - |
| 23.1984 | 310650 | 0.0 | - |
| 23.2022 | 310700 | 0.0002 | - |
| 23.2059 | 310750 | 0.0002 | - |
| 23.2096 | 310800 | 0.0002 | - |
| 23.2134 | 310850 | 0.0002 | - |
| 23.2171 | 310900 | 0.0 | - |
| 23.2208 | 310950 | 0.0 | - |
| 23.2246 | 311000 | 0.0002 | - |
| 23.2283 | 311050 | 0.0 | - |
| 23.2320 | 311100 | 0.0001 | - |
| 23.2358 | 311150 | 0.0 | - |
| 23.2395 | 311200 | 0.0002 | - |
| 23.2432 | 311250 | 0.0 | - |
| 23.2470 | 311300 | 0.0 | - |
| 23.2507 | 311350 | 0.0004 | - |
| 23.2544 | 311400 | 0.0004 | - |
| 23.2582 | 311450 | 0.0 | - |
| 23.2619 | 311500 | 0.0002 | - |
| 23.2656 | 311550 | 0.0002 | - |
| 23.2694 | 311600 | 0.0002 | - |
| 23.2731 | 311650 | 0.0 | - |
| 23.2768 | 311700 | 0.0 | - |
| 23.2806 | 311750 | 0.0 | - |
| 23.2843 | 311800 | 0.0 | - |
| 23.2880 | 311850 | 0.0002 | - |
| 23.2918 | 311900 | 0.0 | - |
| 23.2955 | 311950 | 0.0 | - |
| 23.2992 | 312000 | 0.0 | - |
| 23.3030 | 312050 | 0.0001 | - |
| 23.3067 | 312100 | 0.0 | - |
| 23.3104 | 312150 | 0.0 | - |
| 23.3142 | 312200 | 0.0 | - |
| 23.3179 | 312250 | 0.0 | - |
| 23.3216 | 312300 | 0.0 | - |
| 23.3254 | 312350 | 0.0 | - |
| 23.3291 | 312400 | 0.0001 | - |
| 23.3328 | 312450 | 0.0002 | - |
| 23.3366 | 312500 | 0.0 | - |
| 23.3403 | 312550 | 0.0 | - |
| 23.3440 | 312600 | 0.0 | - |
| 23.3478 | 312650 | 0.0 | - |
| 23.3515 | 312700 | 0.0001 | - |
| 23.3552 | 312750 | 0.0 | - |
| 23.3590 | 312800 | 0.0 | - |
| 23.3627 | 312850 | 0.0002 | - |
| 23.3664 | 312900 | 0.0 | - |
| 23.3702 | 312950 | 0.0002 | - |
| 23.3739 | 313000 | 0.0002 | - |
| 23.3776 | 313050 | 0.0 | - |
| 23.3814 | 313100 | 0.0 | - |
| 23.3851 | 313150 | 0.0 | - |
| 23.3888 | 313200 | 0.0001 | - |
| 23.3926 | 313250 | 0.0 | - |
| 23.3963 | 313300 | 0.0 | - |
| 23.4000 | 313350 | 0.0002 | - |
| 23.4038 | 313400 | 0.0001 | - |
| 23.4075 | 313450 | 0.0005 | - |
| 23.4112 | 313500 | 0.0 | - |
| 23.4150 | 313550 | 0.0003 | - |
| 23.4187 | 313600 | 0.0 | - |
| 23.4224 | 313650 | 0.0 | - |
| 23.4262 | 313700 | 0.0 | - |
| 23.4299 | 313750 | 0.0002 | - |
| 23.4336 | 313800 | 0.0002 | - |
| 23.4374 | 313850 | 0.0002 | - |
| 23.4411 | 313900 | 0.0003 | - |
| 23.4449 | 313950 | 0.0002 | - |
| 23.4486 | 314000 | 0.0002 | - |
| 23.4523 | 314050 | 0.0 | - |
| 23.4561 | 314100 | 0.0 | - |
| 23.4598 | 314150 | 0.0 | - |
| 23.4635 | 314200 | 0.0 | - |
| 23.4673 | 314250 | 0.0 | - |
| 23.4710 | 314300 | 0.0002 | - |
| 23.4747 | 314350 | 0.0 | - |
| 23.4785 | 314400 | 0.0001 | - |
| 23.4822 | 314450 | 0.0 | - |
| 23.4859 | 314500 | 0.0 | - |
| 23.4897 | 314550 | 0.0 | - |
| 23.4934 | 314600 | 0.0002 | - |
| 23.4971 | 314650 | 0.0 | - |
| 23.5009 | 314700 | 0.0 | - |
| 23.5046 | 314750 | 0.0 | - |
| 23.5083 | 314800 | 0.0 | - |
| 23.5121 | 314850 | 0.0002 | - |
| 23.5158 | 314900 | 0.0002 | - |
| 23.5195 | 314950 | 0.0001 | - |
| 23.5233 | 315000 | 0.0 | - |
| 23.5270 | 315050 | 0.0002 | - |
| 23.5307 | 315100 | 0.0 | - |
| 23.5345 | 315150 | 0.0 | - |
| 23.5382 | 315200 | 0.0 | - |
| 23.5419 | 315250 | 0.0001 | - |
| 23.5457 | 315300 | 0.0002 | - |
| 23.5494 | 315350 | 0.0002 | - |
| 23.5531 | 315400 | 0.0 | - |
| 23.5569 | 315450 | 0.0005 | - |
| 23.5606 | 315500 | 0.0005 | - |
| 23.5643 | 315550 | 0.0 | - |
| 23.5681 | 315600 | 0.0003 | - |
| 23.5718 | 315650 | 0.0001 | - |
| 23.5755 | 315700 | 0.0 | - |
| 23.5793 | 315750 | 0.0 | - |
| 23.5830 | 315800 | 0.0 | - |
| 23.5867 | 315850 | 0.0 | - |
| 23.5905 | 315900 | 0.0002 | - |
| 23.5942 | 315950 | 0.0002 | - |
| 23.5979 | 316000 | 0.0 | - |
| 23.6017 | 316050 | 0.0 | - |
| 23.6054 | 316100 | 0.0002 | - |
| 23.6091 | 316150 | 0.0002 | - |
| 23.6129 | 316200 | 0.0002 | - |
| 23.6166 | 316250 | 0.0 | - |
| 23.6203 | 316300 | 0.0 | - |
| 23.6241 | 316350 | 0.0002 | - |
| 23.6278 | 316400 | 0.0 | - |
| 23.6315 | 316450 | 0.0 | - |
| 23.6353 | 316500 | 0.0 | - |
| 23.6390 | 316550 | 0.0003 | - |
| 23.6427 | 316600 | 0.0001 | - |
| 23.6465 | 316650 | 0.0 | - |
| 23.6502 | 316700 | 0.0003 | - |
| 23.6539 | 316750 | 0.0001 | - |
| 23.6577 | 316800 | 0.0002 | - |
| 23.6614 | 316850 | 0.0002 | - |
| 23.6651 | 316900 | 0.0005 | - |
| 23.6689 | 316950 | 0.0002 | - |
| 23.6726 | 317000 | 0.0 | - |
| 23.6763 | 317050 | 0.0002 | - |
| 23.6801 | 317100 | 0.0003 | - |
| 23.6838 | 317150 | 0.0 | - |
| 23.6876 | 317200 | 0.0 | - |
| 23.6913 | 317250 | 0.0002 | - |
| 23.6950 | 317300 | 0.0 | - |
| 23.6988 | 317350 | 0.0002 | - |
| 23.7025 | 317400 | 0.0 | - |
| 23.7062 | 317450 | 0.0002 | - |
| 23.7100 | 317500 | 0.0 | - |
| 23.7137 | 317550 | 0.0 | - |
| 23.7174 | 317600 | 0.0 | - |
| 23.7212 | 317650 | 0.0002 | - |
| 23.7249 | 317700 | 0.0 | - |
| 23.7286 | 317750 | 0.0 | - |
| 23.7324 | 317800 | 0.0002 | - |
| 23.7361 | 317850 | 0.0 | - |
| 23.7398 | 317900 | 0.0 | - |
| 23.7436 | 317950 | 0.0002 | - |
| 23.7473 | 318000 | 0.0002 | - |
| 23.7510 | 318050 | 0.0002 | - |
| 23.7548 | 318100 | 0.0 | - |
| 23.7585 | 318150 | 0.0 | - |
| 23.7622 | 318200 | 0.0 | - |
| 23.7660 | 318250 | 0.0 | - |
| 23.7697 | 318300 | 0.0001 | - |
| 23.7734 | 318350 | 0.0 | - |
| 23.7772 | 318400 | 0.0 | - |
| 23.7809 | 318450 | 0.0 | - |
| 23.7846 | 318500 | 0.0 | - |
| 23.7884 | 318550 | 0.0002 | - |
| 23.7921 | 318600 | 0.0002 | - |
| 23.7958 | 318650 | 0.0 | - |
| 23.7996 | 318700 | 0.0 | - |
| 23.8033 | 318750 | 0.0 | - |
| 23.8070 | 318800 | 0.0 | - |
| 23.8108 | 318850 | 0.0 | - |
| 23.8145 | 318900 | 0.0 | - |
| 23.8182 | 318950 | 0.0 | - |
| 23.8220 | 319000 | 0.0 | - |
| 23.8257 | 319050 | 0.0 | - |
| 23.8294 | 319100 | 0.0003 | - |
| 23.8332 | 319150 | 0.0 | - |
| 23.8369 | 319200 | 0.0 | - |
| 23.8406 | 319250 | 0.0 | - |
| 23.8444 | 319300 | 0.0 | - |
| 23.8481 | 319350 | 0.0 | - |
| 23.8518 | 319400 | 0.0002 | - |
| 23.8556 | 319450 | 0.0 | - |
| 23.8593 | 319500 | 0.0 | - |
| 23.8630 | 319550 | 0.0002 | - |
| 23.8668 | 319600 | 0.0 | - |
| 23.8705 | 319650 | 0.0003 | - |
| 23.8742 | 319700 | 0.0 | - |
| 23.8780 | 319750 | 0.0002 | - |
| 23.8817 | 319800 | 0.0001 | - |
| 23.8854 | 319850 | 0.0 | - |
| 23.8892 | 319900 | 0.0002 | - |
| 23.8929 | 319950 | 0.0 | - |
| 23.8966 | 320000 | 0.0001 | - |
| 23.9004 | 320050 | 0.0 | - |
| 23.9041 | 320100 | 0.0 | - |
| 23.9078 | 320150 | 0.0002 | - |
| 23.9116 | 320200 | 0.0 | - |
| 23.9153 | 320250 | 0.0 | - |
| 23.9191 | 320300 | 0.0 | - |
| 23.9228 | 320350 | 0.0 | - |
| 23.9265 | 320400 | 0.0 | - |
| 23.9303 | 320450 | 0.0002 | - |
| 23.9340 | 320500 | 0.0002 | - |
| 23.9377 | 320550 | 0.0 | - |
| 23.9415 | 320600 | 0.0002 | - |
| 23.9452 | 320650 | 0.0 | - |
| 23.9489 | 320700 | 0.0 | - |
| 23.9527 | 320750 | 0.0 | - |
| 23.9564 | 320800 | 0.0 | - |
| 23.9601 | 320850 | 0.0002 | - |
| 23.9639 | 320900 | 0.0 | - |
| 23.9676 | 320950 | 0.0002 | - |
| 23.9713 | 321000 | 0.0002 | - |
| 23.9751 | 321050 | 0.0002 | - |
| 23.9788 | 321100 | 0.0 | - |
| 23.9825 | 321150 | 0.0 | - |
| 23.9863 | 321200 | 0.0001 | - |
| 23.9900 | 321250 | 0.0002 | - |
| 23.9937 | 321300 | 0.0 | - |
| 23.9975 | 321350 | 0.0002 | - |
| 24.0012 | 321400 | 0.0002 | - |
| 24.0049 | 321450 | 0.0 | - |
| 24.0087 | 321500 | 0.0002 | - |
| 24.0124 | 321550 | 0.0 | - |
| 24.0161 | 321600 | 0.0002 | - |
| 24.0199 | 321650 | 0.0 | - |
| 24.0236 | 321700 | 0.0 | - |
| 24.0273 | 321750 | 0.0 | - |
| 24.0311 | 321800 | 0.0 | - |
| 24.0348 | 321850 | 0.0 | - |
| 24.0385 | 321900 | 0.0 | - |
| 24.0423 | 321950 | 0.0001 | - |
| 24.0460 | 322000 | 0.0 | - |
| 24.0497 | 322050 | 0.0 | - |
| 24.0535 | 322100 | 0.0001 | - |
| 24.0572 | 322150 | 0.0 | - |
| 24.0609 | 322200 | 0.0 | - |
| 24.0647 | 322250 | 0.0003 | - |
| 24.0684 | 322300 | 0.0 | - |
| 24.0721 | 322350 | 0.0 | - |
| 24.0759 | 322400 | 0.0002 | - |
| 24.0796 | 322450 | 0.0 | - |
| 24.0833 | 322500 | 0.0 | - |
| 24.0871 | 322550 | 0.0 | - |
| 24.0908 | 322600 | 0.0 | - |
| 24.0945 | 322650 | 0.0 | - |
| 24.0983 | 322700 | 0.0 | - |
| 24.1020 | 322750 | 0.0002 | - |
| 24.1057 | 322800 | 0.0 | - |
| 24.1095 | 322850 | 0.0002 | - |
| 24.1132 | 322900 | 0.0 | - |
| 24.1169 | 322950 | 0.0002 | - |
| 24.1207 | 323000 | 0.0 | - |
| 24.1244 | 323050 | 0.0 | - |
| 24.1281 | 323100 | 0.0 | - |
| 24.1319 | 323150 | 0.0 | - |
| 24.1356 | 323200 | 0.0002 | - |
| 24.1393 | 323250 | 0.0003 | - |
| 24.1431 | 323300 | 0.0003 | - |
| 24.1468 | 323350 | 0.0002 | - |
| 24.1505 | 323400 | 0.0 | - |
| 24.1543 | 323450 | 0.0 | - |
| 24.1580 | 323500 | 0.0001 | - |
| 24.1618 | 323550 | 0.0004 | - |
| 24.1655 | 323600 | 0.0 | - |
| 24.1692 | 323650 | 0.0002 | - |
| 24.1730 | 323700 | 0.0 | - |
| 24.1767 | 323750 | 0.0002 | - |
| 24.1804 | 323800 | 0.0 | - |
| 24.1842 | 323850 | 0.0 | - |
| 24.1879 | 323900 | 0.0 | - |
| 24.1916 | 323950 | 0.0 | - |
| 24.1954 | 324000 | 0.0 | - |
| 24.1991 | 324050 | 0.0 | - |
| 24.2028 | 324100 | 0.0002 | - |
| 24.2066 | 324150 | 0.0003 | - |
| 24.2103 | 324200 | 0.0 | - |
| 24.2140 | 324250 | 0.0001 | - |
| 24.2178 | 324300 | 0.0 | - |
| 24.2215 | 324350 | 0.0002 | - |
| 24.2252 | 324400 | 0.0 | - |
| 24.2290 | 324450 | 0.0002 | - |
| 24.2327 | 324500 | 0.0002 | - |
| 24.2364 | 324550 | 0.0 | - |
| 24.2402 | 324600 | 0.0001 | - |
| 24.2439 | 324650 | 0.0002 | - |
| 24.2476 | 324700 | 0.0002 | - |
| 24.2514 | 324750 | 0.0 | - |
| 24.2551 | 324800 | 0.0002 | - |
| 24.2588 | 324850 | 0.0 | - |
| 24.2626 | 324900 | 0.0 | - |
| 24.2663 | 324950 | 0.0002 | - |
| 24.2700 | 325000 | 0.0 | - |
| 24.2738 | 325050 | 0.0001 | - |
| 24.2775 | 325100 | 0.0002 | - |
| 24.2812 | 325150 | 0.0 | - |
| 24.2850 | 325200 | 0.0 | - |
| 24.2887 | 325250 | 0.0002 | - |
| 24.2924 | 325300 | 0.0 | - |
| 24.2962 | 325350 | 0.0002 | - |
| 24.2999 | 325400 | 0.0 | - |
| 24.3036 | 325450 | 0.0 | - |
| 24.3074 | 325500 | 0.0 | - |
| 24.3111 | 325550 | 0.0 | - |
| 24.3148 | 325600 | 0.0 | - |
| 24.3186 | 325650 | 0.0 | - |
| 24.3223 | 325700 | 0.0 | - |
| 24.3260 | 325750 | 0.0003 | - |
| 24.3298 | 325800 | 0.0001 | - |
| 24.3335 | 325850 | 0.0002 | - |
| 24.3372 | 325900 | 0.0 | - |
| 24.3410 | 325950 | 0.0 | - |
| 24.3447 | 326000 | 0.0 | - |
| 24.3484 | 326050 | 0.0 | - |
| 24.3522 | 326100 | 0.0 | - |
| 24.3559 | 326150 | 0.0 | - |
| 24.3596 | 326200 | 0.0 | - |
| 24.3634 | 326250 | 0.0 | - |
| 24.3671 | 326300 | 0.0 | - |
| 24.3708 | 326350 | 0.0001 | - |
| 24.3746 | 326400 | 0.0002 | - |
| 24.3783 | 326450 | 0.0 | - |
| 24.3820 | 326500 | 0.0 | - |
| 24.3858 | 326550 | 0.0001 | - |
| 24.3895 | 326600 | 0.0 | - |
| 24.3932 | 326650 | 0.0 | - |
| 24.3970 | 326700 | 0.0 | - |
| 24.4007 | 326750 | 0.0 | - |
| 24.4045 | 326800 | 0.0002 | - |
| 24.4082 | 326850 | 0.0 | - |
| 24.4119 | 326900 | 0.0 | - |
| 24.4157 | 326950 | 0.0 | - |
| 24.4194 | 327000 | 0.0 | - |
| 24.4231 | 327050 | 0.0002 | - |
| 24.4269 | 327100 | 0.0002 | - |
| 24.4306 | 327150 | 0.0002 | - |
| 24.4343 | 327200 | 0.0 | - |
| 24.4381 | 327250 | 0.0 | - |
| 24.4418 | 327300 | 0.0 | - |
| 24.4455 | 327350 | 0.0 | - |
| 24.4493 | 327400 | 0.0 | - |
| 24.4530 | 327450 | 0.0002 | - |
| 24.4567 | 327500 | 0.0 | - |
| 24.4605 | 327550 | 0.0 | - |
| 24.4642 | 327600 | 0.0 | - |
| 24.4679 | 327650 | 0.0 | - |
| 24.4717 | 327700 | 0.0001 | - |
| 24.4754 | 327750 | 0.0002 | - |
| 24.4791 | 327800 | 0.0 | - |
| 24.4829 | 327850 | 0.0 | - |
| 24.4866 | 327900 | 0.0 | - |
| 24.4903 | 327950 | 0.0 | - |
| 24.4941 | 328000 | 0.0 | - |
| 24.4978 | 328050 | 0.0 | - |
| 24.5015 | 328100 | 0.0003 | - |
| 24.5053 | 328150 | 0.0 | - |
| 24.5090 | 328200 | 0.0002 | - |
| 24.5127 | 328250 | 0.0 | - |
| 24.5165 | 328300 | 0.0 | - |
| 24.5202 | 328350 | 0.0002 | - |
| 24.5239 | 328400 | 0.0 | - |
| 24.5277 | 328450 | 0.0 | - |
| 24.5314 | 328500 | 0.0 | - |
| 24.5351 | 328550 | 0.0 | - |
| 24.5389 | 328600 | 0.0 | - |
| 24.5426 | 328650 | 0.0 | - |
| 24.5463 | 328700 | 0.0 | - |
| 24.5501 | 328750 | 0.0 | - |
| 24.5538 | 328800 | 0.0 | - |
| 24.5575 | 328850 | 0.0 | - |
| 24.5613 | 328900 | 0.0 | - |
| 24.5650 | 328950 | 0.0 | - |
| 24.5687 | 329000 | 0.0 | - |
| 24.5725 | 329050 | 0.0 | - |
| 24.5762 | 329100 | 0.0 | - |
| 24.5799 | 329150 | 0.0002 | - |
| 24.5837 | 329200 | 0.0 | - |
| 24.5874 | 329250 | 0.0 | - |
| 24.5911 | 329300 | 0.0 | - |
| 24.5949 | 329350 | 0.0 | - |
| 24.5986 | 329400 | 0.0004 | - |
| 24.6023 | 329450 | 0.0002 | - |
| 24.6061 | 329500 | 0.0002 | - |
| 24.6098 | 329550 | 0.0002 | - |
| 24.6135 | 329600 | 0.0 | - |
| 24.6173 | 329650 | 0.0 | - |
| 24.6210 | 329700 | 0.0 | - |
| 24.6247 | 329750 | 0.0 | - |
| 24.6285 | 329800 | 0.0002 | - |
| 24.6322 | 329850 | 0.0 | - |
| 24.6359 | 329900 | 0.0 | - |
| 24.6397 | 329950 | 0.0002 | - |
| 24.6434 | 330000 | 0.0 | - |
| 24.6472 | 330050 | 0.0 | - |
| 24.6509 | 330100 | 0.0002 | - |
| 24.6546 | 330150 | 0.0 | - |
| 24.6584 | 330200 | 0.0002 | - |
| 24.6621 | 330250 | 0.0 | - |
| 24.6658 | 330300 | 0.0003 | - |
| 24.6696 | 330350 | 0.0 | - |
| 24.6733 | 330400 | 0.0 | - |
| 24.6770 | 330450 | 0.0 | - |
| 24.6808 | 330500 | 0.0 | - |
| 24.6845 | 330550 | 0.0 | - |
| 24.6882 | 330600 | 0.0 | - |
| 24.6920 | 330650 | 0.0001 | - |
| 24.6957 | 330700 | 0.0 | - |
| 24.6994 | 330750 | 0.0 | - |
| 24.7032 | 330800 | 0.0001 | - |
| 24.7069 | 330850 | 0.0 | - |
| 24.7106 | 330900 | 0.0 | - |
| 24.7144 | 330950 | 0.0 | - |
| 24.7181 | 331000 | 0.0 | - |
| 24.7218 | 331050 | 0.0 | - |
| 24.7256 | 331100 | 0.0 | - |
| 24.7293 | 331150 | 0.0003 | - |
| 24.7330 | 331200 | 0.0 | - |
| 24.7368 | 331250 | 0.0002 | - |
| 24.7405 | 331300 | 0.0 | - |
| 24.7442 | 331350 | 0.0 | - |
| 24.7480 | 331400 | 0.0 | - |
| 24.7517 | 331450 | 0.0 | - |
| 24.7554 | 331500 | 0.0001 | - |
| 24.7592 | 331550 | 0.0002 | - |
| 24.7629 | 331600 | 0.0 | - |
| 24.7666 | 331650 | 0.0002 | - |
| 24.7704 | 331700 | 0.0002 | - |
| 24.7741 | 331750 | 0.0 | - |
| 24.7778 | 331800 | 0.0 | - |
| 24.7816 | 331850 | 0.0002 | - |
| 24.7853 | 331900 | 0.0 | - |
| 24.7890 | 331950 | 0.0 | - |
| 24.7928 | 332000 | 0.0 | - |
| 24.7965 | 332050 | 0.0 | - |
| 24.8002 | 332100 | 0.0 | - |
| 24.8040 | 332150 | 0.0 | - |
| 24.8077 | 332200 | 0.0 | - |
| 24.8114 | 332250 | 0.0002 | - |
| 24.8152 | 332300 | 0.0 | - |
| 24.8189 | 332350 | 0.0 | - |
| 24.8226 | 332400 | 0.0 | - |
| 24.8264 | 332450 | 0.0 | - |
| 24.8301 | 332500 | 0.0 | - |
| 24.8338 | 332550 | 0.0 | - |
| 24.8376 | 332600 | 0.0002 | - |
| 24.8413 | 332650 | 0.0001 | - |
| 24.8450 | 332700 | 0.0 | - |
| 24.8488 | 332750 | 0.0001 | - |
| 24.8525 | 332800 | 0.0 | - |
| 24.8562 | 332850 | 0.0 | - |
| 24.8600 | 332900 | 0.0002 | - |
| 24.8637 | 332950 | 0.0002 | - |
| 24.8674 | 333000 | 0.0 | - |
| 24.8712 | 333050 | 0.0003 | - |
| 24.8749 | 333100 | 0.0 | - |
| 24.8786 | 333150 | 0.0003 | - |
| 24.8824 | 333200 | 0.0002 | - |
| 24.8861 | 333250 | 0.0 | - |
| 24.8899 | 333300 | 0.0 | - |
| 24.8936 | 333350 | 0.0 | - |
| 24.8973 | 333400 | 0.0 | - |
| 24.9011 | 333450 | 0.0002 | - |
| 24.9048 | 333500 | 0.0 | - |
| 24.9085 | 333550 | 0.0 | - |
| 24.9123 | 333600 | 0.0 | - |
| 24.9160 | 333650 | 0.0 | - |
| 24.9197 | 333700 | 0.0002 | - |
| 24.9235 | 333750 | 0.0002 | - |
| 24.9272 | 333800 | 0.0002 | - |
| 24.9309 | 333850 | 0.0003 | - |
| 24.9347 | 333900 | 0.0002 | - |
| 24.9384 | 333950 | 0.0001 | - |
| 24.9421 | 334000 | 0.0001 | - |
| 24.9459 | 334050 | 0.0004 | - |
| 24.9496 | 334100 | 0.0001 | - |
| 24.9533 | 334150 | 0.0 | - |
| 24.9571 | 334200 | 0.0 | - |
| 24.9608 | 334250 | 0.0 | - |
| 24.9645 | 334300 | 0.0 | - |
| 24.9683 | 334350 | 0.0 | - |
| 24.9720 | 334400 | 0.0002 | - |
| 24.9757 | 334450 | 0.0 | - |
| 24.9795 | 334500 | 0.0 | - |
| 24.9832 | 334550 | 0.0002 | - |
| 24.9869 | 334600 | 0.0 | - |
| 24.9907 | 334650 | 0.0 | - |
| 24.9944 | 334700 | 0.0 | - |
| 24.9981 | 334750 | 0.0 | - |
| 25.0019 | 334800 | 0.0 | - |
| 25.0056 | 334850 | 0.0 | - |
| 25.0093 | 334900 | 0.0 | - |
| 25.0131 | 334950 | 0.0001 | - |
| 25.0168 | 335000 | 0.0 | - |
| 25.0205 | 335050 | 0.0 | - |
| 25.0243 | 335100 | 0.0002 | - |
| 25.0280 | 335150 | 0.0 | - |
| 25.0317 | 335200 | 0.0003 | - |
| 25.0355 | 335250 | 0.0 | - |
| 25.0392 | 335300 | 0.0002 | - |
| 25.0429 | 335350 | 0.0 | - |
| 25.0467 | 335400 | 0.0 | - |
| 25.0504 | 335450 | 0.0 | - |
| 25.0541 | 335500 | 0.0002 | - |
| 25.0579 | 335550 | 0.0 | - |
| 25.0616 | 335600 | 0.0 | - |
| 25.0653 | 335650 | 0.0 | - |
| 25.0691 | 335700 | 0.0 | - |
| 25.0728 | 335750 | 0.0 | - |
| 25.0765 | 335800 | 0.0 | - |
| 25.0803 | 335850 | 0.0002 | - |
| 25.0840 | 335900 | 0.0002 | - |
| 25.0877 | 335950 | 0.0 | - |
| 25.0915 | 336000 | 0.0 | - |
| 25.0952 | 336050 | 0.0 | - |
| 25.0989 | 336100 | 0.0002 | - |
| 25.1027 | 336150 | 0.0 | - |
| 25.1064 | 336200 | 0.0 | - |
| 25.1101 | 336250 | 0.0 | - |
| 25.1139 | 336300 | 0.0001 | - |
| 25.1176 | 336350 | 0.0001 | - |
| 25.1214 | 336400 | 0.0 | - |
| 25.1251 | 336450 | 0.0 | - |
| 25.1288 | 336500 | 0.0 | - |
| 25.1326 | 336550 | 0.0 | - |
| 25.1363 | 336600 | 0.0 | - |
| 25.1400 | 336650 | 0.0002 | - |
| 25.1438 | 336700 | 0.0001 | - |
| 25.1475 | 336750 | 0.0 | - |
| 25.1512 | 336800 | 0.0 | - |
| 25.1550 | 336850 | 0.0 | - |
| 25.1587 | 336900 | 0.0001 | - |
| 25.1624 | 336950 | 0.0002 | - |
| 25.1662 | 337000 | 0.0 | - |
| 25.1699 | 337050 | 0.0001 | - |
| 25.1736 | 337100 | 0.0 | - |
| 25.1774 | 337150 | 0.0 | - |
| 25.1811 | 337200 | 0.0002 | - |
| 25.1848 | 337250 | 0.0 | - |
| 25.1886 | 337300 | 0.0002 | - |
| 25.1923 | 337350 | 0.0002 | - |
| 25.1960 | 337400 | 0.0 | - |
| 25.1998 | 337450 | 0.0 | - |
| 25.2035 | 337500 | 0.0 | - |
| 25.2072 | 337550 | 0.0 | - |
| 25.2110 | 337600 | 0.0002 | - |
| 25.2147 | 337650 | 0.0 | - |
| 25.2184 | 337700 | 0.0002 | - |
| 25.2222 | 337750 | 0.0 | - |
| 25.2259 | 337800 | 0.0 | - |
| 25.2296 | 337850 | 0.0 | - |
| 25.2334 | 337900 | 0.0 | - |
| 25.2371 | 337950 | 0.0 | - |
| 25.2408 | 338000 | 0.0 | - |
| 25.2446 | 338050 | 0.0002 | - |
| 25.2483 | 338100 | 0.0 | - |
| 25.2520 | 338150 | 0.0002 | - |
| 25.2558 | 338200 | 0.0 | - |
| 25.2595 | 338250 | 0.0002 | - |
| 25.2632 | 338300 | 0.0 | - |
| 25.2670 | 338350 | 0.0 | - |
| 25.2707 | 338400 | 0.0 | - |
| 25.2744 | 338450 | 0.0 | - |
| 25.2782 | 338500 | 0.0002 | - |
| 25.2819 | 338550 | 0.0 | - |
| 25.2856 | 338600 | 0.0 | - |
| 25.2894 | 338650 | 0.0001 | - |
| 25.2931 | 338700 | 0.0 | - |
| 25.2968 | 338750 | 0.0 | - |
| 25.3006 | 338800 | 0.0 | - |
| 25.3043 | 338850 | 0.0 | - |
| 25.3080 | 338900 | 0.0 | - |
| 25.3118 | 338950 | 0.0 | - |
| 25.3155 | 339000 | 0.0001 | - |
| 25.3192 | 339050 | 0.0 | - |
| 25.3230 | 339100 | 0.0 | - |
| 25.3267 | 339150 | 0.0002 | - |
| 25.3304 | 339200 | 0.0 | - |
| 25.3342 | 339250 | 0.0 | - |
| 25.3379 | 339300 | 0.0002 | - |
| 25.3416 | 339350 | 0.0002 | - |
| 25.3454 | 339400 | 0.0 | - |
| 25.3491 | 339450 | 0.0 | - |
| 25.3528 | 339500 | 0.0 | - |
| 25.3566 | 339550 | 0.0 | - |
| 25.3603 | 339600 | 0.0 | - |
| 25.3641 | 339650 | 0.0 | - |
| 25.3678 | 339700 | 0.0002 | - |
| 25.3715 | 339750 | 0.0 | - |
| 25.3753 | 339800 | 0.0002 | - |
| 25.3790 | 339850 | 0.0 | - |
| 25.3827 | 339900 | 0.0002 | - |
| 25.3865 | 339950 | 0.0002 | - |
| 25.3902 | 340000 | 0.0 | - |
| 25.3939 | 340050 | 0.0002 | - |
| 25.3977 | 340100 | 0.0 | - |
| 25.4014 | 340150 | 0.0001 | - |
| 25.4051 | 340200 | 0.0001 | - |
| 25.4089 | 340250 | 0.0 | - |
| 25.4126 | 340300 | 0.0 | - |
| 25.4163 | 340350 | 0.0 | - |
| 25.4201 | 340400 | 0.0002 | - |
| 25.4238 | 340450 | 0.0002 | - |
| 25.4275 | 340500 | 0.0 | - |
| 25.4313 | 340550 | 0.0002 | - |
| 25.4350 | 340600 | 0.0 | - |
| 25.4387 | 340650 | 0.0 | - |
| 25.4425 | 340700 | 0.0002 | - |
| 25.4462 | 340750 | 0.0 | - |
| 25.4499 | 340800 | 0.0 | - |
| 25.4537 | 340850 | 0.0 | - |
| 25.4574 | 340900 | 0.0 | - |
| 25.4611 | 340950 | 0.0 | - |
| 25.4649 | 341000 | 0.0 | - |
| 25.4686 | 341050 | 0.0002 | - |
| 25.4723 | 341100 | 0.0 | - |
| 25.4761 | 341150 | 0.0 | - |
| 25.4798 | 341200 | 0.0002 | - |
| 25.4835 | 341250 | 0.0 | - |
| 25.4873 | 341300 | 0.0 | - |
| 25.4910 | 341350 | 0.0 | - |
| 25.4947 | 341400 | 0.0 | - |
| 25.4985 | 341450 | 0.0 | - |
| 25.5022 | 341500 | 0.0 | - |
| 25.5059 | 341550 | 0.0 | - |
| 25.5097 | 341600 | 0.0 | - |
| 25.5134 | 341650 | 0.0 | - |
| 25.5171 | 341700 | 0.0 | - |
| 25.5209 | 341750 | 0.0005 | - |
| 25.5246 | 341800 | 0.0 | - |
| 25.5283 | 341850 | 0.0 | - |
| 25.5321 | 341900 | 0.0 | - |
| 25.5358 | 341950 | 0.0 | - |
| 25.5395 | 342000 | 0.0003 | - |
| 25.5433 | 342050 | 0.0 | - |
| 25.5470 | 342100 | 0.0 | - |
| 25.5507 | 342150 | 0.0 | - |
| 25.5545 | 342200 | 0.0 | - |
| 25.5582 | 342250 | 0.0 | - |
| 25.5619 | 342300 | 0.0 | - |
| 25.5657 | 342350 | 0.0 | - |
| 25.5694 | 342400 | 0.0002 | - |
| 25.5731 | 342450 | 0.0 | - |
| 25.5769 | 342500 | 0.0002 | - |
| 25.5806 | 342550 | 0.0 | - |
| 25.5843 | 342600 | 0.0 | - |
| 25.5881 | 342650 | 0.0 | - |
| 25.5918 | 342700 | 0.0 | - |
| 25.5955 | 342750 | 0.0 | - |
| 25.5993 | 342800 | 0.0002 | - |
| 25.6030 | 342850 | 0.0 | - |
| 25.6068 | 342900 | 0.0002 | - |
| 25.6105 | 342950 | 0.0 | - |
| 25.6142 | 343000 | 0.0 | - |
| 25.6180 | 343050 | 0.0 | - |
| 25.6217 | 343100 | 0.0 | - |
| 25.6254 | 343150 | 0.0002 | - |
| 25.6292 | 343200 | 0.0 | - |
| 25.6329 | 343250 | 0.0 | - |
| 25.6366 | 343300 | 0.0 | - |
| 25.6404 | 343350 | 0.0002 | - |
| 25.6441 | 343400 | 0.0 | - |
| 25.6478 | 343450 | 0.0 | - |
| 25.6516 | 343500 | 0.0 | - |
| 25.6553 | 343550 | 0.0 | - |
| 25.6590 | 343600 | 0.0002 | - |
| 25.6628 | 343650 | 0.0002 | - |
| 25.6665 | 343700 | 0.0 | - |
| 25.6702 | 343750 | 0.0002 | - |
| 25.6740 | 343800 | 0.0001 | - |
| 25.6777 | 343850 | 0.0002 | - |
| 25.6814 | 343900 | 0.0 | - |
| 25.6852 | 343950 | 0.0 | - |
| 25.6889 | 344000 | 0.0002 | - |
| 25.6926 | 344050 | 0.0 | - |
| 25.6964 | 344100 | 0.0 | - |
| 25.7001 | 344150 | 0.0003 | - |
| 25.7038 | 344200 | 0.0004 | - |
| 25.7076 | 344250 | 0.0003 | - |
| 25.7113 | 344300 | 0.0 | - |
| 25.7150 | 344350 | 0.0 | - |
| 25.7188 | 344400 | 0.0 | - |
| 25.7225 | 344450 | 0.0 | - |
| 25.7262 | 344500 | 0.0 | - |
| 25.7300 | 344550 | 0.0002 | - |
| 25.7337 | 344600 | 0.0 | - |
| 25.7374 | 344650 | 0.0 | - |
| 25.7412 | 344700 | 0.0 | - |
| 25.7449 | 344750 | 0.0 | - |
| 25.7486 | 344800 | 0.0002 | - |
| 25.7524 | 344850 | 0.0 | - |
| 25.7561 | 344900 | 0.0003 | - |
| 25.7598 | 344950 | 0.0 | - |
| 25.7636 | 345000 | 0.0 | - |
| 25.7673 | 345050 | 0.0 | - |
| 25.7710 | 345100 | 0.0002 | - |
| 25.7748 | 345150 | 0.0 | - |
| 25.7785 | 345200 | 0.0002 | - |
| 25.7822 | 345250 | 0.0 | - |
| 25.7860 | 345300 | 0.0 | - |
| 25.7897 | 345350 | 0.0 | - |
| 25.7934 | 345400 | 0.0 | - |
| 25.7972 | 345450 | 0.0004 | - |
| 25.8009 | 345500 | 0.0001 | - |
| 25.8046 | 345550 | 0.0002 | - |
| 25.8084 | 345600 | 0.0003 | - |
| 25.8121 | 345650 | 0.0 | - |
| 25.8158 | 345700 | 0.0002 | - |
| 25.8196 | 345750 | 0.0 | - |
| 25.8233 | 345800 | 0.0 | - |
| 25.8270 | 345850 | 0.0 | - |
| 25.8308 | 345900 | 0.0002 | - |
| 25.8345 | 345950 | 0.0 | - |
| 25.8382 | 346000 | 0.0 | - |
| 25.8420 | 346050 | 0.0002 | - |
| 25.8457 | 346100 | 0.0 | - |
| 25.8495 | 346150 | 0.0 | - |
| 25.8532 | 346200 | 0.0 | - |
| 25.8569 | 346250 | 0.0 | - |
| 25.8607 | 346300 | 0.0 | - |
| 25.8644 | 346350 | 0.0002 | - |
| 25.8681 | 346400 | 0.0 | - |
| 25.8719 | 346450 | 0.0 | - |
| 25.8756 | 346500 | 0.0 | - |
| 25.8793 | 346550 | 0.0 | - |
| 25.8831 | 346600 | 0.0 | - |
| 25.8868 | 346650 | 0.0002 | - |
| 25.8905 | 346700 | 0.0 | - |
| 25.8943 | 346750 | 0.0002 | - |
| 25.8980 | 346800 | 0.0 | - |
| 25.9017 | 346850 | 0.0 | - |
| 25.9055 | 346900 | 0.0003 | - |
| 25.9092 | 346950 | 0.0 | - |
| 25.9129 | 347000 | 0.0 | - |
| 25.9167 | 347050 | 0.0 | - |
| 25.9204 | 347100 | 0.0 | - |
| 25.9241 | 347150 | 0.0 | - |
| 25.9279 | 347200 | 0.0 | - |
| 25.9316 | 347250 | 0.0 | - |
| 25.9353 | 347300 | 0.0 | - |
| 25.9391 | 347350 | 0.0001 | - |
| 25.9428 | 347400 | 0.0 | - |
| 25.9465 | 347450 | 0.0 | - |
| 25.9503 | 347500 | 0.0 | - |
| 25.9540 | 347550 | 0.0 | - |
| 25.9577 | 347600 | 0.0 | - |
| 25.9615 | 347650 | 0.0002 | - |
| 25.9652 | 347700 | 0.0 | - |
| 25.9689 | 347750 | 0.0 | - |
| 25.9727 | 347800 | 0.0 | - |
| 25.9764 | 347850 | 0.0 | - |
| 25.9801 | 347900 | 0.0 | - |
| 25.9839 | 347950 | 0.0 | - |
| 25.9876 | 348000 | 0.0 | - |
| 25.9913 | 348050 | 0.0 | - |
| 25.9951 | 348100 | 0.0002 | - |
| 25.9988 | 348150 | 0.0 | - |
| 26.0025 | 348200 | 0.0 | - |
| 26.0063 | 348250 | 0.0 | - |
| 26.0100 | 348300 | 0.0002 | - |
| 26.0137 | 348350 | 0.0002 | - |
| 26.0175 | 348400 | 0.0 | - |
| 26.0212 | 348450 | 0.0002 | - |
| 26.0249 | 348500 | 0.0003 | - |
| 26.0287 | 348550 | 0.0001 | - |
| 26.0324 | 348600 | 0.0002 | - |
| 26.0361 | 348650 | 0.0 | - |
| 26.0399 | 348700 | 0.0002 | - |
| 26.0436 | 348750 | 0.0 | - |
| 26.0473 | 348800 | 0.0 | - |
| 26.0511 | 348850 | 0.0 | - |
| 26.0548 | 348900 | 0.0 | - |
| 26.0585 | 348950 | 0.0002 | - |
| 26.0623 | 349000 | 0.0002 | - |
| 26.0660 | 349050 | 0.0002 | - |
| 26.0697 | 349100 | 0.0 | - |
| 26.0735 | 349150 | 0.0003 | - |
| 26.0772 | 349200 | 0.0 | - |
| 26.0809 | 349250 | 0.0 | - |
| 26.0847 | 349300 | 0.0 | - |
| 26.0884 | 349350 | 0.0 | - |
| 26.0922 | 349400 | 0.0002 | - |
| 26.0959 | 349450 | 0.0 | - |
| 26.0996 | 349500 | 0.0002 | - |
| 26.1034 | 349550 | 0.0002 | - |
| 26.1071 | 349600 | 0.0002 | - |
| 26.1108 | 349650 | 0.0 | - |
| 26.1146 | 349700 | 0.0002 | - |
| 26.1183 | 349750 | 0.0002 | - |
| 26.1220 | 349800 | 0.0002 | - |
| 26.1258 | 349850 | 0.0 | - |
| 26.1295 | 349900 | 0.0 | - |
| 26.1332 | 349950 | 0.0002 | - |
| 26.1370 | 350000 | 0.0002 | - |
| 26.1407 | 350050 | 0.0 | - |
| 26.1444 | 350100 | 0.0 | - |
| 26.1482 | 350150 | 0.0002 | - |
| 26.1519 | 350200 | 0.0 | - |
| 26.1556 | 350250 | 0.0 | - |
| 26.1594 | 350300 | 0.0 | - |
| 26.1631 | 350350 | 0.0 | - |
| 26.1668 | 350400 | 0.0002 | - |
| 26.1706 | 350450 | 0.0002 | - |
| 26.1743 | 350500 | 0.0 | - |
| 26.1780 | 350550 | 0.0002 | - |
| 26.1818 | 350600 | 0.0002 | - |
| 26.1855 | 350650 | 0.0 | - |
| 26.1892 | 350700 | 0.0 | - |
| 26.1930 | 350750 | 0.0 | - |
| 26.1967 | 350800 | 0.0 | - |
| 26.2004 | 350850 | 0.0 | - |
| 26.2042 | 350900 | 0.0003 | - |
| 26.2079 | 350950 | 0.0 | - |
| 26.2116 | 351000 | 0.0 | - |
| 26.2154 | 351050 | 0.0 | - |
| 26.2191 | 351100 | 0.0 | - |
| 26.2228 | 351150 | 0.0 | - |
| 26.2266 | 351200 | 0.0 | - |
| 26.2303 | 351250 | 0.0002 | - |
| 26.2340 | 351300 | 0.0 | - |
| 26.2378 | 351350 | 0.0 | - |
| 26.2415 | 351400 | 0.0003 | - |
| 26.2452 | 351450 | 0.0005 | - |
| 26.2490 | 351500 | 0.0002 | - |
| 26.2527 | 351550 | 0.0002 | - |
| 26.2564 | 351600 | 0.0001 | - |
| 26.2602 | 351650 | 0.0 | - |
| 26.2639 | 351700 | 0.0001 | - |
| 26.2676 | 351750 | 0.0002 | - |
| 26.2714 | 351800 | 0.0 | - |
| 26.2751 | 351850 | 0.0 | - |
| 26.2788 | 351900 | 0.0002 | - |
| 26.2826 | 351950 | 0.0002 | - |
| 26.2863 | 352000 | 0.0 | - |
| 26.2900 | 352050 | 0.0002 | - |
| 26.2938 | 352100 | 0.0 | - |
| 26.2975 | 352150 | 0.0001 | - |
| 26.3012 | 352200 | 0.0003 | - |
| 26.3050 | 352250 | 0.0 | - |
| 26.3087 | 352300 | 0.0 | - |
| 26.3124 | 352350 | 0.0002 | - |
| 26.3162 | 352400 | 0.0 | - |
| 26.3199 | 352450 | 0.0 | - |
| 26.3237 | 352500 | 0.0 | - |
| 26.3274 | 352550 | 0.0002 | - |
| 26.3311 | 352600 | 0.0002 | - |
| 26.3349 | 352650 | 0.0002 | - |
| 26.3386 | 352700 | 0.0 | - |
| 26.3423 | 352750 | 0.0002 | - |
| 26.3461 | 352800 | 0.0 | - |
| 26.3498 | 352850 | 0.0 | - |
| 26.3535 | 352900 | 0.0 | - |
| 26.3573 | 352950 | 0.0 | - |
| 26.3610 | 353000 | 0.0002 | - |
| 26.3647 | 353050 | 0.0 | - |
| 26.3685 | 353100 | 0.0 | - |
| 26.3722 | 353150 | 0.0004 | - |
| 26.3759 | 353200 | 0.0 | - |
| 26.3797 | 353250 | 0.0003 | - |
| 26.3834 | 353300 | 0.0002 | - |
| 26.3871 | 353350 | 0.0 | - |
| 26.3909 | 353400 | 0.0001 | - |
| 26.3946 | 353450 | 0.0 | - |
| 26.3983 | 353500 | 0.0 | - |
| 26.4021 | 353550 | 0.0 | - |
| 26.4058 | 353600 | 0.0 | - |
| 26.4095 | 353650 | 0.0002 | - |
| 26.4133 | 353700 | 0.0002 | - |
| 26.4170 | 353750 | 0.0 | - |
| 26.4207 | 353800 | 0.0002 | - |
| 26.4245 | 353850 | 0.0 | - |
| 26.4282 | 353900 | 0.0 | - |
| 26.4319 | 353950 | 0.0 | - |
| 26.4357 | 354000 | 0.0002 | - |
| 26.4394 | 354050 | 0.0002 | - |
| 26.4431 | 354100 | 0.0001 | - |
| 26.4469 | 354150 | 0.0 | - |
| 26.4506 | 354200 | 0.0006 | - |
| 26.4543 | 354250 | 0.0003 | - |
| 26.4581 | 354300 | 0.0002 | - |
| 26.4618 | 354350 | 0.0 | - |
| 26.4655 | 354400 | 0.0 | - |
| 26.4693 | 354450 | 0.0 | - |
| 26.4730 | 354500 | 0.0 | - |
| 26.4767 | 354550 | 0.0003 | - |
| 26.4805 | 354600 | 0.0002 | - |
| 26.4842 | 354650 | 0.0004 | - |
| 26.4879 | 354700 | 0.0 | - |
| 26.4917 | 354750 | 0.0 | - |
| 26.4954 | 354800 | 0.0002 | - |
| 26.4991 | 354850 | 0.0004 | - |
| 26.5029 | 354900 | 0.0 | - |
| 26.5066 | 354950 | 0.0 | - |
| 26.5103 | 355000 | 0.0 | - |
| 26.5141 | 355050 | 0.0 | - |
| 26.5178 | 355100 | 0.0 | - |
| 26.5215 | 355150 | 0.0001 | - |
| 26.5253 | 355200 | 0.0002 | - |
| 26.5290 | 355250 | 0.0001 | - |
| 26.5327 | 355300 | 0.0001 | - |
| 26.5365 | 355350 | 0.0 | - |
| 26.5402 | 355400 | 0.0 | - |
| 26.5439 | 355450 | 0.0 | - |
| 26.5477 | 355500 | 0.0002 | - |
| 26.5514 | 355550 | 0.0 | - |
| 26.5551 | 355600 | 0.0 | - |
| 26.5589 | 355650 | 0.0002 | - |
| 26.5626 | 355700 | 0.0 | - |
| 26.5664 | 355750 | 0.0002 | - |
| 26.5701 | 355800 | 0.0002 | - |
| 26.5738 | 355850 | 0.0002 | - |
| 26.5776 | 355900 | 0.0 | - |
| 26.5813 | 355950 | 0.0 | - |
| 26.5850 | 356000 | 0.0 | - |
| 26.5888 | 356050 | 0.0 | - |
| 26.5925 | 356100 | 0.0 | - |
| 26.5962 | 356150 | 0.0002 | - |
| 26.6000 | 356200 | 0.0001 | - |
| 26.6037 | 356250 | 0.0 | - |
| 26.6074 | 356300 | 0.0 | - |
| 26.6112 | 356350 | 0.0002 | - |
| 26.6149 | 356400 | 0.0 | - |
| 26.6186 | 356450 | 0.0 | - |
| 26.6224 | 356500 | 0.0 | - |
| 26.6261 | 356550 | 0.0002 | - |
| 26.6298 | 356600 | 0.0002 | - |
| 26.6336 | 356650 | 0.0 | - |
| 26.6373 | 356700 | 0.0 | - |
| 26.6410 | 356750 | 0.0 | - |
| 26.6448 | 356800 | 0.0001 | - |
| 26.6485 | 356850 | 0.0 | - |
| 26.6522 | 356900 | 0.0 | - |
| 26.6560 | 356950 | 0.0002 | - |
| 26.6597 | 357000 | 0.0 | - |
| 26.6634 | 357050 | 0.0 | - |
| 26.6672 | 357100 | 0.0 | - |
| 26.6709 | 357150 | 0.0 | - |
| 26.6746 | 357200 | 0.0 | - |
| 26.6784 | 357250 | 0.0 | - |
| 26.6821 | 357300 | 0.0001 | - |
| 26.6858 | 357350 | 0.0 | - |
| 26.6896 | 357400 | 0.0 | - |
| 26.6933 | 357450 | 0.0 | - |
| 26.6970 | 357500 | 0.0 | - |
| 26.7008 | 357550 | 0.0 | - |
| 26.7045 | 357600 | 0.0 | - |
| 26.7082 | 357650 | 0.0002 | - |
| 26.7120 | 357700 | 0.0002 | - |
| 26.7157 | 357750 | 0.0002 | - |
| 26.7194 | 357800 | 0.0003 | - |
| 26.7232 | 357850 | 0.0 | - |
| 26.7269 | 357900 | 0.0 | - |
| 26.7306 | 357950 | 0.0 | - |
| 26.7344 | 358000 | 0.0 | - |
| 26.7381 | 358050 | 0.0 | - |
| 26.7418 | 358100 | 0.0 | - |
| 26.7456 | 358150 | 0.0 | - |
| 26.7493 | 358200 | 0.0 | - |
| 26.7530 | 358250 | 0.0 | - |
| 26.7568 | 358300 | 0.0002 | - |
| 26.7605 | 358350 | 0.0001 | - |
| 26.7642 | 358400 | 0.0001 | - |
| 26.7680 | 358450 | 0.0 | - |
| 26.7717 | 358500 | 0.0 | - |
| 26.7754 | 358550 | 0.0 | - |
| 26.7792 | 358600 | 0.0 | - |
| 26.7829 | 358650 | 0.0002 | - |
| 26.7866 | 358700 | 0.0002 | - |
| 26.7904 | 358750 | 0.0002 | - |
| 26.7941 | 358800 | 0.0 | - |
| 26.7978 | 358850 | 0.0002 | - |
| 26.8016 | 358900 | 0.0 | - |
| 26.8053 | 358950 | 0.0 | - |
| 26.8091 | 359000 | 0.0001 | - |
| 26.8128 | 359050 | 0.0002 | - |
| 26.8165 | 359100 | 0.0002 | - |
| 26.8203 | 359150 | 0.0 | - |
| 26.8240 | 359200 | 0.0 | - |
| 26.8277 | 359250 | 0.0002 | - |
| 26.8315 | 359300 | 0.0 | - |
| 26.8352 | 359350 | 0.0 | - |
| 26.8389 | 359400 | 0.0 | - |
| 26.8427 | 359450 | 0.0002 | - |
| 26.8464 | 359500 | 0.0002 | - |
| 26.8501 | 359550 | 0.0001 | - |
| 26.8539 | 359600 | 0.0 | - |
| 26.8576 | 359650 | 0.0 | - |
| 26.8613 | 359700 | 0.0001 | - |
| 26.8651 | 359750 | 0.0 | - |
| 26.8688 | 359800 | 0.0 | - |
| 26.8725 | 359850 | 0.0002 | - |
| 26.8763 | 359900 | 0.0 | - |
| 26.8800 | 359950 | 0.0 | - |
| 26.8837 | 360000 | 0.0002 | - |
| 26.8875 | 360050 | 0.0 | - |
| 26.8912 | 360100 | 0.0 | - |
| 26.8949 | 360150 | 0.0 | - |
| 26.8987 | 360200 | 0.0002 | - |
| 26.9024 | 360250 | 0.0001 | - |
| 26.9061 | 360300 | 0.0 | - |
| 26.9099 | 360350 | 0.0 | - |
| 26.9136 | 360400 | 0.0 | - |
| 26.9173 | 360450 | 0.0 | - |
| 26.9211 | 360500 | 0.0 | - |
| 26.9248 | 360550 | 0.0 | - |
| 26.9285 | 360600 | 0.0 | - |
| 26.9323 | 360650 | 0.0 | - |
| 26.9360 | 360700 | 0.0 | - |
| 26.9397 | 360750 | 0.0002 | - |
| 26.9435 | 360800 | 0.0 | - |
| 26.9472 | 360850 | 0.0 | - |
| 26.9509 | 360900 | 0.0 | - |
| 26.9547 | 360950 | 0.0 | - |
| 26.9584 | 361000 | 0.0 | - |
| 26.9621 | 361050 | 0.0 | - |
| 26.9659 | 361100 | 0.0002 | - |
| 26.9696 | 361150 | 0.0 | - |
| 26.9733 | 361200 | 0.0 | - |
| 26.9771 | 361250 | 0.0 | - |
| 26.9808 | 361300 | 0.0 | - |
| 26.9845 | 361350 | 0.0002 | - |
| 26.9883 | 361400 | 0.0 | - |
| 26.9920 | 361450 | 0.0002 | - |
| 26.9957 | 361500 | 0.0 | - |
| 26.9995 | 361550 | 0.0 | - |
| 27.0032 | 361600 | 0.0002 | - |
| 27.0069 | 361650 | 0.0 | - |
| 27.0107 | 361700 | 0.0 | - |
| 27.0144 | 361750 | 0.0 | - |
| 27.0181 | 361800 | 0.0002 | - |
| 27.0219 | 361850 | 0.0 | - |
| 27.0256 | 361900 | 0.0 | - |
| 27.0293 | 361950 | 0.0 | - |
| 27.0331 | 362000 | 0.0 | - |
| 27.0368 | 362050 | 0.0 | - |
| 27.0405 | 362100 | 0.0 | - |
| 27.0443 | 362150 | 0.0 | - |
| 27.0480 | 362200 | 0.0003 | - |
| 27.0518 | 362250 | 0.0 | - |
| 27.0555 | 362300 | 0.0 | - |
| 27.0592 | 362350 | 0.0 | - |
| 27.0630 | 362400 | 0.0 | - |
| 27.0667 | 362450 | 0.0002 | - |
| 27.0704 | 362500 | 0.0 | - |
| 27.0742 | 362550 | 0.0 | - |
| 27.0779 | 362600 | 0.0001 | - |
| 27.0816 | 362650 | 0.0001 | - |
| 27.0854 | 362700 | 0.0 | - |
| 27.0891 | 362750 | 0.0 | - |
| 27.0928 | 362800 | 0.0 | - |
| 27.0966 | 362850 | 0.0 | - |
| 27.1003 | 362900 | 0.0 | - |
| 27.1040 | 362950 | 0.0 | - |
| 27.1078 | 363000 | 0.0 | - |
| 27.1115 | 363050 | 0.0001 | - |
| 27.1152 | 363100 | 0.0002 | - |
| 27.1190 | 363150 | 0.0 | - |
| 27.1227 | 363200 | 0.0 | - |
| 27.1264 | 363250 | 0.0 | - |
| 27.1302 | 363300 | 0.0 | - |
| 27.1339 | 363350 | 0.0002 | - |
| 27.1376 | 363400 | 0.0 | - |
| 27.1414 | 363450 | 0.0 | - |
| 27.1451 | 363500 | 0.0 | - |
| 27.1488 | 363550 | 0.0002 | - |
| 27.1526 | 363600 | 0.0 | - |
| 27.1563 | 363650 | 0.0002 | - |
| 27.1600 | 363700 | 0.0 | - |
| 27.1638 | 363750 | 0.0 | - |
| 27.1675 | 363800 | 0.0002 | - |
| 27.1712 | 363850 | 0.0 | - |
| 27.1750 | 363900 | 0.0002 | - |
| 27.1787 | 363950 | 0.0 | - |
| 27.1824 | 364000 | 0.0 | - |
| 27.1862 | 364050 | 0.0003 | - |
| 27.1899 | 364100 | 0.0 | - |
| 27.1936 | 364150 | 0.0 | - |
| 27.1974 | 364200 | 0.0 | - |
| 27.2011 | 364250 | 0.0 | - |
| 27.2048 | 364300 | 0.0 | - |
| 27.2086 | 364350 | 0.0 | - |
| 27.2123 | 364400 | 0.0 | - |
| 27.2160 | 364450 | 0.0 | - |
| 27.2198 | 364500 | 0.0 | - |
| 27.2235 | 364550 | 0.0 | - |
| 27.2272 | 364600 | 0.0 | - |
| 27.2310 | 364650 | 0.0 | - |
| 27.2347 | 364700 | 0.0 | - |
| 27.2384 | 364750 | 0.0002 | - |
| 27.2422 | 364800 | 0.0002 | - |
| 27.2459 | 364850 | 0.0002 | - |
| 27.2496 | 364900 | 0.0 | - |
| 27.2534 | 364950 | 0.0002 | - |
| 27.2571 | 365000 | 0.0 | - |
| 27.2608 | 365050 | 0.0 | - |
| 27.2646 | 365100 | 0.0 | - |
| 27.2683 | 365150 | 0.0 | - |
| 27.2720 | 365200 | 0.0 | - |
| 27.2758 | 365250 | 0.0 | - |
| 27.2795 | 365300 | 0.0 | - |
| 27.2832 | 365350 | 0.0 | - |
| 27.2870 | 365400 | 0.0002 | - |
| 27.2907 | 365450 | 0.0001 | - |
| 27.2945 | 365500 | 0.0 | - |
| 27.2982 | 365550 | 0.0 | - |
| 27.3019 | 365600 | 0.0 | - |
| 27.3057 | 365650 | 0.0 | - |
| 27.3094 | 365700 | 0.0 | - |
| 27.3131 | 365750 | 0.0 | - |
| 27.3169 | 365800 | 0.0002 | - |
| 27.3206 | 365850 | 0.0002 | - |
| 27.3243 | 365900 | 0.0 | - |
| 27.3281 | 365950 | 0.0 | - |
| 27.3318 | 366000 | 0.0 | - |
| 27.3355 | 366050 | 0.0 | - |
| 27.3393 | 366100 | 0.0 | - |
| 27.3430 | 366150 | 0.0001 | - |
| 27.3467 | 366200 | 0.0 | - |
| 27.3505 | 366250 | 0.0002 | - |
| 27.3542 | 366300 | 0.0002 | - |
| 27.3579 | 366350 | 0.0 | - |
| 27.3617 | 366400 | 0.0002 | - |
| 27.3654 | 366450 | 0.0 | - |
| 27.3691 | 366500 | 0.0002 | - |
| 27.3729 | 366550 | 0.0002 | - |
| 27.3766 | 366600 | 0.0 | - |
| 27.3803 | 366650 | 0.0001 | - |
| 27.3841 | 366700 | 0.0 | - |
| 27.3878 | 366750 | 0.0002 | - |
| 27.3915 | 366800 | 0.0002 | - |
| 27.3953 | 366850 | 0.0 | - |
| 27.3990 | 366900 | 0.0002 | - |
| 27.4027 | 366950 | 0.0 | - |
| 27.4065 | 367000 | 0.0 | - |
| 27.4102 | 367050 | 0.0 | - |
| 27.4139 | 367100 | 0.0 | - |
| 27.4177 | 367150 | 0.0 | - |
| 27.4214 | 367200 | 0.0 | - |
| 27.4251 | 367250 | 0.0002 | - |
| 27.4289 | 367300 | 0.0 | - |
| 27.4326 | 367350 | 0.0002 | - |
| 27.4363 | 367400 | 0.0 | - |
| 27.4401 | 367450 | 0.0001 | - |
| 27.4438 | 367500 | 0.0 | - |
| 27.4475 | 367550 | 0.0 | - |
| 27.4513 | 367600 | 0.0 | - |
| 27.4550 | 367650 | 0.0 | - |
| 27.4587 | 367700 | 0.0 | - |
| 27.4625 | 367750 | 0.0 | - |
| 27.4662 | 367800 | 0.0 | - |
| 27.4699 | 367850 | 0.0 | - |
| 27.4737 | 367900 | 0.0002 | - |
| 27.4774 | 367950 | 0.0 | - |
| 27.4811 | 368000 | 0.0 | - |
| 27.4849 | 368050 | 0.0 | - |
| 27.4886 | 368100 | 0.0002 | - |
| 27.4923 | 368150 | 0.0002 | - |
| 27.4961 | 368200 | 0.0 | - |
| 27.4998 | 368250 | 0.0003 | - |
| 27.5035 | 368300 | 0.0 | - |
| 27.5073 | 368350 | 0.0002 | - |
| 27.5110 | 368400 | 0.0003 | - |
| 27.5147 | 368450 | 0.0 | - |
| 27.5185 | 368500 | 0.0 | - |
| 27.5222 | 368550 | 0.0 | - |
| 27.5260 | 368600 | 0.0 | - |
| 27.5297 | 368650 | 0.0 | - |
| 27.5334 | 368700 | 0.0 | - |
| 27.5372 | 368750 | 0.0003 | - |
| 27.5409 | 368800 | 0.0 | - |
| 27.5446 | 368850 | 0.0002 | - |
| 27.5484 | 368900 | 0.0 | - |
| 27.5521 | 368950 | 0.0 | - |
| 27.5558 | 369000 | 0.0 | - |
| 27.5596 | 369050 | 0.0 | - |
| 27.5633 | 369100 | 0.0002 | - |
| 27.5670 | 369150 | 0.0 | - |
| 27.5708 | 369200 | 0.0 | - |
| 27.5745 | 369250 | 0.0 | - |
| 27.5782 | 369300 | 0.0 | - |
| 27.5820 | 369350 | 0.0 | - |
| 27.5857 | 369400 | 0.0 | - |
| 27.5894 | 369450 | 0.0 | - |
| 27.5932 | 369500 | 0.0 | - |
| 27.5969 | 369550 | 0.0001 | - |
| 27.6006 | 369600 | 0.0005 | - |
| 27.6044 | 369650 | 0.0 | - |
| 27.6081 | 369700 | 0.0 | - |
| 27.6118 | 369750 | 0.0 | - |
| 27.6156 | 369800 | 0.0 | - |
| 27.6193 | 369850 | 0.0 | - |
| 27.6230 | 369900 | 0.0 | - |
| 27.6268 | 369950 | 0.0 | - |
| 27.6305 | 370000 | 0.0 | - |
| 27.6342 | 370050 | 0.0 | - |
| 27.6380 | 370100 | 0.0 | - |
| 27.6417 | 370150 | 0.0 | - |
| 27.6454 | 370200 | 0.0 | - |
| 27.6492 | 370250 | 0.0001 | - |
| 27.6529 | 370300 | 0.0 | - |
| 27.6566 | 370350 | 0.0 | - |
| 27.6604 | 370400 | 0.0002 | - |
| 27.6641 | 370450 | 0.0 | - |
| 27.6678 | 370500 | 0.0002 | - |
| 27.6716 | 370550 | 0.0001 | - |
| 27.6753 | 370600 | 0.0 | - |
| 27.6790 | 370650 | 0.0 | - |
| 27.6828 | 370700 | 0.0 | - |
| 27.6865 | 370750 | 0.0 | - |
| 27.6902 | 370800 | 0.0 | - |
| 27.6940 | 370850 | 0.0 | - |
| 27.6977 | 370900 | 0.0002 | - |
| 27.7014 | 370950 | 0.0 | - |
| 27.7052 | 371000 | 0.0002 | - |
| 27.7089 | 371050 | 0.0 | - |
| 27.7126 | 371100 | 0.0002 | - |
| 27.7164 | 371150 | 0.0 | - |
| 27.7201 | 371200 | 0.0 | - |
| 27.7238 | 371250 | 0.0 | - |
| 27.7276 | 371300 | 0.0002 | - |
| 27.7313 | 371350 | 0.0002 | - |
| 27.7350 | 371400 | 0.0001 | - |
| 27.7388 | 371450 | 0.0 | - |
| 27.7425 | 371500 | 0.0 | - |
| 27.7462 | 371550 | 0.0 | - |
| 27.7500 | 371600 | 0.0 | - |
| 27.7537 | 371650 | 0.0 | - |
| 27.7574 | 371700 | 0.0 | - |
| 27.7612 | 371750 | 0.0 | - |
| 27.7649 | 371800 | 0.0 | - |
| 27.7687 | 371850 | 0.0 | - |
| 27.7724 | 371900 | 0.0 | - |
| 27.7761 | 371950 | 0.0 | - |
| 27.7799 | 372000 | 0.0 | - |
| 27.7836 | 372050 | 0.0002 | - |
| 27.7873 | 372100 | 0.0002 | - |
| 27.7911 | 372150 | 0.0 | - |
| 27.7948 | 372200 | 0.0 | - |
| 27.7985 | 372250 | 0.0002 | - |
| 27.8023 | 372300 | 0.0 | - |
| 27.8060 | 372350 | 0.0 | - |
| 27.8097 | 372400 | 0.0 | - |
| 27.8135 | 372450 | 0.0 | - |
| 27.8172 | 372500 | 0.0002 | - |
| 27.8209 | 372550 | 0.0 | - |
| 27.8247 | 372600 | 0.0 | - |
| 27.8284 | 372650 | 0.0 | - |
| 27.8321 | 372700 | 0.0 | - |
| 27.8359 | 372750 | 0.0 | - |
| 27.8396 | 372800 | 0.0 | - |
| 27.8433 | 372850 | 0.0002 | - |
| 27.8471 | 372900 | 0.0 | - |
| 27.8508 | 372950 | 0.0 | - |
| 27.8545 | 373000 | 0.0 | - |
| 27.8583 | 373050 | 0.0002 | - |
| 27.8620 | 373100 | 0.0 | - |
| 27.8657 | 373150 | 0.0001 | - |
| 27.8695 | 373200 | 0.0001 | - |
| 27.8732 | 373250 | 0.0 | - |
| 27.8769 | 373300 | 0.0002 | - |
| 27.8807 | 373350 | 0.0 | - |
| 27.8844 | 373400 | 0.0 | - |
| 27.8881 | 373450 | 0.0 | - |
| 27.8919 | 373500 | 0.0002 | - |
| 27.8956 | 373550 | 0.0 | - |
| 27.8993 | 373600 | 0.0 | - |
| 27.9031 | 373650 | 0.0002 | - |
| 27.9068 | 373700 | 0.0 | - |
| 27.9105 | 373750 | 0.0 | - |
| 27.9143 | 373800 | 0.0 | - |
| 27.9180 | 373850 | 0.0 | - |
| 27.9217 | 373900 | 0.0002 | - |
| 27.9255 | 373950 | 0.0 | - |
| 27.9292 | 374000 | 0.0 | - |
| 27.9329 | 374050 | 0.0 | - |
| 27.9367 | 374100 | 0.0 | - |
| 27.9404 | 374150 | 0.0003 | - |
| 27.9441 | 374200 | 0.0 | - |
| 27.9479 | 374250 | 0.0 | - |
| 27.9516 | 374300 | 0.0 | - |
| 27.9553 | 374350 | 0.0002 | - |
| 27.9591 | 374400 | 0.0002 | - |
| 27.9628 | 374450 | 0.0 | - |
| 27.9665 | 374500 | 0.0 | - |
| 27.9703 | 374550 | 0.0 | - |
| 27.9740 | 374600 | 0.0 | - |
| 27.9777 | 374650 | 0.0001 | - |
| 27.9815 | 374700 | 0.0 | - |
| 27.9852 | 374750 | 0.0 | - |
| 27.9889 | 374800 | 0.0 | - |
| 27.9927 | 374850 | 0.0001 | - |
| 27.9964 | 374900 | 0.0 | - |
| 28.0001 | 374950 | 0.0 | - |
| 28.0039 | 375000 | 0.0 | - |
| 28.0076 | 375050 | 0.0002 | - |
| 28.0114 | 375100 | 0.0002 | - |
| 28.0151 | 375150 | 0.0001 | - |
| 28.0188 | 375200 | 0.0 | - |
| 28.0226 | 375250 | 0.0002 | - |
| 28.0263 | 375300 | 0.0002 | - |
| 28.0300 | 375350 | 0.0 | - |
| 28.0338 | 375400 | 0.0 | - |
| 28.0375 | 375450 | 0.0 | - |
| 28.0412 | 375500 | 0.0 | - |
| 28.0450 | 375550 | 0.0 | - |
| 28.0487 | 375600 | 0.0 | - |
| 28.0524 | 375650 | 0.0001 | - |
| 28.0562 | 375700 | 0.0 | - |
| 28.0599 | 375750 | 0.0 | - |
| 28.0636 | 375800 | 0.0002 | - |
| 28.0674 | 375850 | 0.0 | - |
| 28.0711 | 375900 | 0.0 | - |
| 28.0748 | 375950 | 0.0 | - |
| 28.0786 | 376000 | 0.0 | - |
| 28.0823 | 376050 | 0.0 | - |
| 28.0860 | 376100 | 0.0 | - |
| 28.0898 | 376150 | 0.0 | - |
| 28.0935 | 376200 | 0.0 | - |
| 28.0972 | 376250 | 0.0 | - |
| 28.1010 | 376300 | 0.0002 | - |
| 28.1047 | 376350 | 0.0002 | - |
| 28.1084 | 376400 | 0.0 | - |
| 28.1122 | 376450 | 0.0 | - |
| 28.1159 | 376500 | 0.0 | - |
| 28.1196 | 376550 | 0.0 | - |
| 28.1234 | 376600 | 0.0 | - |
| 28.1271 | 376650 | 0.0 | - |
| 28.1308 | 376700 | 0.0 | - |
| 28.1346 | 376750 | 0.0 | - |
| 28.1383 | 376800 | 0.0 | - |
| 28.1420 | 376850 | 0.0002 | - |
| 28.1458 | 376900 | 0.0 | - |
| 28.1495 | 376950 | 0.0 | - |
| 28.1532 | 377000 | 0.0 | - |
| 28.1570 | 377050 | 0.0 | - |
| 28.1607 | 377100 | 0.0 | - |
| 28.1644 | 377150 | 0.0002 | - |
| 28.1682 | 377200 | 0.0 | - |
| 28.1719 | 377250 | 0.0 | - |
| 28.1756 | 377300 | 0.0 | - |
| 28.1794 | 377350 | 0.0 | - |
| 28.1831 | 377400 | 0.0 | - |
| 28.1868 | 377450 | 0.0 | - |
| 28.1906 | 377500 | 0.0 | - |
| 28.1943 | 377550 | 0.0 | - |
| 28.1980 | 377600 | 0.0 | - |
| 28.2018 | 377650 | 0.0 | - |
| 28.2055 | 377700 | 0.0002 | - |
| 28.2092 | 377750 | 0.0 | - |
| 28.2130 | 377800 | 0.0 | - |
| 28.2167 | 377850 | 0.0 | - |
| 28.2204 | 377900 | 0.0 | - |
| 28.2242 | 377950 | 0.0002 | - |
| 28.2279 | 378000 | 0.0 | - |
| 28.2316 | 378050 | 0.0 | - |
| 28.2354 | 378100 | 0.0002 | - |
| 28.2391 | 378150 | 0.0 | - |
| 28.2428 | 378200 | 0.0 | - |
| 28.2466 | 378250 | 0.0 | - |
| 28.2503 | 378300 | 0.0002 | - |
| 28.2541 | 378350 | 0.0 | - |
| 28.2578 | 378400 | 0.0 | - |
| 28.2615 | 378450 | 0.0003 | - |
| 28.2653 | 378500 | 0.0 | - |
| 28.2690 | 378550 | 0.0002 | - |
| 28.2727 | 378600 | 0.0 | - |
| 28.2765 | 378650 | 0.0 | - |
| 28.2802 | 378700 | 0.0 | - |
| 28.2839 | 378750 | 0.0 | - |
| 28.2877 | 378800 | 0.0003 | - |
| 28.2914 | 378850 | 0.0 | - |
| 28.2951 | 378900 | 0.0002 | - |
| 28.2989 | 378950 | 0.0 | - |
| 28.3026 | 379000 | 0.0001 | - |
| 28.3063 | 379050 | 0.0 | - |
| 28.3101 | 379100 | 0.0 | - |
| 28.3138 | 379150 | 0.0 | - |
| 28.3175 | 379200 | 0.0 | - |
| 28.3213 | 379250 | 0.0 | - |
| 28.3250 | 379300 | 0.0 | - |
| 28.3287 | 379350 | 0.0002 | - |
| 28.3325 | 379400 | 0.0 | - |
| 28.3362 | 379450 | 0.0 | - |
| 28.3399 | 379500 | 0.0 | - |
| 28.3437 | 379550 | 0.0 | - |
| 28.3474 | 379600 | 0.0001 | - |
| 28.3511 | 379650 | 0.0002 | - |
| 28.3549 | 379700 | 0.0 | - |
| 28.3586 | 379750 | 0.0 | - |
| 28.3623 | 379800 | 0.0 | - |
| 28.3661 | 379850 | 0.0 | - |
| 28.3698 | 379900 | 0.0 | - |
| 28.3735 | 379950 | 0.0 | - |
| 28.3773 | 380000 | 0.0 | - |
| 28.3810 | 380050 | 0.0 | - |
| 28.3847 | 380100 | 0.0 | - |
| 28.3885 | 380150 | 0.0 | - |
| 28.3922 | 380200 | 0.0002 | - |
| 28.3959 | 380250 | 0.0 | - |
| 28.3997 | 380300 | 0.0 | - |
| 28.4034 | 380350 | 0.0 | - |
| 28.4071 | 380400 | 0.0 | - |
| 28.4109 | 380450 | 0.0 | - |
| 28.4146 | 380500 | 0.0 | - |
| 28.4183 | 380550 | 0.0 | - |
| 28.4221 | 380600 | 0.0002 | - |
| 28.4258 | 380650 | 0.0 | - |
| 28.4295 | 380700 | 0.0 | - |
| 28.4333 | 380750 | 0.0 | - |
| 28.4370 | 380800 | 0.0 | - |
| 28.4407 | 380850 | 0.0 | - |
| 28.4445 | 380900 | 0.0 | - |
| 28.4482 | 380950 | 0.0 | - |
| 28.4519 | 381000 | 0.0 | - |
| 28.4557 | 381050 | 0.0 | - |
| 28.4594 | 381100 | 0.0 | - |
| 28.4631 | 381150 | 0.0 | - |
| 28.4669 | 381200 | 0.0 | - |
| 28.4706 | 381250 | 0.0002 | - |
| 28.4743 | 381300 | 0.0 | - |
| 28.4781 | 381350 | 0.0 | - |
| 28.4818 | 381400 | 0.0 | - |
| 28.4855 | 381450 | 0.0002 | - |
| 28.4893 | 381500 | 0.0002 | - |
| 28.4930 | 381550 | 0.0 | - |
| 28.4968 | 381600 | 0.0 | - |
| 28.5005 | 381650 | 0.0 | - |
| 28.5042 | 381700 | 0.0 | - |
| 28.5080 | 381750 | 0.0 | - |
| 28.5117 | 381800 | 0.0002 | - |
| 28.5154 | 381850 | 0.0 | - |
| 28.5192 | 381900 | 0.0 | - |
| 28.5229 | 381950 | 0.0002 | - |
| 28.5266 | 382000 | 0.0 | - |
| 28.5304 | 382050 | 0.0 | - |
| 28.5341 | 382100 | 0.0 | - |
| 28.5378 | 382150 | 0.0 | - |
| 28.5416 | 382200 | 0.0 | - |
| 28.5453 | 382250 | 0.0 | - |
| 28.5490 | 382300 | 0.0 | - |
| 28.5528 | 382350 | 0.0002 | - |
| 28.5565 | 382400 | 0.0 | - |
| 28.5602 | 382450 | 0.0 | - |
| 28.5640 | 382500 | 0.0 | - |
| 28.5677 | 382550 | 0.0 | - |
| 28.5714 | 382600 | 0.0 | - |
| 28.5752 | 382650 | 0.0 | - |
| 28.5789 | 382700 | 0.0 | - |
| 28.5826 | 382750 | 0.0 | - |
| 28.5864 | 382800 | 0.0002 | - |
| 28.5901 | 382850 | 0.0002 | - |
| 28.5938 | 382900 | 0.0 | - |
| 28.5976 | 382950 | 0.0001 | - |
| 28.6013 | 383000 | 0.0 | - |
| 28.6050 | 383050 | 0.0 | - |
| 28.6088 | 383100 | 0.0 | - |
| 28.6125 | 383150 | 0.0 | - |
| 28.6162 | 383200 | 0.0 | - |
| 28.6200 | 383250 | 0.0 | - |
| 28.6237 | 383300 | 0.0002 | - |
| 28.6274 | 383350 | 0.0 | - |
| 28.6312 | 383400 | 0.0 | - |
| 28.6349 | 383450 | 0.0 | - |
| 28.6386 | 383500 | 0.0 | - |
| 28.6424 | 383550 | 0.0 | - |
| 28.6461 | 383600 | 0.0 | - |
| 28.6498 | 383650 | 0.0002 | - |
| 28.6536 | 383700 | 0.0 | - |
| 28.6573 | 383750 | 0.0001 | - |
| 28.6610 | 383800 | 0.0002 | - |
| 28.6648 | 383850 | 0.0 | - |
| 28.6685 | 383900 | 0.0002 | - |
| 28.6722 | 383950 | 0.0 | - |
| 28.6760 | 384000 | 0.0 | - |
| 28.6797 | 384050 | 0.0 | - |
| 28.6834 | 384100 | 0.0 | - |
| 28.6872 | 384150 | 0.0 | - |
| 28.6909 | 384200 | 0.0 | - |
| 28.6946 | 384250 | 0.0 | - |
| 28.6984 | 384300 | 0.0 | - |
| 28.7021 | 384350 | 0.0 | - |
| 28.7058 | 384400 | 0.0 | - |
| 28.7096 | 384450 | 0.0001 | - |
| 28.7133 | 384500 | 0.0 | - |
| 28.7170 | 384550 | 0.0 | - |
| 28.7208 | 384600 | 0.0002 | - |
| 28.7245 | 384650 | 0.0 | - |
| 28.7283 | 384700 | 0.0 | - |
| 28.7320 | 384750 | 0.0 | - |
| 28.7357 | 384800 | 0.0 | - |
| 28.7395 | 384850 | 0.0 | - |
| 28.7432 | 384900 | 0.0 | - |
| 28.7469 | 384950 | 0.0002 | - |
| 28.7507 | 385000 | 0.0 | - |
| 28.7544 | 385050 | 0.0001 | - |
| 28.7581 | 385100 | 0.0 | - |
| 28.7619 | 385150 | 0.0 | - |
| 28.7656 | 385200 | 0.0 | - |
| 28.7693 | 385250 | 0.0 | - |
| 28.7731 | 385300 | 0.0 | - |
| 28.7768 | 385350 | 0.0 | - |
| 28.7805 | 385400 | 0.0 | - |
| 28.7843 | 385450 | 0.0001 | - |
| 28.7880 | 385500 | 0.0 | - |
| 28.7917 | 385550 | 0.0005 | - |
| 28.7955 | 385600 | 0.0 | - |
| 28.7992 | 385650 | 0.0 | - |
| 28.8029 | 385700 | 0.0002 | - |
| 28.8067 | 385750 | 0.0 | - |
| 28.8104 | 385800 | 0.0 | - |
| 28.8141 | 385850 | 0.0 | - |
| 28.8179 | 385900 | 0.0 | - |
| 28.8216 | 385950 | 0.0 | - |
| 28.8253 | 386000 | 0.0002 | - |
| 28.8291 | 386050 | 0.0 | - |
| 28.8328 | 386100 | 0.0 | - |
| 28.8365 | 386150 | 0.0 | - |
| 28.8403 | 386200 | 0.0 | - |
| 28.8440 | 386250 | 0.0 | - |
| 28.8477 | 386300 | 0.0 | - |
| 28.8515 | 386350 | 0.0 | - |
| 28.8552 | 386400 | 0.0 | - |
| 28.8589 | 386450 | 0.0 | - |
| 28.8627 | 386500 | 0.0 | - |
| 28.8664 | 386550 | 0.0 | - |
| 28.8701 | 386600 | 0.0 | - |
| 28.8739 | 386650 | 0.0002 | - |
| 28.8776 | 386700 | 0.0 | - |
| 28.8813 | 386750 | 0.0 | - |
| 28.8851 | 386800 | 0.0 | - |
| 28.8888 | 386850 | 0.0 | - |
| 28.8925 | 386900 | 0.0 | - |
| 28.8963 | 386950 | 0.0002 | - |
| 28.9000 | 387000 | 0.0 | - |
| 28.9037 | 387050 | 0.0 | - |
| 28.9075 | 387100 | 0.0 | - |
| 28.9112 | 387150 | 0.0 | - |
| 28.9149 | 387200 | 0.0002 | - |
| 28.9187 | 387250 | 0.0 | - |
| 28.9224 | 387300 | 0.0 | - |
| 28.9261 | 387350 | 0.0 | - |
| 28.9299 | 387400 | 0.0002 | - |
| 28.9336 | 387450 | 0.0 | - |
| 28.9373 | 387500 | 0.0 | - |
| 28.9411 | 387550 | 0.0 | - |
| 28.9448 | 387600 | 0.0 | - |
| 28.9485 | 387650 | 0.0 | - |
| 28.9523 | 387700 | 0.0 | - |
| 28.9560 | 387750 | 0.0 | - |
| 28.9597 | 387800 | 0.0 | - |
| 28.9635 | 387850 | 0.0 | - |
| 28.9672 | 387900 | 0.0 | - |
| 28.9710 | 387950 | 0.0 | - |
| 28.9747 | 388000 | 0.0 | - |
| 28.9784 | 388050 | 0.0 | - |
| 28.9822 | 388100 | 0.0002 | - |
| 28.9859 | 388150 | 0.0 | - |
| 28.9896 | 388200 | 0.0 | - |
| 28.9934 | 388250 | 0.0002 | - |
| 28.9971 | 388300 | 0.0 | - |
| 29.0008 | 388350 | 0.0001 | - |
| 29.0046 | 388400 | 0.0 | - |
| 29.0083 | 388450 | 0.0 | - |
| 29.0120 | 388500 | 0.0 | - |
| 29.0158 | 388550 | 0.0 | - |
| 29.0195 | 388600 | 0.0 | - |
| 29.0232 | 388650 | 0.0 | - |
| 29.0270 | 388700 | 0.0002 | - |
| 29.0307 | 388750 | 0.0 | - |
| 29.0344 | 388800 | 0.0 | - |
| 29.0382 | 388850 | 0.0 | - |
| 29.0419 | 388900 | 0.0 | - |
| 29.0456 | 388950 | 0.0002 | - |
| 29.0494 | 389000 | 0.0003 | - |
| 29.0531 | 389050 | 0.0002 | - |
| 29.0568 | 389100 | 0.0 | - |
| 29.0606 | 389150 | 0.0002 | - |
| 29.0643 | 389200 | 0.0 | - |
| 29.0680 | 389250 | 0.0001 | - |
| 29.0718 | 389300 | 0.0002 | - |
| 29.0755 | 389350 | 0.0 | - |
| 29.0792 | 389400 | 0.0 | - |
| 29.0830 | 389450 | 0.0 | - |
| 29.0867 | 389500 | 0.0 | - |
| 29.0904 | 389550 | 0.0 | - |
| 29.0942 | 389600 | 0.0 | - |
| 29.0979 | 389650 | 0.0 | - |
| 29.1016 | 389700 | 0.0 | - |
| 29.1054 | 389750 | 0.0002 | - |
| 29.1091 | 389800 | 0.0 | - |
| 29.1128 | 389850 | 0.0 | - |
| 29.1166 | 389900 | 0.0 | - |
| 29.1203 | 389950 | 0.0 | - |
| 29.1240 | 390000 | 0.0 | - |
| 29.1278 | 390050 | 0.0002 | - |
| 29.1315 | 390100 | 0.0 | - |
| 29.1352 | 390150 | 0.0 | - |
| 29.1390 | 390200 | 0.0002 | - |
| 29.1427 | 390250 | 0.0 | - |
| 29.1464 | 390300 | 0.0002 | - |
| 29.1502 | 390350 | 0.0002 | - |
| 29.1539 | 390400 | 0.0 | - |
| 29.1576 | 390450 | 0.0 | - |
| 29.1614 | 390500 | 0.0 | - |
| 29.1651 | 390550 | 0.0 | - |
| 29.1688 | 390600 | 0.0 | - |
| 29.1726 | 390650 | 0.0 | - |
| 29.1763 | 390700 | 0.0 | - |
| 29.1800 | 390750 | 0.0 | - |
| 29.1838 | 390800 | 0.0 | - |
| 29.1875 | 390850 | 0.0 | - |
| 29.1912 | 390900 | 0.0 | - |
| 29.1950 | 390950 | 0.0 | - |
| 29.1987 | 391000 | 0.0 | - |
| 29.2024 | 391050 | 0.0 | - |
| 29.2062 | 391100 | 0.0 | - |
| 29.2099 | 391150 | 0.0 | - |
| 29.2137 | 391200 | 0.0 | - |
| 29.2174 | 391250 | 0.0 | - |
| 29.2211 | 391300 | 0.0 | - |
| 29.2249 | 391350 | 0.0002 | - |
| 29.2286 | 391400 | 0.0 | - |
| 29.2323 | 391450 | 0.0001 | - |
| 29.2361 | 391500 | 0.0 | - |
| 29.2398 | 391550 | 0.0 | - |
| 29.2435 | 391600 | 0.0002 | - |
| 29.2473 | 391650 | 0.0 | - |
| 29.2510 | 391700 | 0.0 | - |
| 29.2547 | 391750 | 0.0 | - |
| 29.2585 | 391800 | 0.0 | - |
| 29.2622 | 391850 | 0.0 | - |
| 29.2659 | 391900 | 0.0 | - |
| 29.2697 | 391950 | 0.0 | - |
| 29.2734 | 392000 | 0.0002 | - |
| 29.2771 | 392050 | 0.0 | - |
| 29.2809 | 392100 | 0.0 | - |
| 29.2846 | 392150 | 0.0 | - |
| 29.2883 | 392200 | 0.0 | - |
| 29.2921 | 392250 | 0.0 | - |
| 29.2958 | 392300 | 0.0001 | - |
| 29.2995 | 392350 | 0.0 | - |
| 29.3033 | 392400 | 0.0 | - |
| 29.3070 | 392450 | 0.0 | - |
| 29.3107 | 392500 | 0.0 | - |
| 29.3145 | 392550 | 0.0002 | - |
| 29.3182 | 392600 | 0.0 | - |
| 29.3219 | 392650 | 0.0 | - |
| 29.3257 | 392700 | 0.0 | - |
| 29.3294 | 392750 | 0.0 | - |
| 29.3331 | 392800 | 0.0 | - |
| 29.3369 | 392850 | 0.0 | - |
| 29.3406 | 392900 | 0.0 | - |
| 29.3443 | 392950 | 0.0 | - |
| 29.3481 | 393000 | 0.0 | - |
| 29.3518 | 393050 | 0.0 | - |
| 29.3555 | 393100 | 0.0 | - |
| 29.3593 | 393150 | 0.0002 | - |
| 29.3630 | 393200 | 0.0 | - |
| 29.3667 | 393250 | 0.0 | - |
| 29.3705 | 393300 | 0.0 | - |
| 29.3742 | 393350 | 0.0 | - |
| 29.3779 | 393400 | 0.0002 | - |
| 29.3817 | 393450 | 0.0 | - |
| 29.3854 | 393500 | 0.0 | - |
| 29.3891 | 393550 | 0.0 | - |
| 29.3929 | 393600 | 0.0002 | - |
| 29.3966 | 393650 | 0.0 | - |
| 29.4003 | 393700 | 0.0 | - |
| 29.4041 | 393750 | 0.0002 | - |
| 29.4078 | 393800 | 0.0 | - |
| 29.4115 | 393850 | 0.0 | - |
| 29.4153 | 393900 | 0.0002 | - |
| 29.4190 | 393950 | 0.0 | - |
| 29.4227 | 394000 | 0.0 | - |
| 29.4265 | 394050 | 0.0 | - |
| 29.4302 | 394100 | 0.0002 | - |
| 29.4339 | 394150 | 0.0001 | - |
| 29.4377 | 394200 | 0.0 | - |
| 29.4414 | 394250 | 0.0 | - |
| 29.4451 | 394300 | 0.0 | - |
| 29.4489 | 394350 | 0.0 | - |
| 29.4526 | 394400 | 0.0001 | - |
| 29.4564 | 394450 | 0.0002 | - |
| 29.4601 | 394500 | 0.0 | - |
| 29.4638 | 394550 | 0.0 | - |
| 29.4676 | 394600 | 0.0 | - |
| 29.4713 | 394650 | 0.0 | - |
| 29.4750 | 394700 | 0.0 | - |
| 29.4788 | 394750 | 0.0 | - |
| 29.4825 | 394800 | 0.0002 | - |
| 29.4862 | 394850 | 0.0 | - |
| 29.4900 | 394900 | 0.0 | - |
| 29.4937 | 394950 | 0.0 | - |
| 29.4974 | 395000 | 0.0 | - |
| 29.5012 | 395050 | 0.0 | - |
| 29.5049 | 395100 | 0.0 | - |
| 29.5086 | 395150 | 0.0 | - |
| 29.5124 | 395200 | 0.0 | - |
| 29.5161 | 395250 | 0.0001 | - |
| 29.5198 | 395300 | 0.0 | - |
| 29.5236 | 395350 | 0.0 | - |
| 29.5273 | 395400 | 0.0 | - |
| 29.5310 | 395450 | 0.0 | - |
| 29.5348 | 395500 | 0.0 | - |
| 29.5385 | 395550 | 0.0002 | - |
| 29.5422 | 395600 | 0.0 | - |
| 29.5460 | 395650 | 0.0 | - |
| 29.5497 | 395700 | 0.0003 | - |
| 29.5534 | 395750 | 0.0002 | - |
| 29.5572 | 395800 | 0.0 | - |
| 29.5609 | 395850 | 0.0 | - |
| 29.5646 | 395900 | 0.0 | - |
| 29.5684 | 395950 | 0.0 | - |
| 29.5721 | 396000 | 0.0 | - |
| 29.5758 | 396050 | 0.0002 | - |
| 29.5796 | 396100 | 0.0 | - |
| 29.5833 | 396150 | 0.0 | - |
| 29.5870 | 396200 | 0.0 | - |
| 29.5908 | 396250 | 0.0002 | - |
| 29.5945 | 396300 | 0.0002 | - |
| 29.5982 | 396350 | 0.0 | - |
| 29.6020 | 396400 | 0.0 | - |
| 29.6057 | 396450 | 0.0 | - |
| 29.6094 | 396500 | 0.0002 | - |
| 29.6132 | 396550 | 0.0 | - |
| 29.6169 | 396600 | 0.0 | - |
| 29.6206 | 396650 | 0.0 | - |
| 29.6244 | 396700 | 0.0 | - |
| 29.6281 | 396750 | 0.0 | - |
| 29.6318 | 396800 | 0.0 | - |
| 29.6356 | 396850 | 0.0 | - |
| 29.6393 | 396900 | 0.0 | - |
| 29.6430 | 396950 | 0.0 | - |
| 29.6468 | 397000 | 0.0 | - |
| 29.6505 | 397050 | 0.0 | - |
| 29.6542 | 397100 | 0.0 | - |
| 29.6580 | 397150 | 0.0 | - |
| 29.6617 | 397200 | 0.0 | - |
| 29.6654 | 397250 | 0.0 | - |
| 29.6692 | 397300 | 0.0 | - |
| 29.6729 | 397350 | 0.0 | - |
| 29.6766 | 397400 | 0.0001 | - |
| 29.6804 | 397450 | 0.0 | - |
| 29.6841 | 397500 | 0.0 | - |
| 29.6879 | 397550 | 0.0 | - |
| 29.6916 | 397600 | 0.0 | - |
| 29.6953 | 397650 | 0.0 | - |
| 29.6991 | 397700 | 0.0002 | - |
| 29.7028 | 397750 | 0.0 | - |
| 29.7065 | 397800 | 0.0 | - |
| 29.7103 | 397850 | 0.0 | - |
| 29.7140 | 397900 | 0.0 | - |
| 29.7177 | 397950 | 0.0 | - |
| 29.7215 | 398000 | 0.0 | - |
| 29.7252 | 398050 | 0.0 | - |
| 29.7289 | 398100 | 0.0 | - |
| 29.7327 | 398150 | 0.0001 | - |
| 29.7364 | 398200 | 0.0002 | - |
| 29.7401 | 398250 | 0.0003 | - |
| 29.7439 | 398300 | 0.0 | - |
| 29.7476 | 398350 | 0.0 | - |
| 29.7513 | 398400 | 0.0 | - |
| 29.7551 | 398450 | 0.0001 | - |
| 29.7588 | 398500 | 0.0 | - |
| 29.7625 | 398550 | 0.0 | - |
| 29.7663 | 398600 | 0.0001 | - |
| 29.7700 | 398650 | 0.0002 | - |
| 29.7737 | 398700 | 0.0 | - |
| 29.7775 | 398750 | 0.0 | - |
| 29.7812 | 398800 | 0.0 | - |
| 29.7849 | 398850 | 0.0002 | - |
| 29.7887 | 398900 | 0.0 | - |
| 29.7924 | 398950 | 0.0 | - |
| 29.7961 | 399000 | 0.0002 | - |
| 29.7999 | 399050 | 0.0 | - |
| 29.8036 | 399100 | 0.0002 | - |
| 29.8073 | 399150 | 0.0 | - |
| 29.8111 | 399200 | 0.0 | - |
| 29.8148 | 399250 | 0.0002 | - |
| 29.8185 | 399300 | 0.0 | - |
| 29.8223 | 399350 | 0.0 | - |
| 29.8260 | 399400 | 0.0 | - |
| 29.8297 | 399450 | 0.0 | - |
| 29.8335 | 399500 | 0.0 | - |
| 29.8372 | 399550 | 0.0002 | - |
| 29.8409 | 399600 | 0.0 | - |
| 29.8447 | 399650 | 0.0 | - |
| 29.8484 | 399700 | 0.0 | - |
| 29.8521 | 399750 | 0.0002 | - |
| 29.8559 | 399800 | 0.0 | - |
| 29.8596 | 399850 | 0.0 | - |
| 29.8633 | 399900 | 0.0 | - |
| 29.8671 | 399950 | 0.0 | - |
| 29.8708 | 400000 | 0.0 | - |
| 29.8745 | 400050 | 0.0 | - |
| 29.8783 | 400100 | 0.0 | - |
| 29.8820 | 400150 | 0.0 | - |
| 29.8857 | 400200 | 0.0 | - |
| 29.8895 | 400250 | 0.0 | - |
| 29.8932 | 400300 | 0.0001 | - |
| 29.8969 | 400350 | 0.0001 | - |
| 29.9007 | 400400 | 0.0 | - |
| 29.9044 | 400450 | 0.0 | - |
| 29.9081 | 400500 | 0.0 | - |
| 29.9119 | 400550 | 0.0002 | - |
| 29.9156 | 400600 | 0.0 | - |
| 29.9193 | 400650 | 0.0 | - |
| 29.9231 | 400700 | 0.0 | - |
| 29.9268 | 400750 | 0.0 | - |
| 29.9306 | 400800 | 0.0 | - |
| 29.9343 | 400850 | 0.0 | - |
| 29.9380 | 400900 | 0.0 | - |
| 29.9418 | 400950 | 0.0 | - |
| 29.9455 | 401000 | 0.0 | - |
| 29.9492 | 401050 | 0.0 | - |
| 29.9530 | 401100 | 0.0 | - |
| 29.9567 | 401150 | 0.0 | - |
| 29.9604 | 401200 | 0.0 | - |
| 29.9642 | 401250 | 0.0001 | - |
| 29.9679 | 401300 | 0.0 | - |
| 29.9716 | 401350 | 0.0 | - |
| 29.9754 | 401400 | 0.0 | - |
| 29.9791 | 401450 | 0.0 | - |
| 29.9828 | 401500 | 0.0 | - |
| 29.9866 | 401550 | 0.0002 | - |
| 29.9903 | 401600 | 0.0 | - |
| 29.9940 | 401650 | 0.0 | - |
| 29.9978 | 401700 | 0.0002 | - |
### Framework Versions
- Python: 3.10.12
- SetFit: 1.1.0
- Sentence Transformers: 3.3.1
- Transformers: 4.44.2
- PyTorch: 2.2.0a0+81ea7a4
- Datasets: 3.2.0
- Tokenizers: 0.19.1
## Citation
### BibTeX
```bibtex
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> | [
"TEXT_CLASSIFICATION"
] | [
"BEAR"
] |
pszemraj/pegasus-x-large-book-summary | pszemraj | summarization | [
"transformers",
"pytorch",
"safetensors",
"pegasus_x",
"text2text-generation",
"summarization",
"summary",
"booksum",
"long-document",
"long-form",
"dataset:kmfoda/booksum",
"base_model:google/pegasus-x-large",
"base_model:finetune:google/pegasus-x-large",
"license:apache-2.0",
"license:bsd-3-clause",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-09-16T10:55:11 | 2023-09-23T20:46:57 | 1,273 | 35 | ---
base_model: google/pegasus-x-large
datasets:
- kmfoda/booksum
license:
- apache-2.0
- bsd-3-clause
metrics:
- rouge
tags:
- summarization
- summary
- booksum
- long-document
- long-form
languages: en
widget:
- text: large earthquakes along a given fault segment do not occur at random intervals
because it takes time to accumulate the strain energy for the rupture. The rates
at which tectonic plates move and accumulate strain at their boundaries are approximately
uniform. Therefore, in first approximation, one may expect that large ruptures
of the same fault segment will occur at approximately constant time intervals.
If subsequent main shocks have different amounts of slip across the fault, then
the recurrence time may vary, and the basic idea of periodic mainshocks must be
modified. For great plate boundary ruptures the length and slip often vary by
a factor of 2. Along the southern segment of the San Andreas fault the recurrence
interval is 145 years with variations of several decades. The smaller the standard
deviation of the average recurrence interval, the more specific could be the long
term prediction of a future mainshock.
example_title: earthquakes
- text: ' A typical feed-forward neural field algorithm. Spatiotemporal coordinates
are fed into a neural network that predicts values in the reconstructed domain.
Then, this domain is mapped to the sensor domain where sensor measurements are
available as supervision. Class and Section Problems Addressed Generalization
(Section 2) Inverse problems, ill-posed problems, editability; symmetries. Hybrid
Representations (Section 3) Computation & memory efficiency, representation capacity,
editability: Forward Maps (Section 4) Inverse problems Network Architecture (Section
5) Spectral bias, integration & derivatives. Manipulating Neural Fields (Section
6) Edit ability, constraints, regularization. Table 2: The five classes of techniques
in the neural field toolbox each addresses problems that arise in learning, inference,
and control. (Section 3). We can supervise reconstruction via differentiable forward
maps that transform Or project our domain (e.g, 3D reconstruction via 2D images;
Section 4) With appropriate network architecture choices, we can overcome neural
network spectral biases (blurriness) and efficiently compute derivatives and integrals
(Section 5). Finally, we can manipulate neural fields to add constraints and regularizations,
and to achieve editable representations (Section 6). Collectively, these classes
constitute a ''toolbox'' of techniques to help solve problems with neural fields
There are three components in a conditional neural field: (1) An encoder or inference
function € that outputs the conditioning latent variable 2 given an observation
0 E(0) =2. 2 is typically a low-dimensional vector, and is often referred to aS
a latent code Or feature code_ (2) A mapping function 4 between Z and neural field
parameters O: Y(z) = O; (3) The neural field itself $. The encoder € finds the
most probable z given the observations O: argmaxz P(2/0). The decoder maximizes
the inverse conditional probability to find the most probable 0 given Z: arg-
max P(Olz). We discuss different encoding schemes with different optimality guarantees
(Section 2.1.1), both global and local conditioning (Section 2.1.2), and different
mapping functions Y (Section 2.1.3) 2. Generalization Suppose we wish to estimate
a plausible 3D surface shape given a partial or noisy point cloud. We need a suitable
prior over the sur- face in its reconstruction domain to generalize to the partial
observations. A neural network expresses a prior via the function space of its
architecture and parameters 0, and generalization is influenced by the inductive
bias of this function space (Section 5).'
example_title: scientific paper
- text: 'Is a else or outside the cob and tree written being of early client rope
and you have is for good reasons. On to the ocean in Orange for time. By''s the
aggregate we can bed it yet. Why this please pick up on a sort is do and also
M Getoi''s nerocos and do rain become you to let so is his brother is made in
use and Mjulia''s''s the lay major is aging Masastup coin present sea only of
Oosii rooms set to you We do er do we easy this private oliiishs lonthen might
be okay. Good afternoon everybody. Welcome to this lecture of Computational Statistics.
As you can see, I''m not socially my name is Michael Zelinger. I''m one of the
task for this class and you might have already seen me in the first lecture where
I made a quick appearance. I''m also going to give the tortillas in the last third
of this course. So to give you a little bit about me, I''m a old student here
with better Bulman and my research centres on casual inference applied to biomedical
disasters, so that could be genomics or that could be hospital data. If any of
you is interested in writing a bachelor thesis, a semester paper may be mastathesis
about this topic feel for reach out to me. you have my name on models and my email
address you can find in the directory I''d Be very happy to talk about it. you
do not need to be sure about it, we can just have a chat. So with that said, let''s
get on with the lecture. There''s an exciting topic today I''m going to start
by sharing some slides with you and later on during the lecture we''ll move to
the paper. So bear with me for a few seconds. Well, the projector is starting
up. Okay, so let''s get started. Today''s topic is a very important one. It''s
about a technique which really forms one of the fundamentals of data science,
machine learning, and any sort of modern statistics. It''s called cross validation.
I know you really want to understand this topic I Want you to understand this
and frankly, nobody''s gonna leave Professor Mineshousen''s class without understanding
cross validation. So to set the stage for this, I Want to introduce you to the
validation problem in computational statistics. So the problem is the following:
You trained a model on available data. You fitted your model, but you know the
training data you got could always have been different and some data from the
environment. Maybe it''s a random process. You do not really know what it is,
but you know that somebody else who gets a different batch of data from the same
environment they would get slightly different training data and you do not care
that your method performs as well. On this training data. you want to to perform
well on other data that you have not seen other data from the same environment.
So in other words, the validation problem is you want to quantify the performance
of your model on data that you have not seen. So how is this even possible? How
could you possibly measure the performance on data that you do not know The solution
to? This is the following realization is that given that you have a bunch of data,
you were in charge. You get to control how much that your model sees. It works
in the following way: You can hide data firms model. Let''s say you have a training
data set which is a bunch of doubtless so X eyes are the features those are typically
hide and national vector. It''s got more than one dimension for sure. And the
why why eyes. Those are the labels for supervised learning. As you''ve seen before,
it''s the same set up as we have in regression. And so you have this training
data and now you choose that you only use some of those data to fit your model.
You''re not going to use everything, you only use some of it the other part you
hide from your model. And then you can use this hidden data to do validation from
the point of you of your model. This hidden data is complete by unseen. In other
words, we solve our problem of validation.'
example_title: transcribed audio - lecture
- text: 'Transformer-based models have shown to be very useful for many NLP tasks.
However, a major limitation of transformers-based models is its O(n^2)O(n 2) time
& memory complexity (where nn is sequence length). Hence, it''s computationally
very expensive to apply transformer-based models on long sequences n > 512n>512.
Several recent papers, e.g. Longformer, Performer, Reformer, Clustered attention
try to remedy this problem by approximating the full attention matrix. You can
checkout 🤗''s recent blog post in case you are unfamiliar with these models.
BigBird (introduced in paper) is one of such recent models to address this issue.
BigBird relies on block sparse attention instead of normal attention (i.e. BERT''s
attention) and can handle sequences up to a length of 4096 at a much lower computational
cost compared to BERT. It has achieved SOTA on various tasks involving very long
sequences such as long documents summarization, question-answering with long contexts.
BigBird RoBERTa-like model is now available in 🤗Transformers. The goal of this
post is to give the reader an in-depth understanding of big bird implementation
& ease one''s life in using BigBird with 🤗Transformers. But, before going into
more depth, it is important to remember that the BigBird''s attention is an approximation
of BERT''s full attention and therefore does not strive to be better than BERT''s
full attention, but rather to be more efficient. It simply allows to apply transformer-based
models to much longer sequences since BERT''s quadratic memory requirement quickly
becomes unbearable. Simply put, if we would have ∞ compute & ∞ time, BERT''s attention
would be preferred over block sparse attention (which we are going to discuss
in this post).
If you wonder why we need more compute when working with longer sequences, this
blog post is just right for you!
Some of the main questions one might have when working with standard BERT-like
attention include:
Do all tokens really have to attend to all other tokens? Why not compute attention
only over important tokens? How to decide what tokens are important? How to attend
to just a few tokens in a very efficient way? In this blog post, we will try to
answer those questions.
What tokens should be attended to? We will give a practical example of how attention
works by considering the sentence ''BigBird is now available in HuggingFace for
extractive question answering''. In BERT-like attention, every word would simply
attend to all other tokens.
Let''s think about a sensible choice of key tokens that a queried token actually
only should attend to by writing some pseudo-code. Will will assume that the token
available is queried and build a sensible list of key tokens to attend to.
>>> # let''s consider following sentence as an example >>> example = [''BigBird'',
''is'', ''now'', ''available'', ''in'', ''HuggingFace'', ''for'', ''extractive'',
''question'', ''answering'']
>>> # further let''s assume, we''re trying to understand the representation of
''available'' i.e. >>> query_token = ''available'' >>> # We will initialize an
empty `set` and fill up the tokens of our interest as we proceed in this section.
>>> key_tokens = [] # => currently ''available'' token doesn''t have anything
to attend Nearby tokens should be important because, in a sentence (sequence of
words), the current word is highly dependent on neighboring past & future tokens.
This intuition is the idea behind the concept of sliding attention.'
example_title: bigbird blog intro
- text: 'To be fair, you have to have a very high IQ to understand Rick and Morty.
The humour is extremely subtle, and without a solid grasp of theoretical physics
most of the jokes will go over a typical viewer''s head. There''s also Rick''s
nihilistic outlook, which is deftly woven into his characterisation- his personal
philosophy draws heavily from Narodnaya Volya literature, for instance. The fans
understand this stuff; they have the intellectual capacity to truly appreciate
the depths of these jokes, to realise that they''re not just funny- they say something
deep about LIFE. As a consequence people who dislike Rick & Morty truly ARE idiots-
of course they wouldn''t appreciate, for instance, the humour in Rick''s existential
catchphrase ''Wubba Lubba Dub Dub,'' which itself is a cryptic reference to Turgenev''s
Russian epic Fathers and Sons. I''m smirking right now just imagining one of those
addlepated simpletons scratching their heads in confusion as Dan Harmon''s genius
wit unfolds itself on their television screens. What fools.. how I pity them.
😂
And yes, by the way, i DO have a Rick & Morty tattoo. And no, you cannot see it.
It''s for the ladies'' eyes only- and even then they have to demonstrate that
they''re within 5 IQ points of my own (preferably lower) beforehand. Nothin personnel
kid 😎'
example_title: Richard & Mortimer
parameters:
max_length: 48
min_length: 2
no_repeat_ngram_size: 3
encoder_no_repeat_ngram_size: 3
early_stopping: true
length_penalty: 0.1
num_beams: 2
model-index:
- name: pszemraj/pegasus-x-large-book-summary
results:
- task:
type: summarization
name: Summarization
dataset:
name: samsum
type: samsum
config: samsum
split: test
metrics:
- type: rouge
value: 33.1401
name: ROUGE-1
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiYjQ1NjY1OGVjYWEwMzBjMzk3ZmMyZDA0ZTcxOTdmZTUxNTc0OGYxYmY3MzJkMzFmYTVjNzU2ZTk4MzE0NWMzMSIsInZlcnNpb24iOjF9.PSHB6DMF6tkwSw5nsFE57a2ApRAy_tkS6ziKA6PSTWddEdaqfca4pfig6_olmRmcS4KxN6HHcsmioHzv4LJQBw
- type: rouge
value: 9.3095
name: ROUGE-2
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMzk3MTA3NmY1OGE3MzFjZTJhYWYzNGU4NTUzMTgwM2Y1NWZjMmEyNDNmNmEzYmQzZThjOGExMjc2ZjAyZjMzZCIsInZlcnNpb24iOjF9.tfgp8p-WlkVrfducTSg4zs-byeZMCmdZw1aizPQHXm_qRAwGtKcuVkZcmza5Y3o3VqsAEmGzg5HQD1vnZvWIDA
- type: rouge
value: 24.8552
name: ROUGE-L
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiOTVmMTIwNDQwNTI4MmI2MmY1ODc1Mjk0NGQ5ZWE4ZTYzOGNkMjY2ZmJhMjg2MTZlNTdhYTA2ZDAxNTFjMjA2MSIsInZlcnNpb24iOjF9.9HLgy9842oIDm6ABb3L94R1P4zAqTI0QN8aP62xzIyDxUXTbWw68PEDufYLiBJbTgZ8ElopZ9I7aou2zCgXeAA
- type: rouge
value: 29.0391
name: ROUGE-LSUM
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMmNhYWJjYjdjMzMxMmE4ZTE4NGEzMDdmZDZjODI5ZWRjZWJmYTEyZGIzYWQ2NjM3YzQ4MjI4ZTM4MmU5MzRjZSIsInZlcnNpb24iOjF9.d2yoVdmxjVJnsgIYFiLuaBO5Krgw4Axl5yeOSTKrvHygrAxoqT1nl4anzQiyoR3PwYBXwBkwmgpJUfZ7RNXtDQ
- type: loss
value: 2.288182497024536
name: loss
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiYzM5NGIwODMxOTA3MTY3ODc2ZDczYTNmMTMwM2QyZmNlZjFmZDJjMGY3NWNkMDEyYzA4OTA2ZDRiODY3Zjg4OCIsInZlcnNpb24iOjF9.8k9mC050OS7mQSR9oA8liDRDQvEx1VxmTXGLmDYJVYYtTh2HYJFGP8Vy_krocFRIYDxh-IHPEOOSr5NrLMWHBA
- type: gen_len
value: 45.2173
name: gen_len
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNWZhNzQ5OTQ5Yjg5YjhlOTZiZmJhZjZiODNmY2E2OTg4YTg4NWVhYzRkNzM2Mzk4NzdlMDgxM2M4NjY2YzhhYSIsInZlcnNpb24iOjF9.tDEEsPUclZDygAdGhNrBGrF24vR8ao08Nw7hmtUt5lmSZZZK_u-8rpz97QgVS6MCJdjFVnbYC4bkFnlQWI_FAA
- task:
type: summarization
name: Summarization
dataset:
name: launch/gov_report
type: launch/gov_report
config: plain_text
split: test
metrics:
- type: rouge
value: 39.7279
name: ROUGE-1
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiOTAxODk3OTUwMTIzODU3NzU2YzAzZjE2NTM3MzBjNDA0ZWRmZGU3NWUzNTg1YThhNDQ1NjQ5ZmM3OWI2YzBhNSIsInZlcnNpb24iOjF9.vnNKucBNt2-nIyODj9P2HeaWPX5AQR8L-DL8QzrO7kj58-vZnjT6hsAGmepRNzdZ1TLF-3j2J2plcNJ8lUO8Dg
- type: rouge
value: 10.8944
name: ROUGE-2
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNjYzMmIxOTJmZjkxOGI5N2U0NTRmMmQwOGJhMzMxYWIzMWMzYzUwMDEyMDdiZDQ2YTUzOWU0OTViMTI2YTAwYiIsInZlcnNpb24iOjF9.De0PaAikWqfWpoIXTCYP-mSFu3PUATLX08Qq74OHXM8784heFVDX1E1sXlh_QbbKJbuMuZtTKM4qr7oLUizOAw
- type: rouge
value: 19.7018
name: ROUGE-L
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiYzI3MjQzOGQ3MGE3NDNkZTEyMWRkYjUyYTYzNDEwOWVjMGFmNTBiZjE4ZTBhMGYzMmI1Yzk0YjBmYmIzMWMxZSIsInZlcnNpb24iOjF9.FVikJ5Ma0gUgM-tpbomWXnC4jtmvhxqikPqCk84t4IbIdU0CIYGTQEONiz-VqI0fJeNrnTS6lxpBv7XxKoq3BQ
- type: rouge
value: 36.5634
name: ROUGE-LSUM
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiOTI2OTVmNDZiZWE5ZjNkODIwZjJiNTU2ZjJjYjczODUwM2JiNDEzYmE3N2U5YWM5NzJjOWEzMmYzZjdlYWJmYyIsInZlcnNpb24iOjF9.poR4zcqRvdaierfWFdTa53Cv6ZbNbnRwyRTi9HukHF5AWAQgc6zpBLkwOYFYoWjuSH83ohWeMM3MoIdw3zypBw
- type: loss
value: 2.473011016845703
name: loss
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNDFmMjg3NWQ2YTMxMTc1OGZiYWYzNjg5NDY3MWE4MjY5ZDQxZDZhZGI1OTc5MzZkZGEzYmVlNWFiMzZjNDdhNCIsInZlcnNpb24iOjF9.05nKB3SmEfFKSduJqlleF4Fd2_IhwJS8eTOrnzZYCQQfLCfpJAZLhp3eLQCuBY4htd-FNrZftrThL66zVxyrCQ
- type: gen_len
value: 212.8243
name: gen_len
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiOGNjMTg4ZDZlZjAxZGNhN2M0NWI0ZTA0OWEzNDkzNDAzOTJhODA2MmVkODI4YjYzN2FiOTU1ZDMwM2VlNWMyYyIsInZlcnNpb24iOjF9.WYx6XJFKokY2heoN-jpAMp1Z1gsyJus3zpktQgNd0FOYJxOUqW40A0kkHtd15y4dUhsbccLpuJGY1fNJgHOiDw
- task:
type: summarization
name: Summarization
dataset:
name: billsum
type: billsum
config: default
split: test
metrics:
- type: rouge
value: 42.1065
name: ROUGE-1
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZDJhNDM2MWEwMjJlYjRmZTVkYzljODcwMzlmMGUxMDA4ZmRjNjM0NmY3ZWJlMmZjNGI3NDQ3NTQyOTQ3MjBkNSIsInZlcnNpb24iOjF9.l1MiZbXyFyXAcsfFChMrTvSaBhzBR6AuDnBuII8zY3Csz3ShWK0vo09MkQdZ1epe8PKWV9wwUBuJyKk3wL7MDw
- type: rouge
value: 15.4079
name: ROUGE-2
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNTY3NDBkYTVkNjdhY2I0ZmY0NTA4YzVkMGE5YWE5ODdjOGE1MDhkOTJhOWY3NmI2ZWI1MGU2MGI1NDRlYjI3MSIsInZlcnNpb24iOjF9.VN-5eK2SzFDCJnFTHHu7XCU_lynaxW_JEDc3llmcNo_ffDgRmISHHGaqV7fPFymBBMXpPly7XblO_sukyqj1Cg
- type: rouge
value: 24.8814
name: ROUGE-L
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZDYyNGZmNDY3MTY4YzI4ZjZhODE0NGIyN2ZkOGEyYzM3MWZjM2QzZTg5ZjNmZmYzZDE5NzhiZDQ4OGM1YjNiMyIsInZlcnNpb24iOjF9.L73M1M5XdMQkf8zSdfLN0MUrxtO0r6UiLjoOkHfrIGbWNsNJ8tU5lciYFNIhJrICUL8LchCsFqR9LAClKS4bCg
- type: rouge
value: 36.0375
name: ROUGE-LSUM
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMTBlMTQ5OTQxNTA3ZmFiMGYyZWQ0MGM0ODY2YWI3MzgyNjkwNzQyM2FmNGRjMzc3MjJmZDZkOWY4M2RhZTg2MSIsInZlcnNpb24iOjF9.IiMSSVahBgH8n34bGCC_DDGpujDXQbIvGhlcpVV2EBVQLLWUqcCy5WwBdbRrxPC-asBRCNERQxj8Uii4FvPsDQ
- type: loss
value: 1.9130958318710327
name: loss
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNTg2NTMxZDE3MDg3MDFkMTYxNjY1OTc5YjQ4ODcyMGUxMTFiZjJiNDgyYWZhN2NjZmE1MDQ1NTRmZGY0NjQzZSIsInZlcnNpb24iOjF9.kADUBMO8i6-oGDDt1cOiGMrGcMkF_Qc1jSpS2NSFyksDRusQa_YuuShefF4DuHVEr3CS0hNjjRH9_JBeX9ZQDg
- type: gen_len
value: 179.2184
name: gen_len
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNjM4NGNiMTY3YzZjMzg4MTRiMDdiZDFiMzA1ZDIyMDM2MDk1OWRhYWQzN2UxZDNlODIxOWVhY2JlYjk4Mjk5YyIsInZlcnNpb24iOjF9.nU8ImMNWgjg9BKjUBJQLFaJOBq3kyIne8ldlpL0OV0e4888wOntIAcJP0dCCYfRSLVmZuXQ1M8cpDuTf50hNCw
- task:
type: summarization
name: Summarization
dataset:
name: kmfoda/booksum
type: kmfoda/booksum
config: kmfoda--booksum
split: test
metrics:
- type: rouge
value: 35.2154
name: ROUGE-1
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZWQ5MGMzNDc4MDBiNmRiNDY5ZDM4N2QzYTJlYTNiYTcwNDBlMzdlM2I4N2VmM2ZjMmQ3NGU3OTRlMTMzMTg3NyIsInZlcnNpb24iOjF9.E55gu7HvMwc4HejF3YOD6yqQJj7_6GCoCMWm78sY5_w2glR-oM98tu9IsG27VaPva7UklxsspzT2DIVaVKY0CQ
- type: rouge
value: 6.8702
name: ROUGE-2
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZjFhN2JlYzlmMGZmYzkwYjBlNjY4YzhlYzNmMTdmZWYyYmU3NWI0ZTRkMTgxNmRiM2EyZWMyMWFjY2JkNzg1MCIsInZlcnNpb24iOjF9.I9BoHbGt8LLNtLAssIXm9tQ4lHqFCMt0zJS_zTezzxGRMS5On71c3jnlzrDtwEm6wjmZEwYIJK8qqJh-Qa5YAA
- type: rouge
value: 17.6693
name: ROUGE-L
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiOGZlZjcwOTZjMmNjZWFkM2M5Zjg1OTgzMzcxOTM2Y2RkMzY4NGU2NDE2MTVjMjcyMWIwNWI4ODc0YTY3YTA2MSIsInZlcnNpb24iOjF9.Ou1C6U6PrOtXPxlk9PMucdJ_vlnVnSk94QrLJL4b_g2pcY3D80Xrw09iz4BTOPzZ2UTNBLyn8YdLY3m2vHpiAQ
- type: rouge
value: 32.8365
name: ROUGE-LSUM
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMmIzMGQ5MzQ1MjI4MTU0ZGZkZTRhODllNWQyOTQ4ZjA5YWE4ZTJjMzQ2ZWQzOGFiMWUzZDMxOTU5NzkxYjliZiIsInZlcnNpb24iOjF9.2mYURQZYo7e3AY0tfkpqFMNhoHvrysvBXza-XYYrX_xLpruMU9Gzrwc3jvpi2wtp4eeyhzIiZJvH0O6la6zxCg
- type: loss
value: 2.9878039360046387
name: loss
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZGU0ODBmN2I3OGFkNTFiM2I3YWQyNmUzNzUwYzEwNzczZWEwZjIxYTAwZDE2ZTIwMGE3ZGNmMDQzNTFmNjEwYyIsInZlcnNpb24iOjF9.0IKWIImKTXqysQUb2IMPk2eeHlOcBjndiPcU42nfFBMhRTqeXdBqOCP6cidlho7pVN4hsC-77ArJ9pZlbTFuBg
- type: gen_len
value: 200.6785
name: gen_len
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMDUzYTE3MmIxZGM3MWI1MjNhMTU3MTdkMjJjNjY5Y2UzYTdjYWRiY2I4MmUxMDY4NTA5NWZjYWU0NzliODdkYiIsInZlcnNpb24iOjF9.BqmCaWzbCMNUied6zNO744Dl-0LC47FCIv-l8kDjkhSkwQcb_hi93VYts5PTsrFY_MmM8j7AsY1PiFr6nNFMBQ
- task:
type: summarization
name: Summarization
dataset:
name: big_patent
type: big_patent
config: y
split: test
metrics:
- type: rouge
value: 37.376
name: ROUGE-1
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMWI4ZjMxODcxMThiMzE3NjQ3Zjg0NzhmZjlhY2ZmYjQwMGY5ZjlkZGY1MzZmY2M5YTU4NmY1Y2NhZDA3YWFkOCIsInZlcnNpb24iOjF9.sYh4IynXgOpVetYYSWUp0v5QZWvXC1x7_uJR0LZUxaeYKEc4yfICNmDOPzNzoroaV4ELeOaPjHQpYVm-lpAHBA
- type: rouge
value: 11.4432
name: ROUGE-2
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZTZkOGIyYzU3YTQ5ZTFmMDU3MjQ5ZWM2NGQ1MzgwMDYyZDkxN2Q2YjgyZTkzMTEyYjczMGJiYmNkZmU5MTQ3NSIsInZlcnNpb24iOjF9.Qk38acpjPjU64Z1nXEuqMXjKZrGvdC9oY586EjuCPeEAJCSzKimp8FsB-1QrjMH73q6rN2CdumJUxih6HF-KAA
- type: rouge
value: 22.2754
name: ROUGE-L
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNzlmOTUxYmEzYzYyYmVjNGZlNzNiZWIwZmQ5OWVlY2U3NTBiZDExYWUwODQ0Y2ZjMmQyMTNmMTlmNjdmZWUwNCIsInZlcnNpb24iOjF9.bUVhxaepySyaityby71j6h4YO_l4x8OSeZoblagwUMYGXRc0Ej286QzEtZFeRGygMJ5sjUN_loWCtOmAnHY2BA
- type: rouge
value: 32.5087
name: ROUGE-LSUM
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNDEyNjM5NjAzYTNjN2MwZTY4MWY2Y2U5YWUyM2Y1YjAyNjBhZTM0YTAyZjM5N2M1ZDkxOWUxNzE2OWZkYTBmMSIsInZlcnNpb24iOjF9.QfMHkcoAR3xqzsgL1xjHk3Lui1xhE12pJKvYujQ_h5o6PBXT79dsENsrqDGGBjiKdTKNwWqADgaviy1VrWMDCQ
- type: loss
value: 2.9867310523986816
name: loss
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZTUzM2Q5MmE5MzU4YmFlMjFiMmUzZGU2NDAzMTQ1Y2NjZDVlYWI3NGE5MjM0NmMxMjdiOWI3MTU0NDk3NmNkZiIsInZlcnNpb24iOjF9.VoQqu6ZU3AR_cji82UkpvbLnTmZ17fZmR2E4DeonjCyTZpyyfvUsQ2nbKDovQf34DBkYXENk42EUsUF1mBZNBg
- type: gen_len
value: 172.7776
name: gen_len
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNTEzNTMyMDY1N2Q5ZTMxNjNlMTI0Nzk5ZDc1ZWQ5Y2IwZWM0NWNhNWY2MTk3YTRkYzUwMTI4NjZiOWVhOGQwYSIsInZlcnNpb24iOjF9.-Rek2VFmGqIEgqeFoxU_0aCWdFbGYi9BV5c7x-izm9_4vtZdYQ4ITXm4T8C3UlpOax60veJQt2Uax5vyiFc9Ag
---
# pszemraj/pegasus-x-large-book-summary
<a href="https://colab.research.google.com/gist/pszemraj/6c326c0649233ab017d63adc36958d1a/pegasus-x-large-booksum-demo.ipynb">
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/>
</a>
Get SparkNotes-esque summaries of arbitrary text! Due to the model size, it's recommended to try it out in Colab (linked above) as the API textbox may time out.
This model is a fine-tuned version of [google/pegasus-x-large](https://huggingface.co/google/pegasus-x-large) on the `kmfoda/booksum` dataset for approx eight epochs.
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
#### Epochs 1-4
TODO
#### Epochs 5 & 6
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 4
- eval_batch_size: 1
- seed: 42
- distributed_type: multi-GPU
- gradient_accumulation_steps: 32
- total_train_batch_size: 128
- optimizer: _ADAN_ using lucidrains' `adan-pytorch` with default betas
- lr_scheduler_type: constant_with_warmup
- data type: TF32
- num_epochs: 2
#### Epochs 7 & 8
- epochs 5 & 6 were trained with 12288 tokens input
- this fixes that with 2 epochs at 16384 tokens input
The following hyperparameters were used during training:
- learning_rate: 0.0004
- train_batch_size: 4
- eval_batch_size: 1
- seed: 42
- distributed_type: multi-GPU
- gradient_accumulation_steps: 16
- total_train_batch_size: 64
- optimizer: _ADAN_ using lucidrains' `adan-pytorch` with default betas
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.03
- num_epochs: 2
### Framework versions
- Transformers 4.22.0
- Pytorch 1.11.0a0+17540c5
- Datasets 2.4.0
- Tokenizers 0.12.1
| [
"QUESTION_ANSWERING",
"SUMMARIZATION"
] | [
"BEAR"
] |
prdev/mini-gte | prdev | sentence-similarity | [
"sentence-transformers",
"safetensors",
"distilbert",
"sentence-similarity",
"feature-extraction",
"mteb",
"en",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
] | 2025-01-29T04:53:28 | 2025-02-07T19:48:39 | 1,247 | 1 | ---
base_model: distilbert/distilbert-base-uncased
language:
- en
library_name: sentence-transformers
license: apache-2.0
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- mteb
model-index:
- name: prdev/mini-gte
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 74.8955
- type: f1
value: 68.84209999999999
- type: f1_weighted
value: 77.1819
- type: ap
value: 37.731500000000004
- type: ap_weighted
value: 37.731500000000004
- type: main_score
value: 74.8955
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification (default)
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 92.9424
- type: f1
value: 92.9268
- type: f1_weighted
value: 92.9268
- type: ap
value: 89.2255
- type: ap_weighted
value: 89.2255
- type: main_score
value: 92.9424
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 53.09199999999999
- type: f1
value: 52.735299999999995
- type: f1_weighted
value: 52.735299999999995
- type: main_score
value: 53.09199999999999
- task:
type: Retrieval
dataset:
name: MTEB ArguAna (default)
type: mteb/arguana
config: default
split: test
revision: c22ab2a51041ffd869aaddef7af8d8215647e41a
metrics:
- type: ndcg_at_1
value: 31.791999999999998
- type: ndcg_at_3
value: 47.205999999999996
- type: ndcg_at_5
value: 51.842999999999996
- type: ndcg_at_10
value: 56.614
- type: ndcg_at_20
value: 59.211999999999996
- type: ndcg_at_100
value: 60.148999999999994
- type: ndcg_at_1000
value: 60.231
- type: map_at_1
value: 31.791999999999998
- type: map_at_3
value: 43.35
- type: map_at_5
value: 45.928000000000004
- type: map_at_10
value: 47.929
- type: map_at_20
value: 48.674
- type: map_at_100
value: 48.825
- type: map_at_1000
value: 48.827999999999996
- type: recall_at_1
value: 31.791999999999998
- type: recall_at_3
value: 58.392999999999994
- type: recall_at_5
value: 69.63000000000001
- type: recall_at_10
value: 84.211
- type: recall_at_20
value: 94.23899999999999
- type: recall_at_100
value: 99.004
- type: recall_at_1000
value: 99.644
- type: precision_at_1
value: 31.791999999999998
- type: precision_at_3
value: 19.464000000000002
- type: precision_at_5
value: 13.926
- type: precision_at_10
value: 8.421
- type: precision_at_20
value: 4.712000000000001
- type: precision_at_100
value: 0.9900000000000001
- type: precision_at_1000
value: 0.1
- type: mrr_at_1
value: 32.4324
- type: mrr_at_3
value: 43.6463
- type: mrr_at_5
value: 46.1569
- type: mrr_at_10
value: 48.1582
- type: mrr_at_20
value: 48.9033
- type: mrr_at_100
value: 49.0537
- type: mrr_at_1000
value: 49.0569
- type: nauc_ndcg_at_1_max
value: -4.8705
- type: nauc_ndcg_at_1_std
value: -9.1757
- type: nauc_ndcg_at_1_diff1
value: 17.743000000000002
- type: nauc_ndcg_at_3_max
value: -3.916
- type: nauc_ndcg_at_3_std
value: -10.424
- type: nauc_ndcg_at_3_diff1
value: 12.3928
- type: nauc_ndcg_at_5_max
value: -2.5090000000000003
- type: nauc_ndcg_at_5_std
value: -10.1328
- type: nauc_ndcg_at_5_diff1
value: 13.3086
- type: nauc_ndcg_at_10_max
value: -1.4653
- type: nauc_ndcg_at_10_std
value: -9.3154
- type: nauc_ndcg_at_10_diff1
value: 13.7827
- type: nauc_ndcg_at_20_max
value: -2.4534000000000002
- type: nauc_ndcg_at_20_std
value: -9.0213
- type: nauc_ndcg_at_20_diff1
value: 13.764399999999998
- type: nauc_ndcg_at_100_max
value: -2.8207
- type: nauc_ndcg_at_100_std
value: -9.0492
- type: nauc_ndcg_at_100_diff1
value: 14.3422
- type: nauc_ndcg_at_1000_max
value: -3.0108
- type: nauc_ndcg_at_1000_std
value: -9.2507
- type: nauc_ndcg_at_1000_diff1
value: 14.2345
- type: nauc_map_at_1_max
value: -4.8705
- type: nauc_map_at_1_std
value: -9.1757
- type: nauc_map_at_1_diff1
value: 17.743000000000002
- type: nauc_map_at_3_max
value: -4.2874
- type: nauc_map_at_3_std
value: -10.1539
- type: nauc_map_at_3_diff1
value: 13.6101
- type: nauc_map_at_5_max
value: -3.5856
- type: nauc_map_at_5_std
value: -9.9657
- type: nauc_map_at_5_diff1
value: 14.1354
- type: nauc_map_at_10_max
value: -3.2553
- type: nauc_map_at_10_std
value: -9.6771
- type: nauc_map_at_10_diff1
value: 14.402899999999999
- type: nauc_map_at_20_max
value: -3.5541000000000005
- type: nauc_map_at_20_std
value: -9.6286
- type: nauc_map_at_20_diff1
value: 14.3927
- type: nauc_map_at_100_max
value: -3.5811999999999995
- type: nauc_map_at_100_std
value: -9.6278
- type: nauc_map_at_100_diff1
value: 14.4922
- type: nauc_map_at_1000_max
value: -3.5881000000000003
- type: nauc_map_at_1000_std
value: -9.6335
- type: nauc_map_at_1000_diff1
value: 14.488400000000002
- type: nauc_recall_at_1_max
value: -4.8705
- type: nauc_recall_at_1_std
value: -9.1757
- type: nauc_recall_at_1_diff1
value: 17.743000000000002
- type: nauc_recall_at_3_max
value: -2.7195
- type: nauc_recall_at_3_std
value: -11.2342
- type: nauc_recall_at_3_diff1
value: 8.7116
- type: nauc_recall_at_5_max
value: 1.7492
- type: nauc_recall_at_5_std
value: -10.6963
- type: nauc_recall_at_5_diff1
value: 10.569
- type: nauc_recall_at_10_max
value: 10.7433
- type: nauc_recall_at_10_std
value: -6.339599999999999
- type: nauc_recall_at_10_diff1
value: 10.6275
- type: nauc_recall_at_20_max
value: 14.802499999999998
- type: nauc_recall_at_20_std
value: 3.9196
- type: nauc_recall_at_20_diff1
value: 6.0286
- type: nauc_recall_at_100_max
value: 40.8859
- type: nauc_recall_at_100_std
value: 57.965500000000006
- type: nauc_recall_at_100_diff1
value: 30.7703
- type: nauc_recall_at_1000_max
value: 24.2175
- type: nauc_recall_at_1000_std
value: 70.9234
- type: nauc_recall_at_1000_diff1
value: 5.9272
- type: nauc_precision_at_1_max
value: -4.8705
- type: nauc_precision_at_1_std
value: -9.1757
- type: nauc_precision_at_1_diff1
value: 17.743000000000002
- type: nauc_precision_at_3_max
value: -2.7195
- type: nauc_precision_at_3_std
value: -11.2342
- type: nauc_precision_at_3_diff1
value: 8.7116
- type: nauc_precision_at_5_max
value: 1.7492
- type: nauc_precision_at_5_std
value: -10.6963
- type: nauc_precision_at_5_diff1
value: 10.569
- type: nauc_precision_at_10_max
value: 10.7433
- type: nauc_precision_at_10_std
value: -6.339599999999999
- type: nauc_precision_at_10_diff1
value: 10.6275
- type: nauc_precision_at_20_max
value: 14.802499999999998
- type: nauc_precision_at_20_std
value: 3.9196
- type: nauc_precision_at_20_diff1
value: 6.0286
- type: nauc_precision_at_100_max
value: 40.8859
- type: nauc_precision_at_100_std
value: 57.965500000000006
- type: nauc_precision_at_100_diff1
value: 30.7703
- type: nauc_precision_at_1000_max
value: 24.2175
- type: nauc_precision_at_1000_std
value: 70.9234
- type: nauc_precision_at_1000_diff1
value: 5.9272
- type: nauc_mrr_at_1_max
value: -5.1491
- type: nauc_mrr_at_1_std
value: -8.8127
- type: nauc_mrr_at_1_diff1
value: 15.857099999999999
- type: nauc_mrr_at_3_max
value: -5.083200000000001
- type: nauc_mrr_at_3_std
value: -9.8967
- type: nauc_mrr_at_3_diff1
value: 11.9042
- type: nauc_mrr_at_5_max
value: -4.530399999999999
- type: nauc_mrr_at_5_std
value: -9.900599999999999
- type: nauc_mrr_at_5_diff1
value: 12.2957
- type: nauc_mrr_at_10_max
value: -4.2387
- type: nauc_mrr_at_10_std
value: -9.6123
- type: nauc_mrr_at_10_diff1
value: 12.4769
- type: nauc_mrr_at_20_max
value: -4.5254
- type: nauc_mrr_at_20_std
value: -9.5502
- type: nauc_mrr_at_20_diff1
value: 12.4674
- type: nauc_mrr_at_100_max
value: -4.5576
- type: nauc_mrr_at_100_std
value: -9.549100000000001
- type: nauc_mrr_at_100_diff1
value: 12.556899999999999
- type: nauc_mrr_at_1000_max
value: -4.5645999999999995
- type: nauc_mrr_at_1000_std
value: -9.5548
- type: nauc_mrr_at_1000_diff1
value: 12.552900000000001
- type: main_score
value: 56.614
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P (default)
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 47.2524
- type: v_measure_std
value: 13.7772
- type: main_score
value: 47.2524
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S (default)
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 40.7262
- type: v_measure_std
value: 14.125499999999999
- type: main_score
value: 40.7262
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions (default)
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 61.57319999999999
- type: mrr
value: 74.6714
- type: nAUC_map_max
value: 21.8916
- type: nAUC_map_std
value: 17.9941
- type: nAUC_map_diff1
value: 1.5548
- type: nAUC_mrr_max
value: 34.139399999999995
- type: nAUC_mrr_std
value: 18.133499999999998
- type: nAUC_mrr_diff1
value: 13.3597
- type: main_score
value: 61.57319999999999
- task:
type: STS
dataset:
name: MTEB BIOSSES (default)
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: pearson
value: 86.7849
- type: spearman
value: 84.7302
- type: cosine_pearson
value: 86.7849
- type: cosine_spearman
value: 84.7302
- type: manhattan_pearson
value: 84.48179999999999
- type: manhattan_spearman
value: 84.0507
- type: euclidean_pearson
value: 84.8613
- type: euclidean_spearman
value: 84.6266
- type: main_score
value: 84.7302
- task:
type: Classification
dataset:
name: MTEB Banking77Classification (default)
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 85.7175
- type: f1
value: 85.6781
- type: f1_weighted
value: 85.6781
- type: main_score
value: 85.7175
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P (default)
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 40.0588
- type: v_measure_std
value: 0.8872
- type: main_score
value: 40.0588
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S (default)
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 36.382799999999996
- type: v_measure_std
value: 1.167
- type: main_score
value: 36.382799999999996
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval (default)
type: mteb/cqadupstack-android
config: default
split: test
revision: f46a197baaae43b4f621051089b82a364682dfeb
metrics:
- type: ndcg_at_1
value: 37.196
- type: ndcg_at_3
value: 42.778
- type: ndcg_at_5
value: 45.013999999999996
- type: ndcg_at_10
value: 47.973
- type: ndcg_at_20
value: 50.141000000000005
- type: ndcg_at_100
value: 53.31399999999999
- type: ndcg_at_1000
value: 55.52
- type: map_at_1
value: 30.598
- type: map_at_3
value: 38.173
- type: map_at_5
value: 40.093
- type: map_at_10
value: 41.686
- type: map_at_20
value: 42.522
- type: map_at_100
value: 43.191
- type: map_at_1000
value: 43.328
- type: recall_at_1
value: 30.598
- type: recall_at_3
value: 45.019999999999996
- type: recall_at_5
value: 51.357
- type: recall_at_10
value: 60.260000000000005
- type: recall_at_20
value: 67.93299999999999
- type: recall_at_100
value: 82.07
- type: recall_at_1000
value: 96.345
- type: precision_at_1
value: 37.196
- type: precision_at_3
value: 20.552999999999997
- type: precision_at_5
value: 14.707
- type: precision_at_10
value: 9.213000000000001
- type: precision_at_20
value: 5.522
- type: precision_at_100
value: 1.4949999999999999
- type: precision_at_1000
value: 0.198
- type: mrr_at_1
value: 37.196
- type: mrr_at_3
value: 44.4683
- type: mrr_at_5
value: 45.9776
- type: mrr_at_10
value: 47.1884
- type: mrr_at_20
value: 47.6763
- type: mrr_at_100
value: 47.957
- type: mrr_at_1000
value: 48.0103
- type: nauc_ndcg_at_1_max
value: 38.1056
- type: nauc_ndcg_at_1_std
value: -1.5731
- type: nauc_ndcg_at_1_diff1
value: 52.3965
- type: nauc_ndcg_at_3_max
value: 35.8655
- type: nauc_ndcg_at_3_std
value: 0.2057
- type: nauc_ndcg_at_3_diff1
value: 46.299600000000005
- type: nauc_ndcg_at_5_max
value: 36.3806
- type: nauc_ndcg_at_5_std
value: 1.542
- type: nauc_ndcg_at_5_diff1
value: 45.3674
- type: nauc_ndcg_at_10_max
value: 36.6053
- type: nauc_ndcg_at_10_std
value: 2.7934
- type: nauc_ndcg_at_10_diff1
value: 45.3474
- type: nauc_ndcg_at_20_max
value: 37.2333
- type: nauc_ndcg_at_20_std
value: 3.3346
- type: nauc_ndcg_at_20_diff1
value: 45.6105
- type: nauc_ndcg_at_100_max
value: 38.168400000000005
- type: nauc_ndcg_at_100_std
value: 4.618
- type: nauc_ndcg_at_100_diff1
value: 45.7041
- type: nauc_ndcg_at_1000_max
value: 37.911
- type: nauc_ndcg_at_1000_std
value: 4.2068
- type: nauc_ndcg_at_1000_diff1
value: 46.0349
- type: nauc_map_at_1_max
value: 33.6794
- type: nauc_map_at_1_std
value: -0.7946
- type: nauc_map_at_1_diff1
value: 55.799699999999994
- type: nauc_map_at_3_max
value: 35.216300000000004
- type: nauc_map_at_3_std
value: -0.3286
- type: nauc_map_at_3_diff1
value: 49.5727
- type: nauc_map_at_5_max
value: 35.583999999999996
- type: nauc_map_at_5_std
value: 0.4626
- type: nauc_map_at_5_diff1
value: 48.621900000000004
- type: nauc_map_at_10_max
value: 35.837
- type: nauc_map_at_10_std
value: 1.1462999999999999
- type: nauc_map_at_10_diff1
value: 48.302499999999995
- type: nauc_map_at_20_max
value: 36.1877
- type: nauc_map_at_20_std
value: 1.5263
- type: nauc_map_at_20_diff1
value: 48.2105
- type: nauc_map_at_100_max
value: 36.452
- type: nauc_map_at_100_std
value: 1.958
- type: nauc_map_at_100_diff1
value: 48.1781
- type: nauc_map_at_1000_max
value: 36.4422
- type: nauc_map_at_1000_std
value: 1.9560000000000002
- type: nauc_map_at_1000_diff1
value: 48.166399999999996
- type: nauc_recall_at_1_max
value: 33.6794
- type: nauc_recall_at_1_std
value: -0.7946
- type: nauc_recall_at_1_diff1
value: 55.799699999999994
- type: nauc_recall_at_3_max
value: 33.591
- type: nauc_recall_at_3_std
value: 0.7802
- type: nauc_recall_at_3_diff1
value: 42.728100000000005
- type: nauc_recall_at_5_max
value: 34.1456
- type: nauc_recall_at_5_std
value: 3.803
- type: nauc_recall_at_5_diff1
value: 39.3889
- type: nauc_recall_at_10_max
value: 34.2228
- type: nauc_recall_at_10_std
value: 7.394399999999999
- type: nauc_recall_at_10_diff1
value: 37.660900000000005
- type: nauc_recall_at_20_max
value: 35.9338
- type: nauc_recall_at_20_std
value: 9.6754
- type: nauc_recall_at_20_diff1
value: 36.626999999999995
- type: nauc_recall_at_100_max
value: 43.0721
- type: nauc_recall_at_100_std
value: 21.493499999999997
- type: nauc_recall_at_100_diff1
value: 34.809
- type: nauc_recall_at_1000_max
value: 61.345499999999994
- type: nauc_recall_at_1000_std
value: 66.2789
- type: nauc_recall_at_1000_diff1
value: 43.5024
- type: nauc_precision_at_1_max
value: 38.1056
- type: nauc_precision_at_1_std
value: -1.5731
- type: nauc_precision_at_1_diff1
value: 52.3965
- type: nauc_precision_at_3_max
value: 31.2978
- type: nauc_precision_at_3_std
value: 0.0904
- type: nauc_precision_at_3_diff1
value: 25.9668
- type: nauc_precision_at_5_max
value: 28.2209
- type: nauc_precision_at_5_std
value: 3.6561000000000003
- type: nauc_precision_at_5_diff1
value: 16.3544
- type: nauc_precision_at_10_max
value: 21.8709
- type: nauc_precision_at_10_std
value: 7.3919
- type: nauc_precision_at_10_diff1
value: 4.4909
- type: nauc_precision_at_20_max
value: 16.3885
- type: nauc_precision_at_20_std
value: 9.8527
- type: nauc_precision_at_20_diff1
value: -3.9433000000000002
- type: nauc_precision_at_100_max
value: 4.612
- type: nauc_precision_at_100_std
value: 6.9627
- type: nauc_precision_at_100_diff1
value: -14.0135
- type: nauc_precision_at_1000_max
value: -10.599699999999999
- type: nauc_precision_at_1000_std
value: -4.5693
- type: nauc_precision_at_1000_diff1
value: -21.0926
- type: nauc_mrr_at_1_max
value: 38.1056
- type: nauc_mrr_at_1_std
value: -1.5731
- type: nauc_mrr_at_1_diff1
value: 52.3965
- type: nauc_mrr_at_3_max
value: 37.4199
- type: nauc_mrr_at_3_std
value: -0.5046
- type: nauc_mrr_at_3_diff1
value: 46.5936
- type: nauc_mrr_at_5_max
value: 38.1046
- type: nauc_mrr_at_5_std
value: 0.8115000000000001
- type: nauc_mrr_at_5_diff1
value: 46.051500000000004
- type: nauc_mrr_at_10_max
value: 37.9372
- type: nauc_mrr_at_10_std
value: 1.0405
- type: nauc_mrr_at_10_diff1
value: 46.085
- type: nauc_mrr_at_20_max
value: 38.0462
- type: nauc_mrr_at_20_std
value: 0.9399
- type: nauc_mrr_at_20_diff1
value: 46.247
- type: nauc_mrr_at_100_max
value: 38.0712
- type: nauc_mrr_at_100_std
value: 1.0857
- type: nauc_mrr_at_100_diff1
value: 46.257999999999996
- type: nauc_mrr_at_1000_max
value: 38.0822
- type: nauc_mrr_at_1000_std
value: 1.0925
- type: nauc_mrr_at_1000_diff1
value: 46.2851
- type: main_score
value: 47.973
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackEnglishRetrieval (default)
type: mteb/cqadupstack-english
config: default
split: test
revision: ad9991cb51e31e31e430383c75ffb2885547b5f0
metrics:
- type: ndcg_at_1
value: 34.394999999999996
- type: ndcg_at_3
value: 37.994
- type: ndcg_at_5
value: 40.056999999999995
- type: ndcg_at_10
value: 42.174
- type: ndcg_at_20
value: 44.04
- type: ndcg_at_100
value: 46.833999999999996
- type: ndcg_at_1000
value: 49.025999999999996
- type: map_at_1
value: 27.6
- type: map_at_3
value: 34.004
- type: map_at_5
value: 35.592
- type: map_at_10
value: 36.803999999999995
- type: map_at_20
value: 37.508
- type: map_at_100
value: 38.068999999999996
- type: map_at_1000
value: 38.202999999999996
- type: recall_at_1
value: 27.6
- type: recall_at_3
value: 39.684999999999995
- type: recall_at_5
value: 45.397
- type: recall_at_10
value: 51.737
- type: recall_at_20
value: 58.47
- type: recall_at_100
value: 71.42500000000001
- type: recall_at_1000
value: 85.372
- type: precision_at_1
value: 34.394999999999996
- type: precision_at_3
value: 18.279999999999998
- type: precision_at_5
value: 13.096
- type: precision_at_10
value: 8.019
- type: precision_at_20
value: 4.812
- type: precision_at_100
value: 1.344
- type: precision_at_1000
value: 0.182
- type: mrr_at_1
value: 34.3949
- type: mrr_at_3
value: 39.9894
- type: mrr_at_5
value: 41.438399999999994
- type: mrr_at_10
value: 42.3136
- type: mrr_at_20
value: 42.769800000000004
- type: mrr_at_100
value: 43.0583
- type: mrr_at_1000
value: 43.1108
- type: nauc_ndcg_at_1_max
value: 37.1051
- type: nauc_ndcg_at_1_std
value: -1.4586
- type: nauc_ndcg_at_1_diff1
value: 52.3038
- type: nauc_ndcg_at_3_max
value: 35.7717
- type: nauc_ndcg_at_3_std
value: -2.191
- type: nauc_ndcg_at_3_diff1
value: 48.688500000000005
- type: nauc_ndcg_at_5_max
value: 35.6552
- type: nauc_ndcg_at_5_std
value: -2.0198
- type: nauc_ndcg_at_5_diff1
value: 48.308
- type: nauc_ndcg_at_10_max
value: 35.0904
- type: nauc_ndcg_at_10_std
value: -1.3836
- type: nauc_ndcg_at_10_diff1
value: 47.6937
- type: nauc_ndcg_at_20_max
value: 35.6035
- type: nauc_ndcg_at_20_std
value: 0.2853
- type: nauc_ndcg_at_20_diff1
value: 46.705000000000005
- type: nauc_ndcg_at_100_max
value: 36.583
- type: nauc_ndcg_at_100_std
value: 2.7466
- type: nauc_ndcg_at_100_diff1
value: 46.4799
- type: nauc_ndcg_at_1000_max
value: 36.3746
- type: nauc_ndcg_at_1000_std
value: 2.9227
- type: nauc_ndcg_at_1000_diff1
value: 46.6333
- type: nauc_map_at_1_max
value: 29.4449
- type: nauc_map_at_1_std
value: -8.899899999999999
- type: nauc_map_at_1_diff1
value: 55.446799999999996
- type: nauc_map_at_3_max
value: 32.592
- type: nauc_map_at_3_std
value: -6.7539
- type: nauc_map_at_3_diff1
value: 50.857
- type: nauc_map_at_5_max
value: 33.234399999999994
- type: nauc_map_at_5_std
value: -5.8864
- type: nauc_map_at_5_diff1
value: 50.301899999999996
- type: nauc_map_at_10_max
value: 33.6075
- type: nauc_map_at_10_std
value: -4.9146
- type: nauc_map_at_10_diff1
value: 49.8723
- type: nauc_map_at_20_max
value: 34.0783
- type: nauc_map_at_20_std
value: -3.8943
- type: nauc_map_at_20_diff1
value: 49.4751
- type: nauc_map_at_100_max
value: 34.5953
- type: nauc_map_at_100_std
value: -3.0787
- type: nauc_map_at_100_diff1
value: 49.452
- type: nauc_map_at_1000_max
value: 34.6458
- type: nauc_map_at_1000_std
value: -2.9694000000000003
- type: nauc_map_at_1000_diff1
value: 49.467299999999994
- type: nauc_recall_at_1_max
value: 29.4449
- type: nauc_recall_at_1_std
value: -8.899899999999999
- type: nauc_recall_at_1_diff1
value: 55.446799999999996
- type: nauc_recall_at_3_max
value: 31.618800000000004
- type: nauc_recall_at_3_std
value: -6.1698
- type: nauc_recall_at_3_diff1
value: 45.7301
- type: nauc_recall_at_5_max
value: 32.211600000000004
- type: nauc_recall_at_5_std
value: -3.594
- type: nauc_recall_at_5_diff1
value: 43.8823
- type: nauc_recall_at_10_max
value: 31.2112
- type: nauc_recall_at_10_std
value: -0.30860000000000004
- type: nauc_recall_at_10_diff1
value: 41.3329
- type: nauc_recall_at_20_max
value: 32.9024
- type: nauc_recall_at_20_std
value: 5.76
- type: nauc_recall_at_20_diff1
value: 36.8023
- type: nauc_recall_at_100_max
value: 38.7919
- type: nauc_recall_at_100_std
value: 22.4841
- type: nauc_recall_at_100_diff1
value: 33.6918
- type: nauc_recall_at_1000_max
value: 37.6415
- type: nauc_recall_at_1000_std
value: 34.7539
- type: nauc_recall_at_1000_diff1
value: 29.8994
- type: nauc_precision_at_1_max
value: 37.1051
- type: nauc_precision_at_1_std
value: -1.4586
- type: nauc_precision_at_1_diff1
value: 52.3038
- type: nauc_precision_at_3_max
value: 38.8085
- type: nauc_precision_at_3_std
value: 9.067400000000001
- type: nauc_precision_at_3_diff1
value: 32.0198
- type: nauc_precision_at_5_max
value: 38.5842
- type: nauc_precision_at_5_std
value: 14.129
- type: nauc_precision_at_5_diff1
value: 25.2904
- type: nauc_precision_at_10_max
value: 36.321999999999996
- type: nauc_precision_at_10_std
value: 20.381
- type: nauc_precision_at_10_diff1
value: 17.1106
- type: nauc_precision_at_20_max
value: 36.0274
- type: nauc_precision_at_20_std
value: 30.1906
- type: nauc_precision_at_20_diff1
value: 8.752699999999999
- type: nauc_precision_at_100_max
value: 31.626900000000003
- type: nauc_precision_at_100_std
value: 38.6494
- type: nauc_precision_at_100_diff1
value: 2.5243
- type: nauc_precision_at_1000_max
value: 18.869600000000002
- type: nauc_precision_at_1000_std
value: 32.9116
- type: nauc_precision_at_1000_diff1
value: -1.9265999999999999
- type: nauc_mrr_at_1_max
value: 37.1051
- type: nauc_mrr_at_1_std
value: -1.4586
- type: nauc_mrr_at_1_diff1
value: 52.3038
- type: nauc_mrr_at_3_max
value: 37.1104
- type: nauc_mrr_at_3_std
value: 0.3024
- type: nauc_mrr_at_3_diff1
value: 48.6141
- type: nauc_mrr_at_5_max
value: 37.155
- type: nauc_mrr_at_5_std
value: 0.8841
- type: nauc_mrr_at_5_diff1
value: 48.4238
- type: nauc_mrr_at_10_max
value: 36.8581
- type: nauc_mrr_at_10_std
value: 0.9572
- type: nauc_mrr_at_10_diff1
value: 47.9585
- type: nauc_mrr_at_20_max
value: 37.0095
- type: nauc_mrr_at_20_std
value: 1.2396
- type: nauc_mrr_at_20_diff1
value: 47.897099999999995
- type: nauc_mrr_at_100_max
value: 37.0474
- type: nauc_mrr_at_100_std
value: 1.397
- type: nauc_mrr_at_100_diff1
value: 47.8843
- type: nauc_mrr_at_1000_max
value: 37.0388
- type: nauc_mrr_at_1000_std
value: 1.3889
- type: nauc_mrr_at_1000_diff1
value: 47.8923
- type: main_score
value: 42.174
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGamingRetrieval (default)
type: mteb/cqadupstack-gaming
config: default
split: test
revision: 4885aa143210c98657558c04aaf3dc47cfb54340
metrics:
- type: ndcg_at_1
value: 44.263000000000005
- type: ndcg_at_3
value: 51.32
- type: ndcg_at_5
value: 54.354
- type: ndcg_at_10
value: 56.855
- type: ndcg_at_20
value: 59.019
- type: ndcg_at_100
value: 61.507999999999996
- type: ndcg_at_1000
value: 62.522
- type: map_at_1
value: 38.821
- type: map_at_3
value: 47.79
- type: map_at_5
value: 49.826
- type: map_at_10
value: 51.129999999999995
- type: map_at_20
value: 51.882
- type: map_at_100
value: 52.321
- type: map_at_1000
value: 52.373000000000005
- type: recall_at_1
value: 38.821
- type: recall_at_3
value: 55.961000000000006
- type: recall_at_5
value: 63.286
- type: recall_at_10
value: 70.408
- type: recall_at_20
value: 78.47
- type: recall_at_100
value: 90.509
- type: recall_at_1000
value: 97.543
- type: precision_at_1
value: 44.263000000000005
- type: precision_at_3
value: 22.926
- type: precision_at_5
value: 16.012999999999998
- type: precision_at_10
value: 9.223
- type: precision_at_20
value: 5.238
- type: precision_at_100
value: 1.246
- type: precision_at_1000
value: 0.13799999999999998
- type: mrr_at_1
value: 44.2633
- type: mrr_at_3
value: 51.7032
- type: mrr_at_5
value: 53.380399999999995
- type: mrr_at_10
value: 54.3026
- type: mrr_at_20
value: 54.797700000000006
- type: mrr_at_100
value: 55.07379999999999
- type: mrr_at_1000
value: 55.0997
- type: nauc_ndcg_at_1_max
value: 36.5201
- type: nauc_ndcg_at_1_std
value: -4.0972
- type: nauc_ndcg_at_1_diff1
value: 49.5567
- type: nauc_ndcg_at_3_max
value: 36.4186
- type: nauc_ndcg_at_3_std
value: -3.2881
- type: nauc_ndcg_at_3_diff1
value: 44.5043
- type: nauc_ndcg_at_5_max
value: 36.8275
- type: nauc_ndcg_at_5_std
value: -2.8840999999999997
- type: nauc_ndcg_at_5_diff1
value: 44.1124
- type: nauc_ndcg_at_10_max
value: 37.8819
- type: nauc_ndcg_at_10_std
value: -1.5313999999999999
- type: nauc_ndcg_at_10_diff1
value: 43.538700000000006
- type: nauc_ndcg_at_20_max
value: 37.9693
- type: nauc_ndcg_at_20_std
value: -0.5973
- type: nauc_ndcg_at_20_diff1
value: 42.9989
- type: nauc_ndcg_at_100_max
value: 38.3465
- type: nauc_ndcg_at_100_std
value: -0.0186
- type: nauc_ndcg_at_100_diff1
value: 43.4551
- type: nauc_ndcg_at_1000_max
value: 38.2222
- type: nauc_ndcg_at_1000_std
value: -0.3677
- type: nauc_ndcg_at_1000_diff1
value: 43.8485
- type: nauc_map_at_1_max
value: 30.3838
- type: nauc_map_at_1_std
value: -6.0729
- type: nauc_map_at_1_diff1
value: 49.9023
- type: nauc_map_at_3_max
value: 34.394000000000005
- type: nauc_map_at_3_std
value: -5.0606
- type: nauc_map_at_3_diff1
value: 46.3459
- type: nauc_map_at_5_max
value: 34.846199999999996
- type: nauc_map_at_5_std
value: -4.6529
- type: nauc_map_at_5_diff1
value: 45.9401
- type: nauc_map_at_10_max
value: 35.6705
- type: nauc_map_at_10_std
value: -3.6452999999999998
- type: nauc_map_at_10_diff1
value: 45.476299999999995
- type: nauc_map_at_20_max
value: 35.951899999999995
- type: nauc_map_at_20_std
value: -3.0703
- type: nauc_map_at_20_diff1
value: 45.2239
- type: nauc_map_at_100_max
value: 36.1499
- type: nauc_map_at_100_std
value: -2.8472
- type: nauc_map_at_100_diff1
value: 45.2281
- type: nauc_map_at_1000_max
value: 36.1684
- type: nauc_map_at_1000_std
value: -2.8369
- type: nauc_map_at_1000_diff1
value: 45.2513
- type: nauc_recall_at_1_max
value: 30.3838
- type: nauc_recall_at_1_std
value: -6.0729
- type: nauc_recall_at_1_diff1
value: 49.9023
- type: nauc_recall_at_3_max
value: 35.4902
- type: nauc_recall_at_3_std
value: -4.166
- type: nauc_recall_at_3_diff1
value: 41.3795
- type: nauc_recall_at_5_max
value: 35.551100000000005
- type: nauc_recall_at_5_std
value: -2.6090999999999998
- type: nauc_recall_at_5_diff1
value: 38.567499999999995
- type: nauc_recall_at_10_max
value: 39.1336
- type: nauc_recall_at_10_std
value: 1.7909000000000002
- type: nauc_recall_at_10_diff1
value: 36.0768
- type: nauc_recall_at_20_max
value: 41.0936
- type: nauc_recall_at_20_std
value: 8.4893
- type: nauc_recall_at_20_diff1
value: 31.3577
- type: nauc_recall_at_100_max
value: 47.2494
- type: nauc_recall_at_100_std
value: 23.6531
- type: nauc_recall_at_100_diff1
value: 28.3733
- type: nauc_recall_at_1000_max
value: 60.132799999999996
- type: nauc_recall_at_1000_std
value: 51.15650000000001
- type: nauc_recall_at_1000_diff1
value: 23.1446
- type: nauc_precision_at_1_max
value: 36.5201
- type: nauc_precision_at_1_std
value: -4.0972
- type: nauc_precision_at_1_diff1
value: 49.5567
- type: nauc_precision_at_3_max
value: 35.43
- type: nauc_precision_at_3_std
value: 2.5281000000000002
- type: nauc_precision_at_3_diff1
value: 26.259900000000002
- type: nauc_precision_at_5_max
value: 33.2373
- type: nauc_precision_at_5_std
value: 6.2754
- type: nauc_precision_at_5_diff1
value: 18.587699999999998
- type: nauc_precision_at_10_max
value: 32.9216
- type: nauc_precision_at_10_std
value: 14.078299999999999
- type: nauc_precision_at_10_diff1
value: 8.0609
- type: nauc_precision_at_20_max
value: 30.7836
- type: nauc_precision_at_20_std
value: 21.0397
- type: nauc_precision_at_20_diff1
value: -1.7804
- type: nauc_precision_at_100_max
value: 25.4678
- type: nauc_precision_at_100_std
value: 25.452399999999997
- type: nauc_precision_at_100_diff1
value: -10.8569
- type: nauc_precision_at_1000_max
value: 20.2269
- type: nauc_precision_at_1000_std
value: 22.9962
- type: nauc_precision_at_1000_diff1
value: -13.309000000000001
- type: nauc_mrr_at_1_max
value: 36.5201
- type: nauc_mrr_at_1_std
value: -4.0972
- type: nauc_mrr_at_1_diff1
value: 49.5567
- type: nauc_mrr_at_3_max
value: 38.4583
- type: nauc_mrr_at_3_std
value: -2.3642
- type: nauc_mrr_at_3_diff1
value: 45.692899999999995
- type: nauc_mrr_at_5_max
value: 38.2616
- type: nauc_mrr_at_5_std
value: -2.1449
- type: nauc_mrr_at_5_diff1
value: 45.217
- type: nauc_mrr_at_10_max
value: 38.5321
- type: nauc_mrr_at_10_std
value: -1.8026
- type: nauc_mrr_at_10_diff1
value: 45.1717
- type: nauc_mrr_at_20_max
value: 38.5499
- type: nauc_mrr_at_20_std
value: -1.6838
- type: nauc_mrr_at_20_diff1
value: 45.1274
- type: nauc_mrr_at_100_max
value: 38.5241
- type: nauc_mrr_at_100_std
value: -1.7292999999999998
- type: nauc_mrr_at_100_diff1
value: 45.183299999999996
- type: nauc_mrr_at_1000_max
value: 38.520900000000005
- type: nauc_mrr_at_1000_std
value: -1.7335
- type: nauc_mrr_at_1000_diff1
value: 45.1948
- type: main_score
value: 56.855
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGisRetrieval (default)
type: mteb/cqadupstack-gis
config: default
split: test
revision: 5003b3064772da1887988e05400cf3806fe491f2
metrics:
- type: ndcg_at_1
value: 28.362
- type: ndcg_at_3
value: 33.555
- type: ndcg_at_5
value: 35.857
- type: ndcg_at_10
value: 38.182
- type: ndcg_at_20
value: 40.181
- type: ndcg_at_100
value: 43.475
- type: ndcg_at_1000
value: 45.512
- type: map_at_1
value: 26.529000000000003
- type: map_at_3
value: 31.413000000000004
- type: map_at_5
value: 32.844
- type: map_at_10
value: 33.884
- type: map_at_20
value: 34.446
- type: map_at_100
value: 34.942
- type: map_at_1000
value: 35.018
- type: recall_at_1
value: 26.529000000000003
- type: recall_at_3
value: 37.313
- type: recall_at_5
value: 42.792
- type: recall_at_10
value: 49.748
- type: recall_at_20
value: 57.199999999999996
- type: recall_at_100
value: 74.118
- type: recall_at_1000
value: 89.593
- type: precision_at_1
value: 28.362
- type: precision_at_3
value: 13.936000000000002
- type: precision_at_5
value: 9.74
- type: precision_at_10
value: 5.7059999999999995
- type: precision_at_20
value: 3.3329999999999997
- type: precision_at_100
value: 0.886
- type: precision_at_1000
value: 0.109
- type: mrr_at_1
value: 28.3616
- type: mrr_at_3
value: 33.5028
- type: mrr_at_5
value: 34.7175
- type: mrr_at_10
value: 35.6453
- type: mrr_at_20
value: 36.2289
- type: mrr_at_100
value: 36.6171
- type: mrr_at_1000
value: 36.681000000000004
- type: nauc_ndcg_at_1_max
value: 31.811099999999996
- type: nauc_ndcg_at_1_std
value: -4.5333
- type: nauc_ndcg_at_1_diff1
value: 48.3941
- type: nauc_ndcg_at_3_max
value: 31.034499999999998
- type: nauc_ndcg_at_3_std
value: -2.444
- type: nauc_ndcg_at_3_diff1
value: 43.8938
- type: nauc_ndcg_at_5_max
value: 31.373800000000003
- type: nauc_ndcg_at_5_std
value: -1.3659
- type: nauc_ndcg_at_5_diff1
value: 42.4021
- type: nauc_ndcg_at_10_max
value: 30.4083
- type: nauc_ndcg_at_10_std
value: -0.9893000000000001
- type: nauc_ndcg_at_10_diff1
value: 41.2387
- type: nauc_ndcg_at_20_max
value: 30.5471
- type: nauc_ndcg_at_20_std
value: 0.05689999999999999
- type: nauc_ndcg_at_20_diff1
value: 40.8052
- type: nauc_ndcg_at_100_max
value: 30.791800000000002
- type: nauc_ndcg_at_100_std
value: 0.7147
- type: nauc_ndcg_at_100_diff1
value: 40.708
- type: nauc_ndcg_at_1000_max
value: 31.7174
- type: nauc_ndcg_at_1000_std
value: 0.8226000000000001
- type: nauc_ndcg_at_1000_diff1
value: 41.6999
- type: nauc_map_at_1_max
value: 29.6273
- type: nauc_map_at_1_std
value: -6.8855
- type: nauc_map_at_1_diff1
value: 49.7534
- type: nauc_map_at_3_max
value: 30.6498
- type: nauc_map_at_3_std
value: -3.7261
- type: nauc_map_at_3_diff1
value: 45.5401
- type: nauc_map_at_5_max
value: 30.8948
- type: nauc_map_at_5_std
value: -3.0341
- type: nauc_map_at_5_diff1
value: 44.7017
- type: nauc_map_at_10_max
value: 30.538999999999998
- type: nauc_map_at_10_std
value: -2.8572
- type: nauc_map_at_10_diff1
value: 44.2979
- type: nauc_map_at_20_max
value: 30.5475
- type: nauc_map_at_20_std
value: -2.535
- type: nauc_map_at_20_diff1
value: 44.1459
- type: nauc_map_at_100_max
value: 30.6945
- type: nauc_map_at_100_std
value: -2.4573
- type: nauc_map_at_100_diff1
value: 44.1182
- type: nauc_map_at_1000_max
value: 30.7339
- type: nauc_map_at_1000_std
value: -2.4239
- type: nauc_map_at_1000_diff1
value: 44.147999999999996
- type: nauc_recall_at_1_max
value: 29.6273
- type: nauc_recall_at_1_std
value: -6.8855
- type: nauc_recall_at_1_diff1
value: 49.7534
- type: nauc_recall_at_3_max
value: 30.6914
- type: nauc_recall_at_3_std
value: -0.2006
- type: nauc_recall_at_3_diff1
value: 40.1871
- type: nauc_recall_at_5_max
value: 31.055300000000003
- type: nauc_recall_at_5_std
value: 2.3528000000000002
- type: nauc_recall_at_5_diff1
value: 36.0852
- type: nauc_recall_at_10_max
value: 27.7266
- type: nauc_recall_at_10_std
value: 3.3422
- type: nauc_recall_at_10_diff1
value: 32.073800000000006
- type: nauc_recall_at_20_max
value: 27.4648
- type: nauc_recall_at_20_std
value: 7.5625
- type: nauc_recall_at_20_diff1
value: 29.567100000000003
- type: nauc_recall_at_100_max
value: 26.152199999999997
- type: nauc_recall_at_100_std
value: 15.0121
- type: nauc_recall_at_100_diff1
value: 24.9364
- type: nauc_recall_at_1000_max
value: 41.4023
- type: nauc_recall_at_1000_std
value: 30.557299999999998
- type: nauc_recall_at_1000_diff1
value: 32.1092
- type: nauc_precision_at_1_max
value: 31.811099999999996
- type: nauc_precision_at_1_std
value: -4.5333
- type: nauc_precision_at_1_diff1
value: 48.3941
- type: nauc_precision_at_3_max
value: 33.0304
- type: nauc_precision_at_3_std
value: 2.4003
- type: nauc_precision_at_3_diff1
value: 36.2318
- type: nauc_precision_at_5_max
value: 32.257000000000005
- type: nauc_precision_at_5_std
value: 5.0698
- type: nauc_precision_at_5_diff1
value: 31.707800000000002
- type: nauc_precision_at_10_max
value: 27.735599999999998
- type: nauc_precision_at_10_std
value: 6.1906
- type: nauc_precision_at_10_diff1
value: 26.072
- type: nauc_precision_at_20_max
value: 27.5381
- type: nauc_precision_at_20_std
value: 10.1923
- type: nauc_precision_at_20_diff1
value: 21.3019
- type: nauc_precision_at_100_max
value: 21.9208
- type: nauc_precision_at_100_std
value: 14.4338
- type: nauc_precision_at_100_diff1
value: 9.198
- type: nauc_precision_at_1000_max
value: 19.8643
- type: nauc_precision_at_1000_std
value: 15.779499999999999
- type: nauc_precision_at_1000_diff1
value: -1.2106999999999999
- type: nauc_mrr_at_1_max
value: 31.811099999999996
- type: nauc_mrr_at_1_std
value: -4.5333
- type: nauc_mrr_at_1_diff1
value: 48.3941
- type: nauc_mrr_at_3_max
value: 31.6626
- type: nauc_mrr_at_3_std
value: -2.1915
- type: nauc_mrr_at_3_diff1
value: 44.190400000000004
- type: nauc_mrr_at_5_max
value: 31.9004
- type: nauc_mrr_at_5_std
value: -1.7576
- type: nauc_mrr_at_5_diff1
value: 43.3956
- type: nauc_mrr_at_10_max
value: 31.572899999999997
- type: nauc_mrr_at_10_std
value: -1.6476000000000002
- type: nauc_mrr_at_10_diff1
value: 42.9418
- type: nauc_mrr_at_20_max
value: 31.764599999999998
- type: nauc_mrr_at_20_std
value: -1.3288
- type: nauc_mrr_at_20_diff1
value: 42.9203
- type: nauc_mrr_at_100_max
value: 31.7058
- type: nauc_mrr_at_100_std
value: -1.3098999999999998
- type: nauc_mrr_at_100_diff1
value: 42.9097
- type: nauc_mrr_at_1000_max
value: 31.7363
- type: nauc_mrr_at_1000_std
value: -1.2968
- type: nauc_mrr_at_1000_diff1
value: 42.951899999999995
- type: main_score
value: 38.182
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackMathematicaRetrieval (default)
type: mteb/cqadupstack-mathematica
config: default
split: test
revision: 90fceea13679c63fe563ded68f3b6f06e50061de
metrics:
- type: ndcg_at_1
value: 19.776
- type: ndcg_at_3
value: 23.959
- type: ndcg_at_5
value: 26.064
- type: ndcg_at_10
value: 28.797
- type: ndcg_at_20
value: 30.419
- type: ndcg_at_100
value: 34.009
- type: ndcg_at_1000
value: 37.098
- type: map_at_1
value: 15.931999999999999
- type: map_at_3
value: 21.044999999999998
- type: map_at_5
value: 22.381
- type: map_at_10
value: 23.595
- type: map_at_20
value: 24.065
- type: map_at_100
value: 24.606
- type: map_at_1000
value: 24.728
- type: recall_at_1
value: 15.931999999999999
- type: recall_at_3
value: 27.051
- type: recall_at_5
value: 32.293
- type: recall_at_10
value: 40.399
- type: recall_at_20
value: 46.335
- type: recall_at_100
value: 63.855
- type: recall_at_1000
value: 86.06099999999999
- type: precision_at_1
value: 19.776
- type: precision_at_3
value: 11.526
- type: precision_at_5
value: 8.483
- type: precision_at_10
value: 5.398
- type: precision_at_20
value: 3.147
- type: precision_at_100
value: 0.9199999999999999
- type: precision_at_1000
value: 0.133
- type: mrr_at_1
value: 19.7761
- type: mrr_at_3
value: 25.580399999999997
- type: mrr_at_5
value: 26.9113
- type: mrr_at_10
value: 28.121499999999997
- type: mrr_at_20
value: 28.5441
- type: mrr_at_100
value: 28.9649
- type: mrr_at_1000
value: 29.0362
- type: nauc_ndcg_at_1_max
value: 16.1721
- type: nauc_ndcg_at_1_std
value: -5.8922
- type: nauc_ndcg_at_1_diff1
value: 32.987899999999996
- type: nauc_ndcg_at_3_max
value: 16.3184
- type: nauc_ndcg_at_3_std
value: -2.3258
- type: nauc_ndcg_at_3_diff1
value: 30.2222
- type: nauc_ndcg_at_5_max
value: 14.013900000000001
- type: nauc_ndcg_at_5_std
value: -2.0383
- type: nauc_ndcg_at_5_diff1
value: 29.444799999999997
- type: nauc_ndcg_at_10_max
value: 13.4159
- type: nauc_ndcg_at_10_std
value: -2.1247
- type: nauc_ndcg_at_10_diff1
value: 29.035300000000003
- type: nauc_ndcg_at_20_max
value: 13.4454
- type: nauc_ndcg_at_20_std
value: -1.7042000000000002
- type: nauc_ndcg_at_20_diff1
value: 29.136699999999998
- type: nauc_ndcg_at_100_max
value: 14.585600000000001
- type: nauc_ndcg_at_100_std
value: 0.9915999999999999
- type: nauc_ndcg_at_100_diff1
value: 28.419
- type: nauc_ndcg_at_1000_max
value: 14.2089
- type: nauc_ndcg_at_1000_std
value: 0.198
- type: nauc_ndcg_at_1000_diff1
value: 28.349000000000004
- type: nauc_map_at_1_max
value: 13.081499999999998
- type: nauc_map_at_1_std
value: -5.5374
- type: nauc_map_at_1_diff1
value: 33.6615
- type: nauc_map_at_3_max
value: 14.213600000000001
- type: nauc_map_at_3_std
value: -2.8775
- type: nauc_map_at_3_diff1
value: 30.8491
- type: nauc_map_at_5_max
value: 13.004
- type: nauc_map_at_5_std
value: -3.0094
- type: nauc_map_at_5_diff1
value: 30.298799999999996
- type: nauc_map_at_10_max
value: 12.9029
- type: nauc_map_at_10_std
value: -3.0807
- type: nauc_map_at_10_diff1
value: 30.126599999999996
- type: nauc_map_at_20_max
value: 12.9461
- type: nauc_map_at_20_std
value: -2.9581
- type: nauc_map_at_20_diff1
value: 30.134499999999996
- type: nauc_map_at_100_max
value: 13.1359
- type: nauc_map_at_100_std
value: -2.5017
- type: nauc_map_at_100_diff1
value: 30.018299999999996
- type: nauc_map_at_1000_max
value: 13.1193
- type: nauc_map_at_1000_std
value: -2.5128999999999997
- type: nauc_map_at_1000_diff1
value: 30.0067
- type: nauc_recall_at_1_max
value: 13.081499999999998
- type: nauc_recall_at_1_std
value: -5.5374
- type: nauc_recall_at_1_diff1
value: 33.6615
- type: nauc_recall_at_3_max
value: 16.5062
- type: nauc_recall_at_3_std
value: 0.5196000000000001
- type: nauc_recall_at_3_diff1
value: 27.553299999999997
- type: nauc_recall_at_5_max
value: 12.1851
- type: nauc_recall_at_5_std
value: 0.3195
- type: nauc_recall_at_5_diff1
value: 26.190799999999996
- type: nauc_recall_at_10_max
value: 10.595699999999999
- type: nauc_recall_at_10_std
value: -0.16169999999999998
- type: nauc_recall_at_10_diff1
value: 24.6259
- type: nauc_recall_at_20_max
value: 10.2497
- type: nauc_recall_at_20_std
value: 1.2119
- type: nauc_recall_at_20_diff1
value: 24.3161
- type: nauc_recall_at_100_max
value: 14.849499999999999
- type: nauc_recall_at_100_std
value: 15.209200000000001
- type: nauc_recall_at_100_diff1
value: 20.0322
- type: nauc_recall_at_1000_max
value: 10.678
- type: nauc_recall_at_1000_std
value: 19.6415
- type: nauc_recall_at_1000_diff1
value: 12.146899999999999
- type: nauc_precision_at_1_max
value: 16.1721
- type: nauc_precision_at_1_std
value: -5.8922
- type: nauc_precision_at_1_diff1
value: 32.987899999999996
- type: nauc_precision_at_3_max
value: 19.988
- type: nauc_precision_at_3_std
value: -2.574
- type: nauc_precision_at_3_diff1
value: 26.9007
- type: nauc_precision_at_5_max
value: 14.5492
- type: nauc_precision_at_5_std
value: -1.1918
- type: nauc_precision_at_5_diff1
value: 23.2059
- type: nauc_precision_at_10_max
value: 13.595099999999999
- type: nauc_precision_at_10_std
value: -0.9585
- type: nauc_precision_at_10_diff1
value: 21.063200000000002
- type: nauc_precision_at_20_max
value: 13.4271
- type: nauc_precision_at_20_std
value: 0.5092
- type: nauc_precision_at_20_diff1
value: 20.332
- type: nauc_precision_at_100_max
value: 14.5833
- type: nauc_precision_at_100_std
value: 9.581199999999999
- type: nauc_precision_at_100_diff1
value: 9.8307
- type: nauc_precision_at_1000_max
value: 4.9234
- type: nauc_precision_at_1000_std
value: 1.3542
- type: nauc_precision_at_1000_diff1
value: -1.6771999999999998
- type: nauc_mrr_at_1_max
value: 16.1721
- type: nauc_mrr_at_1_std
value: -5.8922
- type: nauc_mrr_at_1_diff1
value: 32.987899999999996
- type: nauc_mrr_at_3_max
value: 17.651
- type: nauc_mrr_at_3_std
value: -3.3937000000000004
- type: nauc_mrr_at_3_diff1
value: 30.067300000000003
- type: nauc_mrr_at_5_max
value: 16.7811
- type: nauc_mrr_at_5_std
value: -2.9766999999999997
- type: nauc_mrr_at_5_diff1
value: 30.125600000000002
- type: nauc_mrr_at_10_max
value: 16.5277
- type: nauc_mrr_at_10_std
value: -3.0048
- type: nauc_mrr_at_10_diff1
value: 30.010399999999997
- type: nauc_mrr_at_20_max
value: 16.470299999999998
- type: nauc_mrr_at_20_std
value: -2.9478
- type: nauc_mrr_at_20_diff1
value: 29.988
- type: nauc_mrr_at_100_max
value: 16.5707
- type: nauc_mrr_at_100_std
value: -2.7508
- type: nauc_mrr_at_100_diff1
value: 29.945100000000004
- type: nauc_mrr_at_1000_max
value: 16.5535
- type: nauc_mrr_at_1000_std
value: -2.7803
- type: nauc_mrr_at_1000_diff1
value: 29.948399999999996
- type: main_score
value: 28.797
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackPhysicsRetrieval (default)
type: mteb/cqadupstack-physics
config: default
split: test
revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4
metrics:
- type: ndcg_at_1
value: 36.574
- type: ndcg_at_3
value: 41.352
- type: ndcg_at_5
value: 44.012
- type: ndcg_at_10
value: 46.841
- type: ndcg_at_20
value: 48.933
- type: ndcg_at_100
value: 52.336000000000006
- type: ndcg_at_1000
value: 54.337
- type: map_at_1
value: 29.968
- type: map_at_3
value: 37.165
- type: map_at_5
value: 39.113
- type: map_at_10
value: 40.58
- type: map_at_20
value: 41.321999999999996
- type: map_at_100
value: 41.914
- type: map_at_1000
value: 42.028999999999996
- type: recall_at_1
value: 29.968
- type: recall_at_3
value: 44.605
- type: recall_at_5
value: 51.426
- type: recall_at_10
value: 59.614999999999995
- type: recall_at_20
value: 66.964
- type: recall_at_100
value: 82.943
- type: recall_at_1000
value: 95.76599999999999
- type: precision_at_1
value: 36.574
- type: precision_at_3
value: 19.442
- type: precision_at_5
value: 13.936000000000002
- type: precision_at_10
value: 8.566
- type: precision_at_20
value: 4.981
- type: precision_at_100
value: 1.3299999999999998
- type: precision_at_1000
value: 0.168
- type: mrr_at_1
value: 36.5736
- type: mrr_at_3
value: 43.7279
- type: mrr_at_5
value: 45.2679
- type: mrr_at_10
value: 46.380900000000004
- type: mrr_at_20
value: 46.8005
- type: mrr_at_100
value: 47.1448
- type: mrr_at_1000
value: 47.1883
- type: nauc_ndcg_at_1_max
value: 35.397400000000005
- type: nauc_ndcg_at_1_std
value: 4.6015
- type: nauc_ndcg_at_1_diff1
value: 49.0112
- type: nauc_ndcg_at_3_max
value: 34.543400000000005
- type: nauc_ndcg_at_3_std
value: 3.5360000000000005
- type: nauc_ndcg_at_3_diff1
value: 47.3852
- type: nauc_ndcg_at_5_max
value: 33.3912
- type: nauc_ndcg_at_5_std
value: 3.2248
- type: nauc_ndcg_at_5_diff1
value: 46.7688
- type: nauc_ndcg_at_10_max
value: 33.1062
- type: nauc_ndcg_at_10_std
value: 3.5458000000000003
- type: nauc_ndcg_at_10_diff1
value: 47.2397
- type: nauc_ndcg_at_20_max
value: 33.7566
- type: nauc_ndcg_at_20_std
value: 4.9054
- type: nauc_ndcg_at_20_diff1
value: 46.866
- type: nauc_ndcg_at_100_max
value: 34.9426
- type: nauc_ndcg_at_100_std
value: 6.7859
- type: nauc_ndcg_at_100_diff1
value: 47.2036
- type: nauc_ndcg_at_1000_max
value: 35.1984
- type: nauc_ndcg_at_1000_std
value: 6.3584000000000005
- type: nauc_ndcg_at_1000_diff1
value: 47.3887
- type: nauc_map_at_1_max
value: 34.4419
- type: nauc_map_at_1_std
value: 0.5319
- type: nauc_map_at_1_diff1
value: 52.832100000000004
- type: nauc_map_at_3_max
value: 34.4595
- type: nauc_map_at_3_std
value: 2.6957
- type: nauc_map_at_3_diff1
value: 49.0352
- type: nauc_map_at_5_max
value: 34.0602
- type: nauc_map_at_5_std
value: 2.8001
- type: nauc_map_at_5_diff1
value: 48.3502
- type: nauc_map_at_10_max
value: 34.1422
- type: nauc_map_at_10_std
value: 3.1277
- type: nauc_map_at_10_diff1
value: 48.6296
- type: nauc_map_at_20_max
value: 34.3693
- type: nauc_map_at_20_std
value: 3.5783
- type: nauc_map_at_20_diff1
value: 48.4885
- type: nauc_map_at_100_max
value: 34.5478
- type: nauc_map_at_100_std
value: 3.9373
- type: nauc_map_at_100_diff1
value: 48.5106
- type: nauc_map_at_1000_max
value: 34.578199999999995
- type: nauc_map_at_1000_std
value: 3.9463999999999997
- type: nauc_map_at_1000_diff1
value: 48.5252
- type: nauc_recall_at_1_max
value: 34.4419
- type: nauc_recall_at_1_std
value: 0.5319
- type: nauc_recall_at_1_diff1
value: 52.832100000000004
- type: nauc_recall_at_3_max
value: 31.4866
- type: nauc_recall_at_3_std
value: 2.1579
- type: nauc_recall_at_3_diff1
value: 44.498599999999996
- type: nauc_recall_at_5_max
value: 29.140500000000003
- type: nauc_recall_at_5_std
value: 1.9796
- type: nauc_recall_at_5_diff1
value: 42.5088
- type: nauc_recall_at_10_max
value: 27.3464
- type: nauc_recall_at_10_std
value: 3.1574
- type: nauc_recall_at_10_diff1
value: 42.7357
- type: nauc_recall_at_20_max
value: 29.177599999999998
- type: nauc_recall_at_20_std
value: 8.4122
- type: nauc_recall_at_20_diff1
value: 40.671600000000005
- type: nauc_recall_at_100_max
value: 37.0171
- type: nauc_recall_at_100_std
value: 24.6492
- type: nauc_recall_at_100_diff1
value: 41.125099999999996
- type: nauc_recall_at_1000_max
value: 60.5939
- type: nauc_recall_at_1000_std
value: 47.818
- type: nauc_recall_at_1000_diff1
value: 49.6035
- type: nauc_precision_at_1_max
value: 35.397400000000005
- type: nauc_precision_at_1_std
value: 4.6015
- type: nauc_precision_at_1_diff1
value: 49.0112
- type: nauc_precision_at_3_max
value: 30.735
- type: nauc_precision_at_3_std
value: 8.8247
- type: nauc_precision_at_3_diff1
value: 33.8511
- type: nauc_precision_at_5_max
value: 24.2405
- type: nauc_precision_at_5_std
value: 7.904700000000001
- type: nauc_precision_at_5_diff1
value: 24.8322
- type: nauc_precision_at_10_max
value: 18.9833
- type: nauc_precision_at_10_std
value: 10.700700000000001
- type: nauc_precision_at_10_diff1
value: 16.3075
- type: nauc_precision_at_20_max
value: 16.267200000000003
- type: nauc_precision_at_20_std
value: 14.3353
- type: nauc_precision_at_20_diff1
value: 8.6847
- type: nauc_precision_at_100_max
value: 8.9435
- type: nauc_precision_at_100_std
value: 18.9022
- type: nauc_precision_at_100_diff1
value: -4.2718
- type: nauc_precision_at_1000_max
value: -1.4000000000000001
- type: nauc_precision_at_1000_std
value: 11.3122
- type: nauc_precision_at_1000_diff1
value: -15.9384
- type: nauc_mrr_at_1_max
value: 35.397400000000005
- type: nauc_mrr_at_1_std
value: 4.6015
- type: nauc_mrr_at_1_diff1
value: 49.0112
- type: nauc_mrr_at_3_max
value: 34.3109
- type: nauc_mrr_at_3_std
value: 4.2108
- type: nauc_mrr_at_3_diff1
value: 45.9716
- type: nauc_mrr_at_5_max
value: 33.9505
- type: nauc_mrr_at_5_std
value: 4.3084999999999996
- type: nauc_mrr_at_5_diff1
value: 45.8489
- type: nauc_mrr_at_10_max
value: 33.7849
- type: nauc_mrr_at_10_std
value: 4.3694999999999995
- type: nauc_mrr_at_10_diff1
value: 45.9683
- type: nauc_mrr_at_20_max
value: 33.9195
- type: nauc_mrr_at_20_std
value: 4.5717
- type: nauc_mrr_at_20_diff1
value: 45.9383
- type: nauc_mrr_at_100_max
value: 34.0208
- type: nauc_mrr_at_100_std
value: 4.6641
- type: nauc_mrr_at_100_diff1
value: 45.9972
- type: nauc_mrr_at_1000_max
value: 34.030899999999995
- type: nauc_mrr_at_1000_std
value: 4.6481
- type: nauc_mrr_at_1000_diff1
value: 46.0101
- type: main_score
value: 46.841
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackProgrammersRetrieval (default)
type: mteb/cqadupstack-programmers
config: default
split: test
revision: 6184bc1440d2dbc7612be22b50686b8826d22b32
metrics:
- type: ndcg_at_1
value: 29.909000000000002
- type: ndcg_at_3
value: 34.832
- type: ndcg_at_5
value: 37.38
- type: ndcg_at_10
value: 40.455000000000005
- type: ndcg_at_20
value: 42.753
- type: ndcg_at_100
value: 46.306000000000004
- type: ndcg_at_1000
value: 48.477
- type: map_at_1
value: 24.757
- type: map_at_3
value: 31.167
- type: map_at_5
value: 32.991
- type: map_at_10
value: 34.516999999999996
- type: map_at_20
value: 35.281
- type: map_at_100
value: 35.892
- type: map_at_1000
value: 36.001
- type: recall_at_1
value: 24.757
- type: recall_at_3
value: 37.57
- type: recall_at_5
value: 44.509
- type: recall_at_10
value: 53.425
- type: recall_at_20
value: 61.53999999999999
- type: recall_at_100
value: 78.608
- type: recall_at_1000
value: 93.252
- type: precision_at_1
value: 29.909000000000002
- type: precision_at_3
value: 16.781
- type: precision_at_5
value: 12.123000000000001
- type: precision_at_10
value: 7.637
- type: precision_at_20
value: 4.572
- type: precision_at_100
value: 1.237
- type: precision_at_1000
value: 0.16
- type: mrr_at_1
value: 29.9087
- type: mrr_at_3
value: 36.2633
- type: mrr_at_5
value: 37.918600000000005
- type: mrr_at_10
value: 39.1135
- type: mrr_at_20
value: 39.6487
- type: mrr_at_100
value: 40.0223
- type: mrr_at_1000
value: 40.070699999999995
- type: nauc_ndcg_at_1_max
value: 36.7468
- type: nauc_ndcg_at_1_std
value: -3.3917
- type: nauc_ndcg_at_1_diff1
value: 46.2004
- type: nauc_ndcg_at_3_max
value: 37.101299999999995
- type: nauc_ndcg_at_3_std
value: -1.1094
- type: nauc_ndcg_at_3_diff1
value: 42.3016
- type: nauc_ndcg_at_5_max
value: 36.6815
- type: nauc_ndcg_at_5_std
value: -0.6321
- type: nauc_ndcg_at_5_diff1
value: 40.8809
- type: nauc_ndcg_at_10_max
value: 36.2424
- type: nauc_ndcg_at_10_std
value: 0.117
- type: nauc_ndcg_at_10_diff1
value: 39.6866
- type: nauc_ndcg_at_20_max
value: 37.0028
- type: nauc_ndcg_at_20_std
value: 1.4393
- type: nauc_ndcg_at_20_diff1
value: 39.170500000000004
- type: nauc_ndcg_at_100_max
value: 37.8882
- type: nauc_ndcg_at_100_std
value: 3.2571000000000003
- type: nauc_ndcg_at_100_diff1
value: 38.8638
- type: nauc_ndcg_at_1000_max
value: 37.688100000000006
- type: nauc_ndcg_at_1000_std
value: 2.979
- type: nauc_ndcg_at_1000_diff1
value: 39.3477
- type: nauc_map_at_1_max
value: 29.072
- type: nauc_map_at_1_std
value: -7.756
- type: nauc_map_at_1_diff1
value: 45.273
- type: nauc_map_at_3_max
value: 34.4972
- type: nauc_map_at_3_std
value: -3.5662
- type: nauc_map_at_3_diff1
value: 43.344
- type: nauc_map_at_5_max
value: 34.9333
- type: nauc_map_at_5_std
value: -2.7205
- type: nauc_map_at_5_diff1
value: 42.2802
- type: nauc_map_at_10_max
value: 35.0349
- type: nauc_map_at_10_std
value: -2.1576
- type: nauc_map_at_10_diff1
value: 41.7284
- type: nauc_map_at_20_max
value: 35.3941
- type: nauc_map_at_20_std
value: -1.7111999999999998
- type: nauc_map_at_20_diff1
value: 41.5433
- type: nauc_map_at_100_max
value: 35.6879
- type: nauc_map_at_100_std
value: -1.2807000000000002
- type: nauc_map_at_100_diff1
value: 41.52
- type: nauc_map_at_1000_max
value: 35.686800000000005
- type: nauc_map_at_1000_std
value: -1.2548
- type: nauc_map_at_1000_diff1
value: 41.5394
- type: nauc_recall_at_1_max
value: 29.072
- type: nauc_recall_at_1_std
value: -7.756
- type: nauc_recall_at_1_diff1
value: 45.273
- type: nauc_recall_at_3_max
value: 35.4112
- type: nauc_recall_at_3_std
value: -1.7929
- type: nauc_recall_at_3_diff1
value: 39.5779
- type: nauc_recall_at_5_max
value: 34.794799999999995
- type: nauc_recall_at_5_std
value: 0.6404
- type: nauc_recall_at_5_diff1
value: 35.280699999999996
- type: nauc_recall_at_10_max
value: 33.48
- type: nauc_recall_at_10_std
value: 3.2202
- type: nauc_recall_at_10_diff1
value: 31.8004
- type: nauc_recall_at_20_max
value: 35.2323
- type: nauc_recall_at_20_std
value: 8.058800000000002
- type: nauc_recall_at_20_diff1
value: 29.3045
- type: nauc_recall_at_100_max
value: 38.379799999999996
- type: nauc_recall_at_100_std
value: 22.2222
- type: nauc_recall_at_100_diff1
value: 22.766000000000002
- type: nauc_recall_at_1000_max
value: 41.457699999999996
- type: nauc_recall_at_1000_std
value: 46.3163
- type: nauc_recall_at_1000_diff1
value: 18.932199999999998
- type: nauc_precision_at_1_max
value: 36.7468
- type: nauc_precision_at_1_std
value: -3.3917
- type: nauc_precision_at_1_diff1
value: 46.2004
- type: nauc_precision_at_3_max
value: 41.9047
- type: nauc_precision_at_3_std
value: 8.6797
- type: nauc_precision_at_3_diff1
value: 32.4061
- type: nauc_precision_at_5_max
value: 40.6237
- type: nauc_precision_at_5_std
value: 12.5406
- type: nauc_precision_at_5_diff1
value: 25.5173
- type: nauc_precision_at_10_max
value: 33.4099
- type: nauc_precision_at_10_std
value: 13.926
- type: nauc_precision_at_10_diff1
value: 16.3236
- type: nauc_precision_at_20_max
value: 31.9979
- type: nauc_precision_at_20_std
value: 17.2255
- type: nauc_precision_at_20_diff1
value: 10.746
- type: nauc_precision_at_100_max
value: 22.994500000000002
- type: nauc_precision_at_100_std
value: 22.8105
- type: nauc_precision_at_100_diff1
value: -0.8222999999999999
- type: nauc_precision_at_1000_max
value: 7.4085
- type: nauc_precision_at_1000_std
value: 13.9769
- type: nauc_precision_at_1000_diff1
value: -7.2029
- type: nauc_mrr_at_1_max
value: 36.7468
- type: nauc_mrr_at_1_std
value: -3.3917
- type: nauc_mrr_at_1_diff1
value: 46.2004
- type: nauc_mrr_at_3_max
value: 39.062599999999996
- type: nauc_mrr_at_3_std
value: 0.013200000000000002
- type: nauc_mrr_at_3_diff1
value: 42.774699999999996
- type: nauc_mrr_at_5_max
value: 39.0588
- type: nauc_mrr_at_5_std
value: 0.8562000000000001
- type: nauc_mrr_at_5_diff1
value: 41.9476
- type: nauc_mrr_at_10_max
value: 38.8292
- type: nauc_mrr_at_10_std
value: 1.0338999999999998
- type: nauc_mrr_at_10_diff1
value: 41.5618
- type: nauc_mrr_at_20_max
value: 38.8348
- type: nauc_mrr_at_20_std
value: 1.2061
- type: nauc_mrr_at_20_diff1
value: 41.548
- type: nauc_mrr_at_100_max
value: 38.8295
- type: nauc_mrr_at_100_std
value: 1.1925
- type: nauc_mrr_at_100_diff1
value: 41.6431
- type: nauc_mrr_at_1000_max
value: 38.8206
- type: nauc_mrr_at_1000_std
value: 1.1844999999999999
- type: nauc_mrr_at_1000_diff1
value: 41.6578
- type: main_score
value: 40.455000000000005
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackRetrieval (default)
type: mteb/cqadupstack-retrieval
config: default
split: test
revision: CQADupstackRetrieval_is_a_combined_dataset
metrics:
- type: main_score
value: 38.678416666666664
- type: ndcg_at_10
value: 38.678416666666664
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackStatsRetrieval (default)
type: mteb/cqadupstack-stats
config: default
split: test
revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a
metrics:
- type: ndcg_at_1
value: 24.847
- type: ndcg_at_3
value: 29.369
- type: ndcg_at_5
value: 31.563999999999997
- type: ndcg_at_10
value: 33.588
- type: ndcg_at_20
value: 35.598
- type: ndcg_at_100
value: 38.543
- type: ndcg_at_1000
value: 41.167
- type: map_at_1
value: 22.042
- type: map_at_3
value: 27.016000000000002
- type: map_at_5
value: 28.369
- type: map_at_10
value: 29.308
- type: map_at_20
value: 29.897000000000002
- type: map_at_100
value: 30.316
- type: map_at_1000
value: 30.416999999999998
- type: recall_at_1
value: 22.042
- type: recall_at_3
value: 32.686
- type: recall_at_5
value: 38.044
- type: recall_at_10
value: 44.028
- type: recall_at_20
value: 51.576
- type: recall_at_100
value: 66.611
- type: recall_at_1000
value: 86.054
- type: precision_at_1
value: 24.847
- type: precision_at_3
value: 12.628
- type: precision_at_5
value: 9.017999999999999
- type: precision_at_10
value: 5.367999999999999
- type: precision_at_20
value: 3.175
- type: precision_at_100
value: 0.84
- type: precision_at_1000
value: 0.116
- type: mrr_at_1
value: 24.8466
- type: mrr_at_3
value: 29.856899999999996
- type: mrr_at_5
value: 31.198900000000002
- type: mrr_at_10
value: 31.9986
- type: mrr_at_20
value: 32.5373
- type: mrr_at_100
value: 32.920500000000004
- type: mrr_at_1000
value: 32.99
- type: nauc_ndcg_at_1_max
value: 35.3991
- type: nauc_ndcg_at_1_std
value: 7.4666
- type: nauc_ndcg_at_1_diff1
value: 62.871500000000005
- type: nauc_ndcg_at_3_max
value: 33.2542
- type: nauc_ndcg_at_3_std
value: 6.0760000000000005
- type: nauc_ndcg_at_3_diff1
value: 54.038
- type: nauc_ndcg_at_5_max
value: 33.4106
- type: nauc_ndcg_at_5_std
value: 8.0913
- type: nauc_ndcg_at_5_diff1
value: 53.3581
- type: nauc_ndcg_at_10_max
value: 34.342800000000004
- type: nauc_ndcg_at_10_std
value: 8.7164
- type: nauc_ndcg_at_10_diff1
value: 52.797700000000006
- type: nauc_ndcg_at_20_max
value: 34.703
- type: nauc_ndcg_at_20_std
value: 10.3363
- type: nauc_ndcg_at_20_diff1
value: 51.7927
- type: nauc_ndcg_at_100_max
value: 34.408
- type: nauc_ndcg_at_100_std
value: 11.4848
- type: nauc_ndcg_at_100_diff1
value: 50.708
- type: nauc_ndcg_at_1000_max
value: 34.8598
- type: nauc_ndcg_at_1000_std
value: 11.9612
- type: nauc_ndcg_at_1000_diff1
value: 51.497899999999994
- type: nauc_map_at_1_max
value: 34.5063
- type: nauc_map_at_1_std
value: 4.4961
- type: nauc_map_at_1_diff1
value: 64.782
- type: nauc_map_at_3_max
value: 33.4219
- type: nauc_map_at_3_std
value: 5.0572
- type: nauc_map_at_3_diff1
value: 56.918800000000005
- type: nauc_map_at_5_max
value: 33.7034
- type: nauc_map_at_5_std
value: 6.462700000000001
- type: nauc_map_at_5_diff1
value: 56.3771
- type: nauc_map_at_10_max
value: 34.279900000000005
- type: nauc_map_at_10_std
value: 7.008699999999999
- type: nauc_map_at_10_diff1
value: 56.1832
- type: nauc_map_at_20_max
value: 34.3794
- type: nauc_map_at_20_std
value: 7.474500000000001
- type: nauc_map_at_20_diff1
value: 55.8517
- type: nauc_map_at_100_max
value: 34.3464
- type: nauc_map_at_100_std
value: 7.639799999999999
- type: nauc_map_at_100_diff1
value: 55.66330000000001
- type: nauc_map_at_1000_max
value: 34.3893
- type: nauc_map_at_1000_std
value: 7.6875
- type: nauc_map_at_1000_diff1
value: 55.696999999999996
- type: nauc_recall_at_1_max
value: 34.5063
- type: nauc_recall_at_1_std
value: 4.4961
- type: nauc_recall_at_1_diff1
value: 64.782
- type: nauc_recall_at_3_max
value: 30.8728
- type: nauc_recall_at_3_std
value: 4.8788
- type: nauc_recall_at_3_diff1
value: 47.795
- type: nauc_recall_at_5_max
value: 31.211299999999998
- type: nauc_recall_at_5_std
value: 9.819700000000001
- type: nauc_recall_at_5_diff1
value: 45.614
- type: nauc_recall_at_10_max
value: 33.2451
- type: nauc_recall_at_10_std
value: 11.3511
- type: nauc_recall_at_10_diff1
value: 43.4298
- type: nauc_recall_at_20_max
value: 33.633
- type: nauc_recall_at_20_std
value: 16.7179
- type: nauc_recall_at_20_diff1
value: 39.0638
- type: nauc_recall_at_100_max
value: 30.8326
- type: nauc_recall_at_100_std
value: 24.501
- type: nauc_recall_at_100_diff1
value: 30.077399999999997
- type: nauc_recall_at_1000_max
value: 31.132900000000003
- type: nauc_recall_at_1000_std
value: 42.1105
- type: nauc_recall_at_1000_diff1
value: 22.4678
- type: nauc_precision_at_1_max
value: 35.3991
- type: nauc_precision_at_1_std
value: 7.4666
- type: nauc_precision_at_1_diff1
value: 62.871500000000005
- type: nauc_precision_at_3_max
value: 32.2855
- type: nauc_precision_at_3_std
value: 9.7582
- type: nauc_precision_at_3_diff1
value: 44.250299999999996
- type: nauc_precision_at_5_max
value: 32.7906
- type: nauc_precision_at_5_std
value: 16.1189
- type: nauc_precision_at_5_diff1
value: 41.9327
- type: nauc_precision_at_10_max
value: 33.9955
- type: nauc_precision_at_10_std
value: 17.7777
- type: nauc_precision_at_10_diff1
value: 36.0824
- type: nauc_precision_at_20_max
value: 33.5331
- type: nauc_precision_at_20_std
value: 22.729
- type: nauc_precision_at_20_diff1
value: 28.9461
- type: nauc_precision_at_100_max
value: 27.121000000000002
- type: nauc_precision_at_100_std
value: 26.1571
- type: nauc_precision_at_100_diff1
value: 15.1555
- type: nauc_precision_at_1000_max
value: 17.0259
- type: nauc_precision_at_1000_std
value: 21.2591
- type: nauc_precision_at_1000_diff1
value: 0.2408
- type: nauc_mrr_at_1_max
value: 35.3991
- type: nauc_mrr_at_1_std
value: 7.4666
- type: nauc_mrr_at_1_diff1
value: 62.871500000000005
- type: nauc_mrr_at_3_max
value: 34.0674
- type: nauc_mrr_at_3_std
value: 7.5811
- type: nauc_mrr_at_3_diff1
value: 55.435500000000005
- type: nauc_mrr_at_5_max
value: 34.0133
- type: nauc_mrr_at_5_std
value: 8.7651
- type: nauc_mrr_at_5_diff1
value: 54.8242
- type: nauc_mrr_at_10_max
value: 34.2033
- type: nauc_mrr_at_10_std
value: 8.6065
- type: nauc_mrr_at_10_diff1
value: 54.4752
- type: nauc_mrr_at_20_max
value: 34.3838
- type: nauc_mrr_at_20_std
value: 9.1144
- type: nauc_mrr_at_20_diff1
value: 54.2493
- type: nauc_mrr_at_100_max
value: 34.2967
- type: nauc_mrr_at_100_std
value: 9.2348
- type: nauc_mrr_at_100_diff1
value: 54.087799999999994
- type: nauc_mrr_at_1000_max
value: 34.3112
- type: nauc_mrr_at_1000_std
value: 9.243
- type: nauc_mrr_at_1000_diff1
value: 54.1208
- type: main_score
value: 33.588
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackTexRetrieval (default)
type: mteb/cqadupstack-tex
config: default
split: test
revision: 46989137a86843e03a6195de44b09deda022eec7
metrics:
- type: ndcg_at_1
value: 19.236
- type: ndcg_at_3
value: 22.599
- type: ndcg_at_5
value: 24.137
- type: ndcg_at_10
value: 26.387
- type: ndcg_at_20
value: 28.353
- type: ndcg_at_100
value: 31.814999999999998
- type: ndcg_at_1000
value: 34.991
- type: map_at_1
value: 15.772
- type: map_at_3
value: 20.081
- type: map_at_5
value: 21.111
- type: map_at_10
value: 22.133
- type: map_at_20
value: 22.718
- type: map_at_100
value: 23.244
- type: map_at_1000
value: 23.375
- type: recall_at_1
value: 15.772
- type: recall_at_3
value: 24.944
- type: recall_at_5
value: 28.959000000000003
- type: recall_at_10
value: 35.768
- type: recall_at_20
value: 42.953
- type: recall_at_100
value: 60.209999999999994
- type: recall_at_1000
value: 83.035
- type: precision_at_1
value: 19.236
- type: precision_at_3
value: 10.622
- type: precision_at_5
value: 7.577
- type: precision_at_10
value: 4.7829999999999995
- type: precision_at_20
value: 2.968
- type: precision_at_100
value: 0.8920000000000001
- type: precision_at_1000
value: 0.134
- type: mrr_at_1
value: 19.2361
- type: mrr_at_3
value: 23.755399999999998
- type: mrr_at_5
value: 24.7448
- type: mrr_at_10
value: 25.7284
- type: mrr_at_20
value: 26.2892
- type: mrr_at_100
value: 26.7023
- type: mrr_at_1000
value: 26.787699999999997
- type: nauc_ndcg_at_1_max
value: 25.8189
- type: nauc_ndcg_at_1_std
value: -0.7723
- type: nauc_ndcg_at_1_diff1
value: 37.4223
- type: nauc_ndcg_at_3_max
value: 25.003999999999998
- type: nauc_ndcg_at_3_std
value: 0.047
- type: nauc_ndcg_at_3_diff1
value: 32.6399
- type: nauc_ndcg_at_5_max
value: 24.934700000000003
- type: nauc_ndcg_at_5_std
value: 0.2853
- type: nauc_ndcg_at_5_diff1
value: 31.622600000000002
- type: nauc_ndcg_at_10_max
value: 25.6266
- type: nauc_ndcg_at_10_std
value: 1.5631
- type: nauc_ndcg_at_10_diff1
value: 30.8794
- type: nauc_ndcg_at_20_max
value: 26.3898
- type: nauc_ndcg_at_20_std
value: 2.4745
- type: nauc_ndcg_at_20_diff1
value: 30.761300000000002
- type: nauc_ndcg_at_100_max
value: 26.292900000000003
- type: nauc_ndcg_at_100_std
value: 3.7591
- type: nauc_ndcg_at_100_diff1
value: 30.122100000000003
- type: nauc_ndcg_at_1000_max
value: 26.4123
- type: nauc_ndcg_at_1000_std
value: 4.2536
- type: nauc_ndcg_at_1000_diff1
value: 30.4018
- type: nauc_map_at_1_max
value: 26.0937
- type: nauc_map_at_1_std
value: -0.9603999999999999
- type: nauc_map_at_1_diff1
value: 40.326699999999995
- type: nauc_map_at_3_max
value: 25.079600000000003
- type: nauc_map_at_3_std
value: -0.1563
- type: nauc_map_at_3_diff1
value: 34.824
- type: nauc_map_at_5_max
value: 25.134800000000002
- type: nauc_map_at_5_std
value: -0.16590000000000002
- type: nauc_map_at_5_diff1
value: 34.082
- type: nauc_map_at_10_max
value: 25.4738
- type: nauc_map_at_10_std
value: 0.3806
- type: nauc_map_at_10_diff1
value: 33.6015
- type: nauc_map_at_20_max
value: 25.744699999999998
- type: nauc_map_at_20_std
value: 0.6495
- type: nauc_map_at_20_diff1
value: 33.5837
- type: nauc_map_at_100_max
value: 25.7512
- type: nauc_map_at_100_std
value: 0.8006
- type: nauc_map_at_100_diff1
value: 33.4639
- type: nauc_map_at_1000_max
value: 25.7618
- type: nauc_map_at_1000_std
value: 0.8451
- type: nauc_map_at_1000_diff1
value: 33.4469
- type: nauc_recall_at_1_max
value: 26.0937
- type: nauc_recall_at_1_std
value: -0.9603999999999999
- type: nauc_recall_at_1_diff1
value: 40.326699999999995
- type: nauc_recall_at_3_max
value: 23.5655
- type: nauc_recall_at_3_std
value: 1.5734000000000001
- type: nauc_recall_at_3_diff1
value: 28.773100000000003
- type: nauc_recall_at_5_max
value: 23.0476
- type: nauc_recall_at_5_std
value: 1.5559999999999998
- type: nauc_recall_at_5_diff1
value: 26.194
- type: nauc_recall_at_10_max
value: 24.497700000000002
- type: nauc_recall_at_10_std
value: 4.7022
- type: nauc_recall_at_10_diff1
value: 24.171
- type: nauc_recall_at_20_max
value: 26.168799999999997
- type: nauc_recall_at_20_std
value: 7.4726
- type: nauc_recall_at_20_diff1
value: 23.0682
- type: nauc_recall_at_100_max
value: 24.8448
- type: nauc_recall_at_100_std
value: 14.4567
- type: nauc_recall_at_100_diff1
value: 18.4698
- type: nauc_recall_at_1000_max
value: 25.9176
- type: nauc_recall_at_1000_std
value: 29.0789
- type: nauc_recall_at_1000_diff1
value: 14.382100000000001
- type: nauc_precision_at_1_max
value: 25.8189
- type: nauc_precision_at_1_std
value: -0.7723
- type: nauc_precision_at_1_diff1
value: 37.4223
- type: nauc_precision_at_3_max
value: 24.1539
- type: nauc_precision_at_3_std
value: 0.8337000000000001
- type: nauc_precision_at_3_diff1
value: 25.9882
- type: nauc_precision_at_5_max
value: 24.269299999999998
- type: nauc_precision_at_5_std
value: 1.4546999999999999
- type: nauc_precision_at_5_diff1
value: 23.069300000000002
- type: nauc_precision_at_10_max
value: 24.4338
- type: nauc_precision_at_10_std
value: 4.0008
- type: nauc_precision_at_10_diff1
value: 19.037000000000003
- type: nauc_precision_at_20_max
value: 24.928900000000002
- type: nauc_precision_at_20_std
value: 6.2217
- type: nauc_precision_at_20_diff1
value: 16.2922
- type: nauc_precision_at_100_max
value: 19.2407
- type: nauc_precision_at_100_std
value: 9.9782
- type: nauc_precision_at_100_diff1
value: 4.7276
- type: nauc_precision_at_1000_max
value: 12.422600000000001
- type: nauc_precision_at_1000_std
value: 9.030000000000001
- type: nauc_precision_at_1000_diff1
value: -5.3838
- type: nauc_mrr_at_1_max
value: 25.8189
- type: nauc_mrr_at_1_std
value: -0.7723
- type: nauc_mrr_at_1_diff1
value: 37.4223
- type: nauc_mrr_at_3_max
value: 24.999399999999998
- type: nauc_mrr_at_3_std
value: -0.3036
- type: nauc_mrr_at_3_diff1
value: 32.7559
- type: nauc_mrr_at_5_max
value: 25.020999999999997
- type: nauc_mrr_at_5_std
value: -0.149
- type: nauc_mrr_at_5_diff1
value: 32.2376
- type: nauc_mrr_at_10_max
value: 25.279600000000002
- type: nauc_mrr_at_10_std
value: 0.271
- type: nauc_mrr_at_10_diff1
value: 31.9357
- type: nauc_mrr_at_20_max
value: 25.517400000000002
- type: nauc_mrr_at_20_std
value: 0.5566
- type: nauc_mrr_at_20_diff1
value: 31.901200000000003
- type: nauc_mrr_at_100_max
value: 25.4772
- type: nauc_mrr_at_100_std
value: 0.6613
- type: nauc_mrr_at_100_diff1
value: 31.826900000000002
- type: nauc_mrr_at_1000_max
value: 25.468000000000004
- type: nauc_mrr_at_1000_std
value: 0.6685
- type: nauc_mrr_at_1000_diff1
value: 31.8495
- type: main_score
value: 26.387
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackUnixRetrieval (default)
type: mteb/cqadupstack-unix
config: default
split: test
revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53
metrics:
- type: ndcg_at_1
value: 26.866
- type: ndcg_at_3
value: 30.59
- type: ndcg_at_5
value: 33.08
- type: ndcg_at_10
value: 35.697
- type: ndcg_at_20
value: 37.697
- type: ndcg_at_100
value: 41.252
- type: ndcg_at_1000
value: 43.968
- type: map_at_1
value: 22.489
- type: map_at_3
value: 27.767999999999997
- type: map_at_5
value: 29.408
- type: map_at_10
value: 30.579
- type: map_at_20
value: 31.175000000000004
- type: map_at_100
value: 31.738
- type: map_at_1000
value: 31.852000000000004
- type: recall_at_1
value: 22.489
- type: recall_at_3
value: 33.635999999999996
- type: recall_at_5
value: 39.816
- type: recall_at_10
value: 47.61
- type: recall_at_20
value: 54.766000000000005
- type: recall_at_100
value: 71.944
- type: recall_at_1000
value: 91.229
- type: precision_at_1
value: 26.866
- type: precision_at_3
value: 13.930000000000001
- type: precision_at_5
value: 10.075000000000001
- type: precision_at_10
value: 6.0729999999999995
- type: precision_at_20
value: 3.61
- type: precision_at_100
value: 1.006
- type: precision_at_1000
value: 0.136
- type: mrr_at_1
value: 26.865699999999997
- type: mrr_at_3
value: 32.0585
- type: mrr_at_5
value: 33.4904
- type: mrr_at_10
value: 34.5912
- type: mrr_at_20
value: 35.094300000000004
- type: mrr_at_100
value: 35.5351
- type: mrr_at_1000
value: 35.6028
- type: nauc_ndcg_at_1_max
value: 41.288799999999995
- type: nauc_ndcg_at_1_std
value: -2.2298999999999998
- type: nauc_ndcg_at_1_diff1
value: 49.8265
- type: nauc_ndcg_at_3_max
value: 39.39
- type: nauc_ndcg_at_3_std
value: -0.0365
- type: nauc_ndcg_at_3_diff1
value: 46.2035
- type: nauc_ndcg_at_5_max
value: 38.6686
- type: nauc_ndcg_at_5_std
value: 0.1894
- type: nauc_ndcg_at_5_diff1
value: 44.4368
- type: nauc_ndcg_at_10_max
value: 38.3128
- type: nauc_ndcg_at_10_std
value: 1.8970999999999998
- type: nauc_ndcg_at_10_diff1
value: 44.303
- type: nauc_ndcg_at_20_max
value: 37.8206
- type: nauc_ndcg_at_20_std
value: 1.8249000000000002
- type: nauc_ndcg_at_20_diff1
value: 43.8219
- type: nauc_ndcg_at_100_max
value: 38.3774
- type: nauc_ndcg_at_100_std
value: 3.3640999999999996
- type: nauc_ndcg_at_100_diff1
value: 43.9134
- type: nauc_ndcg_at_1000_max
value: 39.1018
- type: nauc_ndcg_at_1000_std
value: 3.167
- type: nauc_ndcg_at_1000_diff1
value: 43.9295
- type: nauc_map_at_1_max
value: 40.1469
- type: nauc_map_at_1_std
value: -2.7226
- type: nauc_map_at_1_diff1
value: 52.3181
- type: nauc_map_at_3_max
value: 39.115100000000005
- type: nauc_map_at_3_std
value: -0.45199999999999996
- type: nauc_map_at_3_diff1
value: 48.0484
- type: nauc_map_at_5_max
value: 39.0963
- type: nauc_map_at_5_std
value: -0.17329999999999998
- type: nauc_map_at_5_diff1
value: 46.8174
- type: nauc_map_at_10_max
value: 38.9901
- type: nauc_map_at_10_std
value: 0.5842
- type: nauc_map_at_10_diff1
value: 46.7611
- type: nauc_map_at_20_max
value: 38.9159
- type: nauc_map_at_20_std
value: 0.5559999999999999
- type: nauc_map_at_20_diff1
value: 46.5794
- type: nauc_map_at_100_max
value: 39.0595
- type: nauc_map_at_100_std
value: 0.7881000000000001
- type: nauc_map_at_100_diff1
value: 46.5484
- type: nauc_map_at_1000_max
value: 39.0897
- type: nauc_map_at_1000_std
value: 0.7957000000000001
- type: nauc_map_at_1000_diff1
value: 46.5428
- type: nauc_recall_at_1_max
value: 40.1469
- type: nauc_recall_at_1_std
value: -2.7226
- type: nauc_recall_at_1_diff1
value: 52.3181
- type: nauc_recall_at_3_max
value: 36.7469
- type: nauc_recall_at_3_std
value: 0.9477
- type: nauc_recall_at_3_diff1
value: 43.125
- type: nauc_recall_at_5_max
value: 35.1646
- type: nauc_recall_at_5_std
value: 1.4531
- type: nauc_recall_at_5_diff1
value: 38.1625
- type: nauc_recall_at_10_max
value: 33.2965
- type: nauc_recall_at_10_std
value: 5.968
- type: nauc_recall_at_10_diff1
value: 37.3253
- type: nauc_recall_at_20_max
value: 30.6624
- type: nauc_recall_at_20_std
value: 5.8494
- type: nauc_recall_at_20_diff1
value: 35.4185
- type: nauc_recall_at_100_max
value: 31.283300000000004
- type: nauc_recall_at_100_std
value: 17.6584
- type: nauc_recall_at_100_diff1
value: 34.8031
- type: nauc_recall_at_1000_max
value: 42.3045
- type: nauc_recall_at_1000_std
value: 38.412800000000004
- type: nauc_recall_at_1000_diff1
value: 26.7818
- type: nauc_precision_at_1_max
value: 41.288799999999995
- type: nauc_precision_at_1_std
value: -2.2298999999999998
- type: nauc_precision_at_1_diff1
value: 49.8265
- type: nauc_precision_at_3_max
value: 37.9005
- type: nauc_precision_at_3_std
value: 3.1521
- type: nauc_precision_at_3_diff1
value: 36.1785
- type: nauc_precision_at_5_max
value: 35.1235
- type: nauc_precision_at_5_std
value: 4.1023
- type: nauc_precision_at_5_diff1
value: 29.325699999999998
- type: nauc_precision_at_10_max
value: 32.6961
- type: nauc_precision_at_10_std
value: 8.8151
- type: nauc_precision_at_10_diff1
value: 25.7135
- type: nauc_precision_at_20_max
value: 25.8708
- type: nauc_precision_at_20_std
value: 8.075899999999999
- type: nauc_precision_at_20_diff1
value: 18.407
- type: nauc_precision_at_100_max
value: 17.2159
- type: nauc_precision_at_100_std
value: 11.1057
- type: nauc_precision_at_100_diff1
value: 4.9951
- type: nauc_precision_at_1000_max
value: 3.8856
- type: nauc_precision_at_1000_std
value: 5.3964
- type: nauc_precision_at_1000_diff1
value: -11.1141
- type: nauc_mrr_at_1_max
value: 41.288799999999995
- type: nauc_mrr_at_1_std
value: -2.2298999999999998
- type: nauc_mrr_at_1_diff1
value: 49.8265
- type: nauc_mrr_at_3_max
value: 39.7658
- type: nauc_mrr_at_3_std
value: -1.0785
- type: nauc_mrr_at_3_diff1
value: 45.6847
- type: nauc_mrr_at_5_max
value: 39.5728
- type: nauc_mrr_at_5_std
value: -0.8420000000000001
- type: nauc_mrr_at_5_diff1
value: 44.6613
- type: nauc_mrr_at_10_max
value: 39.5053
- type: nauc_mrr_at_10_std
value: -0.11689999999999999
- type: nauc_mrr_at_10_diff1
value: 44.724000000000004
- type: nauc_mrr_at_20_max
value: 39.352
- type: nauc_mrr_at_20_std
value: -0.1751
- type: nauc_mrr_at_20_diff1
value: 44.5922
- type: nauc_mrr_at_100_max
value: 39.3906
- type: nauc_mrr_at_100_std
value: -0.0412
- type: nauc_mrr_at_100_diff1
value: 44.635999999999996
- type: nauc_mrr_at_1000_max
value: 39.4159
- type: nauc_mrr_at_1000_std
value: -0.0473
- type: nauc_mrr_at_1000_diff1
value: 44.6477
- type: main_score
value: 35.697
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWebmastersRetrieval (default)
type: mteb/cqadupstack-webmasters
config: default
split: test
revision: 160c094312a0e1facb97e55eeddb698c0abe3571
metrics:
- type: ndcg_at_1
value: 27.668
- type: ndcg_at_3
value: 32.812000000000005
- type: ndcg_at_5
value: 35.228
- type: ndcg_at_10
value: 37.551
- type: ndcg_at_20
value: 39.379
- type: ndcg_at_100
value: 43.596000000000004
- type: ndcg_at_1000
value: 46.114
- type: map_at_1
value: 22.32
- type: map_at_3
value: 28.563
- type: map_at_5
value: 30.282999999999998
- type: map_at_10
value: 31.544
- type: map_at_20
value: 32.295
- type: map_at_100
value: 33.145
- type: map_at_1000
value: 33.367999999999995
- type: recall_at_1
value: 22.32
- type: recall_at_3
value: 35.28
- type: recall_at_5
value: 41.701
- type: recall_at_10
value: 48.929
- type: recall_at_20
value: 55.809
- type: recall_at_100
value: 76.49000000000001
- type: recall_at_1000
value: 92.647
- type: precision_at_1
value: 27.668
- type: precision_at_3
value: 15.744
- type: precision_at_5
value: 11.779
- type: precision_at_10
value: 7.411
- type: precision_at_20
value: 4.654
- type: precision_at_100
value: 1.5630000000000002
- type: precision_at_1000
value: 0.242
- type: mrr_at_1
value: 27.668
- type: mrr_at_3
value: 33.860299999999995
- type: mrr_at_5
value: 35.4315
- type: mrr_at_10
value: 36.3724
- type: mrr_at_20
value: 36.8404
- type: mrr_at_100
value: 37.3207
- type: mrr_at_1000
value: 37.3797
- type: nauc_ndcg_at_1_max
value: 29.939799999999998
- type: nauc_ndcg_at_1_std
value: 3.3960999999999997
- type: nauc_ndcg_at_1_diff1
value: 50.718300000000006
- type: nauc_ndcg_at_3_max
value: 30.255100000000002
- type: nauc_ndcg_at_3_std
value: 7.4765999999999995
- type: nauc_ndcg_at_3_diff1
value: 44.6222
- type: nauc_ndcg_at_5_max
value: 29.791400000000003
- type: nauc_ndcg_at_5_std
value: 9.9377
- type: nauc_ndcg_at_5_diff1
value: 42.7502
- type: nauc_ndcg_at_10_max
value: 29.493399999999998
- type: nauc_ndcg_at_10_std
value: 9.3112
- type: nauc_ndcg_at_10_diff1
value: 43.3784
- type: nauc_ndcg_at_20_max
value: 30.200300000000002
- type: nauc_ndcg_at_20_std
value: 8.2095
- type: nauc_ndcg_at_20_diff1
value: 43.8137
- type: nauc_ndcg_at_100_max
value: 30.6938
- type: nauc_ndcg_at_100_std
value: 10.9702
- type: nauc_ndcg_at_100_diff1
value: 43.2695
- type: nauc_ndcg_at_1000_max
value: 31.0035
- type: nauc_ndcg_at_1000_std
value: 10.43
- type: nauc_ndcg_at_1000_diff1
value: 44.6603
- type: nauc_map_at_1_max
value: 28.7706
- type: nauc_map_at_1_std
value: -1.4021000000000001
- type: nauc_map_at_1_diff1
value: 53.6976
- type: nauc_map_at_3_max
value: 29.710700000000003
- type: nauc_map_at_3_std
value: 4.3148
- type: nauc_map_at_3_diff1
value: 47.586600000000004
- type: nauc_map_at_5_max
value: 29.4636
- type: nauc_map_at_5_std
value: 5.6241
- type: nauc_map_at_5_diff1
value: 46.0464
- type: nauc_map_at_10_max
value: 29.608400000000003
- type: nauc_map_at_10_std
value: 5.7526
- type: nauc_map_at_10_diff1
value: 45.942699999999995
- type: nauc_map_at_20_max
value: 29.878300000000003
- type: nauc_map_at_20_std
value: 5.900600000000001
- type: nauc_map_at_20_diff1
value: 46.0349
- type: nauc_map_at_100_max
value: 29.9908
- type: nauc_map_at_100_std
value: 6.7274
- type: nauc_map_at_100_diff1
value: 46.0149
- type: nauc_map_at_1000_max
value: 29.8265
- type: nauc_map_at_1000_std
value: 6.8384
- type: nauc_map_at_1000_diff1
value: 46.1011
- type: nauc_recall_at_1_max
value: 28.7706
- type: nauc_recall_at_1_std
value: -1.4021000000000001
- type: nauc_recall_at_1_diff1
value: 53.6976
- type: nauc_recall_at_3_max
value: 28.657700000000002
- type: nauc_recall_at_3_std
value: 9.058399999999999
- type: nauc_recall_at_3_diff1
value: 40.709
- type: nauc_recall_at_5_max
value: 26.9309
- type: nauc_recall_at_5_std
value: 13.569400000000002
- type: nauc_recall_at_5_diff1
value: 34.2241
- type: nauc_recall_at_10_max
value: 26.4271
- type: nauc_recall_at_10_std
value: 12.7339
- type: nauc_recall_at_10_diff1
value: 33.9447
- type: nauc_recall_at_20_max
value: 29.2512
- type: nauc_recall_at_20_std
value: 9.9774
- type: nauc_recall_at_20_diff1
value: 36.85
- type: nauc_recall_at_100_max
value: 30.4911
- type: nauc_recall_at_100_std
value: 29.9644
- type: nauc_recall_at_100_diff1
value: 29.4678
- type: nauc_recall_at_1000_max
value: 44.5434
- type: nauc_recall_at_1000_std
value: 45.6492
- type: nauc_recall_at_1000_diff1
value: 43.278
- type: nauc_precision_at_1_max
value: 29.939799999999998
- type: nauc_precision_at_1_std
value: 3.3960999999999997
- type: nauc_precision_at_1_diff1
value: 50.718300000000006
- type: nauc_precision_at_3_max
value: 27.2703
- type: nauc_precision_at_3_std
value: 12.4915
- type: nauc_precision_at_3_diff1
value: 31.81
- type: nauc_precision_at_5_max
value: 24.1045
- type: nauc_precision_at_5_std
value: 17.6234
- type: nauc_precision_at_5_diff1
value: 23.8408
- type: nauc_precision_at_10_max
value: 19.8596
- type: nauc_precision_at_10_std
value: 18.5965
- type: nauc_precision_at_10_diff1
value: 20.820800000000002
- type: nauc_precision_at_20_max
value: 17.7276
- type: nauc_precision_at_20_std
value: 18.241
- type: nauc_precision_at_20_diff1
value: 14.235500000000002
- type: nauc_precision_at_100_max
value: 3.5949
- type: nauc_precision_at_100_std
value: 22.1485
- type: nauc_precision_at_100_diff1
value: 4.9958
- type: nauc_precision_at_1000_max
value: -8.9717
- type: nauc_precision_at_1000_std
value: 15.4312
- type: nauc_precision_at_1000_diff1
value: 1.3613
- type: nauc_mrr_at_1_max
value: 29.939799999999998
- type: nauc_mrr_at_1_std
value: 3.3960999999999997
- type: nauc_mrr_at_1_diff1
value: 50.718300000000006
- type: nauc_mrr_at_3_max
value: 29.451
- type: nauc_mrr_at_3_std
value: 7.2462
- type: nauc_mrr_at_3_diff1
value: 44.946799999999996
- type: nauc_mrr_at_5_max
value: 29.7994
- type: nauc_mrr_at_5_std
value: 8.919599999999999
- type: nauc_mrr_at_5_diff1
value: 44.0498
- type: nauc_mrr_at_10_max
value: 29.878700000000002
- type: nauc_mrr_at_10_std
value: 8.5343
- type: nauc_mrr_at_10_diff1
value: 44.3541
- type: nauc_mrr_at_20_max
value: 30.006
- type: nauc_mrr_at_20_std
value: 8.1953
- type: nauc_mrr_at_20_diff1
value: 44.544
- type: nauc_mrr_at_100_max
value: 30.0259
- type: nauc_mrr_at_100_std
value: 8.465499999999999
- type: nauc_mrr_at_100_diff1
value: 44.611000000000004
- type: nauc_mrr_at_1000_max
value: 30.024
- type: nauc_mrr_at_1000_std
value: 8.4392
- type: nauc_mrr_at_1000_diff1
value: 44.6335
- type: main_score
value: 37.551
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWordpressRetrieval (default)
type: mteb/cqadupstack-wordpress
config: default
split: test
revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4
metrics:
- type: ndcg_at_1
value: 19.778000000000002
- type: ndcg_at_3
value: 24.784
- type: ndcg_at_5
value: 27.358
- type: ndcg_at_10
value: 29.641000000000002
- type: ndcg_at_20
value: 31.832
- type: ndcg_at_100
value: 35.112
- type: ndcg_at_1000
value: 37.611
- type: map_at_1
value: 18.315
- type: map_at_3
value: 22.706
- type: map_at_5
value: 24.197
- type: map_at_10
value: 25.188
- type: map_at_20
value: 25.820999999999998
- type: map_at_100
value: 26.272000000000002
- type: map_at_1000
value: 26.374
- type: recall_at_1
value: 18.315
- type: recall_at_3
value: 28.647
- type: recall_at_5
value: 34.852
- type: recall_at_10
value: 41.626999999999995
- type: recall_at_20
value: 50.111000000000004
- type: recall_at_100
value: 67.244
- type: recall_at_1000
value: 85.556
- type: precision_at_1
value: 19.778000000000002
- type: precision_at_3
value: 10.598
- type: precision_at_5
value: 7.911
- type: precision_at_10
value: 4.806
- type: precision_at_20
value: 2.902
- type: precision_at_100
value: 0.815
- type: precision_at_1000
value: 0.11499999999999999
- type: mrr_at_1
value: 19.778200000000002
- type: mrr_at_3
value: 24.5533
- type: mrr_at_5
value: 26.0783
- type: mrr_at_10
value: 27.0088
- type: mrr_at_20
value: 27.573199999999996
- type: mrr_at_100
value: 27.988000000000003
- type: mrr_at_1000
value: 28.0549
- type: nauc_ndcg_at_1_max
value: 24.9483
- type: nauc_ndcg_at_1_std
value: 4.0085999999999995
- type: nauc_ndcg_at_1_diff1
value: 41.1484
- type: nauc_ndcg_at_3_max
value: 22.8401
- type: nauc_ndcg_at_3_std
value: 4.7114
- type: nauc_ndcg_at_3_diff1
value: 35.5933
- type: nauc_ndcg_at_5_max
value: 22.4457
- type: nauc_ndcg_at_5_std
value: 2.776
- type: nauc_ndcg_at_5_diff1
value: 34.972300000000004
- type: nauc_ndcg_at_10_max
value: 20.1579
- type: nauc_ndcg_at_10_std
value: 3.3688000000000002
- type: nauc_ndcg_at_10_diff1
value: 33.628
- type: nauc_ndcg_at_20_max
value: 19.7526
- type: nauc_ndcg_at_20_std
value: 3.8321
- type: nauc_ndcg_at_20_diff1
value: 32.7857
- type: nauc_ndcg_at_100_max
value: 21.1183
- type: nauc_ndcg_at_100_std
value: 6.848799999999999
- type: nauc_ndcg_at_100_diff1
value: 33.7359
- type: nauc_ndcg_at_1000_max
value: 21.503
- type: nauc_ndcg_at_1000_std
value: 8.0401
- type: nauc_ndcg_at_1000_diff1
value: 34.214299999999994
- type: nauc_map_at_1_max
value: 21.0954
- type: nauc_map_at_1_std
value: 3.9734
- type: nauc_map_at_1_diff1
value: 42.8502
- type: nauc_map_at_3_max
value: 22.201
- type: nauc_map_at_3_std
value: 4.3289
- type: nauc_map_at_3_diff1
value: 37.9489
- type: nauc_map_at_5_max
value: 22.1251
- type: nauc_map_at_5_std
value: 3.0327
- type: nauc_map_at_5_diff1
value: 37.2945
- type: nauc_map_at_10_max
value: 21.1451
- type: nauc_map_at_10_std
value: 3.3652
- type: nauc_map_at_10_diff1
value: 36.580400000000004
- type: nauc_map_at_20_max
value: 21.0693
- type: nauc_map_at_20_std
value: 3.4766
- type: nauc_map_at_20_diff1
value: 36.275600000000004
- type: nauc_map_at_100_max
value: 21.2609
- type: nauc_map_at_100_std
value: 3.9440999999999997
- type: nauc_map_at_100_diff1
value: 36.4114
- type: nauc_map_at_1000_max
value: 21.2454
- type: nauc_map_at_1000_std
value: 3.994
- type: nauc_map_at_1000_diff1
value: 36.4005
- type: nauc_recall_at_1_max
value: 21.0954
- type: nauc_recall_at_1_std
value: 3.9734
- type: nauc_recall_at_1_diff1
value: 42.8502
- type: nauc_recall_at_3_max
value: 21.6886
- type: nauc_recall_at_3_std
value: 5.5664
- type: nauc_recall_at_3_diff1
value: 31.4152
- type: nauc_recall_at_5_max
value: 20.491699999999998
- type: nauc_recall_at_5_std
value: 1.5245
- type: nauc_recall_at_5_diff1
value: 29.374499999999998
- type: nauc_recall_at_10_max
value: 13.886899999999999
- type: nauc_recall_at_10_std
value: 2.7815
- type: nauc_recall_at_10_diff1
value: 25.475900000000003
- type: nauc_recall_at_20_max
value: 11.8825
- type: nauc_recall_at_20_std
value: 4.2615
- type: nauc_recall_at_20_diff1
value: 22.382099999999998
- type: nauc_recall_at_100_max
value: 17.011699999999998
- type: nauc_recall_at_100_std
value: 20.9418
- type: nauc_recall_at_100_diff1
value: 24.9262
- type: nauc_recall_at_1000_max
value: 23.3383
- type: nauc_recall_at_1000_std
value: 50.590900000000005
- type: nauc_recall_at_1000_diff1
value: 30.3374
- type: nauc_precision_at_1_max
value: 24.9483
- type: nauc_precision_at_1_std
value: 4.0085999999999995
- type: nauc_precision_at_1_diff1
value: 41.1484
- type: nauc_precision_at_3_max
value: 25.4974
- type: nauc_precision_at_3_std
value: 5.7277000000000005
- type: nauc_precision_at_3_diff1
value: 29.2651
- type: nauc_precision_at_5_max
value: 25.7469
- type: nauc_precision_at_5_std
value: 2.512
- type: nauc_precision_at_5_diff1
value: 26.712000000000003
- type: nauc_precision_at_10_max
value: 19.8399
- type: nauc_precision_at_10_std
value: 5.8683
- type: nauc_precision_at_10_diff1
value: 21.2318
- type: nauc_precision_at_20_max
value: 18.0566
- type: nauc_precision_at_20_std
value: 8.4218
- type: nauc_precision_at_20_diff1
value: 15.591099999999999
- type: nauc_precision_at_100_max
value: 19.794600000000003
- type: nauc_precision_at_100_std
value: 20.591
- type: nauc_precision_at_100_diff1
value: 10.7974
- type: nauc_precision_at_1000_max
value: 1.1079
- type: nauc_precision_at_1000_std
value: 20.1769
- type: nauc_precision_at_1000_diff1
value: -11.980599999999999
- type: nauc_mrr_at_1_max
value: 24.9483
- type: nauc_mrr_at_1_std
value: 4.0085999999999995
- type: nauc_mrr_at_1_diff1
value: 41.1484
- type: nauc_mrr_at_3_max
value: 24.762999999999998
- type: nauc_mrr_at_3_std
value: 4.3391
- type: nauc_mrr_at_3_diff1
value: 36.2907
- type: nauc_mrr_at_5_max
value: 24.843899999999998
- type: nauc_mrr_at_5_std
value: 3.5093
- type: nauc_mrr_at_5_diff1
value: 35.9504
- type: nauc_mrr_at_10_max
value: 23.8084
- type: nauc_mrr_at_10_std
value: 3.5873000000000004
- type: nauc_mrr_at_10_diff1
value: 35.5747
- type: nauc_mrr_at_20_max
value: 23.6496
- type: nauc_mrr_at_20_std
value: 3.6975000000000002
- type: nauc_mrr_at_20_diff1
value: 35.3852
- type: nauc_mrr_at_100_max
value: 23.76
- type: nauc_mrr_at_100_std
value: 4.105099999999999
- type: nauc_mrr_at_100_diff1
value: 35.4929
- type: nauc_mrr_at_1000_max
value: 23.7583
- type: nauc_mrr_at_1000_std
value: 4.1303
- type: nauc_mrr_at_1000_diff1
value: 35.4926
- type: main_score
value: 29.641000000000002
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER (default)
type: mteb/climate-fever
config: default
split: test
revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380
metrics:
- type: ndcg_at_1
value: 40.521
- type: ndcg_at_3
value: 34.048
- type: ndcg_at_5
value: 36.027
- type: ndcg_at_10
value: 39.739000000000004
- type: ndcg_at_20
value: 42.405
- type: ndcg_at_100
value: 46.732
- type: ndcg_at_1000
value: 49.756
- type: map_at_1
value: 17.568
- type: map_at_3
value: 25.258999999999997
- type: map_at_5
value: 27.761000000000003
- type: map_at_10
value: 29.818
- type: map_at_20
value: 30.867
- type: map_at_100
value: 31.772
- type: map_at_1000
value: 31.956
- type: recall_at_1
value: 17.568
- type: recall_at_3
value: 30.174
- type: recall_at_5
value: 36.802
- type: recall_at_10
value: 44.999
- type: recall_at_20
value: 52.371
- type: recall_at_100
value: 68.805
- type: recall_at_1000
value: 85.559
- type: precision_at_1
value: 40.521
- type: precision_at_3
value: 25.755
- type: precision_at_5
value: 19.296
- type: precision_at_10
value: 12.104
- type: precision_at_20
value: 7.2379999999999995
- type: precision_at_100
value: 1.978
- type: precision_at_1000
value: 0.255
- type: mrr_at_1
value: 40.5212
- type: mrr_at_3
value: 49.848
- type: mrr_at_5
value: 51.7503
- type: mrr_at_10
value: 52.777499999999996
- type: mrr_at_20
value: 53.190099999999994
- type: mrr_at_100
value: 53.436499999999995
- type: mrr_at_1000
value: 53.4573
- type: nauc_ndcg_at_1_max
value: 37.6922
- type: nauc_ndcg_at_1_std
value: 14.352400000000001
- type: nauc_ndcg_at_1_diff1
value: 35.2176
- type: nauc_ndcg_at_3_max
value: 39.1261
- type: nauc_ndcg_at_3_std
value: 14.9172
- type: nauc_ndcg_at_3_diff1
value: 27.787499999999998
- type: nauc_ndcg_at_5_max
value: 40.7423
- type: nauc_ndcg_at_5_std
value: 15.754199999999999
- type: nauc_ndcg_at_5_diff1
value: 27.861599999999996
- type: nauc_ndcg_at_10_max
value: 42.2251
- type: nauc_ndcg_at_10_std
value: 18.322
- type: nauc_ndcg_at_10_diff1
value: 27.082
- type: nauc_ndcg_at_20_max
value: 42.888999999999996
- type: nauc_ndcg_at_20_std
value: 19.3603
- type: nauc_ndcg_at_20_diff1
value: 27.0993
- type: nauc_ndcg_at_100_max
value: 43.9071
- type: nauc_ndcg_at_100_std
value: 22.3581
- type: nauc_ndcg_at_100_diff1
value: 27.3167
- type: nauc_ndcg_at_1000_max
value: 44.0561
- type: nauc_ndcg_at_1000_std
value: 22.7021
- type: nauc_ndcg_at_1000_diff1
value: 27.398600000000002
- type: nauc_map_at_1_max
value: 35.4322
- type: nauc_map_at_1_std
value: 4.8918
- type: nauc_map_at_1_diff1
value: 41.0561
- type: nauc_map_at_3_max
value: 38.018299999999996
- type: nauc_map_at_3_std
value: 10.956299999999999
- type: nauc_map_at_3_diff1
value: 30.419200000000004
- type: nauc_map_at_5_max
value: 39.225100000000005
- type: nauc_map_at_5_std
value: 12.8212
- type: nauc_map_at_5_diff1
value: 29.2512
- type: nauc_map_at_10_max
value: 40.3819
- type: nauc_map_at_10_std
value: 14.601700000000001
- type: nauc_map_at_10_diff1
value: 28.612900000000003
- type: nauc_map_at_20_max
value: 40.7221
- type: nauc_map_at_20_std
value: 15.1138
- type: nauc_map_at_20_diff1
value: 28.6089
- type: nauc_map_at_100_max
value: 41.0295
- type: nauc_map_at_100_std
value: 15.8999
- type: nauc_map_at_100_diff1
value: 28.749299999999998
- type: nauc_map_at_1000_max
value: 41.0629
- type: nauc_map_at_1000_std
value: 15.9558
- type: nauc_map_at_1000_diff1
value: 28.7466
- type: nauc_recall_at_1_max
value: 35.4322
- type: nauc_recall_at_1_std
value: 4.8918
- type: nauc_recall_at_1_diff1
value: 41.0561
- type: nauc_recall_at_3_max
value: 37.7731
- type: nauc_recall_at_3_std
value: 12.5568
- type: nauc_recall_at_3_diff1
value: 24.4847
- type: nauc_recall_at_5_max
value: 38.9728
- type: nauc_recall_at_5_std
value: 15.0025
- type: nauc_recall_at_5_diff1
value: 22.132199999999997
- type: nauc_recall_at_10_max
value: 38.9505
- type: nauc_recall_at_10_std
value: 18.668100000000003
- type: nauc_recall_at_10_diff1
value: 18.536
- type: nauc_recall_at_20_max
value: 38.9569
- type: nauc_recall_at_20_std
value: 20.350199999999997
- type: nauc_recall_at_20_diff1
value: 17.4117
- type: nauc_recall_at_100_max
value: 40.1812
- type: nauc_recall_at_100_std
value: 30.7988
- type: nauc_recall_at_100_diff1
value: 14.9611
- type: nauc_recall_at_1000_max
value: 44.235
- type: nauc_recall_at_1000_std
value: 41.7923
- type: nauc_recall_at_1000_diff1
value: 10.7114
- type: nauc_precision_at_1_max
value: 37.6922
- type: nauc_precision_at_1_std
value: 14.352400000000001
- type: nauc_precision_at_1_diff1
value: 35.2176
- type: nauc_precision_at_3_max
value: 35.6221
- type: nauc_precision_at_3_std
value: 22.3033
- type: nauc_precision_at_3_diff1
value: 11.9528
- type: nauc_precision_at_5_max
value: 34.672599999999996
- type: nauc_precision_at_5_std
value: 24.185100000000002
- type: nauc_precision_at_5_diff1
value: 8.6234
- type: nauc_precision_at_10_max
value: 32.7609
- type: nauc_precision_at_10_std
value: 27.332299999999996
- type: nauc_precision_at_10_diff1
value: 5.5712
- type: nauc_precision_at_20_max
value: 29.6198
- type: nauc_precision_at_20_std
value: 27.537200000000002
- type: nauc_precision_at_20_diff1
value: 3.6273
- type: nauc_precision_at_100_max
value: 21.7954
- type: nauc_precision_at_100_std
value: 32.4662
- type: nauc_precision_at_100_diff1
value: -2.0006
- type: nauc_precision_at_1000_max
value: 8.2475
- type: nauc_precision_at_1000_std
value: 26.8237
- type: nauc_precision_at_1000_diff1
value: -10.2669
- type: nauc_mrr_at_1_max
value: 37.6922
- type: nauc_mrr_at_1_std
value: 14.352400000000001
- type: nauc_mrr_at_1_diff1
value: 35.2176
- type: nauc_mrr_at_3_max
value: 40.268
- type: nauc_mrr_at_3_std
value: 18.3079
- type: nauc_mrr_at_3_diff1
value: 30.514999999999997
- type: nauc_mrr_at_5_max
value: 40.7444
- type: nauc_mrr_at_5_std
value: 18.5863
- type: nauc_mrr_at_5_diff1
value: 30.7305
- type: nauc_mrr_at_10_max
value: 40.8067
- type: nauc_mrr_at_10_std
value: 18.997600000000002
- type: nauc_mrr_at_10_diff1
value: 30.614200000000004
- type: nauc_mrr_at_20_max
value: 40.8984
- type: nauc_mrr_at_20_std
value: 19.168499999999998
- type: nauc_mrr_at_20_diff1
value: 30.758499999999998
- type: nauc_mrr_at_100_max
value: 40.8979
- type: nauc_mrr_at_100_std
value: 19.1996
- type: nauc_mrr_at_100_diff1
value: 30.7498
- type: nauc_mrr_at_1000_max
value: 40.881299999999996
- type: nauc_mrr_at_1000_std
value: 19.178
- type: nauc_mrr_at_1000_diff1
value: 30.7577
- type: main_score
value: 39.739000000000004
- task:
type: Retrieval
dataset:
name: MTEB DBPedia (default)
type: mteb/dbpedia
config: default
split: test
revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659
metrics:
- type: ndcg_at_1
value: 53.37499999999999
- type: ndcg_at_3
value: 42.994
- type: ndcg_at_5
value: 40.494
- type: ndcg_at_10
value: 38.035000000000004
- type: ndcg_at_20
value: 37.805
- type: ndcg_at_100
value: 43.144
- type: ndcg_at_1000
value: 50.676
- type: map_at_1
value: 8.605
- type: map_at_3
value: 13.138
- type: map_at_5
value: 15.356
- type: map_at_10
value: 18.099999999999998
- type: map_at_20
value: 20.764
- type: map_at_100
value: 25.163999999999998
- type: map_at_1000
value: 26.799
- type: recall_at_1
value: 8.605
- type: recall_at_3
value: 14.418000000000001
- type: recall_at_5
value: 18.061
- type: recall_at_10
value: 23.543
- type: recall_at_20
value: 30.422
- type: recall_at_100
value: 49.028
- type: recall_at_1000
value: 72.658
- type: precision_at_1
value: 65
- type: precision_at_3
value: 45.833
- type: precision_at_5
value: 38.85
- type: precision_at_10
value: 29.525000000000002
- type: precision_at_20
value: 22.625
- type: precision_at_100
value: 9.805
- type: precision_at_1000
value: 2.077
- type: mrr_at_1
value: 65
- type: mrr_at_3
value: 71.54169999999999
- type: mrr_at_5
value: 72.1792
- type: mrr_at_10
value: 72.7745
- type: mrr_at_20
value: 73.17439999999999
- type: mrr_at_100
value: 73.3228
- type: mrr_at_1000
value: 73.32570000000001
- type: nauc_ndcg_at_1_max
value: 51.8867
- type: nauc_ndcg_at_1_std
value: 25.167499999999997
- type: nauc_ndcg_at_1_diff1
value: 39.820100000000004
- type: nauc_ndcg_at_3_max
value: 48.2333
- type: nauc_ndcg_at_3_std
value: 31.234499999999997
- type: nauc_ndcg_at_3_diff1
value: 27.023999999999997
- type: nauc_ndcg_at_5_max
value: 47.9002
- type: nauc_ndcg_at_5_std
value: 32.7547
- type: nauc_ndcg_at_5_diff1
value: 25.4475
- type: nauc_ndcg_at_10_max
value: 46.1203
- type: nauc_ndcg_at_10_std
value: 30.566
- type: nauc_ndcg_at_10_diff1
value: 25.4179
- type: nauc_ndcg_at_20_max
value: 42.7061
- type: nauc_ndcg_at_20_std
value: 26.6509
- type: nauc_ndcg_at_20_diff1
value: 23.901600000000002
- type: nauc_ndcg_at_100_max
value: 42.028999999999996
- type: nauc_ndcg_at_100_std
value: 30.721500000000002
- type: nauc_ndcg_at_100_diff1
value: 21.9503
- type: nauc_ndcg_at_1000_max
value: 46.9932
- type: nauc_ndcg_at_1000_std
value: 38.718799999999995
- type: nauc_ndcg_at_1000_diff1
value: 19.8737
- type: nauc_map_at_1_max
value: 15.648599999999998
- type: nauc_map_at_1_std
value: -12.8624
- type: nauc_map_at_1_diff1
value: 35.7138
- type: nauc_map_at_3_max
value: 16.9008
- type: nauc_map_at_3_std
value: -6.8941
- type: nauc_map_at_3_diff1
value: 28.064099999999996
- type: nauc_map_at_5_max
value: 18.8605
- type: nauc_map_at_5_std
value: -3.0509
- type: nauc_map_at_5_diff1
value: 25.964599999999997
- type: nauc_map_at_10_max
value: 21.6785
- type: nauc_map_at_10_std
value: 1.7839
- type: nauc_map_at_10_diff1
value: 22.969
- type: nauc_map_at_20_max
value: 25.9578
- type: nauc_map_at_20_std
value: 8.3626
- type: nauc_map_at_20_diff1
value: 21.0503
- type: nauc_map_at_100_max
value: 30.9448
- type: nauc_map_at_100_std
value: 20.410800000000002
- type: nauc_map_at_100_diff1
value: 17.7467
- type: nauc_map_at_1000_max
value: 31.969900000000003
- type: nauc_map_at_1000_std
value: 22.9604
- type: nauc_map_at_1000_diff1
value: 16.5214
- type: nauc_recall_at_1_max
value: 15.648599999999998
- type: nauc_recall_at_1_std
value: -12.8624
- type: nauc_recall_at_1_diff1
value: 35.7138
- type: nauc_recall_at_3_max
value: 13.6045
- type: nauc_recall_at_3_std
value: -7.5614
- type: nauc_recall_at_3_diff1
value: 24.0617
- type: nauc_recall_at_5_max
value: 13.2823
- type: nauc_recall_at_5_std
value: -5.2039
- type: nauc_recall_at_5_diff1
value: 20.2316
- type: nauc_recall_at_10_max
value: 16.034499999999998
- type: nauc_recall_at_10_std
value: -0.6257
- type: nauc_recall_at_10_diff1
value: 17.6053
- type: nauc_recall_at_20_max
value: 20.3006
- type: nauc_recall_at_20_std
value: 5.8022
- type: nauc_recall_at_20_diff1
value: 15.576
- type: nauc_recall_at_100_max
value: 25.8586
- type: nauc_recall_at_100_std
value: 25.831500000000002
- type: nauc_recall_at_100_diff1
value: 11.3408
- type: nauc_recall_at_1000_max
value: 34.0091
- type: nauc_recall_at_1000_std
value: 41.7999
- type: nauc_recall_at_1000_diff1
value: 6.8013
- type: nauc_precision_at_1_max
value: 59.560199999999995
- type: nauc_precision_at_1_std
value: 32.649899999999995
- type: nauc_precision_at_1_diff1
value: 44.1834
- type: nauc_precision_at_3_max
value: 44.3559
- type: nauc_precision_at_3_std
value: 42.951699999999995
- type: nauc_precision_at_3_diff1
value: 10.4531
- type: nauc_precision_at_5_max
value: 42.3183
- type: nauc_precision_at_5_std
value: 48.4798
- type: nauc_precision_at_5_diff1
value: 4.2654
- type: nauc_precision_at_10_max
value: 39.2447
- type: nauc_precision_at_10_std
value: 49.2467
- type: nauc_precision_at_10_diff1
value: -2.4278999999999997
- type: nauc_precision_at_20_max
value: 34.3648
- type: nauc_precision_at_20_std
value: 47.038000000000004
- type: nauc_precision_at_20_diff1
value: -7.8901
- type: nauc_precision_at_100_max
value: 21.528
- type: nauc_precision_at_100_std
value: 42.1485
- type: nauc_precision_at_100_diff1
value: -13.3385
- type: nauc_precision_at_1000_max
value: -4.6619
- type: nauc_precision_at_1000_std
value: 11.582
- type: nauc_precision_at_1000_diff1
value: -20.6555
- type: nauc_mrr_at_1_max
value: 59.560199999999995
- type: nauc_mrr_at_1_std
value: 32.649899999999995
- type: nauc_mrr_at_1_diff1
value: 44.1834
- type: nauc_mrr_at_3_max
value: 63.2521
- type: nauc_mrr_at_3_std
value: 41.7667
- type: nauc_mrr_at_3_diff1
value: 42.016999999999996
- type: nauc_mrr_at_5_max
value: 63.482000000000006
- type: nauc_mrr_at_5_std
value: 42.1506
- type: nauc_mrr_at_5_diff1
value: 41.815999999999995
- type: nauc_mrr_at_10_max
value: 63.130399999999995
- type: nauc_mrr_at_10_std
value: 41.5067
- type: nauc_mrr_at_10_diff1
value: 41.9133
- type: nauc_mrr_at_20_max
value: 63.159600000000005
- type: nauc_mrr_at_20_std
value: 41.2181
- type: nauc_mrr_at_20_diff1
value: 42.2187
- type: nauc_mrr_at_100_max
value: 63.1207
- type: nauc_mrr_at_100_std
value: 41.219699999999996
- type: nauc_mrr_at_100_diff1
value: 42.1492
- type: nauc_mrr_at_1000_max
value: 63.118399999999994
- type: nauc_mrr_at_1000_std
value: 41.213899999999995
- type: nauc_mrr_at_1000_diff1
value: 42.1491
- type: main_score
value: 38.035000000000004
- task:
type: Classification
dataset:
name: MTEB EmotionClassification (default)
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 48.245
- type: f1
value: 43.1184
- type: f1_weighted
value: 50.537
- type: main_score
value: 48.245
- task:
type: Retrieval
dataset:
name: MTEB FEVER (default)
type: mteb/fever
config: default
split: test
revision: bea83ef9e8fb933d90a2f1d5515737465d613e12
metrics:
- type: ndcg_at_1
value: 82.913
- type: ndcg_at_3
value: 86.708
- type: ndcg_at_5
value: 87.727
- type: ndcg_at_10
value: 88.26
- type: ndcg_at_20
value: 88.579
- type: ndcg_at_100
value: 88.93799999999999
- type: ndcg_at_1000
value: 89.164
- type: map_at_1
value: 76.79599999999999
- type: map_at_3
value: 83.649
- type: map_at_5
value: 84.393
- type: map_at_10
value: 84.70400000000001
- type: map_at_20
value: 84.827
- type: map_at_100
value: 84.90299999999999
- type: map_at_1000
value: 84.916
- type: recall_at_1
value: 76.79599999999999
- type: recall_at_3
value: 90.261
- type: recall_at_5
value: 92.89500000000001
- type: recall_at_10
value: 94.455
- type: recall_at_20
value: 95.527
- type: recall_at_100
value: 97.062
- type: recall_at_1000
value: 98.433
- type: precision_at_1
value: 82.913
- type: precision_at_3
value: 32.883
- type: precision_at_5
value: 20.429
- type: precision_at_10
value: 10.468
- type: precision_at_20
value: 5.335
- type: precision_at_100
value: 1.103
- type: precision_at_1000
value: 0.11399999999999999
- type: mrr_at_1
value: 82.9133
- type: mrr_at_3
value: 88.9114
- type: mrr_at_5
value: 89.4109
- type: mrr_at_10
value: 89.5488
- type: mrr_at_20
value: 89.5843
- type: mrr_at_100
value: 89.5951
- type: mrr_at_1000
value: 89.596
- type: nauc_ndcg_at_1_max
value: 30.1237
- type: nauc_ndcg_at_1_std
value: -7.976800000000001
- type: nauc_ndcg_at_1_diff1
value: 76.71759999999999
- type: nauc_ndcg_at_3_max
value: 24.2439
- type: nauc_ndcg_at_3_std
value: -1.3402
- type: nauc_ndcg_at_3_diff1
value: 53.475300000000004
- type: nauc_ndcg_at_5_max
value: 23.234099999999998
- type: nauc_ndcg_at_5_std
value: 0.1351
- type: nauc_ndcg_at_5_diff1
value: 51.782700000000006
- type: nauc_ndcg_at_10_max
value: 23.4737
- type: nauc_ndcg_at_10_std
value: 1.1952
- type: nauc_ndcg_at_10_diff1
value: 51.6677
- type: nauc_ndcg_at_20_max
value: 23.742
- type: nauc_ndcg_at_20_std
value: 1.1509
- type: nauc_ndcg_at_20_diff1
value: 52.1851
- type: nauc_ndcg_at_100_max
value: 24.346
- type: nauc_ndcg_at_100_std
value: 1.119
- type: nauc_ndcg_at_100_diff1
value: 52.976
- type: nauc_ndcg_at_1000_max
value: 24.8411
- type: nauc_ndcg_at_1000_std
value: 0.7044
- type: nauc_ndcg_at_1000_diff1
value: 54.2615
- type: nauc_map_at_1_max
value: 19.1903
- type: nauc_map_at_1_std
value: -5.463500000000001
- type: nauc_map_at_1_diff1
value: 57.45
- type: nauc_map_at_3_max
value: 20.7737
- type: nauc_map_at_3_std
value: -2.0726
- type: nauc_map_at_3_diff1
value: 51.200100000000006
- type: nauc_map_at_5_max
value: 20.833099999999998
- type: nauc_map_at_5_std
value: -1.3852
- type: nauc_map_at_5_diff1
value: 51.0736
- type: nauc_map_at_10_max
value: 21.2614
- type: nauc_map_at_10_std
value: -0.9268000000000001
- type: nauc_map_at_10_diff1
value: 51.3035
- type: nauc_map_at_20_max
value: 21.4214
- type: nauc_map_at_20_std
value: -0.8916999999999999
- type: nauc_map_at_20_diff1
value: 51.4853
- type: nauc_map_at_100_max
value: 21.5602
- type: nauc_map_at_100_std
value: -0.8564999999999999
- type: nauc_map_at_100_diff1
value: 51.6119
- type: nauc_map_at_1000_max
value: 21.5891
- type: nauc_map_at_1000_std
value: -0.8619999999999999
- type: nauc_map_at_1000_diff1
value: 51.669399999999996
- type: nauc_recall_at_1_max
value: 19.1903
- type: nauc_recall_at_1_std
value: -5.463500000000001
- type: nauc_recall_at_1_diff1
value: 57.45
- type: nauc_recall_at_3_max
value: 18.8097
- type: nauc_recall_at_3_std
value: 5.7094
- type: nauc_recall_at_3_diff1
value: 31.1662
- type: nauc_recall_at_5_max
value: 15.9461
- type: nauc_recall_at_5_std
value: 13.459
- type: nauc_recall_at_5_diff1
value: 19.5024
- type: nauc_recall_at_10_max
value: 16.006899999999998
- type: nauc_recall_at_10_std
value: 22.7035
- type: nauc_recall_at_10_diff1
value: 11.734
- type: nauc_recall_at_20_max
value: 15.2441
- type: nauc_recall_at_20_std
value: 26.5079
- type: nauc_recall_at_20_diff1
value: 6.9472000000000005
- type: nauc_recall_at_100_max
value: 17.4246
- type: nauc_recall_at_100_std
value: 37.8238
- type: nauc_recall_at_100_diff1
value: -3.3619999999999997
- type: nauc_recall_at_1000_max
value: 25.897
- type: nauc_recall_at_1000_std
value: 50.85849999999999
- type: nauc_recall_at_1000_diff1
value: -3.4954
- type: nauc_precision_at_1_max
value: 30.1237
- type: nauc_precision_at_1_std
value: -7.976800000000001
- type: nauc_precision_at_1_diff1
value: 76.71759999999999
- type: nauc_precision_at_3_max
value: 32.2474
- type: nauc_precision_at_3_std
value: 9.4755
- type: nauc_precision_at_3_diff1
value: 26.744200000000003
- type: nauc_precision_at_5_max
value: 25.596999999999998
- type: nauc_precision_at_5_std
value: 14.3121
- type: nauc_precision_at_5_diff1
value: 8.2439
- type: nauc_precision_at_10_max
value: 22.4779
- type: nauc_precision_at_10_std
value: 17.156
- type: nauc_precision_at_10_diff1
value: -0.7406
- type: nauc_precision_at_20_max
value: 20.408
- type: nauc_precision_at_20_std
value: 15.516399999999999
- type: nauc_precision_at_20_diff1
value: -3.4888000000000003
- type: nauc_precision_at_100_max
value: 17.2807
- type: nauc_precision_at_100_std
value: 11.8869
- type: nauc_precision_at_100_diff1
value: -5.1894
- type: nauc_precision_at_1000_max
value: 15.6691
- type: nauc_precision_at_1000_std
value: 6.3058000000000005
- type: nauc_precision_at_1000_diff1
value: -0.9933000000000001
- type: nauc_mrr_at_1_max
value: 30.1237
- type: nauc_mrr_at_1_std
value: -7.976800000000001
- type: nauc_mrr_at_1_diff1
value: 76.71759999999999
- type: nauc_mrr_at_3_max
value: 33.969
- type: nauc_mrr_at_3_std
value: -6.7858
- type: nauc_mrr_at_3_diff1
value: 75.5417
- type: nauc_mrr_at_5_max
value: 33.853
- type: nauc_mrr_at_5_std
value: -6.4335
- type: nauc_mrr_at_5_diff1
value: 75.6962
- type: nauc_mrr_at_10_max
value: 33.6511
- type: nauc_mrr_at_10_std
value: -6.268
- type: nauc_mrr_at_10_diff1
value: 75.7315
- type: nauc_mrr_at_20_max
value: 33.500600000000006
- type: nauc_mrr_at_20_std
value: -6.4148
- type: nauc_mrr_at_20_diff1
value: 75.7596
- type: nauc_mrr_at_100_max
value: 33.4376
- type: nauc_mrr_at_100_std
value: -6.482799999999999
- type: nauc_mrr_at_100_diff1
value: 75.7571
- type: nauc_mrr_at_1000_max
value: 33.4322
- type: nauc_mrr_at_1000_std
value: -6.4902
- type: nauc_mrr_at_1000_diff1
value: 75.7587
- type: main_score
value: 88.26
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018 (default)
type: mteb/fiqa
config: default
split: test
revision: 27a168819829fe9bcd655c2df245fb19452e8e06
metrics:
- type: ndcg_at_1
value: 37.037
- type: ndcg_at_3
value: 35.313
- type: ndcg_at_5
value: 35.99
- type: ndcg_at_10
value: 38.451
- type: ndcg_at_20
value: 41.097
- type: ndcg_at_100
value: 45.759
- type: ndcg_at_1000
value: 48.952
- type: map_at_1
value: 18.82
- type: map_at_3
value: 27.173000000000002
- type: map_at_5
value: 29.12
- type: map_at_10
value: 30.907
- type: map_at_20
value: 31.918999999999997
- type: map_at_100
value: 32.855000000000004
- type: map_at_1000
value: 33.049
- type: recall_at_1
value: 18.82
- type: recall_at_3
value: 32.505
- type: recall_at_5
value: 37.524
- type: recall_at_10
value: 44.936
- type: recall_at_20
value: 52.961999999999996
- type: recall_at_100
value: 72.229
- type: recall_at_1000
value: 91.266
- type: precision_at_1
value: 37.037
- type: precision_at_3
value: 23.714
- type: precision_at_5
value: 16.975
- type: precision_at_10
value: 10.664
- type: precision_at_20
value: 6.451
- type: precision_at_100
value: 1.799
- type: precision_at_1000
value: 0.23700000000000002
- type: mrr_at_1
value: 37.037
- type: mrr_at_3
value: 44.4444
- type: mrr_at_5
value: 45.5324
- type: mrr_at_10
value: 46.5289
- type: mrr_at_20
value: 47.017199999999995
- type: mrr_at_100
value: 47.4272
- type: mrr_at_1000
value: 47.4667
- type: nauc_ndcg_at_1_max
value: 39.3378
- type: nauc_ndcg_at_1_std
value: 0.4046
- type: nauc_ndcg_at_1_diff1
value: 54.5767
- type: nauc_ndcg_at_3_max
value: 35.509299999999996
- type: nauc_ndcg_at_3_std
value: -0.055099999999999996
- type: nauc_ndcg_at_3_diff1
value: 46.927400000000006
- type: nauc_ndcg_at_5_max
value: 33.5333
- type: nauc_ndcg_at_5_std
value: -0.2111
- type: nauc_ndcg_at_5_diff1
value: 46.493
- type: nauc_ndcg_at_10_max
value: 33.590199999999996
- type: nauc_ndcg_at_10_std
value: 1.1043
- type: nauc_ndcg_at_10_diff1
value: 44.5017
- type: nauc_ndcg_at_20_max
value: 33.3792
- type: nauc_ndcg_at_20_std
value: 2.4081
- type: nauc_ndcg_at_20_diff1
value: 43.838
- type: nauc_ndcg_at_100_max
value: 36.343599999999995
- type: nauc_ndcg_at_100_std
value: 6.0874
- type: nauc_ndcg_at_100_diff1
value: 43.7378
- type: nauc_ndcg_at_1000_max
value: 37.4981
- type: nauc_ndcg_at_1000_std
value: 5.7039
- type: nauc_ndcg_at_1000_diff1
value: 44.6965
- type: nauc_map_at_1_max
value: 26.435399999999998
- type: nauc_map_at_1_std
value: -1.5532000000000001
- type: nauc_map_at_1_diff1
value: 55.325
- type: nauc_map_at_3_max
value: 30.523
- type: nauc_map_at_3_std
value: -1.194
- type: nauc_map_at_3_diff1
value: 48.8296
- type: nauc_map_at_5_max
value: 31.502799999999997
- type: nauc_map_at_5_std
value: -1.093
- type: nauc_map_at_5_diff1
value: 47.849399999999996
- type: nauc_map_at_10_max
value: 32.5109
- type: nauc_map_at_10_std
value: -0.1616
- type: nauc_map_at_10_diff1
value: 46.4203
- type: nauc_map_at_20_max
value: 32.6185
- type: nauc_map_at_20_std
value: 0.41050000000000003
- type: nauc_map_at_20_diff1
value: 46.145599999999995
- type: nauc_map_at_100_max
value: 33.326299999999996
- type: nauc_map_at_100_std
value: 1.1496
- type: nauc_map_at_100_diff1
value: 46.1063
- type: nauc_map_at_1000_max
value: 33.4678
- type: nauc_map_at_1000_std
value: 1.1722
- type: nauc_map_at_1000_diff1
value: 46.1577
- type: nauc_recall_at_1_max
value: 26.435399999999998
- type: nauc_recall_at_1_std
value: -1.5532000000000001
- type: nauc_recall_at_1_diff1
value: 55.325
- type: nauc_recall_at_3_max
value: 25.3216
- type: nauc_recall_at_3_std
value: -1.3092
- type: nauc_recall_at_3_diff1
value: 40.7913
- type: nauc_recall_at_5_max
value: 24.444
- type: nauc_recall_at_5_std
value: -0.9400000000000001
- type: nauc_recall_at_5_diff1
value: 38.0763
- type: nauc_recall_at_10_max
value: 24.8674
- type: nauc_recall_at_10_std
value: 2.3571
- type: nauc_recall_at_10_diff1
value: 32.1728
- type: nauc_recall_at_20_max
value: 22.243299999999998
- type: nauc_recall_at_20_std
value: 5.6803
- type: nauc_recall_at_20_diff1
value: 28.557
- type: nauc_recall_at_100_max
value: 29.0702
- type: nauc_recall_at_100_std
value: 25.7249
- type: nauc_recall_at_100_diff1
value: 21.2079
- type: nauc_recall_at_1000_max
value: 40.241
- type: nauc_recall_at_1000_std
value: 48.301899999999996
- type: nauc_recall_at_1000_diff1
value: 20.038
- type: nauc_precision_at_1_max
value: 39.3378
- type: nauc_precision_at_1_std
value: 0.4046
- type: nauc_precision_at_1_diff1
value: 54.5767
- type: nauc_precision_at_3_max
value: 36.4929
- type: nauc_precision_at_3_std
value: 0.8775
- type: nauc_precision_at_3_diff1
value: 30.308699999999998
- type: nauc_precision_at_5_max
value: 35.120200000000004
- type: nauc_precision_at_5_std
value: 1.3797
- type: nauc_precision_at_5_diff1
value: 25.166800000000002
- type: nauc_precision_at_10_max
value: 34.2855
- type: nauc_precision_at_10_std
value: 5.0542
- type: nauc_precision_at_10_diff1
value: 15.148
- type: nauc_precision_at_20_max
value: 31.4664
- type: nauc_precision_at_20_std
value: 9.0011
- type: nauc_precision_at_20_diff1
value: 10.3899
- type: nauc_precision_at_100_max
value: 32.6942
- type: nauc_precision_at_100_std
value: 14.7489
- type: nauc_precision_at_100_diff1
value: 2.806
- type: nauc_precision_at_1000_max
value: 27.2725
- type: nauc_precision_at_1000_std
value: 11.8238
- type: nauc_precision_at_1000_diff1
value: -3.4041
- type: nauc_mrr_at_1_max
value: 39.3378
- type: nauc_mrr_at_1_std
value: 0.4046
- type: nauc_mrr_at_1_diff1
value: 54.5767
- type: nauc_mrr_at_3_max
value: 39.4613
- type: nauc_mrr_at_3_std
value: 1.7649000000000001
- type: nauc_mrr_at_3_diff1
value: 50.1734
- type: nauc_mrr_at_5_max
value: 38.9739
- type: nauc_mrr_at_5_std
value: 1.4766
- type: nauc_mrr_at_5_diff1
value: 49.900299999999994
- type: nauc_mrr_at_10_max
value: 39.2236
- type: nauc_mrr_at_10_std
value: 1.6832
- type: nauc_mrr_at_10_diff1
value: 49.420500000000004
- type: nauc_mrr_at_20_max
value: 39.114900000000006
- type: nauc_mrr_at_20_std
value: 1.8496
- type: nauc_mrr_at_20_diff1
value: 49.339
- type: nauc_mrr_at_100_max
value: 39.309
- type: nauc_mrr_at_100_std
value: 2.1651
- type: nauc_mrr_at_100_diff1
value: 49.3731
- type: nauc_mrr_at_1000_max
value: 39.3136
- type: nauc_mrr_at_1000_std
value: 2.1359
- type: nauc_mrr_at_1000_diff1
value: 49.399100000000004
- type: main_score
value: 38.451
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA (default)
type: mteb/hotpotqa
config: default
split: test
revision: ab518f4d6fcca38d87c25209f94beba119d02014
metrics:
- type: ndcg_at_1
value: 80.513
- type: ndcg_at_3
value: 61.72299999999999
- type: ndcg_at_5
value: 64.1
- type: ndcg_at_10
value: 65.89699999999999
- type: ndcg_at_20
value: 67.071
- type: ndcg_at_100
value: 68.72500000000001
- type: ndcg_at_1000
value: 70.031
- type: map_at_1
value: 40.257
- type: map_at_3
value: 53.547999999999995
- type: map_at_5
value: 55.44
- type: map_at_10
value: 56.505
- type: map_at_20
value: 56.987
- type: map_at_100
value: 57.328
- type: map_at_1000
value: 57.396
- type: recall_at_1
value: 40.257
- type: recall_at_3
value: 57.211
- type: recall_at_5
value: 61.904
- type: recall_at_10
value: 66.408
- type: recall_at_20
value: 70.162
- type: recall_at_100
value: 77.448
- type: recall_at_1000
value: 86.07000000000001
- type: precision_at_1
value: 80.513
- type: precision_at_3
value: 38.141000000000005
- type: precision_at_5
value: 24.762
- type: precision_at_10
value: 13.282
- type: precision_at_20
value: 7.016
- type: precision_at_100
value: 1.549
- type: precision_at_1000
value: 0.172
- type: mrr_at_1
value: 80.5132
- type: mrr_at_3
value: 84.6793
- type: mrr_at_5
value: 85.21
- type: mrr_at_10
value: 85.5189
- type: mrr_at_20
value: 85.6233
- type: mrr_at_100
value: 85.6853
- type: mrr_at_1000
value: 85.6933
- type: nauc_ndcg_at_1_max
value: 57.2352
- type: nauc_ndcg_at_1_std
value: -3.5655
- type: nauc_ndcg_at_1_diff1
value: 73.753
- type: nauc_ndcg_at_3_max
value: 23.2053
- type: nauc_ndcg_at_3_std
value: -1.4256
- type: nauc_ndcg_at_3_diff1
value: 20.7058
- type: nauc_ndcg_at_5_max
value: 20.7402
- type: nauc_ndcg_at_5_std
value: 0.48989999999999995
- type: nauc_ndcg_at_5_diff1
value: 17.1464
- type: nauc_ndcg_at_10_max
value: 19.1611
- type: nauc_ndcg_at_10_std
value: 1.0252000000000001
- type: nauc_ndcg_at_10_diff1
value: 15.4391
- type: nauc_ndcg_at_20_max
value: 18.4063
- type: nauc_ndcg_at_20_std
value: 1.345
- type: nauc_ndcg_at_20_diff1
value: 14.818999999999999
- type: nauc_ndcg_at_100_max
value: 17.639499999999998
- type: nauc_ndcg_at_100_std
value: 2.3089
- type: nauc_ndcg_at_100_diff1
value: 13.9948
- type: nauc_ndcg_at_1000_max
value: 17.9525
- type: nauc_ndcg_at_1000_std
value: 2.0898
- type: nauc_ndcg_at_1000_diff1
value: 14.499300000000002
- type: nauc_map_at_1_max
value: 57.2352
- type: nauc_map_at_1_std
value: -3.5655
- type: nauc_map_at_1_diff1
value: 73.753
- type: nauc_map_at_3_max
value: 17.3676
- type: nauc_map_at_3_std
value: -2.0408
- type: nauc_map_at_3_diff1
value: 13.6378
- type: nauc_map_at_5_max
value: 15.8526
- type: nauc_map_at_5_std
value: -0.7017
- type: nauc_map_at_5_diff1
value: 11.354899999999999
- type: nauc_map_at_10_max
value: 15.187600000000002
- type: nauc_map_at_10_std
value: -0.44
- type: nauc_map_at_10_diff1
value: 10.673399999999999
- type: nauc_map_at_20_max
value: 14.972199999999999
- type: nauc_map_at_20_std
value: -0.3543
- type: nauc_map_at_20_diff1
value: 10.5319
- type: nauc_map_at_100_max
value: 14.8562
- type: nauc_map_at_100_std
value: -0.1799
- type: nauc_map_at_100_diff1
value: 10.4117
- type: nauc_map_at_1000_max
value: 14.8625
- type: nauc_map_at_1000_std
value: -0.18109999999999998
- type: nauc_map_at_1000_diff1
value: 10.424700000000001
- type: nauc_recall_at_1_max
value: 57.2352
- type: nauc_recall_at_1_std
value: -3.5655
- type: nauc_recall_at_1_diff1
value: 73.753
- type: nauc_recall_at_3_max
value: 12.948200000000002
- type: nauc_recall_at_3_std
value: -0.626
- type: nauc_recall_at_3_diff1
value: 5.068099999999999
- type: nauc_recall_at_5_max
value: 7.8968
- type: nauc_recall_at_5_std
value: 2.9478
- type: nauc_recall_at_5_diff1
value: -1.7441000000000002
- type: nauc_recall_at_10_max
value: 3.2369000000000003
- type: nauc_recall_at_10_std
value: 4.2506
- type: nauc_recall_at_10_diff1
value: -6.7679
- type: nauc_recall_at_20_max
value: 0.1675
- type: nauc_recall_at_20_std
value: 5.4809
- type: nauc_recall_at_20_diff1
value: -9.762
- type: nauc_recall_at_100_max
value: -6.5167
- type: nauc_recall_at_100_std
value: 10.6357
- type: nauc_recall_at_100_diff1
value: -17.631800000000002
- type: nauc_recall_at_1000_max
value: -12.8048
- type: nauc_recall_at_1000_std
value: 11.675099999999999
- type: nauc_recall_at_1000_diff1
value: -25.1894
- type: nauc_precision_at_1_max
value: 57.2352
- type: nauc_precision_at_1_std
value: -3.5655
- type: nauc_precision_at_1_diff1
value: 73.753
- type: nauc_precision_at_3_max
value: 12.948200000000002
- type: nauc_precision_at_3_std
value: -0.626
- type: nauc_precision_at_3_diff1
value: 5.068099999999999
- type: nauc_precision_at_5_max
value: 7.8968
- type: nauc_precision_at_5_std
value: 2.9478
- type: nauc_precision_at_5_diff1
value: -1.7441000000000002
- type: nauc_precision_at_10_max
value: 3.2369000000000003
- type: nauc_precision_at_10_std
value: 4.2506
- type: nauc_precision_at_10_diff1
value: -6.7679
- type: nauc_precision_at_20_max
value: 0.1675
- type: nauc_precision_at_20_std
value: 5.4809
- type: nauc_precision_at_20_diff1
value: -9.762
- type: nauc_precision_at_100_max
value: -6.5167
- type: nauc_precision_at_100_std
value: 10.6357
- type: nauc_precision_at_100_diff1
value: -17.631800000000002
- type: nauc_precision_at_1000_max
value: -12.8048
- type: nauc_precision_at_1000_std
value: 11.675099999999999
- type: nauc_precision_at_1000_diff1
value: -25.1894
- type: nauc_mrr_at_1_max
value: 57.2352
- type: nauc_mrr_at_1_std
value: -3.5655
- type: nauc_mrr_at_1_diff1
value: 73.753
- type: nauc_mrr_at_3_max
value: 60.146100000000004
- type: nauc_mrr_at_3_std
value: -1.0741
- type: nauc_mrr_at_3_diff1
value: 72.1941
- type: nauc_mrr_at_5_max
value: 60.1464
- type: nauc_mrr_at_5_std
value: -0.506
- type: nauc_mrr_at_5_diff1
value: 72.38
- type: nauc_mrr_at_10_max
value: 60.0685
- type: nauc_mrr_at_10_std
value: -0.39899999999999997
- type: nauc_mrr_at_10_diff1
value: 72.461
- type: nauc_mrr_at_20_max
value: 60.0296
- type: nauc_mrr_at_20_std
value: -0.4039
- type: nauc_mrr_at_20_diff1
value: 72.53309999999999
- type: nauc_mrr_at_100_max
value: 59.964
- type: nauc_mrr_at_100_std
value: -0.4698
- type: nauc_mrr_at_100_diff1
value: 72.5235
- type: nauc_mrr_at_1000_max
value: 59.96
- type: nauc_mrr_at_1000_std
value: -0.4855
- type: nauc_mrr_at_1000_diff1
value: 72.5279
- type: main_score
value: 65.89699999999999
- task:
type: Classification
dataset:
name: MTEB ImdbClassification (default)
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 88.6972
- type: f1
value: 88.6641
- type: f1_weighted
value: 88.6641
- type: ap
value: 83.88029999999999
- type: ap_weighted
value: 83.88029999999999
- type: main_score
value: 88.6972
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO (default)
type: mteb/msmarco
config: default
split: dev
revision: c5a29a104738b98a9e76336939199e264163d4a0
metrics:
- type: ndcg_at_1
value: 21.948
- type: ndcg_at_3
value: 32.747
- type: ndcg_at_5
value: 36.582
- type: ndcg_at_10
value: 40.284
- type: ndcg_at_20
value: 42.885
- type: ndcg_at_100
value: 46.075
- type: ndcg_at_1000
value: 47.351
- type: map_at_1
value: 21.365000000000002
- type: map_at_3
value: 29.820999999999998
- type: map_at_5
value: 31.97
- type: map_at_10
value: 33.528000000000006
- type: map_at_20
value: 34.264
- type: map_at_100
value: 34.731
- type: map_at_1000
value: 34.782000000000004
- type: recall_at_1
value: 21.365000000000002
- type: recall_at_3
value: 40.605000000000004
- type: recall_at_5
value: 49.81
- type: recall_at_10
value: 61.047
- type: recall_at_20
value: 71.125
- type: recall_at_100
value: 87.813
- type: recall_at_1000
value: 97.556
- type: precision_at_1
value: 21.948
- type: precision_at_3
value: 13.988
- type: precision_at_5
value: 10.344000000000001
- type: precision_at_10
value: 6.361
- type: precision_at_20
value: 3.723
- type: precision_at_100
value: 0.9259999999999999
- type: precision_at_1000
value: 0.104
- type: mrr_at_1
value: 21.9484
- type: mrr_at_3
value: 30.4465
- type: mrr_at_5
value: 32.5683
- type: mrr_at_10
value: 34.098800000000004
- type: mrr_at_20
value: 34.8029
- type: mrr_at_100
value: 35.2406
- type: mrr_at_1000
value: 35.2854
- type: nauc_ndcg_at_1_max
value: 5.9741
- type: nauc_ndcg_at_1_std
value: -18.5744
- type: nauc_ndcg_at_1_diff1
value: 35.4593
- type: nauc_ndcg_at_3_max
value: 7.3545
- type: nauc_ndcg_at_3_std
value: -21.494
- type: nauc_ndcg_at_3_diff1
value: 30.8726
- type: nauc_ndcg_at_5_max
value: 7.604800000000001
- type: nauc_ndcg_at_5_std
value: -22.2137
- type: nauc_ndcg_at_5_diff1
value: 30.457600000000003
- type: nauc_ndcg_at_10_max
value: 8.6668
- type: nauc_ndcg_at_10_std
value: -22.1356
- type: nauc_ndcg_at_10_diff1
value: 31.0905
- type: nauc_ndcg_at_20_max
value: 9.5083
- type: nauc_ndcg_at_20_std
value: -20.593
- type: nauc_ndcg_at_20_diff1
value: 30.9527
- type: nauc_ndcg_at_100_max
value: 9.5322
- type: nauc_ndcg_at_100_std
value: -18.5136
- type: nauc_ndcg_at_100_diff1
value: 30.9364
- type: nauc_ndcg_at_1000_max
value: 9.1474
- type: nauc_ndcg_at_1000_std
value: -19.1979
- type: nauc_ndcg_at_1000_diff1
value: 31.0386
- type: nauc_map_at_1_max
value: 6.101100000000001
- type: nauc_map_at_1_std
value: -18.698500000000003
- type: nauc_map_at_1_diff1
value: 35.4792
- type: nauc_map_at_3_max
value: 7.0567
- type: nauc_map_at_3_std
value: -20.9636
- type: nauc_map_at_3_diff1
value: 31.816699999999997
- type: nauc_map_at_5_max
value: 7.1744
- type: nauc_map_at_5_std
value: -21.407
- type: nauc_map_at_5_diff1
value: 31.5737
- type: nauc_map_at_10_max
value: 7.598199999999999
- type: nauc_map_at_10_std
value: -21.3725
- type: nauc_map_at_10_diff1
value: 31.8448
- type: nauc_map_at_20_max
value: 7.820399999999999
- type: nauc_map_at_20_std
value: -20.9531
- type: nauc_map_at_20_diff1
value: 31.805899999999998
- type: nauc_map_at_100_max
value: 7.8269
- type: nauc_map_at_100_std
value: -20.6647
- type: nauc_map_at_100_diff1
value: 31.8065
- type: nauc_map_at_1000_max
value: 7.818300000000001
- type: nauc_map_at_1000_std
value: -20.6733
- type: nauc_map_at_1000_diff1
value: 31.810100000000002
- type: nauc_recall_at_1_max
value: 6.101100000000001
- type: nauc_recall_at_1_std
value: -18.698500000000003
- type: nauc_recall_at_1_diff1
value: 35.4792
- type: nauc_recall_at_3_max
value: 8.216099999999999
- type: nauc_recall_at_3_std
value: -22.9422
- type: nauc_recall_at_3_diff1
value: 28.219300000000004
- type: nauc_recall_at_5_max
value: 8.7943
- type: nauc_recall_at_5_std
value: -24.5572
- type: nauc_recall_at_5_diff1
value: 27.2003
- type: nauc_recall_at_10_max
value: 12.2627
- type: nauc_recall_at_10_std
value: -24.5221
- type: nauc_recall_at_10_diff1
value: 28.6163
- type: nauc_recall_at_20_max
value: 16.961000000000002
- type: nauc_recall_at_20_std
value: -17.8785
- type: nauc_recall_at_20_diff1
value: 27.6059
- type: nauc_recall_at_100_max
value: 24.529500000000002
- type: nauc_recall_at_100_std
value: 9.4392
- type: nauc_recall_at_100_diff1
value: 24.284
- type: nauc_recall_at_1000_max
value: 46.471000000000004
- type: nauc_recall_at_1000_std
value: 48.265299999999996
- type: nauc_recall_at_1000_diff1
value: 8.8465
- type: nauc_precision_at_1_max
value: 5.9741
- type: nauc_precision_at_1_std
value: -18.5744
- type: nauc_precision_at_1_diff1
value: 35.4593
- type: nauc_precision_at_3_max
value: 7.8017
- type: nauc_precision_at_3_std
value: -22.8873
- type: nauc_precision_at_3_diff1
value: 27.6704
- type: nauc_precision_at_5_max
value: 8.2906
- type: nauc_precision_at_5_std
value: -24.1192
- type: nauc_precision_at_5_diff1
value: 26.1024
- type: nauc_precision_at_10_max
value: 11.4748
- type: nauc_precision_at_10_std
value: -23.3331
- type: nauc_precision_at_10_diff1
value: 26.8968
- type: nauc_precision_at_20_max
value: 15.304599999999999
- type: nauc_precision_at_20_std
value: -15.3527
- type: nauc_precision_at_20_diff1
value: 23.863300000000002
- type: nauc_precision_at_100_max
value: 18.1506
- type: nauc_precision_at_100_std
value: 10.6614
- type: nauc_precision_at_100_diff1
value: 13.7323
- type: nauc_precision_at_1000_max
value: 11.7232
- type: nauc_precision_at_1000_std
value: 17.5344
- type: nauc_precision_at_1000_diff1
value: -3.4896000000000003
- type: nauc_mrr_at_1_max
value: 5.9741
- type: nauc_mrr_at_1_std
value: -18.5744
- type: nauc_mrr_at_1_diff1
value: 35.4593
- type: nauc_mrr_at_3_max
value: 6.929100000000001
- type: nauc_mrr_at_3_std
value: -20.7196
- type: nauc_mrr_at_3_diff1
value: 31.8547
- type: nauc_mrr_at_5_max
value: 7.1258
- type: nauc_mrr_at_5_std
value: -21.0583
- type: nauc_mrr_at_5_diff1
value: 31.6481
- type: nauc_mrr_at_10_max
value: 7.5504
- type: nauc_mrr_at_10_std
value: -20.9941
- type: nauc_mrr_at_10_diff1
value: 31.924400000000002
- type: nauc_mrr_at_20_max
value: 7.7503
- type: nauc_mrr_at_20_std
value: -20.5759
- type: nauc_mrr_at_20_diff1
value: 31.8852
- type: nauc_mrr_at_100_max
value: 7.7376000000000005
- type: nauc_mrr_at_100_std
value: -20.3293
- type: nauc_mrr_at_100_diff1
value: 31.887500000000003
- type: nauc_mrr_at_1000_max
value: 7.725999999999999
- type: nauc_mrr_at_1000_std
value: -20.344
- type: nauc_mrr_at_1000_diff1
value: 31.8917
- type: main_score
value: 40.284
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 95.1756
- type: f1
value: 94.925
- type: f1_weighted
value: 95.1766
- type: main_score
value: 95.1756
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 76.7601
- type: f1
value: 57.5613
- type: f1_weighted
value: 79.6763
- type: main_score
value: 76.7601
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 4672e20407010da34463acc759c162ca9734bca6
metrics:
- type: accuracy
value: 75.3262
- type: f1
value: 72.7127
- type: f1_weighted
value: 75.40259999999999
- type: main_score
value: 75.3262
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8
metrics:
- type: accuracy
value: 80.2152
- type: f1
value: 80.06490000000001
- type: f1_weighted
value: 80.26830000000001
- type: main_score
value: 80.2152
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P (default)
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 35.098800000000004
- type: v_measure_std
value: 1.6771999999999998
- type: main_score
value: 35.098800000000004
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S (default)
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 32.9033
- type: v_measure_std
value: 1.4976
- type: main_score
value: 32.9033
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking (default)
type: mteb/mind_small
config: default
split: test
revision: 59042f120c80e8afa9cdbb224f67076cec0fc9a7
metrics:
- type: map
value: 32.1551
- type: mrr
value: 33.3476
- type: nAUC_map_max
value: -19.1719
- type: nAUC_map_std
value: 0.1456
- type: nAUC_map_diff1
value: 14.8056
- type: nAUC_mrr_max
value: -13.6261
- type: nAUC_mrr_std
value: 1.5634
- type: nAUC_mrr_diff1
value: 13.8537
- type: main_score
value: 32.1551
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus (default)
type: mteb/nfcorpus
config: default
split: test
revision: ec0fa4fe99da2ff19ca1214b7966684033a58814
metrics:
- type: ndcg_at_1
value: 48.607
- type: ndcg_at_3
value: 42.187000000000005
- type: ndcg_at_5
value: 38.989000000000004
- type: ndcg_at_10
value: 36.168
- type: ndcg_at_20
value: 33.550000000000004
- type: ndcg_at_100
value: 33.104
- type: ndcg_at_1000
value: 41.760999999999996
- type: map_at_1
value: 6.802
- type: map_at_3
value: 10.333
- type: map_at_5
value: 11.735
- type: map_at_10
value: 13.79
- type: map_at_20
value: 15.155
- type: map_at_100
value: 17.151
- type: map_at_1000
value: 18.584
- type: recall_at_1
value: 6.802
- type: recall_at_3
value: 11.18
- type: recall_at_5
value: 13.376
- type: recall_at_10
value: 17.803
- type: recall_at_20
value: 21.365000000000002
- type: recall_at_100
value: 33.885
- type: recall_at_1000
value: 65.12400000000001
- type: precision_at_1
value: 50.464
- type: precision_at_3
value: 38.906
- type: precision_at_5
value: 32.879000000000005
- type: precision_at_10
value: 26.346999999999998
- type: precision_at_20
value: 19.241
- type: precision_at_100
value: 8.195
- type: precision_at_1000
value: 2.089
- type: mrr_at_1
value: 50.4644
- type: mrr_at_3
value: 56.192
- type: mrr_at_5
value: 57.291000000000004
- type: mrr_at_10
value: 58.0675
- type: mrr_at_20
value: 58.3226
- type: mrr_at_100
value: 58.5455
- type: mrr_at_1000
value: 58.578300000000006
- type: nauc_ndcg_at_1_max
value: 45.467200000000005
- type: nauc_ndcg_at_1_std
value: 22.6949
- type: nauc_ndcg_at_1_diff1
value: 33.3321
- type: nauc_ndcg_at_3_max
value: 45.142500000000005
- type: nauc_ndcg_at_3_std
value: 24.959500000000002
- type: nauc_ndcg_at_3_diff1
value: 26.5735
- type: nauc_ndcg_at_5_max
value: 44.4407
- type: nauc_ndcg_at_5_std
value: 24.3504
- type: nauc_ndcg_at_5_diff1
value: 25.2917
- type: nauc_ndcg_at_10_max
value: 42.011700000000005
- type: nauc_ndcg_at_10_std
value: 24.1625
- type: nauc_ndcg_at_10_diff1
value: 23.9877
- type: nauc_ndcg_at_20_max
value: 40.647299999999994
- type: nauc_ndcg_at_20_std
value: 22.7495
- type: nauc_ndcg_at_20_diff1
value: 25.496999999999996
- type: nauc_ndcg_at_100_max
value: 41.7588
- type: nauc_ndcg_at_100_std
value: 24.1511
- type: nauc_ndcg_at_100_diff1
value: 26.500400000000003
- type: nauc_ndcg_at_1000_max
value: 47.1963
- type: nauc_ndcg_at_1000_std
value: 31.6374
- type: nauc_ndcg_at_1000_diff1
value: 27.0791
- type: nauc_map_at_1_max
value: 11.7018
- type: nauc_map_at_1_std
value: -17.1298
- type: nauc_map_at_1_diff1
value: 35.74
- type: nauc_map_at_3_max
value: 16.7865
- type: nauc_map_at_3_std
value: -12.6553
- type: nauc_map_at_3_diff1
value: 32.492900000000006
- type: nauc_map_at_5_max
value: 19.8343
- type: nauc_map_at_5_std
value: -9.8376
- type: nauc_map_at_5_diff1
value: 30.326700000000002
- type: nauc_map_at_10_max
value: 23.1297
- type: nauc_map_at_10_std
value: -4.3291
- type: nauc_map_at_10_diff1
value: 27.889799999999997
- type: nauc_map_at_20_max
value: 26.009300000000003
- type: nauc_map_at_20_std
value: 0.303
- type: nauc_map_at_20_diff1
value: 27.6099
- type: nauc_map_at_100_max
value: 29.181
- type: nauc_map_at_100_std
value: 6.575400000000001
- type: nauc_map_at_100_diff1
value: 26.7868
- type: nauc_map_at_1000_max
value: 30.7289
- type: nauc_map_at_1000_std
value: 10.2937
- type: nauc_map_at_1000_diff1
value: 26.212400000000002
- type: nauc_recall_at_1_max
value: 11.7018
- type: nauc_recall_at_1_std
value: -17.1298
- type: nauc_recall_at_1_diff1
value: 35.74
- type: nauc_recall_at_3_max
value: 16.7435
- type: nauc_recall_at_3_std
value: -10.9179
- type: nauc_recall_at_3_diff1
value: 32.0388
- type: nauc_recall_at_5_max
value: 19.1904
- type: nauc_recall_at_5_std
value: -8.4857
- type: nauc_recall_at_5_diff1
value: 28.8152
- type: nauc_recall_at_10_max
value: 18.8131
- type: nauc_recall_at_10_std
value: -4.1152
- type: nauc_recall_at_10_diff1
value: 22.1207
- type: nauc_recall_at_20_max
value: 21.6186
- type: nauc_recall_at_20_std
value: 1.9934
- type: nauc_recall_at_20_diff1
value: 23.9005
- type: nauc_recall_at_100_max
value: 23.977899999999998
- type: nauc_recall_at_100_std
value: 14.8828
- type: nauc_recall_at_100_diff1
value: 17.1315
- type: nauc_recall_at_1000_max
value: 22.2311
- type: nauc_recall_at_1000_std
value: 22.386200000000002
- type: nauc_recall_at_1000_diff1
value: 6.7295
- type: nauc_precision_at_1_max
value: 46.517199999999995
- type: nauc_precision_at_1_std
value: 23.0247
- type: nauc_precision_at_1_diff1
value: 32.6597
- type: nauc_precision_at_3_max
value: 45.5606
- type: nauc_precision_at_3_std
value: 30.495100000000004
- type: nauc_precision_at_3_diff1
value: 16.7253
- type: nauc_precision_at_5_max
value: 44.226
- type: nauc_precision_at_5_std
value: 32.8318
- type: nauc_precision_at_5_diff1
value: 11.7102
- type: nauc_precision_at_10_max
value: 39.396
- type: nauc_precision_at_10_std
value: 38.0743
- type: nauc_precision_at_10_diff1
value: 6.424199999999999
- type: nauc_precision_at_20_max
value: 35.2707
- type: nauc_precision_at_20_std
value: 40.2219
- type: nauc_precision_at_20_diff1
value: 5.2245
- type: nauc_precision_at_100_max
value: 26.052799999999998
- type: nauc_precision_at_100_std
value: 42.7801
- type: nauc_precision_at_100_diff1
value: -2.0803
- type: nauc_precision_at_1000_max
value: 12.784699999999999
- type: nauc_precision_at_1000_std
value: 31.784299999999998
- type: nauc_precision_at_1000_diff1
value: -8.5489
- type: nauc_mrr_at_1_max
value: 46.517199999999995
- type: nauc_mrr_at_1_std
value: 23.0247
- type: nauc_mrr_at_1_diff1
value: 32.6597
- type: nauc_mrr_at_3_max
value: 51.1949
- type: nauc_mrr_at_3_std
value: 28.8621
- type: nauc_mrr_at_3_diff1
value: 33.4315
- type: nauc_mrr_at_5_max
value: 51.6085
- type: nauc_mrr_at_5_std
value: 29.293200000000002
- type: nauc_mrr_at_5_diff1
value: 33.6288
- type: nauc_mrr_at_10_max
value: 52.2656
- type: nauc_mrr_at_10_std
value: 30.303200000000004
- type: nauc_mrr_at_10_diff1
value: 33.036
- type: nauc_mrr_at_20_max
value: 52.237
- type: nauc_mrr_at_20_std
value: 30.351899999999997
- type: nauc_mrr_at_20_diff1
value: 33.088899999999995
- type: nauc_mrr_at_100_max
value: 52.2787
- type: nauc_mrr_at_100_std
value: 30.377900000000004
- type: nauc_mrr_at_100_diff1
value: 33.083
- type: nauc_mrr_at_1000_max
value: 52.2464
- type: nauc_mrr_at_1000_std
value: 30.337799999999998
- type: nauc_mrr_at_1000_diff1
value: 33.0673
- type: main_score
value: 36.168
- task:
type: Retrieval
dataset:
name: MTEB NQ (default)
type: mteb/nq
config: default
split: test
revision: b774495ed302d8c44a3a7ea25c90dbce03968f31
metrics:
- type: ndcg_at_1
value: 32.677
- type: ndcg_at_3
value: 43.921
- type: ndcg_at_5
value: 48.459
- type: ndcg_at_10
value: 52.19200000000001
- type: ndcg_at_20
value: 54.294
- type: ndcg_at_100
value: 56.574000000000005
- type: ndcg_at_1000
value: 57.318000000000005
- type: map_at_1
value: 28.904000000000003
- type: map_at_3
value: 39.961
- type: map_at_5
value: 42.691
- type: map_at_10
value: 44.39
- type: map_at_20
value: 45.046
- type: map_at_100
value: 45.426
- type: map_at_1000
value: 45.461
- type: recall_at_1
value: 28.904000000000003
- type: recall_at_3
value: 52.105000000000004
- type: recall_at_5
value: 62.563
- type: recall_at_10
value: 73.443
- type: recall_at_20
value: 81.211
- type: recall_at_100
value: 92.50399999999999
- type: recall_at_1000
value: 97.941
- type: precision_at_1
value: 32.677
- type: precision_at_3
value: 20.268
- type: precision_at_5
value: 14.762
- type: precision_at_10
value: 8.746
- type: precision_at_20
value: 4.873
- type: precision_at_100
value: 1.121
- type: precision_at_1000
value: 0.11900000000000001
- type: mrr_at_1
value: 32.6767
- type: mrr_at_3
value: 43.2165
- type: mrr_at_5
value: 45.5108
- type: mrr_at_10
value: 46.8929
- type: mrr_at_20
value: 47.3935
- type: mrr_at_100
value: 47.6706
- type: mrr_at_1000
value: 47.6936
- type: nauc_ndcg_at_1_max
value: 26.243499999999997
- type: nauc_ndcg_at_1_std
value: 1.0733
- type: nauc_ndcg_at_1_diff1
value: 31.4245
- type: nauc_ndcg_at_3_max
value: 29.8653
- type: nauc_ndcg_at_3_std
value: 0.38
- type: nauc_ndcg_at_3_diff1
value: 28.706799999999998
- type: nauc_ndcg_at_5_max
value: 31.4938
- type: nauc_ndcg_at_5_std
value: 1.4593
- type: nauc_ndcg_at_5_diff1
value: 28.660000000000004
- type: nauc_ndcg_at_10_max
value: 33.379599999999996
- type: nauc_ndcg_at_10_std
value: 3.6429000000000005
- type: nauc_ndcg_at_10_diff1
value: 27.7694
- type: nauc_ndcg_at_20_max
value: 33.6801
- type: nauc_ndcg_at_20_std
value: 4.6917
- type: nauc_ndcg_at_20_diff1
value: 28.0358
- type: nauc_ndcg_at_100_max
value: 33.2398
- type: nauc_ndcg_at_100_std
value: 5.3007
- type: nauc_ndcg_at_100_diff1
value: 28.3204
- type: nauc_ndcg_at_1000_max
value: 32.4151
- type: nauc_ndcg_at_1000_std
value: 4.4551
- type: nauc_ndcg_at_1000_diff1
value: 28.3979
- type: nauc_map_at_1_max
value: 24.692
- type: nauc_map_at_1_std
value: -0.6727000000000001
- type: nauc_map_at_1_diff1
value: 31.570700000000002
- type: nauc_map_at_3_max
value: 28.7103
- type: nauc_map_at_3_std
value: -0.1295
- type: nauc_map_at_3_diff1
value: 29.3762
- type: nauc_map_at_5_max
value: 29.645100000000003
- type: nauc_map_at_5_std
value: 0.5023
- type: nauc_map_at_5_diff1
value: 29.371799999999997
- type: nauc_map_at_10_max
value: 30.498599999999996
- type: nauc_map_at_10_std
value: 1.4423
- type: nauc_map_at_10_diff1
value: 28.987099999999998
- type: nauc_map_at_20_max
value: 30.5749
- type: nauc_map_at_20_std
value: 1.7545000000000002
- type: nauc_map_at_20_diff1
value: 29.0642
- type: nauc_map_at_100_max
value: 30.506899999999998
- type: nauc_map_at_100_std
value: 1.8941
- type: nauc_map_at_100_diff1
value: 29.093400000000003
- type: nauc_map_at_1000_max
value: 30.4757
- type: nauc_map_at_1000_std
value: 1.8661
- type: nauc_map_at_1000_diff1
value: 29.0989
- type: nauc_recall_at_1_max
value: 24.692
- type: nauc_recall_at_1_std
value: -0.6727000000000001
- type: nauc_recall_at_1_diff1
value: 31.570700000000002
- type: nauc_recall_at_3_max
value: 31.5469
- type: nauc_recall_at_3_std
value: -0.1289
- type: nauc_recall_at_3_diff1
value: 26.3778
- type: nauc_recall_at_5_max
value: 35.625299999999996
- type: nauc_recall_at_5_std
value: 2.2864
- type: nauc_recall_at_5_diff1
value: 25.9116
- type: nauc_recall_at_10_max
value: 43.694100000000006
- type: nauc_recall_at_10_std
value: 10.399799999999999
- type: nauc_recall_at_10_diff1
value: 22.0504
- type: nauc_recall_at_20_max
value: 49.3132
- type: nauc_recall_at_20_std
value: 18.3764
- type: nauc_recall_at_20_diff1
value: 22.5169
- type: nauc_recall_at_100_max
value: 63.3036
- type: nauc_recall_at_100_std
value: 43.9544
- type: nauc_recall_at_100_diff1
value: 22.3844
- type: nauc_recall_at_1000_max
value: 71.2236
- type: nauc_recall_at_1000_std
value: 65.73740000000001
- type: nauc_recall_at_1000_diff1
value: 15.8222
- type: nauc_precision_at_1_max
value: 26.243499999999997
- type: nauc_precision_at_1_std
value: 1.0733
- type: nauc_precision_at_1_diff1
value: 31.4245
- type: nauc_precision_at_3_max
value: 30.7195
- type: nauc_precision_at_3_std
value: 3.5707999999999998
- type: nauc_precision_at_3_diff1
value: 22.1868
- type: nauc_precision_at_5_max
value: 31.107699999999998
- type: nauc_precision_at_5_std
value: 6.402900000000001
- type: nauc_precision_at_5_diff1
value: 18.8022
- type: nauc_precision_at_10_max
value: 31.1066
- type: nauc_precision_at_10_std
value: 12.9737
- type: nauc_precision_at_10_diff1
value: 11.6843
- type: nauc_precision_at_20_max
value: 28.1126
- type: nauc_precision_at_20_std
value: 17.3827
- type: nauc_precision_at_20_diff1
value: 8.0096
- type: nauc_precision_at_100_max
value: 17.5032
- type: nauc_precision_at_100_std
value: 21.9705
- type: nauc_precision_at_100_diff1
value: -0.33
- type: nauc_precision_at_1000_max
value: 6.0157
- type: nauc_precision_at_1000_std
value: 15.8443
- type: nauc_precision_at_1000_diff1
value: -5.3555
- type: nauc_mrr_at_1_max
value: 26.243499999999997
- type: nauc_mrr_at_1_std
value: 1.0733
- type: nauc_mrr_at_1_diff1
value: 31.4245
- type: nauc_mrr_at_3_max
value: 29.3899
- type: nauc_mrr_at_3_std
value: 1.8917
- type: nauc_mrr_at_3_diff1
value: 28.9903
- type: nauc_mrr_at_5_max
value: 30.105700000000002
- type: nauc_mrr_at_5_std
value: 2.4156
- type: nauc_mrr_at_5_diff1
value: 28.927500000000002
- type: nauc_mrr_at_10_max
value: 30.585600000000003
- type: nauc_mrr_at_10_std
value: 3.0894
- type: nauc_mrr_at_10_diff1
value: 28.6339
- type: nauc_mrr_at_20_max
value: 30.5819
- type: nauc_mrr_at_20_std
value: 3.2848
- type: nauc_mrr_at_20_diff1
value: 28.710599999999996
- type: nauc_mrr_at_100_max
value: 30.505900000000004
- type: nauc_mrr_at_100_std
value: 3.2804
- type: nauc_mrr_at_100_diff1
value: 28.7829
- type: nauc_mrr_at_1000_max
value: 30.479200000000002
- type: nauc_mrr_at_1000_std
value: 3.2541
- type: nauc_mrr_at_1000_diff1
value: 28.7883
- type: main_score
value: 52.19200000000001
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval (default)
type: mteb/quora
config: default
split: test
revision: e4e08e0b7dbe3c8700f0daef558ff32256715259
metrics:
- type: ndcg_at_1
value: 80.72
- type: ndcg_at_3
value: 84.799
- type: ndcg_at_5
value: 86.5
- type: ndcg_at_10
value: 87.774
- type: ndcg_at_20
value: 88.492
- type: ndcg_at_100
value: 89.079
- type: ndcg_at_1000
value: 89.188
- type: map_at_1
value: 70.122
- type: map_at_3
value: 80.94200000000001
- type: map_at_5
value: 82.892
- type: map_at_10
value: 83.966
- type: map_at_20
value: 84.397
- type: map_at_100
value: 84.617
- type: map_at_1000
value: 84.636
- type: recall_at_1
value: 70.122
- type: recall_at_3
value: 86.518
- type: recall_at_5
value: 91.25999999999999
- type: recall_at_10
value: 95.029
- type: recall_at_20
value: 97.356
- type: recall_at_100
value: 99.50099999999999
- type: recall_at_1000
value: 99.981
- type: precision_at_1
value: 80.72
- type: precision_at_3
value: 37.033
- type: precision_at_5
value: 24.448
- type: precision_at_10
value: 13.313
- type: precision_at_20
value: 7.073
- type: precision_at_100
value: 1.525
- type: precision_at_1000
value: 0.157
- type: mrr_at_1
value: 80.74
- type: mrr_at_3
value: 85.9867
- type: mrr_at_5
value: 86.6992
- type: mrr_at_10
value: 87.01050000000001
- type: mrr_at_20
value: 87.10419999999999
- type: mrr_at_100
value: 87.1308
- type: mrr_at_1000
value: 87.13149999999999
- type: nauc_ndcg_at_1_max
value: 39.4296
- type: nauc_ndcg_at_1_std
value: -37.2067
- type: nauc_ndcg_at_1_diff1
value: 77.592
- type: nauc_ndcg_at_3_max
value: 35.3634
- type: nauc_ndcg_at_3_std
value: -42.268
- type: nauc_ndcg_at_3_diff1
value: 75.5758
- type: nauc_ndcg_at_5_max
value: 35.8489
- type: nauc_ndcg_at_5_std
value: -43.7405
- type: nauc_ndcg_at_5_diff1
value: 76.2101
- type: nauc_ndcg_at_10_max
value: 36.5492
- type: nauc_ndcg_at_10_std
value: -43.2419
- type: nauc_ndcg_at_10_diff1
value: 76.3768
- type: nauc_ndcg_at_20_max
value: 37.2184
- type: nauc_ndcg_at_20_std
value: -41.9107
- type: nauc_ndcg_at_20_diff1
value: 76.3328
- type: nauc_ndcg_at_100_max
value: 37.4681
- type: nauc_ndcg_at_100_std
value: -40.3898
- type: nauc_ndcg_at_100_diff1
value: 76.2178
- type: nauc_ndcg_at_1000_max
value: 37.5719
- type: nauc_ndcg_at_1000_std
value: -40.1922
- type: nauc_ndcg_at_1000_diff1
value: 76.213
- type: nauc_map_at_1_max
value: 26.8334
- type: nauc_map_at_1_std
value: -39.3359
- type: nauc_map_at_1_diff1
value: 80.2704
- type: nauc_map_at_3_max
value: 32.5583
- type: nauc_map_at_3_std
value: -44.9227
- type: nauc_map_at_3_diff1
value: 77.0651
- type: nauc_map_at_5_max
value: 34.248200000000004
- type: nauc_map_at_5_std
value: -44.9763
- type: nauc_map_at_5_diff1
value: 76.9915
- type: nauc_map_at_10_max
value: 35.2645
- type: nauc_map_at_10_std
value: -43.964
- type: nauc_map_at_10_diff1
value: 76.75
- type: nauc_map_at_20_max
value: 35.6828
- type: nauc_map_at_20_std
value: -42.9159
- type: nauc_map_at_20_diff1
value: 76.5704
- type: nauc_map_at_100_max
value: 35.7883
- type: nauc_map_at_100_std
value: -42.2245
- type: nauc_map_at_100_diff1
value: 76.47739999999999
- type: nauc_map_at_1000_max
value: 35.8215
- type: nauc_map_at_1000_std
value: -42.1751
- type: nauc_map_at_1000_diff1
value: 76.4742
- type: nauc_recall_at_1_max
value: 26.8334
- type: nauc_recall_at_1_std
value: -39.3359
- type: nauc_recall_at_1_diff1
value: 80.2704
- type: nauc_recall_at_3_max
value: 28.5888
- type: nauc_recall_at_3_std
value: -49.1538
- type: nauc_recall_at_3_diff1
value: 72.9953
- type: nauc_recall_at_5_max
value: 29.9277
- type: nauc_recall_at_5_std
value: -54.6965
- type: nauc_recall_at_5_diff1
value: 72.3317
- type: nauc_recall_at_10_max
value: 31.9235
- type: nauc_recall_at_10_std
value: -56.9474
- type: nauc_recall_at_10_diff1
value: 72.3197
- type: nauc_recall_at_20_max
value: 35.3429
- type: nauc_recall_at_20_std
value: -52.6226
- type: nauc_recall_at_20_diff1
value: 72.5483
- type: nauc_recall_at_100_max
value: 35.1811
- type: nauc_recall_at_100_std
value: -36.578500000000005
- type: nauc_recall_at_100_diff1
value: 69.4611
- type: nauc_recall_at_1000_max
value: 7.5347
- type: nauc_recall_at_1000_std
value: 19.7823
- type: nauc_recall_at_1000_diff1
value: 52.217400000000005
- type: nauc_precision_at_1_max
value: 39.4296
- type: nauc_precision_at_1_std
value: -37.2067
- type: nauc_precision_at_1_diff1
value: 77.592
- type: nauc_precision_at_3_max
value: 11.0296
- type: nauc_precision_at_3_std
value: 4.4478
- type: nauc_precision_at_3_diff1
value: -16.0148
- type: nauc_precision_at_5_max
value: 5.6739999999999995
- type: nauc_precision_at_5_std
value: 14.811
- type: nauc_precision_at_5_diff1
value: -29.308400000000002
- type: nauc_precision_at_10_max
value: 1.5417999999999998
- type: nauc_precision_at_10_std
value: 24.002299999999998
- type: nauc_precision_at_10_diff1
value: -37.5572
- type: nauc_precision_at_20_max
value: -0.7968
- type: nauc_precision_at_20_std
value: 30.3741
- type: nauc_precision_at_20_diff1
value: -41.3475
- type: nauc_precision_at_100_max
value: -3.5911999999999997
- type: nauc_precision_at_100_std
value: 36.186099999999996
- type: nauc_precision_at_100_diff1
value: -43.8219
- type: nauc_precision_at_1000_max
value: -3.7081999999999997
- type: nauc_precision_at_1000_std
value: 37.4237
- type: nauc_precision_at_1000_diff1
value: -44.0968
- type: nauc_mrr_at_1_max
value: 39.556799999999996
- type: nauc_mrr_at_1_std
value: -37.2311
- type: nauc_mrr_at_1_diff1
value: 77.5559
- type: nauc_mrr_at_3_max
value: 39.1982
- type: nauc_mrr_at_3_std
value: -38.8782
- type: nauc_mrr_at_3_diff1
value: 76.4216
- type: nauc_mrr_at_5_max
value: 39.4401
- type: nauc_mrr_at_5_std
value: -39.0877
- type: nauc_mrr_at_5_diff1
value: 76.6241
- type: nauc_mrr_at_10_max
value: 39.4302
- type: nauc_mrr_at_10_std
value: -38.798500000000004
- type: nauc_mrr_at_10_diff1
value: 76.69930000000001
- type: nauc_mrr_at_20_max
value: 39.4583
- type: nauc_mrr_at_20_std
value: -38.6556
- type: nauc_mrr_at_20_diff1
value: 76.7297
- type: nauc_mrr_at_100_max
value: 39.434799999999996
- type: nauc_mrr_at_100_std
value: -38.647999999999996
- type: nauc_mrr_at_100_diff1
value: 76.7332
- type: nauc_mrr_at_1000_max
value: 39.433299999999996
- type: nauc_mrr_at_1000_std
value: -38.6493
- type: nauc_mrr_at_1000_diff1
value: 76.7332
- type: main_score
value: 87.774
- task:
type: Clustering
dataset:
name: MTEB RedditClustering (default)
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 56.1172
- type: v_measure_std
value: 4.1773
- type: main_score
value: 56.1172
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P (default)
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 385e3cb46b4cfa89021f56c4380204149d0efe33
metrics:
- type: v_measure
value: 65.1415
- type: v_measure_std
value: 12.804499999999999
- type: main_score
value: 65.1415
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS (default)
type: mteb/scidocs
config: default
split: test
revision: f8c2fcf00f625baaa80f62ec5bd9e1fff3b8ae88
metrics:
- type: ndcg_at_1
value: 24.3
- type: ndcg_at_3
value: 20.275000000000002
- type: ndcg_at_5
value: 17.712
- type: ndcg_at_10
value: 21.317
- type: ndcg_at_20
value: 24.355
- type: ndcg_at_100
value: 30.115
- type: ndcg_at_1000
value: 35.803000000000004
- type: map_at_1
value: 4.928
- type: map_at_3
value: 9.15
- type: map_at_5
value: 10.914
- type: map_at_10
value: 12.808
- type: map_at_20
value: 13.968
- type: map_at_100
value: 15.113999999999999
- type: map_at_1000
value: 15.463
- type: recall_at_1
value: 4.928
- type: recall_at_3
value: 11.643
- type: recall_at_5
value: 15.873000000000001
- type: recall_at_10
value: 22.472
- type: recall_at_20
value: 29.612
- type: recall_at_100
value: 48.323
- type: recall_at_1000
value: 75.98
- type: precision_at_1
value: 24.3
- type: precision_at_3
value: 19.167
- type: precision_at_5
value: 15.659999999999998
- type: precision_at_10
value: 11.08
- type: precision_at_20
value: 7.3
- type: precision_at_100
value: 2.382
- type: precision_at_1000
value: 0.374
- type: mrr_at_1
value: 24.3
- type: mrr_at_3
value: 32.2
- type: mrr_at_5
value: 34.25
- type: mrr_at_10
value: 35.6749
- type: mrr_at_20
value: 36.326
- type: mrr_at_100
value: 36.804199999999994
- type: mrr_at_1000
value: 36.850899999999996
- type: nauc_ndcg_at_1_max
value: 24.935499999999998
- type: nauc_ndcg_at_1_std
value: 8.174299999999999
- type: nauc_ndcg_at_1_diff1
value: 20.6462
- type: nauc_ndcg_at_3_max
value: 26.8438
- type: nauc_ndcg_at_3_std
value: 12.5593
- type: nauc_ndcg_at_3_diff1
value: 16.1086
- type: nauc_ndcg_at_5_max
value: 28.050199999999997
- type: nauc_ndcg_at_5_std
value: 15.6557
- type: nauc_ndcg_at_5_diff1
value: 14.2624
- type: nauc_ndcg_at_10_max
value: 29.183999999999997
- type: nauc_ndcg_at_10_std
value: 18.8626
- type: nauc_ndcg_at_10_diff1
value: 12.1979
- type: nauc_ndcg_at_20_max
value: 30.128
- type: nauc_ndcg_at_20_std
value: 22.264400000000002
- type: nauc_ndcg_at_20_diff1
value: 12.0184
- type: nauc_ndcg_at_100_max
value: 31.5267
- type: nauc_ndcg_at_100_std
value: 27.067000000000004
- type: nauc_ndcg_at_100_diff1
value: 12.964500000000001
- type: nauc_ndcg_at_1000_max
value: 31.4219
- type: nauc_ndcg_at_1000_std
value: 26.9349
- type: nauc_ndcg_at_1000_diff1
value: 13.322600000000001
- type: nauc_map_at_1_max
value: 24.7756
- type: nauc_map_at_1_std
value: 8.1475
- type: nauc_map_at_1_diff1
value: 20.5305
- type: nauc_map_at_3_max
value: 27.1559
- type: nauc_map_at_3_std
value: 10.626199999999999
- type: nauc_map_at_3_diff1
value: 17.2136
- type: nauc_map_at_5_max
value: 28.149800000000003
- type: nauc_map_at_5_std
value: 13.549800000000001
- type: nauc_map_at_5_diff1
value: 14.9097
- type: nauc_map_at_10_max
value: 29.041299999999996
- type: nauc_map_at_10_std
value: 16.6128
- type: nauc_map_at_10_diff1
value: 13.232199999999999
- type: nauc_map_at_20_max
value: 29.8518
- type: nauc_map_at_20_std
value: 18.9557
- type: nauc_map_at_20_diff1
value: 13.1546
- type: nauc_map_at_100_max
value: 30.426399999999997
- type: nauc_map_at_100_std
value: 20.7314
- type: nauc_map_at_100_diff1
value: 13.3874
- type: nauc_map_at_1000_max
value: 30.4659
- type: nauc_map_at_1000_std
value: 20.938200000000002
- type: nauc_map_at_1000_diff1
value: 13.3881
- type: nauc_recall_at_1_max
value: 24.7756
- type: nauc_recall_at_1_std
value: 8.1475
- type: nauc_recall_at_1_diff1
value: 20.5305
- type: nauc_recall_at_3_max
value: 27.020300000000002
- type: nauc_recall_at_3_std
value: 13.8467
- type: nauc_recall_at_3_diff1
value: 13.849400000000001
- type: nauc_recall_at_5_max
value: 27.685
- type: nauc_recall_at_5_std
value: 18.287300000000002
- type: nauc_recall_at_5_diff1
value: 10.401
- type: nauc_recall_at_10_max
value: 28.5451
- type: nauc_recall_at_10_std
value: 23.0846
- type: nauc_recall_at_10_diff1
value: 6.751500000000001
- type: nauc_recall_at_20_max
value: 28.4084
- type: nauc_recall_at_20_std
value: 28.245700000000003
- type: nauc_recall_at_20_diff1
value: 6.3271
- type: nauc_recall_at_100_max
value: 28.331200000000003
- type: nauc_recall_at_100_std
value: 37.9775
- type: nauc_recall_at_100_diff1
value: 7.408399999999999
- type: nauc_recall_at_1000_max
value: 24.3488
- type: nauc_recall_at_1000_std
value: 38.596799999999995
- type: nauc_recall_at_1000_diff1
value: 6.427099999999999
- type: nauc_precision_at_1_max
value: 24.935499999999998
- type: nauc_precision_at_1_std
value: 8.174299999999999
- type: nauc_precision_at_1_diff1
value: 20.6462
- type: nauc_precision_at_3_max
value: 27.107300000000002
- type: nauc_precision_at_3_std
value: 13.9846
- type: nauc_precision_at_3_diff1
value: 14.025199999999998
- type: nauc_precision_at_5_max
value: 27.940199999999997
- type: nauc_precision_at_5_std
value: 18.523500000000002
- type: nauc_precision_at_5_diff1
value: 10.6452
- type: nauc_precision_at_10_max
value: 28.9679
- type: nauc_precision_at_10_std
value: 23.2788
- type: nauc_precision_at_10_diff1
value: 7.0396
- type: nauc_precision_at_20_max
value: 28.799200000000003
- type: nauc_precision_at_20_std
value: 28.2269
- type: nauc_precision_at_20_diff1
value: 6.6255999999999995
- type: nauc_precision_at_100_max
value: 28.6629
- type: nauc_precision_at_100_std
value: 37.5551
- type: nauc_precision_at_100_diff1
value: 7.858999999999999
- type: nauc_precision_at_1000_max
value: 25.0545
- type: nauc_precision_at_1000_std
value: 37.301899999999996
- type: nauc_precision_at_1000_diff1
value: 7.5589
- type: nauc_mrr_at_1_max
value: 24.935499999999998
- type: nauc_mrr_at_1_std
value: 8.174299999999999
- type: nauc_mrr_at_1_diff1
value: 20.6462
- type: nauc_mrr_at_3_max
value: 26.037
- type: nauc_mrr_at_3_std
value: 13.3379
- type: nauc_mrr_at_3_diff1
value: 16.713
- type: nauc_mrr_at_5_max
value: 26.512200000000004
- type: nauc_mrr_at_5_std
value: 14.1804
- type: nauc_mrr_at_5_diff1
value: 17.1186
- type: nauc_mrr_at_10_max
value: 26.7938
- type: nauc_mrr_at_10_std
value: 14.458699999999999
- type: nauc_mrr_at_10_diff1
value: 16.531299999999998
- type: nauc_mrr_at_20_max
value: 26.628
- type: nauc_mrr_at_20_std
value: 14.593200000000001
- type: nauc_mrr_at_20_diff1
value: 16.4492
- type: nauc_mrr_at_100_max
value: 26.6627
- type: nauc_mrr_at_100_std
value: 14.5648
- type: nauc_mrr_at_100_diff1
value: 16.614
- type: nauc_mrr_at_1000_max
value: 26.6506
- type: nauc_mrr_at_1000_std
value: 14.5124
- type: nauc_mrr_at_1000_diff1
value: 16.6315
- type: main_score
value: 21.317
- task:
type: STS
dataset:
name: MTEB SICK-R (default)
type: mteb/sickr-sts
config: default
split: test
revision: 20a6d6f312dd54037fe07a32d58e5e168867909d
metrics:
- type: pearson
value: 85.2236
- type: spearman
value: 79.55109999999999
- type: cosine_pearson
value: 85.2236
- type: cosine_spearman
value: 79.55109999999999
- type: manhattan_pearson
value: 82.441
- type: manhattan_spearman
value: 79.65100000000001
- type: euclidean_pearson
value: 82.461
- type: euclidean_spearman
value: 79.58460000000001
- type: main_score
value: 79.55109999999999
- task:
type: STS
dataset:
name: MTEB STS12 (default)
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: pearson
value: 82.6754
- type: spearman
value: 74.3345
- type: cosine_pearson
value: 82.6754
- type: cosine_spearman
value: 74.3336
- type: manhattan_pearson
value: 78.64330000000001
- type: manhattan_spearman
value: 74.23769999999999
- type: euclidean_pearson
value: 78.589
- type: euclidean_spearman
value: 74.1178
- type: main_score
value: 74.3336
- task:
type: STS
dataset:
name: MTEB STS13 (default)
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: pearson
value: 82.4438
- type: spearman
value: 83.5213
- type: cosine_pearson
value: 82.4438
- type: cosine_spearman
value: 83.5213
- type: manhattan_pearson
value: 83.1769
- type: manhattan_spearman
value: 83.29039999999999
- type: euclidean_pearson
value: 83.0053
- type: euclidean_spearman
value: 83.1047
- type: main_score
value: 83.5213
- task:
type: STS
dataset:
name: MTEB STS14 (default)
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: pearson
value: 82.8843
- type: spearman
value: 80.6162
- type: cosine_pearson
value: 82.8843
- type: cosine_spearman
value: 80.6161
- type: manhattan_pearson
value: 82.446
- type: manhattan_spearman
value: 80.5926
- type: euclidean_pearson
value: 82.33840000000001
- type: euclidean_spearman
value: 80.4619
- type: main_score
value: 80.6161
- task:
type: STS
dataset:
name: MTEB STS15 (default)
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: pearson
value: 85.7467
- type: spearman
value: 86.85690000000001
- type: cosine_pearson
value: 85.7467
- type: cosine_spearman
value: 86.85690000000001
- type: manhattan_pearson
value: 86.469
- type: manhattan_spearman
value: 86.751
- type: euclidean_pearson
value: 86.4531
- type: euclidean_spearman
value: 86.7053
- type: main_score
value: 86.85690000000001
- task:
type: STS
dataset:
name: MTEB STS16 (default)
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: pearson
value: 84.143
- type: spearman
value: 85.4366
- type: cosine_pearson
value: 84.143
- type: cosine_spearman
value: 85.4367
- type: manhattan_pearson
value: 84.6762
- type: manhattan_spearman
value: 85.1846
- type: euclidean_pearson
value: 84.6233
- type: euclidean_spearman
value: 85.1252
- type: main_score
value: 85.4367
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: faeb762787bd10488a50c8b5be4a3b82e411949c
metrics:
- type: pearson
value: 89.10820000000001
- type: spearman
value: 89.7621
- type: cosine_pearson
value: 89.10820000000001
- type: cosine_spearman
value: 89.7621
- type: manhattan_pearson
value: 89.3624
- type: manhattan_spearman
value: 89.6515
- type: euclidean_pearson
value: 89.3729
- type: euclidean_spearman
value: 89.6836
- type: main_score
value: 89.7621
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3
metrics:
- type: pearson
value: 68.44019999999999
- type: spearman
value: 68.3308
- type: cosine_pearson
value: 68.4403
- type: cosine_spearman
value: 68.3308
- type: manhattan_pearson
value: 69.8199
- type: manhattan_spearman
value: 68.754
- type: euclidean_pearson
value: 69.5629
- type: euclidean_spearman
value: 68.55630000000001
- type: main_score
value: 68.3308
- task:
type: STS
dataset:
name: MTEB STSBenchmark (default)
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: pearson
value: 86.0748
- type: spearman
value: 86.2486
- type: cosine_pearson
value: 86.0748
- type: cosine_spearman
value: 86.2486
- type: manhattan_pearson
value: 86.2053
- type: manhattan_spearman
value: 86.0544
- type: euclidean_pearson
value: 86.1504
- type: euclidean_spearman
value: 85.985
- type: main_score
value: 86.2486
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR (default)
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 84.4152
- type: mrr
value: 95.4755
- type: nAUC_map_max
value: 55.565799999999996
- type: nAUC_map_std
value: 71.4307
- type: nAUC_map_diff1
value: 1.0619
- type: nAUC_mrr_max
value: 87.11959999999999
- type: nAUC_mrr_std
value: 82.7146
- type: nAUC_mrr_diff1
value: 44.2384
- type: main_score
value: 84.4152
- task:
type: Retrieval
dataset:
name: MTEB SciFact (default)
type: mteb/scifact
config: default
split: test
revision: 0228b52cf27578f30900b9e5271d331663a030d7
metrics:
- type: ndcg_at_1
value: 62.333000000000006
- type: ndcg_at_3
value: 67.297
- type: ndcg_at_5
value: 69.24900000000001
- type: ndcg_at_10
value: 72.63199999999999
- type: ndcg_at_20
value: 73.6
- type: ndcg_at_100
value: 74.888
- type: ndcg_at_1000
value: 75.452
- type: map_at_1
value: 59.733000000000004
- type: map_at_3
value: 65.194
- type: map_at_5
value: 66.616
- type: map_at_10
value: 68.158
- type: map_at_20
value: 68.464
- type: map_at_100
value: 68.685
- type: map_at_1000
value: 68.704
- type: recall_at_1
value: 59.733000000000004
- type: recall_at_3
value: 70.61699999999999
- type: recall_at_5
value: 75.5
- type: recall_at_10
value: 85.51100000000001
- type: recall_at_20
value: 89.133
- type: recall_at_100
value: 95.5
- type: recall_at_1000
value: 100
- type: precision_at_1
value: 62.333000000000006
- type: precision_at_3
value: 25.667
- type: precision_at_5
value: 16.8
- type: precision_at_10
value: 9.633
- type: precision_at_20
value: 5.050000000000001
- type: precision_at_100
value: 1.083
- type: precision_at_1000
value: 0.11299999999999999
- type: mrr_at_1
value: 62.3333
- type: mrr_at_3
value: 67.11110000000001
- type: mrr_at_5
value: 68.0111
- type: mrr_at_10
value: 69.243
- type: mrr_at_20
value: 69.4702
- type: mrr_at_100
value: 69.661
- type: mrr_at_1000
value: 69.6798
- type: nauc_ndcg_at_1_max
value: 54.49850000000001
- type: nauc_ndcg_at_1_std
value: 7.1119
- type: nauc_ndcg_at_1_diff1
value: 66.1422
- type: nauc_ndcg_at_3_max
value: 56.4064
- type: nauc_ndcg_at_3_std
value: 5.8438
- type: nauc_ndcg_at_3_diff1
value: 63.8497
- type: nauc_ndcg_at_5_max
value: 57.1304
- type: nauc_ndcg_at_5_std
value: 7.220600000000001
- type: nauc_ndcg_at_5_diff1
value: 63.31250000000001
- type: nauc_ndcg_at_10_max
value: 55.722300000000004
- type: nauc_ndcg_at_10_std
value: 9.1649
- type: nauc_ndcg_at_10_diff1
value: 60.6626
- type: nauc_ndcg_at_20_max
value: 55.945299999999996
- type: nauc_ndcg_at_20_std
value: 9.843200000000001
- type: nauc_ndcg_at_20_diff1
value: 60.113099999999996
- type: nauc_ndcg_at_100_max
value: 56.1439
- type: nauc_ndcg_at_100_std
value: 10.7972
- type: nauc_ndcg_at_100_diff1
value: 60.854200000000006
- type: nauc_ndcg_at_1000_max
value: 56.2434
- type: nauc_ndcg_at_1000_std
value: 9.697899999999999
- type: nauc_ndcg_at_1000_diff1
value: 61.719
- type: nauc_map_at_1_max
value: 50.9749
- type: nauc_map_at_1_std
value: 3.6609999999999996
- type: nauc_map_at_1_diff1
value: 67.3719
- type: nauc_map_at_3_max
value: 54.31009999999999
- type: nauc_map_at_3_std
value: 4.3235
- type: nauc_map_at_3_diff1
value: 64.8984
- type: nauc_map_at_5_max
value: 55.328599999999994
- type: nauc_map_at_5_std
value: 5.959099999999999
- type: nauc_map_at_5_diff1
value: 64.0263
- type: nauc_map_at_10_max
value: 55.1577
- type: nauc_map_at_10_std
value: 7.147399999999999
- type: nauc_map_at_10_diff1
value: 62.943000000000005
- type: nauc_map_at_20_max
value: 55.367900000000006
- type: nauc_map_at_20_std
value: 7.5756000000000006
- type: nauc_map_at_20_diff1
value: 62.7916
- type: nauc_map_at_100_max
value: 55.3427
- type: nauc_map_at_100_std
value: 7.645499999999999
- type: nauc_map_at_100_diff1
value: 62.89940000000001
- type: nauc_map_at_1000_max
value: 55.35020000000001
- type: nauc_map_at_1000_std
value: 7.6221
- type: nauc_map_at_1000_diff1
value: 62.930299999999995
- type: nauc_recall_at_1_max
value: 50.9749
- type: nauc_recall_at_1_std
value: 3.6609999999999996
- type: nauc_recall_at_1_diff1
value: 67.3719
- type: nauc_recall_at_3_max
value: 56.184
- type: nauc_recall_at_3_std
value: 1.8581
- type: nauc_recall_at_3_diff1
value: 63.3395
- type: nauc_recall_at_5_max
value: 59.3021
- type: nauc_recall_at_5_std
value: 6.4342
- type: nauc_recall_at_5_diff1
value: 61.8267
- type: nauc_recall_at_10_max
value: 51.8488
- type: nauc_recall_at_10_std
value: 13.397
- type: nauc_recall_at_10_diff1
value: 47.7098
- type: nauc_recall_at_20_max
value: 51.607499999999995
- type: nauc_recall_at_20_std
value: 17.8583
- type: nauc_recall_at_20_diff1
value: 40.2701
- type: nauc_recall_at_100_max
value: 54.1844
- type: nauc_recall_at_100_std
value: 53.411500000000004
- type: nauc_recall_at_100_diff1
value: 31.0043
- type: nauc_recall_at_1000_max
- type: nauc_recall_at_1000_std
- type: nauc_recall_at_1000_diff1
- type: nauc_precision_at_1_max
value: 54.49850000000001
- type: nauc_precision_at_1_std
value: 7.1119
- type: nauc_precision_at_1_diff1
value: 66.1422
- type: nauc_precision_at_3_max
value: 52.115
- type: nauc_precision_at_3_std
value: 16.1809
- type: nauc_precision_at_3_diff1
value: 41.6736
- type: nauc_precision_at_5_max
value: 46.3365
- type: nauc_precision_at_5_std
value: 22.7022
- type: nauc_precision_at_5_diff1
value: 25.564500000000002
- type: nauc_precision_at_10_max
value: 31.7504
- type: nauc_precision_at_10_std
value: 31.063499999999998
- type: nauc_precision_at_10_diff1
value: 0.61
- type: nauc_precision_at_20_max
value: 27.0162
- type: nauc_precision_at_20_std
value: 35.5844
- type: nauc_precision_at_20_diff1
value: -8.559899999999999
- type: nauc_precision_at_100_max
value: 17.9369
- type: nauc_precision_at_100_std
value: 45.360299999999995
- type: nauc_precision_at_100_diff1
value: -21.3734
- type: nauc_precision_at_1000_max
value: 9.6015
- type: nauc_precision_at_1000_std
value: 41.6207
- type: nauc_precision_at_1000_diff1
value: -31.4964
- type: nauc_mrr_at_1_max
value: 54.49850000000001
- type: nauc_mrr_at_1_std
value: 7.1119
- type: nauc_mrr_at_1_diff1
value: 66.1422
- type: nauc_mrr_at_3_max
value: 57.52589999999999
- type: nauc_mrr_at_3_std
value: 8.605400000000001
- type: nauc_mrr_at_3_diff1
value: 63.4207
- type: nauc_mrr_at_5_max
value: 57.809900000000006
- type: nauc_mrr_at_5_std
value: 9.2631
- type: nauc_mrr_at_5_diff1
value: 63.4016
- type: nauc_mrr_at_10_max
value: 57.02199999999999
- type: nauc_mrr_at_10_std
value: 9.542100000000001
- type: nauc_mrr_at_10_diff1
value: 62.527
- type: nauc_mrr_at_20_max
value: 56.942800000000005
- type: nauc_mrr_at_20_std
value: 9.3838
- type: nauc_mrr_at_20_diff1
value: 62.3991
- type: nauc_mrr_at_100_max
value: 56.9339
- type: nauc_mrr_at_100_std
value: 9.4351
- type: nauc_mrr_at_100_diff1
value: 62.5023
- type: nauc_mrr_at_1000_max
value: 56.943200000000004
- type: nauc_mrr_at_1000_std
value: 9.4141
- type: nauc_mrr_at_1000_diff1
value: 62.5335
- type: main_score
value: 72.63199999999999
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions (default)
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: similarity_accuracy
value: 99.801
- type: similarity_accuracy_threshold
value: 79.4678
- type: similarity_f1
value: 89.64450000000001
- type: similarity_f1_threshold
value: 79.4678
- type: similarity_precision
value: 92.4548
- type: similarity_recall
value: 87
- type: similarity_ap
value: 95.45320000000001
- type: cosine_accuracy
value: 99.801
- type: cosine_accuracy_threshold
value: 79.4678
- type: cosine_f1
value: 89.64450000000001
- type: cosine_f1_threshold
value: 79.4678
- type: cosine_precision
value: 92.4548
- type: cosine_recall
value: 87
- type: cosine_ap
value: 95.45320000000001
- type: manhattan_accuracy
value: 99.798
- type: manhattan_accuracy_threshold
value: 23357.8888
- type: manhattan_f1
value: 89.4737
- type: manhattan_f1_threshold
value: 23387.7502
- type: manhattan_precision
value: 92.4307
- type: manhattan_recall
value: 86.7
- type: manhattan_ap
value: 95.4849
- type: euclidean_accuracy
value: 99.795
- type: euclidean_accuracy_threshold
value: 1058.165
- type: euclidean_f1
value: 89.35300000000001
- type: euclidean_f1_threshold
value: 1085.2129
- type: euclidean_precision
value: 91.0696
- type: euclidean_recall
value: 87.7
- type: euclidean_ap
value: 95.4203
- type: dot_accuracy
value: 99.79599999999999
- type: dot_accuracy_threshold
value: 22427.3651
- type: dot_f1
value: 89.6311
- type: dot_f1_threshold
value: 21924.3118
- type: dot_precision
value: 89.3638
- type: dot_recall
value: 89.9
- type: dot_ap
value: 95.2676
- type: max_accuracy
value: 99.801
- type: max_f1
value: 89.64450000000001
- type: max_precision
value: 92.4548
- type: max_recall
value: 89.9
- type: max_ap
value: 95.4849
- type: main_score
value: 95.4849
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering (default)
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 66.74849999999999
- type: v_measure_std
value: 5.3791
- type: main_score
value: 66.74849999999999
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P (default)
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 36.4297
- type: v_measure_std
value: 1.4817
- type: main_score
value: 36.4297
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions (default)
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 49.791000000000004
- type: mrr
value: 50.559200000000004
- type: nAUC_map_max
value: 12.2774
- type: nAUC_map_std
value: 8.179599999999999
- type: nAUC_map_diff1
value: 34.4755
- type: nAUC_mrr_max
value: 12.5622
- type: nAUC_mrr_std
value: 8.7019
- type: nAUC_mrr_diff1
value: 34.4394
- type: main_score
value: 49.791000000000004
- task:
type: Summarization
dataset:
name: MTEB SummEval (default)
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: pearson
value: 29.8817
- type: spearman
value: 30.4779
- type: cosine_spearman
value: 30.4779
- type: cosine_pearson
value: 29.8817
- type: dot_spearman
value: 29.801499999999997
- type: dot_pearson
value: 30.2519
- type: main_score
value: 30.4779
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID (default)
type: mteb/trec-covid
config: default
split: test
revision: bb9466bac8153a0349341eb1b22e06409e78ef4e
metrics:
- type: ndcg_at_1
value: 83
- type: ndcg_at_3
value: 79.061
- type: ndcg_at_5
value: 74.477
- type: ndcg_at_10
value: 74.02499999999999
- type: ndcg_at_20
value: 71.031
- type: ndcg_at_100
value: 56.143
- type: ndcg_at_1000
value: 52.11
- type: map_at_1
value: 0.243
- type: map_at_3
value: 0.642
- type: map_at_5
value: 0.997
- type: map_at_10
value: 1.848
- type: map_at_20
value: 3.354
- type: map_at_100
value: 10.297
- type: map_at_1000
value: 25.901999999999997
- type: recall_at_1
value: 0.243
- type: recall_at_3
value: 0.668
- type: recall_at_5
value: 1.0630000000000002
- type: recall_at_10
value: 2.09
- type: recall_at_20
value: 3.92
- type: recall_at_100
value: 13.83
- type: recall_at_1000
value: 49.292
- type: precision_at_1
value: 88
- type: precision_at_3
value: 81.333
- type: precision_at_5
value: 76.4
- type: precision_at_10
value: 77.60000000000001
- type: precision_at_20
value: 74
- type: precision_at_100
value: 57.04
- type: precision_at_1000
value: 22.962
- type: mrr_at_1
value: 86
- type: mrr_at_3
value: 92
- type: mrr_at_5
value: 92.4
- type: mrr_at_10
value: 92.4
- type: mrr_at_20
value: 92.4
- type: mrr_at_100
value: 92.4
- type: mrr_at_1000
value: 92.4
- type: nauc_ndcg_at_1_max
value: 61.9438
- type: nauc_ndcg_at_1_std
value: 47.8895
- type: nauc_ndcg_at_1_diff1
value: -24.637500000000003
- type: nauc_ndcg_at_3_max
value: 61.8429
- type: nauc_ndcg_at_3_std
value: 54.91629999999999
- type: nauc_ndcg_at_3_diff1
value: -20.4604
- type: nauc_ndcg_at_5_max
value: 57.8672
- type: nauc_ndcg_at_5_std
value: 54.7293
- type: nauc_ndcg_at_5_diff1
value: -16.473599999999998
- type: nauc_ndcg_at_10_max
value: 53.545
- type: nauc_ndcg_at_10_std
value: 56.166799999999995
- type: nauc_ndcg_at_10_diff1
value: -10.5161
- type: nauc_ndcg_at_20_max
value: 57.28529999999999
- type: nauc_ndcg_at_20_std
value: 67.5691
- type: nauc_ndcg_at_20_diff1
value: -9.9578
- type: nauc_ndcg_at_100_max
value: 55.752100000000006
- type: nauc_ndcg_at_100_std
value: 79.8469
- type: nauc_ndcg_at_100_diff1
value: -6.660099999999999
- type: nauc_ndcg_at_1000_max
value: 59.807900000000004
- type: nauc_ndcg_at_1000_std
value: 72.4075
- type: nauc_ndcg_at_1000_diff1
value: -7.3533
- type: nauc_map_at_1_max
value: 6.4536999999999995
- type: nauc_map_at_1_std
value: -7.8119
- type: nauc_map_at_1_diff1
value: -14.0471
- type: nauc_map_at_3_max
value: 16.98
- type: nauc_map_at_3_std
value: -0.3721
- type: nauc_map_at_3_diff1
value: -15.7142
- type: nauc_map_at_5_max
value: 19.6888
- type: nauc_map_at_5_std
value: 1.4467
- type: nauc_map_at_5_diff1
value: -16.999200000000002
- type: nauc_map_at_10_max
value: 21.453400000000002
- type: nauc_map_at_10_std
value: 4.1453
- type: nauc_map_at_10_diff1
value: -13.9404
- type: nauc_map_at_20_max
value: 23.8514
- type: nauc_map_at_20_std
value: 11.505899999999999
- type: nauc_map_at_20_diff1
value: -10.5448
- type: nauc_map_at_100_max
value: 46.5883
- type: nauc_map_at_100_std
value: 57.91159999999999
- type: nauc_map_at_100_diff1
value: -8.8815
- type: nauc_map_at_1000_max
value: 63.9415
- type: nauc_map_at_1000_std
value: 79.9525
- type: nauc_map_at_1000_diff1
value: 2.9305000000000003
- type: nauc_recall_at_1_max
value: 6.4536999999999995
- type: nauc_recall_at_1_std
value: -7.8119
- type: nauc_recall_at_1_diff1
value: -14.0471
- type: nauc_recall_at_3_max
value: 13.3248
- type: nauc_recall_at_3_std
value: -3.4745999999999997
- type: nauc_recall_at_3_diff1
value: -16.9174
- type: nauc_recall_at_5_max
value: 14.6892
- type: nauc_recall_at_5_std
value: -2.0025999999999997
- type: nauc_recall_at_5_diff1
value: -17.622799999999998
- type: nauc_recall_at_10_max
value: 12.6493
- type: nauc_recall_at_10_std
value: -3.3624
- type: nauc_recall_at_10_diff1
value: -14.583599999999999
- type: nauc_recall_at_20_max
value: 12.4179
- type: nauc_recall_at_20_std
value: 2.6304000000000003
- type: nauc_recall_at_20_diff1
value: -12.0154
- type: nauc_recall_at_100_max
value: 33.3924
- type: nauc_recall_at_100_std
value: 41.6643
- type: nauc_recall_at_100_diff1
value: -13.6719
- type: nauc_recall_at_1000_max
value: 54.8435
- type: nauc_recall_at_1000_std
value: 59.816199999999995
- type: nauc_recall_at_1000_diff1
value: -2.3768000000000002
- type: nauc_precision_at_1_max
value: 83.2167
- type: nauc_precision_at_1_std
value: 71.8899
- type: nauc_precision_at_1_diff1
value: -18.970699999999997
- type: nauc_precision_at_3_max
value: 70.7754
- type: nauc_precision_at_3_std
value: 60.5541
- type: nauc_precision_at_3_diff1
value: -16.8234
- type: nauc_precision_at_5_max
value: 64.384
- type: nauc_precision_at_5_std
value: 54.879999999999995
- type: nauc_precision_at_5_diff1
value: -12.5072
- type: nauc_precision_at_10_max
value: 60.5951
- type: nauc_precision_at_10_std
value: 57.330000000000005
- type: nauc_precision_at_10_diff1
value: -4.029400000000001
- type: nauc_precision_at_20_max
value: 61.1634
- type: nauc_precision_at_20_std
value: 69.7819
- type: nauc_precision_at_20_diff1
value: -6.6238
- type: nauc_precision_at_100_max
value: 57.61619999999999
- type: nauc_precision_at_100_std
value: 82.3103
- type: nauc_precision_at_100_diff1
value: 0.8824000000000001
- type: nauc_precision_at_1000_max
value: 48.0414
- type: nauc_precision_at_1000_std
value: 54.315599999999996
- type: nauc_precision_at_1000_diff1
value: 8.9054
- type: nauc_mrr_at_1_max
value: 55.3901
- type: nauc_mrr_at_1_std
value: 45.5245
- type: nauc_mrr_at_1_diff1
value: -1.6835
- type: nauc_mrr_at_3_max
value: 61.1547
- type: nauc_mrr_at_3_std
value: 52.5639
- type: nauc_mrr_at_3_diff1
value: -23.9503
- type: nauc_mrr_at_5_max
value: 59.0374
- type: nauc_mrr_at_5_std
value: 49.9784
- type: nauc_mrr_at_5_diff1
value: -15.771799999999999
- type: nauc_mrr_at_10_max
value: 59.0374
- type: nauc_mrr_at_10_std
value: 49.9784
- type: nauc_mrr_at_10_diff1
value: -15.771799999999999
- type: nauc_mrr_at_20_max
value: 59.0374
- type: nauc_mrr_at_20_std
value: 49.9784
- type: nauc_mrr_at_20_diff1
value: -15.771799999999999
- type: nauc_mrr_at_100_max
value: 59.0374
- type: nauc_mrr_at_100_std
value: 49.9784
- type: nauc_mrr_at_100_diff1
value: -15.771799999999999
- type: nauc_mrr_at_1000_max
value: 59.0374
- type: nauc_mrr_at_1000_std
value: 49.9784
- type: nauc_mrr_at_1000_diff1
value: -15.771799999999999
- type: main_score
value: 74.02499999999999
- task:
type: Retrieval
dataset:
name: MTEB Touche2020 (default)
type: mteb/touche2020
config: default
split: test
revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f
metrics:
- type: ndcg_at_1
value: 26.531
- type: ndcg_at_3
value: 25.533
- type: ndcg_at_5
value: 25.846999999999998
- type: ndcg_at_10
value: 23.86
- type: ndcg_at_20
value: 25.685999999999996
- type: ndcg_at_100
value: 35.339999999999996
- type: ndcg_at_1000
value: 46.949999999999996
- type: map_at_1
value: 2.253
- type: map_at_3
value: 4.737
- type: map_at_5
value: 6.550000000000001
- type: map_at_10
value: 9.114
- type: map_at_20
value: 11.928999999999998
- type: map_at_100
value: 15.082
- type: map_at_1000
value: 16.567
- type: recall_at_1
value: 2.253
- type: recall_at_3
value: 6.067
- type: recall_at_5
value: 9.985
- type: recall_at_10
value: 15.595
- type: recall_at_20
value: 24.709
- type: recall_at_100
value: 46.075
- type: recall_at_1000
value: 81.211
- type: precision_at_1
value: 30.612000000000002
- type: precision_at_3
value: 27.211000000000002
- type: precision_at_5
value: 27.346999999999998
- type: precision_at_10
value: 21.633
- type: precision_at_20
value: 17.347
- type: precision_at_100
value: 7.306
- type: precision_at_1000
value: 1.498
- type: mrr_at_1
value: 30.6122
- type: mrr_at_3
value: 41.8367
- type: mrr_at_5
value: 45.6122
- type: mrr_at_10
value: 46.827000000000005
- type: mrr_at_20
value: 47.652699999999996
- type: mrr_at_100
value: 47.9184
- type: mrr_at_1000
value: 47.9184
- type: nauc_ndcg_at_1_max
value: -39.9725
- type: nauc_ndcg_at_1_std
value: -16.2998
- type: nauc_ndcg_at_1_diff1
value: 10.729700000000001
- type: nauc_ndcg_at_3_max
value: -32.3198
- type: nauc_ndcg_at_3_std
value: -8.7066
- type: nauc_ndcg_at_3_diff1
value: 17.6297
- type: nauc_ndcg_at_5_max
value: -32.069900000000004
- type: nauc_ndcg_at_5_std
value: -0.3237
- type: nauc_ndcg_at_5_diff1
value: 6.7525
- type: nauc_ndcg_at_10_max
value: -32.9347
- type: nauc_ndcg_at_10_std
value: 0.506
- type: nauc_ndcg_at_10_diff1
value: 5.428999999999999
- type: nauc_ndcg_at_20_max
value: -30.7678
- type: nauc_ndcg_at_20_std
value: 0.2792
- type: nauc_ndcg_at_20_diff1
value: 8.7515
- type: nauc_ndcg_at_100_max
value: -30.291800000000002
- type: nauc_ndcg_at_100_std
value: 24.8031
- type: nauc_ndcg_at_100_diff1
value: 3.9330999999999996
- type: nauc_ndcg_at_1000_max
value: -27.448299999999996
- type: nauc_ndcg_at_1000_std
value: 35.2315
- type: nauc_ndcg_at_1000_diff1
value: 0.15059999999999998
- type: nauc_map_at_1_max
value: -41.288799999999995
- type: nauc_map_at_1_std
value: -24.5046
- type: nauc_map_at_1_diff1
value: 7.072100000000001
- type: nauc_map_at_3_max
value: -28.862199999999998
- type: nauc_map_at_3_std
value: -18.990299999999998
- type: nauc_map_at_3_diff1
value: 15.764700000000001
- type: nauc_map_at_5_max
value: -27.409699999999997
- type: nauc_map_at_5_std
value: -15.2501
- type: nauc_map_at_5_diff1
value: 8.8044
- type: nauc_map_at_10_max
value: -26.222099999999998
- type: nauc_map_at_10_std
value: -11.4798
- type: nauc_map_at_10_diff1
value: 7.2714
- type: nauc_map_at_20_max
value: -23.0414
- type: nauc_map_at_20_std
value: -7.985
- type: nauc_map_at_20_diff1
value: 7.704
- type: nauc_map_at_100_max
value: -21.3902
- type: nauc_map_at_100_std
value: 4.8129
- type: nauc_map_at_100_diff1
value: 6.401700000000001
- type: nauc_map_at_1000_max
value: -21.4197
- type: nauc_map_at_1000_std
value: 8.5824
- type: nauc_map_at_1000_diff1
value: 5.328
- type: nauc_recall_at_1_max
value: -41.288799999999995
- type: nauc_recall_at_1_std
value: -24.5046
- type: nauc_recall_at_1_diff1
value: 7.072100000000001
- type: nauc_recall_at_3_max
value: -24.351300000000002
- type: nauc_recall_at_3_std
value: -13.661000000000001
- type: nauc_recall_at_3_diff1
value: 14.1204
- type: nauc_recall_at_5_max
value: -22.767799999999998
- type: nauc_recall_at_5_std
value: -7.4171000000000005
- type: nauc_recall_at_5_diff1
value: 1.9924999999999997
- type: nauc_recall_at_10_max
value: -25.3874
- type: nauc_recall_at_10_std
value: -3.9967
- type: nauc_recall_at_10_diff1
value: 3.4776000000000002
- type: nauc_recall_at_20_max
value: -25.051099999999998
- type: nauc_recall_at_20_std
value: -2.0329
- type: nauc_recall_at_20_diff1
value: 2.2399
- type: nauc_recall_at_100_max
value: -20.6196
- type: nauc_recall_at_100_std
value: 39.644200000000005
- type: nauc_recall_at_100_diff1
value: -6.7455
- type: nauc_recall_at_1000_max
value: -6.2200999999999995
- type: nauc_recall_at_1000_std
value: 78.9064
- type: nauc_recall_at_1000_diff1
value: -23.044700000000002
- type: nauc_precision_at_1_max
value: -39.8407
- type: nauc_precision_at_1_std
value: -16.3352
- type: nauc_precision_at_1_diff1
value: 12.1075
- type: nauc_precision_at_3_max
value: -30.505900000000004
- type: nauc_precision_at_3_std
value: -6.6981
- type: nauc_precision_at_3_diff1
value: 22.1572
- type: nauc_precision_at_5_max
value: -26.9752
- type: nauc_precision_at_5_std
value: 9.2292
- type: nauc_precision_at_5_diff1
value: 6.7962
- type: nauc_precision_at_10_max
value: -29.9346
- type: nauc_precision_at_10_std
value: 13.3568
- type: nauc_precision_at_10_diff1
value: 6.8902
- type: nauc_precision_at_20_max
value: -22.7968
- type: nauc_precision_at_20_std
value: 21.0382
- type: nauc_precision_at_20_diff1
value: 9.033199999999999
- type: nauc_precision_at_100_max
value: -11.4519
- type: nauc_precision_at_100_std
value: 72.8881
- type: nauc_precision_at_100_diff1
value: -8.261000000000001
- type: nauc_precision_at_1000_max
value: 29.3926
- type: nauc_precision_at_1000_std
value: 44.936
- type: nauc_precision_at_1000_diff1
value: -15.2011
- type: nauc_mrr_at_1_max
value: -39.8407
- type: nauc_mrr_at_1_std
value: -16.3352
- type: nauc_mrr_at_1_diff1
value: 12.1075
- type: nauc_mrr_at_3_max
value: -37.689
- type: nauc_mrr_at_3_std
value: -8.757
- type: nauc_mrr_at_3_diff1
value: 6.916300000000001
- type: nauc_mrr_at_5_max
value: -36.2749
- type: nauc_mrr_at_5_std
value: -5.7966
- type: nauc_mrr_at_5_diff1
value: 4.8726
- type: nauc_mrr_at_10_max
value: -39.0726
- type: nauc_mrr_at_10_std
value: -6.830799999999999
- type: nauc_mrr_at_10_diff1
value: 5.1214
- type: nauc_mrr_at_20_max
value: -38.6519
- type: nauc_mrr_at_20_std
value: -8.6379
- type: nauc_mrr_at_20_diff1
value: 6.436699999999999
- type: nauc_mrr_at_100_max
value: -38.065599999999996
- type: nauc_mrr_at_100_std
value: -8.444
- type: nauc_mrr_at_100_diff1
value: 6.2007
- type: nauc_mrr_at_1000_max
value: -38.065599999999996
- type: nauc_mrr_at_1000_std
value: -8.444
- type: nauc_mrr_at_1000_diff1
value: 6.2007
- type: main_score
value: 23.86
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification (default)
type: mteb/toxic_conversations_50k
config: default
split: test
revision: edfaf9da55d3dd50d43143d90c1ac476895ae6de
metrics:
- type: accuracy
value: 78.21289999999999
- type: f1
value: 60.9322
- type: f1_weighted
value: 82.69539999999999
- type: ap
value: 19.0474
- type: ap_weighted
value: 19.0474
- type: main_score
value: 78.21289999999999
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification (default)
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 61.3865
- type: f1
value: 61.8066
- type: f1_weighted
value: 60.887
- type: main_score
value: 61.3865
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering (default)
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 49.3006
- type: v_measure_std
value: 0.8814000000000001
- type: main_score
value: 49.3006
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015 (default)
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: similarity_accuracy
value: 85.96289999999999
- type: similarity_accuracy_threshold
value: 81.7629
- type: similarity_f1
value: 67.1044
- type: similarity_f1_threshold
value: 77.98479999999999
- type: similarity_precision
value: 64.3497
- type: similarity_recall
value: 70.10549999999999
- type: similarity_ap
value: 73.406
- type: cosine_accuracy
value: 85.96289999999999
- type: cosine_accuracy_threshold
value: 81.7629
- type: cosine_f1
value: 67.1044
- type: cosine_f1_threshold
value: 77.98479999999999
- type: cosine_precision
value: 64.3497
- type: cosine_recall
value: 70.10549999999999
- type: cosine_ap
value: 73.406
- type: manhattan_accuracy
value: 85.8735
- type: manhattan_accuracy_threshold
value: 22606.6437
- type: manhattan_f1
value: 67.173
- type: manhattan_f1_threshold
value: 24147.3145
- type: manhattan_precision
value: 65.5857
- type: manhattan_recall
value: 68.8391
- type: manhattan_ap
value: 73.4081
- type: euclidean_accuracy
value: 85.94500000000001
- type: euclidean_accuracy_threshold
value: 1019.4165999999999
- type: euclidean_f1
value: 67.0857
- type: euclidean_f1_threshold
value: 1125.1016
- type: euclidean_precision
value: 64.07300000000001
- type: euclidean_recall
value: 70.3958
- type: euclidean_ap
value: 73.3824
- type: dot_accuracy
value: 85.3013
- type: dot_accuracy_threshold
value: 23726.3916
- type: dot_f1
value: 65.8888
- type: dot_f1_threshold
value: 22265.913399999998
- type: dot_precision
value: 62.2128
- type: dot_recall
value: 70.0264
- type: dot_ap
value: 71.5363
- type: max_accuracy
value: 85.96289999999999
- type: max_f1
value: 67.173
- type: max_precision
value: 65.5857
- type: max_recall
value: 70.3958
- type: max_ap
value: 73.4081
- type: main_score
value: 73.4081
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus (default)
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: similarity_accuracy
value: 88.82100000000001
- type: similarity_accuracy_threshold
value: 75.9351
- type: similarity_f1
value: 77.7423
- type: similarity_f1_threshold
value: 72.788
- type: similarity_precision
value: 74.2279
- type: similarity_recall
value: 81.6061
- type: similarity_ap
value: 85.4324
- type: cosine_accuracy
value: 88.82100000000001
- type: cosine_accuracy_threshold
value: 75.9351
- type: cosine_f1
value: 77.7423
- type: cosine_f1_threshold
value: 72.788
- type: cosine_precision
value: 74.2279
- type: cosine_recall
value: 81.6061
- type: cosine_ap
value: 85.4324
- type: manhattan_accuracy
value: 88.786
- type: manhattan_accuracy_threshold
value: 25561.4807
- type: manhattan_f1
value: 77.5953
- type: manhattan_f1_threshold
value: 26504.0619
- type: manhattan_precision
value: 76.1432
- type: manhattan_recall
value: 79.1038
- type: manhattan_ap
value: 85.3477
- type: euclidean_accuracy
value: 88.76859999999999
- type: euclidean_accuracy_threshold
value: 1181.4543
- type: euclidean_f1
value: 77.645
- type: euclidean_f1_threshold
value: 1224.4793
- type: euclidean_precision
value: 75.5925
- type: euclidean_recall
value: 79.8121
- type: euclidean_ap
value: 85.3781
- type: dot_accuracy
value: 88.5707
- type: dot_accuracy_threshold
value: 21387.7899
- type: dot_f1
value: 77.4888
- type: dot_f1_threshold
value: 20875.6653
- type: dot_precision
value: 75.58009999999999
- type: dot_recall
value: 79.4965
- type: dot_ap
value: 84.88550000000001
- type: max_accuracy
value: 88.82100000000001
- type: max_f1
value: 77.7423
- type: max_precision
value: 76.1432
- type: max_recall
value: 81.6061
- type: max_ap
value: 85.4324
- type: main_score
value: 85.4324
---
# Mini-GTE
<p align="center">
<img src="./qtack_logo.png" alt="QTACK Logo" style="width:33%;">
</p>
## Overview
This is the first model developed by QTACK and serves as a proof of concept for our distillation approach! Built upon a distillbert-based architecture, Mini-GTE is distilled from GTE and designed for efficiency without sacrificing accuracy at only 66M parameters. As a standalone sentence transformer, it ranks 2nd on the MTEB classic leaderboard in the <100M parameter category and 63rd overall which makes it a strong choice for real-time query encoding, semantic search, and similarity tasks.
## Model Details
- **Model Type:** Sentence Transformer
- **Base model:** [distilbert/distilbert-base-uncased](https://huggingface.co/distilbert/distilbert-base-uncased) <!-- at revision 12040accade4e8a0f71eabdb258fecc2e7e948be -->
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 768 dimensions
- **Similarity Function:** Cosine Similarity
## Usage
- Optimized for quick inference
- Great at quickly generating high quality encodings
- Easy to plug and play since it is distilled from GTE
- **We want to see how you’re using our model so we’ll give you a free coffee/$10 gift card if you get on call with us and show us what you’ve built!**
## Getting Started
### Installation
Mini-GTE is built on the [Sentence Transformers](https://www.sbert.net/) framework. To install the required packages, run:
```bash
pip install -U sentence-transformers
```
### Quick Start
Here's a quick example to get you started:
```python
from sentence_transformers import SentenceTransformer
# Download directly from Hugging Face
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
'The weather is lovely today.',
"It's so sunny outside!",
'He drove to the stadium.',
]
embeddings = model.encode(sentences)
print(embeddings.shape) # Expected: [3, 768]
# Compute the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape) # Expected: [3, 3]
```
## Training Details
- Python: 3.10.12
- Sentence Transformers: 3.3.1
- Transformers: 4.48.0.dev0
- PyTorch: 2.1.0a0+32f93b1
- Accelerate: 1.2.0
- Datasets: 2.21.0
- Tokenizers: 0.21.0
## Getting Help
For any questions, suggestions, or issues, please contact the QTACK team directly through our [contact page](https://www.qtack.com/contact).
| [
"SUMMARIZATION"
] | [
"BIOSSES",
"SCIFACT"
] |
Dizex/FoodBaseBERT-NER | Dizex | token-classification | [
"transformers",
"pytorch",
"safetensors",
"bert",
"token-classification",
"FoodBase",
"NER",
"en",
"dataset:Dizex/FoodBase",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-10-31T09:00:15 | 2023-05-14T19:31:01 | 1,241 | 19 | ---
datasets:
- Dizex/FoodBase
language: en
license: mit
tags:
- FoodBase
- NER
widget:
- text: 'Today''s meal: Fresh olive poké bowl topped with chia seeds. Very delicious!'
example_title: Food example 1
- text: Tartufo Pasta with garlic flavoured butter and olive oil, egg yolk, parmigiano
and pasta water.
example_title: Food example 2
---
# FoodBaseBERT
## Model description
**FoodBaseBERT** is a fine-tuned BERT model that is ready to use for **Named Entity Recognition** of Food entities. It has been trained to recognize one entity: food (FOOD).
Specifically, this model is a *bert-base-cased* model that was fine-tuned on the [FoodBase NER](https://academic.oup.com/database/article/doi/10.1093/database/baz121/5611291) dataset.
## Intended uses
#### How to use
You can use this model with Transformers *pipeline* for NER.
```python
from transformers import AutoTokenizer, AutoModelForTokenClassification
from transformers import pipeline
tokenizer = AutoTokenizer.from_pretrained("Dizex/FoodBaseBERT")
model = AutoModelForTokenClassification.from_pretrained("Dizex/FoodBaseBERT")
pipe = pipeline("ner", model=model, tokenizer=tokenizer)
example = "Today's meal: Fresh olive poké bowl topped with chia seeds. Very delicious!"
ner_entity_results = pipe(example)
print(ner_entity_results)
```
| [
"NAMED_ENTITY_RECOGNITION"
] | [
"CHIA"
] |
PlanTL-GOB-ES/roberta-base-biomedical-clinical-es | PlanTL-GOB-ES | fill-mask | [
"transformers",
"pytorch",
"roberta",
"fill-mask",
"biomedical",
"clinical",
"spanish",
"es",
"arxiv:2109.03570",
"arxiv:2109.07765",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04 | 2022-11-15T15:22:45 | 1,222 | 18 | ---
language:
- es
license: apache-2.0
metrics:
- ppl
tags:
- biomedical
- clinical
- spanish
widget:
- text: El único antecedente personal a reseñar era la <mask> arterial.
- text: Las radiologías óseas de cuerpo entero no detectan alteraciones <mask>, ni
alteraciones vertebrales.
- text: En el <mask> toraco-abdómino-pélvico no se encontraron hallazgos patológicos
de interés.
---
# Biomedical-clinical language model for Spanish
## Table of contents
<details>
<summary>Click to expand</summary>
- [Model description](#model-description)
- [Intended uses and limitations](#intended-use)
- [How to use](#how-to-use)
- [Limitations and bias](#limitations-and-bias)
- [Training](#training)
- [Evaluation](#evaluation)
- [Additional information](#additional-information)
- [Author](#author)
- [Contact information](#contact-information)
- [Copyright](#copyright)
- [Licensing information](#licensing-information)
- [Funding](#funding)
- [Citation information](#citation-information)
- [Disclaimer](#disclaimer)
</details>
## Model description
Biomedical pretrained language model for Spanish. This model is a [RoBERTa-based](https://github.com/pytorch/fairseq/tree/master/examples/roberta) model trained on a **biomedical-clinical** corpus in Spanish collected from several sources.
## Intended uses and limitations
The model is ready-to-use only for masked language modelling to perform the Fill Mask task (try the inference API or read the next section). However, it is intended to be fine-tuned on downstream tasks such as Named Entity Recognition or Text Classification.
## How to use
```python
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("BSC-TeMU/roberta-base-biomedical-es")
model = AutoModelForMaskedLM.from_pretrained("BSC-TeMU/roberta-base-biomedical-es")
from transformers import pipeline
unmasker = pipeline('fill-mask', model="BSC-TeMU/roberta-base-biomedical-es")
unmasker("El único antecedente personal a reseñar era la <mask> arterial.")
```
```
# Output
[
{
"sequence": " El único antecedente personal a reseñar era la hipertensión arterial.",
"score": 0.9855039715766907,
"token": 3529,
"token_str": " hipertensión"
},
{
"sequence": " El único antecedente personal a reseñar era la diabetes arterial.",
"score": 0.0039140828885138035,
"token": 1945,
"token_str": " diabetes"
},
{
"sequence": " El único antecedente personal a reseñar era la hipotensión arterial.",
"score": 0.002484665485098958,
"token": 11483,
"token_str": " hipotensión"
},
{
"sequence": " El único antecedente personal a reseñar era la Hipertensión arterial.",
"score": 0.0023484621196985245,
"token": 12238,
"token_str": " Hipertensión"
},
{
"sequence": " El único antecedente personal a reseñar era la presión arterial.",
"score": 0.0008009297889657319,
"token": 2267,
"token_str": " presión"
}
]
```
## Limitations and bias
At the time of submission, no measures have been taken to estimate the bias embedded in the model. However, we are well aware that our models may be biased since the corpora have been collected using crawling techniques on multiple web sources. We intend to conduct research in these areas in the future, and if completed, this model card will be updated.
## Training
The training corpus has been tokenized using a byte version of [Byte-Pair Encoding (BPE)](https://github.com/openai/gpt-2)
used in the original [RoBERTA](https://github.com/pytorch/fairseq/tree/master/examples/roberta) model with a vocabulary size of 52,000 tokens. The pretraining consists of a masked language model training at the subword level following the approach employed for the RoBERTa base model with the same hyperparameters as in the original work. The training lasted a total of 48 hours with 16 NVIDIA V100 GPUs of 16GB DDRAM, using Adam optimizer with a peak learning rate of 0.0005 and an effective batch size of 2,048 sentences.
The training corpus is composed of several biomedical corpora in Spanish, collected from publicly available corpora and crawlers, and a real-world clinical corpus collected from more than 278K clinical documents and notes. To obtain a high-quality training corpus while retaining the idiosyncrasies of the clinical language, a cleaning pipeline has been applied only to the biomedical corpora, keeping the clinical corpus uncleaned. Essentially, the cleaning operations used are:
- data parsing in different formats
- sentence splitting
- language detection
- filtering of ill-formed sentences
- deduplication of repetitive contents
- keep the original document boundaries
Then, the biomedical corpora are concatenated and further global deduplication among the biomedical corpora have been applied.
Eventually, the clinical corpus is concatenated to the cleaned biomedical corpus resulting in a medium-size biomedical-clinical corpus for Spanish composed of more than 1B tokens. The table below shows some basic statistics of the individual cleaned corpora:
| Name | No. tokens | Description |
|-----------------------------------------------------------------------------------------|-------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| [Medical crawler](https://zenodo.org/record/4561970) | 745,705,946 | Crawler of more than 3,000 URLs belonging to Spanish biomedical and health domains. |
| Clinical cases misc. | 102,855,267 | A miscellany of medical content, essentially clinical cases. Note that a clinical case report is a scientific publication where medical practitioners share patient cases and it is different from a clinical note or document. |
| Clinical notes/documents | 91,250,080 | Collection of more than 278K clinical documents, including discharge reports, clinical course notes and X-ray reports, for a total of 91M tokens. |
| [Scielo](https://github.com/PlanTL-SANIDAD/SciELO-Spain-Crawler) | 60,007,289 | Publications written in Spanish crawled from the Spanish SciELO server in 2017. |
| [BARR2_background](https://temu.bsc.es/BARR2/downloads/background_set.raw_text.tar.bz2) | 24,516,442 | Biomedical Abbreviation Recognition and Resolution (BARR2) containing Spanish clinical case study sections from a variety of clinical disciplines. |
| Wikipedia_life_sciences | 13,890,501 | Wikipedia articles crawled 04/01/2021 with the [Wikipedia API python library](https://pypi.org/project/Wikipedia-API/) starting from the "Ciencias\_de\_la\_vida" category up to a maximum of 5 subcategories. Multiple links to the same articles are then discarded to avoid repeating content. |
| Patents | 13,463,387 | Google Patent in Medical Domain for Spain (Spanish). The accepted codes (Medical Domain) for Json files of patents are: "A61B", "A61C","A61F", "A61H", "A61K", "A61L","A61M", "A61B", "A61P". |
| [EMEA](http://opus.nlpl.eu/download.php?f=EMEA/v3/moses/en-es.txt.zip) | 5,377,448 | Spanish-side documents extracted from parallel corpora made out of PDF documents from the European Medicines Agency. |
| [mespen_Medline](https://zenodo.org/record/3562536#.YTt1fH2xXbR) | 4,166,077 | Spanish-side articles extracted from a collection of Spanish-English parallel corpus consisting of biomedical scientific literature. The collection of parallel resources are aggregated from the MedlinePlus source. |
| PubMed | 1,858,966 | Open-access articles from the PubMed repository crawled in 2017. |
## Evaluation
The model has been evaluated on the Named Entity Recognition (NER) using the following datasets:
- [PharmaCoNER](https://zenodo.org/record/4270158): is a track on chemical and drug mention recognition from Spanish medical texts (for more info see: https://temu.bsc.es/pharmaconer/).
- [CANTEMIST](https://zenodo.org/record/3978041#.YTt5qH2xXbQ): is a shared task specifically focusing on named entity recognition of tumor morphology, in Spanish (for more info see: https://zenodo.org/record/3978041#.YTt5qH2xXbQ).
- ICTUSnet: consists of 1,006 hospital discharge reports of patients admitted for stroke from 18 different Spanish hospitals. It contains more than 79,000 annotations for 51 different kinds of variables.
The evaluation results are compared against the [mBERT](https://huggingface.co/bert-base-multilingual-cased) and [BETO](https://huggingface.co/dccuchile/bert-base-spanish-wwm-cased) models:
| F1 - Precision - Recall | roberta-base-biomedical-clinical-es | mBERT | BETO |
|---------------------------|----------------------------|-------------------------------|-------------------------|
| PharmaCoNER | **90.04** - **88.92** - **91.18** | 87.46 - 86.50 - 88.46 | 88.18 - 87.12 - 89.28 |
| CANTEMIST | **83.34** - **81.48** - **85.30** | 82.61 - 81.12 - 84.15 | 82.42 - 80.91 - 84.00 |
| ICTUSnet | **88.08** - **84.92** - **91.50** | 86.75 - 83.53 - 90.23 | 85.95 - 83.10 - 89.02 |
## Additional information
### Author
Text Mining Unit (TeMU) at the Barcelona Supercomputing Center ([email protected])
### Contact information
For further information, send an email to <[email protected]>
### Copyright
Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)
### Licensing information
[Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0)
### Funding
This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.
### Citation information
If you use our models, please cite our latest preprint:
```bibtex
@misc{carrino2021biomedical,
title={Biomedical and Clinical Language Models for Spanish: On the Benefits of Domain-Specific Pretraining in a Mid-Resource Scenario},
author={Casimiro Pio Carrino and Jordi Armengol-Estapé and Asier Gutiérrez-Fandiño and Joan Llop-Palao and Marc Pàmies and Aitor Gonzalez-Agirre and Marta Villegas},
year={2021},
eprint={2109.03570},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
If you use our Medical Crawler corpus, please cite the preprint:
```bibtex
@misc{carrino2021spanish,
title={Spanish Biomedical Crawled Corpus: A Large, Diverse Dataset for Spanish Biomedical Language Models},
author={Casimiro Pio Carrino and Jordi Armengol-Estapé and Ona de Gibert Bonet and Asier Gutiérrez-Fandiño and Aitor Gonzalez-Agirre and Martin Krallinger and Marta Villegas},
year={2021},
eprint={2109.07765},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### Disclaimer
<details>
<summary>Click to expand</summary>
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.
When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence.
In no event shall the owner of the models (SEDIA – State Secretariat for Digitalization and Artificial Intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.
Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.
Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.
En ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos.
</details> | [
"NAMED_ENTITY_RECOGNITION",
"TEXT_CLASSIFICATION"
] | [
"CANTEMIST",
"PHARMACONER",
"SCIELO"
] |
TheBloke/meditron-70B-GGUF | TheBloke | text-generation | [
"transformers",
"gguf",
"llama",
"medical",
"health",
"llama2",
"text-generation",
"en",
"dataset:bigbio/med_qa",
"dataset:medmcqa",
"dataset:bigbio/pubmed_qa",
"dataset:epfl-llm/guidelines",
"arxiv:2311.16079",
"base_model:epfl-llm/meditron-70b",
"base_model:quantized:epfl-llm/meditron-70b",
"license:llama2",
"region:us"
] | 2023-11-30T17:10:33 | 2023-11-30T17:54:45 | 1,186 | 20 | ---
base_model: epfl-llm/meditron-70b
datasets:
- bigbio/med_qa
- medmcqa
- bigbio/pubmed_qa
- epfl-llm/guidelines
language:
- en
license: llama2
metrics:
- accuracy
- perplexity
model_name: Meditron 70B
pipeline_tag: text-generation
tags:
- medical
- health
- llama2
inference: false
model_creator: EPFL LLM Team
model_type: llama
prompt_template: '<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant
'
quantized_by: TheBloke
---
<!-- markdownlint-disable MD041 -->
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# Meditron 70B - GGUF
- Model creator: [EPFL LLM Team](https://huggingface.co/epfl-llm)
- Original model: [Meditron 70B](https://huggingface.co/epfl-llm/meditron-70b)
<!-- description start -->
## Description
This repo contains GGUF format model files for [EPFL LLM Team's Meditron 70B](https://huggingface.co/epfl-llm/meditron-70b).
These files were quantised using hardware kindly provided by [Massed Compute](https://massedcompute.com/).
<!-- description end -->
<!-- README_GGUF.md-about-gguf start -->
### About GGUF
GGUF is a new format introduced by the llama.cpp team on August 21st 2023. It is a replacement for GGML, which is no longer supported by llama.cpp.
Here is an incomplete list of clients and libraries that are known to support GGUF:
* [llama.cpp](https://github.com/ggerganov/llama.cpp). The source project for GGUF. Offers a CLI and a server option.
* [text-generation-webui](https://github.com/oobabooga/text-generation-webui), the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.
* [KoboldCpp](https://github.com/LostRuins/koboldcpp), a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.
* [GPT4All](https://gpt4all.io/index.html), a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.
* [LM Studio](https://lmstudio.ai/), an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.
* [LoLLMS Web UI](https://github.com/ParisNeo/lollms-webui), a great web UI with many interesting and unique features, including a full model library for easy model selection.
* [Faraday.dev](https://faraday.dev/), an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.
* [llama-cpp-python](https://github.com/abetlen/llama-cpp-python), a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.
* [candle](https://github.com/huggingface/candle), a Rust ML framework with a focus on performance, including GPU support, and ease of use.
* [ctransformers](https://github.com/marella/ctransformers), a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.
<!-- README_GGUF.md-about-gguf end -->
<!-- repositories-available start -->
## Repositories available
* [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/meditron-70B-AWQ)
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/meditron-70B-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/meditron-70B-GGUF)
* [EPFL LLM Team's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/epfl-llm/meditron-70b)
<!-- repositories-available end -->
<!-- prompt-template start -->
## Prompt template: ChatML
```
<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant
```
<!-- prompt-template end -->
<!-- compatibility_gguf start -->
## Compatibility
These quantised GGUFv2 files are compatible with llama.cpp from August 27th onwards, as of commit [d0cee0d](https://github.com/ggerganov/llama.cpp/commit/d0cee0d36d5be95a0d9088b674dbb27354107221)
They are also compatible with many third party UIs and libraries - please see the list at the top of this README.
## Explanation of quantisation methods
<details>
<summary>Click to see details</summary>
The new methods available are:
* GGML_TYPE_Q2_K - "type-1" 2-bit quantization in super-blocks containing 16 blocks, each block having 16 weight. Block scales and mins are quantized with 4 bits. This ends up effectively using 2.5625 bits per weight (bpw)
* GGML_TYPE_Q3_K - "type-0" 3-bit quantization in super-blocks containing 16 blocks, each block having 16 weights. Scales are quantized with 6 bits. This end up using 3.4375 bpw.
* GGML_TYPE_Q4_K - "type-1" 4-bit quantization in super-blocks containing 8 blocks, each block having 32 weights. Scales and mins are quantized with 6 bits. This ends up using 4.5 bpw.
* GGML_TYPE_Q5_K - "type-1" 5-bit quantization. Same super-block structure as GGML_TYPE_Q4_K resulting in 5.5 bpw
* GGML_TYPE_Q6_K - "type-0" 6-bit quantization. Super-blocks with 16 blocks, each block having 16 weights. Scales are quantized with 8 bits. This ends up using 6.5625 bpw
Refer to the Provided Files table below to see what files use which methods, and how.
</details>
<!-- compatibility_gguf end -->
<!-- README_GGUF.md-provided-files start -->
## Provided files
| Name | Quant method | Bits | Size | Max RAM required | Use case |
| ---- | ---- | ---- | ---- | ---- | ----- |
| [meditron-70b.Q2_K.gguf](https://huggingface.co/TheBloke/meditron-70B-GGUF/blob/main/meditron-70b.Q2_K.gguf) | Q2_K | 2 | 29.28 GB| 31.78 GB | smallest, significant quality loss - not recommended for most purposes |
| [meditron-70b.Q3_K_S.gguf](https://huggingface.co/TheBloke/meditron-70B-GGUF/blob/main/meditron-70b.Q3_K_S.gguf) | Q3_K_S | 3 | 29.92 GB| 32.42 GB | very small, high quality loss |
| [meditron-70b.Q3_K_M.gguf](https://huggingface.co/TheBloke/meditron-70B-GGUF/blob/main/meditron-70b.Q3_K_M.gguf) | Q3_K_M | 3 | 33.19 GB| 35.69 GB | very small, high quality loss |
| [meditron-70b.Q3_K_L.gguf](https://huggingface.co/TheBloke/meditron-70B-GGUF/blob/main/meditron-70b.Q3_K_L.gguf) | Q3_K_L | 3 | 36.15 GB| 38.65 GB | small, substantial quality loss |
| [meditron-70b.Q4_0.gguf](https://huggingface.co/TheBloke/meditron-70B-GGUF/blob/main/meditron-70b.Q4_0.gguf) | Q4_0 | 4 | 38.87 GB| 41.37 GB | legacy; small, very high quality loss - prefer using Q3_K_M |
| [meditron-70b.Q4_K_S.gguf](https://huggingface.co/TheBloke/meditron-70B-GGUF/blob/main/meditron-70b.Q4_K_S.gguf) | Q4_K_S | 4 | 39.07 GB| 41.57 GB | small, greater quality loss |
| [meditron-70b.Q4_K_M.gguf](https://huggingface.co/TheBloke/meditron-70B-GGUF/blob/main/meditron-70b.Q4_K_M.gguf) | Q4_K_M | 4 | 41.42 GB| 43.92 GB | medium, balanced quality - recommended |
| [meditron-70b.Q5_0.gguf](https://huggingface.co/TheBloke/meditron-70B-GGUF/blob/main/meditron-70b.Q5_0.gguf) | Q5_0 | 5 | 47.46 GB| 49.96 GB | legacy; medium, balanced quality - prefer using Q4_K_M |
| [meditron-70b.Q5_K_S.gguf](https://huggingface.co/TheBloke/meditron-70B-GGUF/blob/main/meditron-70b.Q5_K_S.gguf) | Q5_K_S | 5 | 47.46 GB| 49.96 GB | large, low quality loss - recommended |
| [meditron-70b.Q5_K_M.gguf](https://huggingface.co/TheBloke/meditron-70B-GGUF/blob/main/meditron-70b.Q5_K_M.gguf) | Q5_K_M | 5 | 48.75 GB| 51.25 GB | large, very low quality loss - recommended |
| meditron-70b.Q6_K.gguf | Q6_K | 6 | 56.59 GB| 59.09 GB | very large, extremely low quality loss |
| meditron-70b.Q8_0.gguf | Q8_0 | 8 | 73.29 GB| 75.79 GB | very large, extremely low quality loss - not recommended |
**Note**: the above RAM figures assume no GPU offloading. If layers are offloaded to the GPU, this will reduce RAM usage and use VRAM instead.
### Q6_K and Q8_0 files are split and require joining
**Note:** HF does not support uploading files larger than 50GB. Therefore I have uploaded the Q6_K and Q8_0 files as split files.
<details>
<summary>Click for instructions regarding Q6_K and Q8_0 files</summary>
### q6_K
Please download:
* `meditron-70b.Q6_K.gguf-split-a`
* `meditron-70b.Q6_K.gguf-split-b`
### q8_0
Please download:
* `meditron-70b.Q8_0.gguf-split-a`
* `meditron-70b.Q8_0.gguf-split-b`
To join the files, do the following:
Linux and macOS:
```
cat meditron-70b.Q6_K.gguf-split-* > meditron-70b.Q6_K.gguf && rm meditron-70b.Q6_K.gguf-split-*
cat meditron-70b.Q8_0.gguf-split-* > meditron-70b.Q8_0.gguf && rm meditron-70b.Q8_0.gguf-split-*
```
Windows command line:
```
COPY /B meditron-70b.Q6_K.gguf-split-a + meditron-70b.Q6_K.gguf-split-b meditron-70b.Q6_K.gguf
del meditron-70b.Q6_K.gguf-split-a meditron-70b.Q6_K.gguf-split-b
COPY /B meditron-70b.Q8_0.gguf-split-a + meditron-70b.Q8_0.gguf-split-b meditron-70b.Q8_0.gguf
del meditron-70b.Q8_0.gguf-split-a meditron-70b.Q8_0.gguf-split-b
```
</details>
<!-- README_GGUF.md-provided-files end -->
<!-- README_GGUF.md-how-to-download start -->
## How to download GGUF files
**Note for manual downloaders:** You almost never want to clone the entire repo! Multiple different quantisation formats are provided, and most users only want to pick and download a single file.
The following clients/libraries will automatically download models for you, providing a list of available models to choose from:
* LM Studio
* LoLLMS Web UI
* Faraday.dev
### In `text-generation-webui`
Under Download Model, you can enter the model repo: TheBloke/meditron-70B-GGUF and below it, a specific filename to download, such as: meditron-70b.Q4_K_M.gguf.
Then click Download.
### On the command line, including multiple files at once
I recommend using the `huggingface-hub` Python library:
```shell
pip3 install huggingface-hub
```
Then you can download any individual model file to the current directory, at high speed, with a command like this:
```shell
huggingface-cli download TheBloke/meditron-70B-GGUF meditron-70b.Q4_K_M.gguf --local-dir . --local-dir-use-symlinks False
```
<details>
<summary>More advanced huggingface-cli download usage (click to read)</summary>
You can also download multiple files at once with a pattern:
```shell
huggingface-cli download TheBloke/meditron-70B-GGUF --local-dir . --local-dir-use-symlinks False --include='*Q4_K*gguf'
```
For more documentation on downloading with `huggingface-cli`, please see: [HF -> Hub Python Library -> Download files -> Download from the CLI](https://huggingface.co/docs/huggingface_hub/guides/download#download-from-the-cli).
To accelerate downloads on fast connections (1Gbit/s or higher), install `hf_transfer`:
```shell
pip3 install hf_transfer
```
And set environment variable `HF_HUB_ENABLE_HF_TRANSFER` to `1`:
```shell
HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download TheBloke/meditron-70B-GGUF meditron-70b.Q4_K_M.gguf --local-dir . --local-dir-use-symlinks False
```
Windows Command Line users: You can set the environment variable by running `set HF_HUB_ENABLE_HF_TRANSFER=1` before the download command.
</details>
<!-- README_GGUF.md-how-to-download end -->
<!-- README_GGUF.md-how-to-run start -->
## Example `llama.cpp` command
Make sure you are using `llama.cpp` from commit [d0cee0d](https://github.com/ggerganov/llama.cpp/commit/d0cee0d36d5be95a0d9088b674dbb27354107221) or later.
```shell
./main -ngl 35 -m meditron-70b.Q4_K_M.gguf --color -c 4096 --temp 0.7 --repeat_penalty 1.1 -n -1 -p "<|im_start|>system\n{system_message}<|im_end|>\n<|im_start|>user\n{prompt}<|im_end|>\n<|im_start|>assistant"
```
Change `-ngl 32` to the number of layers to offload to GPU. Remove it if you don't have GPU acceleration.
Change `-c 4096` to the desired sequence length. For extended sequence models - eg 8K, 16K, 32K - the necessary RoPE scaling parameters are read from the GGUF file and set by llama.cpp automatically. Note that longer sequence lengths require much more resources, so you may need to reduce this value.
If you want to have a chat-style conversation, replace the `-p <PROMPT>` argument with `-i -ins`
For other parameters and how to use them, please refer to [the llama.cpp documentation](https://github.com/ggerganov/llama.cpp/blob/master/examples/main/README.md)
## How to run in `text-generation-webui`
Further instructions can be found in the text-generation-webui documentation, here: [text-generation-webui/docs/04 ‐ Model Tab.md](https://github.com/oobabooga/text-generation-webui/blob/main/docs/04%20%E2%80%90%20Model%20Tab.md#llamacpp).
## How to run from Python code
You can use GGUF models from Python using the [llama-cpp-python](https://github.com/abetlen/llama-cpp-python) or [ctransformers](https://github.com/marella/ctransformers) libraries. Note that at the time of writing (Nov 27th 2023), ctransformers has not been updated for some time and is not compatible with some recent models. Therefore I recommend you use llama-cpp-python.
### How to load this model in Python code, using llama-cpp-python
For full documentation, please see: [llama-cpp-python docs](https://abetlen.github.io/llama-cpp-python/).
#### First install the package
Run one of the following commands, according to your system:
```shell
# Base ctransformers with no GPU acceleration
pip install llama-cpp-python
# With NVidia CUDA acceleration
CMAKE_ARGS="-DLLAMA_CUBLAS=on" pip install llama-cpp-python
# Or with OpenBLAS acceleration
CMAKE_ARGS="-DLLAMA_BLAS=ON -DLLAMA_BLAS_VENDOR=OpenBLAS" pip install llama-cpp-python
# Or with CLBLast acceleration
CMAKE_ARGS="-DLLAMA_CLBLAST=on" pip install llama-cpp-python
# Or with AMD ROCm GPU acceleration (Linux only)
CMAKE_ARGS="-DLLAMA_HIPBLAS=on" pip install llama-cpp-python
# Or with Metal GPU acceleration for macOS systems only
CMAKE_ARGS="-DLLAMA_METAL=on" pip install llama-cpp-python
# In windows, to set the variables CMAKE_ARGS in PowerShell, follow this format; eg for NVidia CUDA:
$env:CMAKE_ARGS = "-DLLAMA_OPENBLAS=on"
pip install llama-cpp-python
```
#### Simple llama-cpp-python example code
```python
from llama_cpp import Llama
# Set gpu_layers to the number of layers to offload to GPU. Set to 0 if no GPU acceleration is available on your system.
llm = Llama(
model_path="./meditron-70b.Q4_K_M.gguf", # Download the model file first
n_ctx=4096, # The max sequence length to use - note that longer sequence lengths require much more resources
n_threads=8, # The number of CPU threads to use, tailor to your system and the resulting performance
n_gpu_layers=35 # The number of layers to offload to GPU, if you have GPU acceleration available
)
# Simple inference example
output = llm(
"<|im_start|>system\n{system_message}<|im_end|>\n<|im_start|>user\n{prompt}<|im_end|>\n<|im_start|>assistant", # Prompt
max_tokens=512, # Generate up to 512 tokens
stop=["</s>"], # Example stop token - not necessarily correct for this specific model! Please check before using.
echo=True # Whether to echo the prompt
)
# Chat Completion API
llm = Llama(model_path="./meditron-70b.Q4_K_M.gguf", chat_format="llama-2") # Set chat_format according to the model you are using
llm.create_chat_completion(
messages = [
{"role": "system", "content": "You are a story writing assistant."},
{
"role": "user",
"content": "Write a story about llamas."
}
]
)
```
## How to use with LangChain
Here are guides on using llama-cpp-python and ctransformers with LangChain:
* [LangChain + llama-cpp-python](https://python.langchain.com/docs/integrations/llms/llamacpp)
* [LangChain + ctransformers](https://python.langchain.com/docs/integrations/providers/ctransformers)
<!-- README_GGUF.md-how-to-run end -->
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute
Thanks to the [chirper.ai](https://chirper.ai) team!
Thanks to Clay from [gpus.llm-utils.org](llm-utils)!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Brandon Frisco, LangChain4j, Spiking Neurons AB, transmissions 11, Joseph William Delisle, Nitin Borwankar, Willem Michiel, Michael Dempsey, vamX, Jeffrey Morgan, zynix, jjj, Omer Bin Jawed, Sean Connelly, jinyuan sun, Jeromy Smith, Shadi, Pawan Osman, Chadd, Elijah Stavena, Illia Dulskyi, Sebastain Graf, Stephen Murray, terasurfer, Edmond Seymore, Celu Ramasamy, Mandus, Alex, biorpg, Ajan Kanaga, Clay Pascal, Raven Klaugh, 阿明, K, ya boyyy, usrbinkat, Alicia Loh, John Villwock, ReadyPlayerEmma, Chris Smitley, Cap'n Zoog, fincy, GodLy, S_X, sidney chen, Cory Kujawski, OG, Mano Prime, AzureBlack, Pieter, Kalila, Spencer Kim, Tom X Nguyen, Stanislav Ovsiannikov, Michael Levine, Andrey, Trailburnt, Vadim, Enrico Ros, Talal Aujan, Brandon Phillips, Jack West, Eugene Pentland, Michael Davis, Will Dee, webtim, Jonathan Leane, Alps Aficionado, Rooh Singh, Tiffany J. Kim, theTransient, Luke @flexchar, Elle, Caitlyn Gatomon, Ari Malik, subjectnull, Johann-Peter Hartmann, Trenton Dambrowitz, Imad Khwaja, Asp the Wyvern, Emad Mostaque, Rainer Wilmers, Alexandros Triantafyllidis, Nicholas, Pedro Madruga, SuperWojo, Harry Royden McLaughlin, James Bentley, Olakabola, David Ziegler, Ai Maven, Jeff Scroggin, Nikolai Manek, Deo Leter, Matthew Berman, Fen Risland, Ken Nordquist, Manuel Alberto Morcote, Luke Pendergrass, TL, Fred von Graf, Randy H, Dan Guido, NimbleBox.ai, Vitor Caleffi, Gabriel Tamborski, knownsqashed, Lone Striker, Erik Bjäreholt, John Detwiler, Leonard Tan, Iucharbius
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
<!-- original-model-card start -->
# Original model card: EPFL LLM Team's Meditron 70B
<img width=50% src="meditron_LOGO.png" alt="Alt text" title="Meditron-logo">
# Model Card for Meditron-70B-v1.0
Meditron is a suite of open-source medical Large Language Models (LLMs).
Meditron-70B is a 70 billion parameters model adapted to the medical domain from Llama-2-70B through continued pretraining on a comprehensively curated medical corpus, including selected PubMed articles, abstracts, a [new dataset](https://huggingface.co/datasets/epfl-llm/guidelines) of internationally-recognized medical guidelines, and general domain data from [RedPajama-v1](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T).
Meditron-70B, finetuned on relevant training data, outperforms Llama-2-70B, GPT-3.5 (`text-davinci-003`, 8-shot), and Flan-PaLM on multiple medical reasoning tasks.
<!--# Table of Contents
[Model Card for Meditron 70B](#model-card-for--meditron-70b-v1.0)
- [Table of Contents](#table-of-contents)
- [Model Details](#model-details)
- [Model Description](#model-description)
- [Uses](#uses)
- [Downstream Use](#downstream-use)
- [Out-of-Scope Use](#out-of-scope-use)
- [Bias, Risks, and Limitations](#bias-risks-and-limitations)
- [Recommendations](#recommendations)
- [Training Details](#training-details)
- [Training Data](#training-data)
- [Training Procedure](#training-procedure)
- [Preprocessing](#preprocessing)
- [Evaluation](#evaluation)
- [Testing Data & Metrics](#testing-data-&-metrics)
- [Testing Data](#testing-data)
- [Metrics](#metrics)
- [Results](#results)
- [Environmental Impact](#environmental-impact)
- [Citation](#citation)-->
<details open>
<summary><strong>Advisory Notice</strong></summary>
<blockquote style="padding: 10px; margin: 0 0 10px; border-left: 5px solid #ddd;">
While Meditron is designed to encode medical knowledge from sources of high-quality evidence, it is not yet adapted to deliver this knowledge appropriately, safely, or within professional actionable constraints.
We recommend against deploying Meditron in medical applications without extensive use-case alignment, as well as additional testing, specifically including randomized controlled trials in real-world practice settings.
</blockquote>
</details>
## Model Details
- **Developed by:** [EPFL LLM Team](https://huggingface.co/epfl-llm)
- **Model type:** Causal decoder-only transformer language model
- **Language(s):** English (mainly)
- **Model License:** [LLAMA 2 COMMUNITY LICENSE AGREEMENT](https://huggingface.co/meta-llama/Llama-2-70b/raw/main/LICENSE.txt)
- **Code License:** [APACHE 2.0 LICENSE](LICENSE)
- **Continue-pretrained from model:** [Llama-2-70B](https://huggingface.co/meta-llama/Llama-2-70b)
- **Context length:** 4K tokens
- **Input:** Text-only data
- **Output:** Model generates text only
- **Status:** This is a static model trained on an offline dataset. Future versions of the tuned models will be released as we enhance model's performance.
- **Knowledge Cutoff:** August 2023
### Model Sources
- **Repository:** [epflLLM/meditron](https://github.com/epfLLM/meditron)
- **Trainer:** [epflLLM/Megatron-LLM](https://github.com/epfLLM/Megatron-LLM)
- **Paper:** *[MediTron-70B: Scaling Medical Pretraining for Large Language Models](https://arxiv.org/abs/2311.16079)*
## Uses
Meditron-70B is being made available for further testing and assessment as an AI assistant to enhance clinical decision-making and enhance access to an LLM for healthcare use. Potential use cases may include but are not limited to:
- Medical exam question answering
- Supporting differential diagnosis
- Disease information (symptoms, cause, treatment) query
- General health information query
### Direct Use
It is possible to use this model to generate text, which is useful for experimentation and understanding its capabilities.
It should not be used directly for production or work that may impact people.
### Downstream Use
Meditron-70B is a foundation model that can be finetuned, instruction-tuned, or RLHF-tuned for specific downstream tasks and applications.
The main way we have used this model is finetuning for downstream question-answering tasks, but we encourage using this model for additional applications.
Specific formatting needs to be followed to prompt our finetuned models, including the `<|im_start|>`, `<|im_end|>` tags, and `system`, `question`, `answer` identifiers.
"""
<|im_start|>system
{system_message}<|im_end|>
<|im_start|>question
{prompt}<|im_end|>
<|im_start|>answer
"""
**Note 1**: The above formatting is not required for running the base model (this repository)
**Note 2**: the above formatting is just an example of a finetuning template. This format is not a requirement if you use your own formatting option for the finetuning of the model.
To run proper generation with this base model, we recommend using a high-throughput and memory-efficient inference engine, such as [vLLM](https://github.com/vllm-project/vllm), with a UI that supports chat and text generation, such as [BetterChatGPT](https://github.com/ztjhz/BetterChatGPT)
To see more details about model deployment and generation, please see our [documentation](https://github.com/epfLLM/meditron/blob/main/deployment/README.md).
### Out-of-Scope Use
We do not recommend using this model for natural language generation in a production environment, finetuned or otherwise.
## Truthfulness, Helpfulness, Risk, and Bias
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
We did an initial assessment of Meditron models' **Truthfulness** against baseline models and consumer-level medical models.
We use TruthfulQA (multiple choice) as the main evaluation benchmark.
We only focus on the categories that are relevant to the medical domain, including Health, Nutrition, Psychology, and Science.
For 7B models, we perform one-shot evaluations for consistent answer generation.
For 70B models, the evaluations are under the zero-shot setting.
Below, we report the detailed truthfulness performance of each category.
| | | | | | | | |
| --- | ------ |----- |----- |----- |----- |----- |----- |
|Category | meditron-70b | llama-2-70b | med42-70b* | meditron-7b | llama-2-7b | PMC-llama-7b |
|Health | 81.8 | 69.1 | 83.6 | 27.3 | 16.4 | 3.6 |
|Nutrition | 77.9 | 68.8 | 62.5 | 31.1 | 12.5 | 6.3 |
|Psychology| 47.4 | 36.8 | 52.6 | 21.1 | 10.5 | 0.0 |
|Science | 77.8 | 44.4 | 33.3 | 33.3 | 11.1 | 0.0 |
|Avg | 71.2 | 54.8 | 58.0 | 28.3 | 12.6 | 2.5 |
| | | | | | | |
For a more detailed performance analysis, please see our paper.
For **Helpfulness**, **Risk** and **Bias**, we provide a comprehensive qualitative generation report of Meditron-70B on queries designed by medical experts.
Each query targets specific aspects of helpfulness (medical accuracy, up-to-date information, etc.), risk (public health, medical ethics, etc.) and bias (gender, age, race, etc.).
Please see the detailed generations in our paper. We compare our generations to Llama-2-70B and ChatGPT-3.5 (version Nov, 27, 2023)
Significant research is still required to fully explore potential bias, fairness, and safety issues with this language model.
### Recommendations
**IMPORTANT!**
Users (both direct and downstream) should be made aware of the risks, biases, and limitations of the model.
While this model is capable of generating natural language text, we have only begun to explore this capability and its limitations.
Understanding these limitations is especially important in a domain like medicine.
Therefore, we strongly recommend against using this model in production for natural language generation or for professional purposes related to health and medicine without comprehensive testing for your application.
## Training Details
### Training Data
Meditron’s domain-adaptive pre-training corpus GAP-Replay combines 48.1B tokens from four corpora:
- [**Clinical Guidelines**](https://huggingface.co/datasets/epfl-llm/guidelines): a new dataset of 46K internationally-recognized clinical practice guidelines from various healthcare-related sources, including hospitals and international organizations.
- **Medical Paper Abstracts**: 16.1M abstracts extracted from closed-access PubMed and PubMed Central papers.
- **Medical Papers**: full-text articles extracted from 5M publicly available PubMed and PubMed Central papers.
- **Replay Data**: 400M tokens of general domain pretraining data sampled from [RedPajama-v1](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T)
<img width="60%" src="gap-replay.png" alt="Alt text" title="Meditron-logo">
#### Data Preprocessing
Please see the detailed preprocessing procedure in our paper.
### Training Procedure
We used the [Megatron-LLM](https://github.com/epfLLM/Megatron-LLM) distributed training library, a derivative of Nvidia's Megatron LM project, to optimize training efficiency.
Hardware consists of 16 nodes of 8x NVIDIA A100 (80GB) SXM GPUs connected by NVLink and NVSwitch with a single Nvidia ConnectX-6 DX network card and equipped with 2 x AMD EPYC 7543 32-Core Processors and 512 GB of RAM.
The nodes are connected via RDMA over Converged Ethernet.
Our three-way parallelism scheme uses:
- Data Parallelism (DP -- different GPUs process different subsets of the batches) of 2,
- Pipeline Parallelism (PP -- different GPUs process different layers) of 8,
- Tensor Parallelism (TP -- different GPUs process different subtensors for matrix multiplication) of 8.
#### Training Hyperparameters
| | |
| --- | ------ |
| bf16 | true |
| lr | 1.5e-4 |
| eps | 1e-5 |
| betas | \[0.9, 0.95\] |
| clip_grad | 1 |
| weight decay | 0.1 |
| DP size | 2 |
| TP size | 8 |
| PP size | 8 |
| seq length | 4096 |
| lr scheduler | cosine|
| min lr | 1e-6 |
| warmup iteration | 2000 |
| micro batch size | 2 |
| global batch size | 512 |
| | |
#### Speeds, Sizes, Times
The model was trained in September and October 2023.
The model architecture is exactly Llama 2, meaning
| | |
| --- | ------ |
| Model size | 70B |
| Hidden dimension | 8192 |
| Num. attention heads | 64 |
| Num. layers | 80 |
| | | |
We train the 70B model on 48e9 tokens, at a throughput of about 40,200 tokens / second.
This amounts to a bfloat16 model flops utilization of roughly 42.3\%.
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data & Metrics
#### Testing Data
- [MedQA (USMLE)](https://huggingface.co/datasets/bigbio/med_qa)
- [MedMCQA](https://huggingface.co/datasets/medmcqa)
- [PubMedQA](https://huggingface.co/datasets/bigbio/pubmed_qa)
- [MMLU-Medical](https://huggingface.co/datasets/lukaemon/mmlu)
- [MedQA-4-Option](https://huggingface.co/datasets/GBaker/MedQA-USMLE-4-options)
#### Metrics
- Accuracy: suite the evaluation of multiple-choice question-answering tasks.
### Results
We finetune meditron-70b and llama-2-70b on each benchmark (pubmedqa, medmcqa, medqa)'s training data individually.
We report the finetuned models' performance with self-consistency chain-of-thought as the inference mode.
For MMLU-Medical, models finetuned on MedMCQA are used for inference.
For MedQA-4-Option, models finetuned on MedQA are used for inference.
For a more detailed performance analysis, please see our paper.
| | | | | | |
| --- | ------ |----- |----- |----- |----- |
|Dataset| meditron-70b | llama-2-70b | med42-70b* | clinical-camel-70b* |
|MMLU-Medical | 77.6 | 77.9 | 74.5 | 65.7 |
|PubMedQA | 81.6 | 80.0 | 61.2 | 67.0 |
|MedMCQA | 66.0 | 62.6 | 59.2 | 46.7 |
|MedQA | 64.4 | 61.5 | 59.1 | 50.8 |
|MedQA-4-Option| 70.2 | 63.8 | 63.9 | 56.8 |
|Avg | 72.0 | 69.2 | 63.6 | 57.4 |
| | | | | | |
**Note**: models with * are already instruction-tuned, so we exclude them from further finetuning on any training data.
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
- **Hardware Type:** 128 x NVIDIA A100 (80GB) SXM
- **Total GPU hours:** 42,496
- **Hardware Provider:** EPFL Research Computing Platform
- **Compute Region:** Switzerland
- **Carbon Emitted:** Switzerland has a carbon efficiency of 0.016 kgCO2/kWh (https://www.carbonfootprint.com/docs/2018_8_electricity_factors_august_2018_-_online_sources.pdf). 332 hours of 128 A100s means 42496 hours at a TDP of 400W. Assuming a Power Usage effectiveness of 1.8, total emissions are estimated to be:
(400W / 1000W/kWh / GPU * 0.016 kgCO2/kWh * 332 h * 128 GPU) * 1.8 PUE = 486 kgCO2.
## Citation
**BibTeX:**
If you use Meditron or its training data, please cite our work:
```
@misc{chen2023meditron70b,
title={MEDITRON-70B: Scaling Medical Pretraining for Large Language Models},
author={Zeming Chen and Alejandro Hernández-Cano and Angelika Romanou and Antoine Bonnet and Kyle Matoba and Francesco Salvi and Matteo Pagliardini and Simin Fan and Andreas Köpf and Amirkeivan Mohtashami and Alexandre Sallinen and Alireza Sakhaeirad and Vinitra Swamy and Igor Krawczuk and Deniz Bayazit and Axel Marmet and Syrielle Montariol and Mary-Anne Hartley and Martin Jaggi and Antoine Bosselut},
year={2023},
eprint={2311.16079},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
@software{epfmedtrn,
author = {Zeming Chen and Alejandro Hernández Cano and Angelika Romanou and Antoine Bonnet and Kyle Matoba and Francesco Salvi and Matteo Pagliardini and Simin Fan and Andreas Köpf and Amirkeivan Mohtashami and Alexandre Sallinen and Alireza Sakhaeirad and Vinitra Swamy and Igor Krawczuk and Deniz Bayazit and Axel Marmet and Syrielle Montariol and Mary-Anne Hartley and Martin Jaggi and Antoine Bosselut},
title = {MediTron-70B: Scaling Medical Pretraining for Large Language Models},
month = November,
year = 2023,
url = {https://github.com/epfLLM/meditron}
}
```
<!-- original-model-card end -->
| [
"QUESTION_ANSWERING"
] | [
"MEDQA",
"PUBMEDQA"
] |
McGill-NLP/LLM2Vec-Sheared-LLaMA-mntp-supervised | McGill-NLP | sentence-similarity | [
"peft",
"safetensors",
"text-embedding",
"embeddings",
"information-retrieval",
"beir",
"text-classification",
"language-model",
"text-clustering",
"text-semantic-similarity",
"text-evaluation",
"text-reranking",
"feature-extraction",
"sentence-similarity",
"Sentence Similarity",
"natural_questions",
"ms_marco",
"fever",
"hotpot_qa",
"mteb",
"en",
"arxiv:2404.05961",
"license:mit",
"model-index",
"region:us"
] | 2024-04-04T14:12:46 | 2024-04-11T20:09:40 | 1,163 | 4 | ---
language:
- en
library_name: peft
license: mit
pipeline_tag: sentence-similarity
tags:
- text-embedding
- embeddings
- information-retrieval
- beir
- text-classification
- language-model
- text-clustering
- text-semantic-similarity
- text-evaluation
- text-reranking
- feature-extraction
- sentence-similarity
- Sentence Similarity
- natural_questions
- ms_marco
- fever
- hotpot_qa
- mteb
model-index:
- name: LLM2Vec-Sheared-LLaMA-supervised
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 77.41791044776119
- type: ap
value: 41.45458580415683
- type: f1
value: 71.63305447032735
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 82.0527
- type: ap
value: 77.3222852456055
- type: f1
value: 81.97981459031165
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 40.806000000000004
- type: f1
value: 40.3299129176701
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: arguana
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 25.391000000000002
- type: map_at_10
value: 41.919000000000004
- type: map_at_100
value: 42.846000000000004
- type: map_at_1000
value: 42.851
- type: map_at_3
value: 36.260999999999996
- type: map_at_5
value: 39.528999999999996
- type: mrr_at_1
value: 26.245
- type: mrr_at_10
value: 42.215
- type: mrr_at_100
value: 43.135
- type: mrr_at_1000
value: 43.14
- type: mrr_at_3
value: 36.546
- type: mrr_at_5
value: 39.782000000000004
- type: ndcg_at_1
value: 25.391000000000002
- type: ndcg_at_10
value: 51.663000000000004
- type: ndcg_at_100
value: 55.419
- type: ndcg_at_1000
value: 55.517
- type: ndcg_at_3
value: 39.96
- type: ndcg_at_5
value: 45.909
- type: precision_at_1
value: 25.391000000000002
- type: precision_at_10
value: 8.3
- type: precision_at_100
value: 0.989
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 16.904
- type: precision_at_5
value: 13.058
- type: recall_at_1
value: 25.391000000000002
- type: recall_at_10
value: 83.001
- type: recall_at_100
value: 98.933
- type: recall_at_1000
value: 99.644
- type: recall_at_3
value: 50.711
- type: recall_at_5
value: 65.292
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 43.472186058302285
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 39.846039374129546
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 60.713811638804174
- type: mrr
value: 73.38906476718111
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_spearman
value: 85.88328221005123
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 86.00974025974025
- type: f1
value: 85.97349359388288
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 37.102075665637685
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 34.27583239919031
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: cqadupstack/android
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 33.043
- type: map_at_10
value: 44.515
- type: map_at_100
value: 45.967999999999996
- type: map_at_1000
value: 46.098
- type: map_at_3
value: 40.285
- type: map_at_5
value: 42.841
- type: mrr_at_1
value: 40.2
- type: mrr_at_10
value: 50.233000000000004
- type: mrr_at_100
value: 50.938
- type: mrr_at_1000
value: 50.978
- type: mrr_at_3
value: 47.353
- type: mrr_at_5
value: 49.034
- type: ndcg_at_1
value: 40.2
- type: ndcg_at_10
value: 51.096
- type: ndcg_at_100
value: 56.267999999999994
- type: ndcg_at_1000
value: 58.092999999999996
- type: ndcg_at_3
value: 45.09
- type: ndcg_at_5
value: 48.198
- type: precision_at_1
value: 40.2
- type: precision_at_10
value: 9.843
- type: precision_at_100
value: 1.546
- type: precision_at_1000
value: 0.20400000000000001
- type: precision_at_3
value: 21.507
- type: precision_at_5
value: 15.966
- type: recall_at_1
value: 33.043
- type: recall_at_10
value: 63.871
- type: recall_at_100
value: 85.527
- type: recall_at_1000
value: 96.936
- type: recall_at_3
value: 46.859
- type: recall_at_5
value: 55.116
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackEnglishRetrieval
type: cqadupstack/english
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 31.924000000000003
- type: map_at_10
value: 42.298
- type: map_at_100
value: 43.589
- type: map_at_1000
value: 43.724000000000004
- type: map_at_3
value: 39.739999999999995
- type: map_at_5
value: 41.131
- type: mrr_at_1
value: 40.064
- type: mrr_at_10
value: 48.4
- type: mrr_at_100
value: 49.07
- type: mrr_at_1000
value: 49.113
- type: mrr_at_3
value: 46.635
- type: mrr_at_5
value: 47.549
- type: ndcg_at_1
value: 40.064
- type: ndcg_at_10
value: 47.686
- type: ndcg_at_100
value: 52.054
- type: ndcg_at_1000
value: 54.151
- type: ndcg_at_3
value: 44.57
- type: ndcg_at_5
value: 45.727000000000004
- type: precision_at_1
value: 40.064
- type: precision_at_10
value: 8.770999999999999
- type: precision_at_100
value: 1.422
- type: precision_at_1000
value: 0.19
- type: precision_at_3
value: 21.741
- type: precision_at_5
value: 14.790000000000001
- type: recall_at_1
value: 31.924000000000003
- type: recall_at_10
value: 56.603
- type: recall_at_100
value: 74.82900000000001
- type: recall_at_1000
value: 88.176
- type: recall_at_3
value: 46.11
- type: recall_at_5
value: 50.273999999999994
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGamingRetrieval
type: cqadupstack/gaming
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 40.721000000000004
- type: map_at_10
value: 53.053
- type: map_at_100
value: 54.103
- type: map_at_1000
value: 54.157999999999994
- type: map_at_3
value: 49.854
- type: map_at_5
value: 51.547
- type: mrr_at_1
value: 46.833999999999996
- type: mrr_at_10
value: 56.61000000000001
- type: mrr_at_100
value: 57.286
- type: mrr_at_1000
value: 57.312
- type: mrr_at_3
value: 54.17999999999999
- type: mrr_at_5
value: 55.503
- type: ndcg_at_1
value: 46.833999999999996
- type: ndcg_at_10
value: 58.928000000000004
- type: ndcg_at_100
value: 62.939
- type: ndcg_at_1000
value: 63.970000000000006
- type: ndcg_at_3
value: 53.599
- type: ndcg_at_5
value: 55.96600000000001
- type: precision_at_1
value: 46.833999999999996
- type: precision_at_10
value: 9.48
- type: precision_at_100
value: 1.2349999999999999
- type: precision_at_1000
value: 0.13699999999999998
- type: precision_at_3
value: 24.032999999999998
- type: precision_at_5
value: 16.213
- type: recall_at_1
value: 40.721000000000004
- type: recall_at_10
value: 72.653
- type: recall_at_100
value: 89.91900000000001
- type: recall_at_1000
value: 97.092
- type: recall_at_3
value: 58.135999999999996
- type: recall_at_5
value: 64.156
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGisRetrieval
type: cqadupstack/gis
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 24.938
- type: map_at_10
value: 34.027
- type: map_at_100
value: 34.999
- type: map_at_1000
value: 35.083
- type: map_at_3
value: 31.154
- type: map_at_5
value: 32.767
- type: mrr_at_1
value: 27.006000000000004
- type: mrr_at_10
value: 36.192
- type: mrr_at_100
value: 36.989
- type: mrr_at_1000
value: 37.053999999999995
- type: mrr_at_3
value: 33.503
- type: mrr_at_5
value: 34.977000000000004
- type: ndcg_at_1
value: 27.006000000000004
- type: ndcg_at_10
value: 39.297
- type: ndcg_at_100
value: 44.078
- type: ndcg_at_1000
value: 46.162
- type: ndcg_at_3
value: 33.695
- type: ndcg_at_5
value: 36.401
- type: precision_at_1
value: 27.006000000000004
- type: precision_at_10
value: 6.181
- type: precision_at_100
value: 0.905
- type: precision_at_1000
value: 0.11199999999999999
- type: precision_at_3
value: 14.426
- type: precision_at_5
value: 10.215
- type: recall_at_1
value: 24.938
- type: recall_at_10
value: 53.433
- type: recall_at_100
value: 75.558
- type: recall_at_1000
value: 91.096
- type: recall_at_3
value: 38.421
- type: recall_at_5
value: 44.906
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackMathematicaRetrieval
type: cqadupstack/mathematica
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 15.565999999999999
- type: map_at_10
value: 23.419999999999998
- type: map_at_100
value: 24.678
- type: map_at_1000
value: 24.801000000000002
- type: map_at_3
value: 20.465
- type: map_at_5
value: 21.979000000000003
- type: mrr_at_1
value: 19.652
- type: mrr_at_10
value: 27.929
- type: mrr_at_100
value: 28.92
- type: mrr_at_1000
value: 28.991
- type: mrr_at_3
value: 25.249
- type: mrr_at_5
value: 26.66
- type: ndcg_at_1
value: 19.652
- type: ndcg_at_10
value: 28.869
- type: ndcg_at_100
value: 34.675
- type: ndcg_at_1000
value: 37.577
- type: ndcg_at_3
value: 23.535
- type: ndcg_at_5
value: 25.807999999999996
- type: precision_at_1
value: 19.652
- type: precision_at_10
value: 5.659
- type: precision_at_100
value: 0.979
- type: precision_at_1000
value: 0.13699999999999998
- type: precision_at_3
value: 11.401
- type: precision_at_5
value: 8.581999999999999
- type: recall_at_1
value: 15.565999999999999
- type: recall_at_10
value: 41.163
- type: recall_at_100
value: 66.405
- type: recall_at_1000
value: 87.071
- type: recall_at_3
value: 26.478
- type: recall_at_5
value: 32.217
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackPhysicsRetrieval
type: cqadupstack/physics
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 30.834
- type: map_at_10
value: 41.49
- type: map_at_100
value: 42.897999999999996
- type: map_at_1000
value: 43.004
- type: map_at_3
value: 38.151
- type: map_at_5
value: 40.157
- type: mrr_at_1
value: 38.306000000000004
- type: mrr_at_10
value: 47.371
- type: mrr_at_100
value: 48.265
- type: mrr_at_1000
value: 48.304
- type: mrr_at_3
value: 44.915
- type: mrr_at_5
value: 46.516999999999996
- type: ndcg_at_1
value: 38.306000000000004
- type: ndcg_at_10
value: 47.394999999999996
- type: ndcg_at_100
value: 53.086999999999996
- type: ndcg_at_1000
value: 54.94799999999999
- type: ndcg_at_3
value: 42.384
- type: ndcg_at_5
value: 45.055
- type: precision_at_1
value: 38.306000000000004
- type: precision_at_10
value: 8.624
- type: precision_at_100
value: 1.325
- type: precision_at_1000
value: 0.165
- type: precision_at_3
value: 20.18
- type: precision_at_5
value: 14.418000000000001
- type: recall_at_1
value: 30.834
- type: recall_at_10
value: 58.977000000000004
- type: recall_at_100
value: 82.78
- type: recall_at_1000
value: 94.825
- type: recall_at_3
value: 44.954
- type: recall_at_5
value: 51.925
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackProgrammersRetrieval
type: cqadupstack/programmers
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 28.549000000000003
- type: map_at_10
value: 38.796
- type: map_at_100
value: 40.085
- type: map_at_1000
value: 40.198
- type: map_at_3
value: 35.412
- type: map_at_5
value: 37.116
- type: mrr_at_1
value: 35.388
- type: mrr_at_10
value: 44.626
- type: mrr_at_100
value: 45.445
- type: mrr_at_1000
value: 45.491
- type: mrr_at_3
value: 41.952
- type: mrr_at_5
value: 43.368
- type: ndcg_at_1
value: 35.388
- type: ndcg_at_10
value: 44.894
- type: ndcg_at_100
value: 50.166999999999994
- type: ndcg_at_1000
value: 52.308
- type: ndcg_at_3
value: 39.478
- type: ndcg_at_5
value: 41.608000000000004
- type: precision_at_1
value: 35.388
- type: precision_at_10
value: 8.322000000000001
- type: precision_at_100
value: 1.2670000000000001
- type: precision_at_1000
value: 0.164
- type: precision_at_3
value: 18.836
- type: precision_at_5
value: 13.333
- type: recall_at_1
value: 28.549000000000003
- type: recall_at_10
value: 57.229
- type: recall_at_100
value: 79.541
- type: recall_at_1000
value: 93.887
- type: recall_at_3
value: 42.056
- type: recall_at_5
value: 47.705999999999996
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackRetrieval
type: mteb/cqadupstack
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 26.897333333333336
- type: map_at_10
value: 36.28758333333334
- type: map_at_100
value: 37.480083333333326
- type: map_at_1000
value: 37.59683333333333
- type: map_at_3
value: 33.3485
- type: map_at_5
value: 34.98283333333334
- type: mrr_at_1
value: 31.98916666666667
- type: mrr_at_10
value: 40.61116666666666
- type: mrr_at_100
value: 41.42133333333333
- type: mrr_at_1000
value: 41.476333333333336
- type: mrr_at_3
value: 38.19366666666667
- type: mrr_at_5
value: 39.53125
- type: ndcg_at_1
value: 31.98916666666667
- type: ndcg_at_10
value: 41.73475
- type: ndcg_at_100
value: 46.72291666666666
- type: ndcg_at_1000
value: 48.94916666666666
- type: ndcg_at_3
value: 36.883833333333335
- type: ndcg_at_5
value: 39.114
- type: precision_at_1
value: 31.98916666666667
- type: precision_at_10
value: 7.364083333333335
- type: precision_at_100
value: 1.1604166666666667
- type: precision_at_1000
value: 0.15433333333333335
- type: precision_at_3
value: 17.067500000000003
- type: precision_at_5
value: 12.091916666666666
- type: recall_at_1
value: 26.897333333333336
- type: recall_at_10
value: 53.485749999999996
- type: recall_at_100
value: 75.38716666666666
- type: recall_at_1000
value: 90.75841666666666
- type: recall_at_3
value: 39.86725
- type: recall_at_5
value: 45.683416666666666
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackStatsRetrieval
type: cqadupstack/stats
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 23.544
- type: map_at_10
value: 30.85
- type: map_at_100
value: 31.674000000000003
- type: map_at_1000
value: 31.778000000000002
- type: map_at_3
value: 28.451999999999998
- type: map_at_5
value: 29.797
- type: mrr_at_1
value: 26.687
- type: mrr_at_10
value: 33.725
- type: mrr_at_100
value: 34.439
- type: mrr_at_1000
value: 34.512
- type: mrr_at_3
value: 31.493
- type: mrr_at_5
value: 32.735
- type: ndcg_at_1
value: 26.687
- type: ndcg_at_10
value: 35.207
- type: ndcg_at_100
value: 39.406
- type: ndcg_at_1000
value: 42.021
- type: ndcg_at_3
value: 30.842000000000002
- type: ndcg_at_5
value: 32.882
- type: precision_at_1
value: 26.687
- type: precision_at_10
value: 5.66
- type: precision_at_100
value: 0.836
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_3
value: 13.395000000000001
- type: precision_at_5
value: 9.386999999999999
- type: recall_at_1
value: 23.544
- type: recall_at_10
value: 45.769
- type: recall_at_100
value: 65.33200000000001
- type: recall_at_1000
value: 84.82499999999999
- type: recall_at_3
value: 33.665
- type: recall_at_5
value: 38.795
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackTexRetrieval
type: cqadupstack/tex
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 16.524
- type: map_at_10
value: 23.65
- type: map_at_100
value: 24.654999999999998
- type: map_at_1000
value: 24.786
- type: map_at_3
value: 21.441
- type: map_at_5
value: 22.664
- type: mrr_at_1
value: 20.372
- type: mrr_at_10
value: 27.548000000000002
- type: mrr_at_100
value: 28.37
- type: mrr_at_1000
value: 28.449
- type: mrr_at_3
value: 25.291999999999998
- type: mrr_at_5
value: 26.596999999999998
- type: ndcg_at_1
value: 20.372
- type: ndcg_at_10
value: 28.194000000000003
- type: ndcg_at_100
value: 32.955
- type: ndcg_at_1000
value: 35.985
- type: ndcg_at_3
value: 24.212
- type: ndcg_at_5
value: 26.051000000000002
- type: precision_at_1
value: 20.372
- type: precision_at_10
value: 5.237
- type: precision_at_100
value: 0.8909999999999999
- type: precision_at_1000
value: 0.132
- type: precision_at_3
value: 11.643
- type: precision_at_5
value: 8.424
- type: recall_at_1
value: 16.524
- type: recall_at_10
value: 37.969
- type: recall_at_100
value: 59.48
- type: recall_at_1000
value: 81.04599999999999
- type: recall_at_3
value: 26.647
- type: recall_at_5
value: 31.558999999999997
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackUnixRetrieval
type: cqadupstack/unix
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 26.273000000000003
- type: map_at_10
value: 35.176
- type: map_at_100
value: 36.367
- type: map_at_1000
value: 36.473
- type: map_at_3
value: 32.583
- type: map_at_5
value: 33.977000000000004
- type: mrr_at_1
value: 30.97
- type: mrr_at_10
value: 39.31
- type: mrr_at_100
value: 40.225
- type: mrr_at_1000
value: 40.284
- type: mrr_at_3
value: 37.111
- type: mrr_at_5
value: 38.296
- type: ndcg_at_1
value: 30.97
- type: ndcg_at_10
value: 40.323
- type: ndcg_at_100
value: 45.725
- type: ndcg_at_1000
value: 48.022
- type: ndcg_at_3
value: 35.772
- type: ndcg_at_5
value: 37.741
- type: precision_at_1
value: 30.97
- type: precision_at_10
value: 6.819
- type: precision_at_100
value: 1.061
- type: precision_at_1000
value: 0.136
- type: precision_at_3
value: 16.387
- type: precision_at_5
value: 11.437
- type: recall_at_1
value: 26.273000000000003
- type: recall_at_10
value: 51.772
- type: recall_at_100
value: 75.362
- type: recall_at_1000
value: 91.232
- type: recall_at_3
value: 39.172000000000004
- type: recall_at_5
value: 44.147999999999996
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWebmastersRetrieval
type: cqadupstack/webmasters
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 28.326
- type: map_at_10
value: 37.97
- type: map_at_100
value: 39.602
- type: map_at_1000
value: 39.812999999999995
- type: map_at_3
value: 34.838
- type: map_at_5
value: 36.582
- type: mrr_at_1
value: 33.992
- type: mrr_at_10
value: 42.875
- type: mrr_at_100
value: 43.78
- type: mrr_at_1000
value: 43.827
- type: mrr_at_3
value: 40.481
- type: mrr_at_5
value: 41.657
- type: ndcg_at_1
value: 33.992
- type: ndcg_at_10
value: 44.122
- type: ndcg_at_100
value: 49.652
- type: ndcg_at_1000
value: 51.919000000000004
- type: ndcg_at_3
value: 39.285
- type: ndcg_at_5
value: 41.449999999999996
- type: precision_at_1
value: 33.992
- type: precision_at_10
value: 8.32
- type: precision_at_100
value: 1.617
- type: precision_at_1000
value: 0.245
- type: precision_at_3
value: 18.445
- type: precision_at_5
value: 13.281
- type: recall_at_1
value: 28.326
- type: recall_at_10
value: 55.822
- type: recall_at_100
value: 80.352
- type: recall_at_1000
value: 94.441
- type: recall_at_3
value: 41.704
- type: recall_at_5
value: 47.513
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWordpressRetrieval
type: cqadupstack/wordpress
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 22.526
- type: map_at_10
value: 30.206
- type: map_at_100
value: 31.142999999999997
- type: map_at_1000
value: 31.246000000000002
- type: map_at_3
value: 27.807
- type: map_at_5
value: 29.236
- type: mrr_at_1
value: 24.399
- type: mrr_at_10
value: 32.515
- type: mrr_at_100
value: 33.329
- type: mrr_at_1000
value: 33.400999999999996
- type: mrr_at_3
value: 30.159999999999997
- type: mrr_at_5
value: 31.482
- type: ndcg_at_1
value: 24.399
- type: ndcg_at_10
value: 34.806
- type: ndcg_at_100
value: 39.669
- type: ndcg_at_1000
value: 42.234
- type: ndcg_at_3
value: 30.144
- type: ndcg_at_5
value: 32.481
- type: precision_at_1
value: 24.399
- type: precision_at_10
value: 5.453
- type: precision_at_100
value: 0.8410000000000001
- type: precision_at_1000
value: 0.117
- type: precision_at_3
value: 12.815999999999999
- type: precision_at_5
value: 9.057
- type: recall_at_1
value: 22.526
- type: recall_at_10
value: 46.568
- type: recall_at_100
value: 69.56099999999999
- type: recall_at_1000
value: 88.474
- type: recall_at_3
value: 34.205000000000005
- type: recall_at_5
value: 39.885999999999996
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: climate-fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 14.363000000000001
- type: map_at_10
value: 24.101
- type: map_at_100
value: 26.240000000000002
- type: map_at_1000
value: 26.427
- type: map_at_3
value: 20.125
- type: map_at_5
value: 22.128
- type: mrr_at_1
value: 32.182
- type: mrr_at_10
value: 44.711
- type: mrr_at_100
value: 45.523
- type: mrr_at_1000
value: 45.551
- type: mrr_at_3
value: 41.443999999999996
- type: mrr_at_5
value: 43.473
- type: ndcg_at_1
value: 32.182
- type: ndcg_at_10
value: 33.495000000000005
- type: ndcg_at_100
value: 41.192
- type: ndcg_at_1000
value: 44.346000000000004
- type: ndcg_at_3
value: 27.651999999999997
- type: ndcg_at_5
value: 29.634
- type: precision_at_1
value: 32.182
- type: precision_at_10
value: 10.391
- type: precision_at_100
value: 1.8679999999999999
- type: precision_at_1000
value: 0.246
- type: precision_at_3
value: 20.586
- type: precision_at_5
value: 15.648000000000001
- type: recall_at_1
value: 14.363000000000001
- type: recall_at_10
value: 39.706
- type: recall_at_100
value: 65.763
- type: recall_at_1000
value: 83.296
- type: recall_at_3
value: 25.064999999999998
- type: recall_at_5
value: 31.085
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: dbpedia-entity
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 8.698
- type: map_at_10
value: 20.237
- type: map_at_100
value: 28.534
- type: map_at_1000
value: 30.346
- type: map_at_3
value: 14.097999999999999
- type: map_at_5
value: 16.567999999999998
- type: mrr_at_1
value: 68.0
- type: mrr_at_10
value: 76.35
- type: mrr_at_100
value: 76.676
- type: mrr_at_1000
value: 76.68
- type: mrr_at_3
value: 74.792
- type: mrr_at_5
value: 75.717
- type: ndcg_at_1
value: 56.25
- type: ndcg_at_10
value: 43.578
- type: ndcg_at_100
value: 47.928
- type: ndcg_at_1000
value: 55.312
- type: ndcg_at_3
value: 47.744
- type: ndcg_at_5
value: 45.257
- type: precision_at_1
value: 68.0
- type: precision_at_10
value: 35.275
- type: precision_at_100
value: 10.985
- type: precision_at_1000
value: 2.235
- type: precision_at_3
value: 52.0
- type: precision_at_5
value: 44.45
- type: recall_at_1
value: 8.698
- type: recall_at_10
value: 26.661
- type: recall_at_100
value: 54.686
- type: recall_at_1000
value: 77.795
- type: recall_at_3
value: 15.536
- type: recall_at_5
value: 19.578
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 48.385000000000005
- type: f1
value: 43.818784352804165
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 75.399
- type: map_at_10
value: 83.02199999999999
- type: map_at_100
value: 83.204
- type: map_at_1000
value: 83.217
- type: map_at_3
value: 81.86
- type: map_at_5
value: 82.677
- type: mrr_at_1
value: 81.233
- type: mrr_at_10
value: 88.10900000000001
- type: mrr_at_100
value: 88.17099999999999
- type: mrr_at_1000
value: 88.172
- type: mrr_at_3
value: 87.289
- type: mrr_at_5
value: 87.897
- type: ndcg_at_1
value: 81.233
- type: ndcg_at_10
value: 86.80600000000001
- type: ndcg_at_100
value: 87.492
- type: ndcg_at_1000
value: 87.71600000000001
- type: ndcg_at_3
value: 84.975
- type: ndcg_at_5
value: 86.158
- type: precision_at_1
value: 81.233
- type: precision_at_10
value: 10.299999999999999
- type: precision_at_100
value: 1.085
- type: precision_at_1000
value: 0.11199999999999999
- type: precision_at_3
value: 32.178000000000004
- type: precision_at_5
value: 20.069
- type: recall_at_1
value: 75.399
- type: recall_at_10
value: 93.533
- type: recall_at_100
value: 96.32300000000001
- type: recall_at_1000
value: 97.695
- type: recall_at_3
value: 88.61099999999999
- type: recall_at_5
value: 91.617
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: fiqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 20.564
- type: map_at_10
value: 33.162000000000006
- type: map_at_100
value: 35.146
- type: map_at_1000
value: 35.32
- type: map_at_3
value: 28.786
- type: map_at_5
value: 31.22
- type: mrr_at_1
value: 40.278000000000006
- type: mrr_at_10
value: 48.577
- type: mrr_at_100
value: 49.385
- type: mrr_at_1000
value: 49.423
- type: mrr_at_3
value: 46.116
- type: mrr_at_5
value: 47.305
- type: ndcg_at_1
value: 40.278000000000006
- type: ndcg_at_10
value: 40.998000000000005
- type: ndcg_at_100
value: 48.329
- type: ndcg_at_1000
value: 51.148
- type: ndcg_at_3
value: 36.852000000000004
- type: ndcg_at_5
value: 38.146
- type: precision_at_1
value: 40.278000000000006
- type: precision_at_10
value: 11.466
- type: precision_at_100
value: 1.9120000000000001
- type: precision_at_1000
value: 0.242
- type: precision_at_3
value: 24.383
- type: precision_at_5
value: 18.179000000000002
- type: recall_at_1
value: 20.564
- type: recall_at_10
value: 48.327999999999996
- type: recall_at_100
value: 75.89
- type: recall_at_1000
value: 92.826
- type: recall_at_3
value: 33.517
- type: recall_at_5
value: 39.46
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: hotpotqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 34.294000000000004
- type: map_at_10
value: 55.435
- type: map_at_100
value: 56.507
- type: map_at_1000
value: 56.57600000000001
- type: map_at_3
value: 51.654999999999994
- type: map_at_5
value: 54.086
- type: mrr_at_1
value: 68.589
- type: mrr_at_10
value: 75.837
- type: mrr_at_100
value: 76.142
- type: mrr_at_1000
value: 76.155
- type: mrr_at_3
value: 74.50099999999999
- type: mrr_at_5
value: 75.339
- type: ndcg_at_1
value: 68.589
- type: ndcg_at_10
value: 63.846000000000004
- type: ndcg_at_100
value: 67.65
- type: ndcg_at_1000
value: 69.015
- type: ndcg_at_3
value: 58.355999999999995
- type: ndcg_at_5
value: 61.489000000000004
- type: precision_at_1
value: 68.589
- type: precision_at_10
value: 13.738
- type: precision_at_100
value: 1.67
- type: precision_at_1000
value: 0.185
- type: precision_at_3
value: 37.736
- type: precision_at_5
value: 25.11
- type: recall_at_1
value: 34.294000000000004
- type: recall_at_10
value: 68.69
- type: recall_at_100
value: 83.477
- type: recall_at_1000
value: 92.465
- type: recall_at_3
value: 56.604
- type: recall_at_5
value: 62.775000000000006
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 75.332
- type: ap
value: 69.58548013224627
- type: f1
value: 75.19505914957745
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: msmarco
config: default
split: dev
revision: None
metrics:
- type: map_at_1
value: 19.373
- type: map_at_10
value: 31.377
- type: map_at_100
value: 32.635
- type: map_at_1000
value: 32.688
- type: map_at_3
value: 27.337
- type: map_at_5
value: 29.608
- type: mrr_at_1
value: 19.900000000000002
- type: mrr_at_10
value: 31.928
- type: mrr_at_100
value: 33.14
- type: mrr_at_1000
value: 33.184999999999995
- type: mrr_at_3
value: 27.955999999999996
- type: mrr_at_5
value: 30.209999999999997
- type: ndcg_at_1
value: 19.900000000000002
- type: ndcg_at_10
value: 38.324000000000005
- type: ndcg_at_100
value: 44.45
- type: ndcg_at_1000
value: 45.728
- type: ndcg_at_3
value: 30.099999999999998
- type: ndcg_at_5
value: 34.157
- type: precision_at_1
value: 19.900000000000002
- type: precision_at_10
value: 6.246
- type: precision_at_100
value: 0.932
- type: precision_at_1000
value: 0.104
- type: precision_at_3
value: 12.937000000000001
- type: precision_at_5
value: 9.817
- type: recall_at_1
value: 19.373
- type: recall_at_10
value: 59.82300000000001
- type: recall_at_100
value: 88.252
- type: recall_at_1000
value: 97.962
- type: recall_at_3
value: 37.480999999999995
- type: recall_at_5
value: 47.215
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 94.08800729594162
- type: f1
value: 93.6743110282188
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 77.04742362061104
- type: f1
value: 59.62885599991211
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 75.58170813718897
- type: f1
value: 73.57458347240402
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 79.15601882985877
- type: f1
value: 79.08126473478004
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 33.551020623875196
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 31.110159113704523
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 31.960982592404424
- type: mrr
value: 33.106781262600435
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: nfcorpus
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 5.679
- type: map_at_10
value: 13.922
- type: map_at_100
value: 17.949
- type: map_at_1000
value: 19.573999999999998
- type: map_at_3
value: 10.061
- type: map_at_5
value: 11.931
- type: mrr_at_1
value: 47.678
- type: mrr_at_10
value: 56.701
- type: mrr_at_100
value: 57.221
- type: mrr_at_1000
value: 57.260999999999996
- type: mrr_at_3
value: 54.334
- type: mrr_at_5
value: 55.85099999999999
- type: ndcg_at_1
value: 45.975
- type: ndcg_at_10
value: 37.117
- type: ndcg_at_100
value: 34.633
- type: ndcg_at_1000
value: 43.498
- type: ndcg_at_3
value: 42.475
- type: ndcg_at_5
value: 40.438
- type: precision_at_1
value: 47.678
- type: precision_at_10
value: 27.647
- type: precision_at_100
value: 9.08
- type: precision_at_1000
value: 2.218
- type: precision_at_3
value: 39.938
- type: precision_at_5
value: 35.17
- type: recall_at_1
value: 5.679
- type: recall_at_10
value: 18.552
- type: recall_at_100
value: 35.799
- type: recall_at_1000
value: 68.029
- type: recall_at_3
value: 11.43
- type: recall_at_5
value: 14.71
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: nq
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 29.055999999999997
- type: map_at_10
value: 45.547
- type: map_at_100
value: 46.591
- type: map_at_1000
value: 46.615
- type: map_at_3
value: 40.81
- type: map_at_5
value: 43.673
- type: mrr_at_1
value: 32.763999999999996
- type: mrr_at_10
value: 47.937999999999995
- type: mrr_at_100
value: 48.691
- type: mrr_at_1000
value: 48.705
- type: mrr_at_3
value: 43.984
- type: mrr_at_5
value: 46.467999999999996
- type: ndcg_at_1
value: 32.763999999999996
- type: ndcg_at_10
value: 53.891999999999996
- type: ndcg_at_100
value: 58.167
- type: ndcg_at_1000
value: 58.67099999999999
- type: ndcg_at_3
value: 45.007999999999996
- type: ndcg_at_5
value: 49.805
- type: precision_at_1
value: 32.763999999999996
- type: precision_at_10
value: 9.186
- type: precision_at_100
value: 1.1560000000000001
- type: precision_at_1000
value: 0.12
- type: precision_at_3
value: 21.012
- type: precision_at_5
value: 15.348
- type: recall_at_1
value: 29.055999999999997
- type: recall_at_10
value: 76.864
- type: recall_at_100
value: 95.254
- type: recall_at_1000
value: 98.914
- type: recall_at_3
value: 53.911
- type: recall_at_5
value: 64.982
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: quora
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 69.393
- type: map_at_10
value: 83.408
- type: map_at_100
value: 84.071
- type: map_at_1000
value: 84.086
- type: map_at_3
value: 80.372
- type: map_at_5
value: 82.245
- type: mrr_at_1
value: 80.06
- type: mrr_at_10
value: 86.546
- type: mrr_at_100
value: 86.661
- type: mrr_at_1000
value: 86.66199999999999
- type: mrr_at_3
value: 85.56700000000001
- type: mrr_at_5
value: 86.215
- type: ndcg_at_1
value: 80.07
- type: ndcg_at_10
value: 87.372
- type: ndcg_at_100
value: 88.683
- type: ndcg_at_1000
value: 88.78
- type: ndcg_at_3
value: 84.384
- type: ndcg_at_5
value: 85.978
- type: precision_at_1
value: 80.07
- type: precision_at_10
value: 13.345
- type: precision_at_100
value: 1.5350000000000001
- type: precision_at_1000
value: 0.157
- type: precision_at_3
value: 36.973
- type: precision_at_5
value: 24.334
- type: recall_at_1
value: 69.393
- type: recall_at_10
value: 94.994
- type: recall_at_100
value: 99.523
- type: recall_at_1000
value: 99.97399999999999
- type: recall_at_3
value: 86.459
- type: recall_at_5
value: 90.962
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 53.02365304347829
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 60.4722130918676
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: scidocs
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 4.233
- type: map_at_10
value: 10.333
- type: map_at_100
value: 12.286
- type: map_at_1000
value: 12.594
- type: map_at_3
value: 7.514
- type: map_at_5
value: 8.774
- type: mrr_at_1
value: 20.9
- type: mrr_at_10
value: 31.232
- type: mrr_at_100
value: 32.287
- type: mrr_at_1000
value: 32.352
- type: mrr_at_3
value: 27.766999999999996
- type: mrr_at_5
value: 29.487000000000002
- type: ndcg_at_1
value: 20.9
- type: ndcg_at_10
value: 17.957
- type: ndcg_at_100
value: 25.526
- type: ndcg_at_1000
value: 31.097
- type: ndcg_at_3
value: 16.915
- type: ndcg_at_5
value: 14.579
- type: precision_at_1
value: 20.9
- type: precision_at_10
value: 9.41
- type: precision_at_100
value: 2.032
- type: precision_at_1000
value: 0.337
- type: precision_at_3
value: 15.767000000000001
- type: precision_at_5
value: 12.659999999999998
- type: recall_at_1
value: 4.233
- type: recall_at_10
value: 19.067999999999998
- type: recall_at_100
value: 41.257
- type: recall_at_1000
value: 68.487
- type: recall_at_3
value: 9.618
- type: recall_at_5
value: 12.853
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_spearman
value: 82.25303886615637
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_spearman
value: 78.27678362978094
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_spearman
value: 85.5228883863618
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_spearman
value: 82.48847836687274
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_spearman
value: 88.76235312662311
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_spearman
value: 87.10893533398001
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_spearman
value: 90.10224405448504
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_spearman
value: 68.25088774601221
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_spearman
value: 87.15751321128134
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 79.23418699664575
- type: mrr
value: 93.72032288698955
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: scifact
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 56.511
- type: map_at_10
value: 67.062
- type: map_at_100
value: 67.537
- type: map_at_1000
value: 67.553
- type: map_at_3
value: 63.375
- type: map_at_5
value: 65.828
- type: mrr_at_1
value: 59.333000000000006
- type: mrr_at_10
value: 67.95
- type: mrr_at_100
value: 68.284
- type: mrr_at_1000
value: 68.30000000000001
- type: mrr_at_3
value: 65.0
- type: mrr_at_5
value: 66.93299999999999
- type: ndcg_at_1
value: 59.333000000000006
- type: ndcg_at_10
value: 72.08099999999999
- type: ndcg_at_100
value: 74.232
- type: ndcg_at_1000
value: 74.657
- type: ndcg_at_3
value: 65.72200000000001
- type: ndcg_at_5
value: 69.395
- type: precision_at_1
value: 59.333000000000006
- type: precision_at_10
value: 9.8
- type: precision_at_100
value: 1.097
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_3
value: 25.444
- type: precision_at_5
value: 17.533
- type: recall_at_1
value: 56.511
- type: recall_at_10
value: 86.63300000000001
- type: recall_at_100
value: 96.667
- type: recall_at_1000
value: 100.0
- type: recall_at_3
value: 70.217
- type: recall_at_5
value: 78.806
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.83861386138614
- type: cos_sim_ap
value: 96.24728474711715
- type: cos_sim_f1
value: 91.76351692774129
- type: cos_sim_precision
value: 92.74770173646579
- type: cos_sim_recall
value: 90.8
- type: dot_accuracy
value: 99.62475247524752
- type: dot_ap
value: 88.12302791709324
- type: dot_f1
value: 81.0187409899087
- type: dot_precision
value: 77.98334875115633
- type: dot_recall
value: 84.3
- type: euclidean_accuracy
value: 99.83465346534653
- type: euclidean_ap
value: 95.79574410387337
- type: euclidean_f1
value: 91.56139464375947
- type: euclidean_precision
value: 92.54341164453524
- type: euclidean_recall
value: 90.60000000000001
- type: manhattan_accuracy
value: 99.84059405940594
- type: manhattan_ap
value: 95.81230332276807
- type: manhattan_f1
value: 91.80661577608143
- type: manhattan_precision
value: 93.47150259067357
- type: manhattan_recall
value: 90.2
- type: max_accuracy
value: 99.84059405940594
- type: max_ap
value: 96.24728474711715
- type: max_f1
value: 91.80661577608143
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 63.035694955649866
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 34.00935398440242
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 49.61138657342161
- type: mrr
value: 50.26590749936338
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 30.994071916424655
- type: cos_sim_spearman
value: 30.010135460886296
- type: dot_pearson
value: 27.03290596322524
- type: dot_spearman
value: 28.824264579690357
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: trec-covid
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 0.247
- type: map_at_10
value: 2.01
- type: map_at_100
value: 12.912
- type: map_at_1000
value: 32.35
- type: map_at_3
value: 0.6859999999999999
- type: map_at_5
value: 1.089
- type: mrr_at_1
value: 92.0
- type: mrr_at_10
value: 95.25
- type: mrr_at_100
value: 95.25
- type: mrr_at_1000
value: 95.25
- type: mrr_at_3
value: 95.0
- type: mrr_at_5
value: 95.0
- type: ndcg_at_1
value: 88.0
- type: ndcg_at_10
value: 80.411
- type: ndcg_at_100
value: 63.871
- type: ndcg_at_1000
value: 58.145
- type: ndcg_at_3
value: 84.75399999999999
- type: ndcg_at_5
value: 82.372
- type: precision_at_1
value: 92.0
- type: precision_at_10
value: 84.8
- type: precision_at_100
value: 65.84
- type: precision_at_1000
value: 25.874000000000002
- type: precision_at_3
value: 90.0
- type: precision_at_5
value: 88.0
- type: recall_at_1
value: 0.247
- type: recall_at_10
value: 2.185
- type: recall_at_100
value: 16.051000000000002
- type: recall_at_1000
value: 55.18300000000001
- type: recall_at_3
value: 0.701
- type: recall_at_5
value: 1.1360000000000001
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: webis-touche2020
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 2.094
- type: map_at_10
value: 9.078
- type: map_at_100
value: 15.152
- type: map_at_1000
value: 16.773
- type: map_at_3
value: 4.67
- type: map_at_5
value: 6.111
- type: mrr_at_1
value: 24.490000000000002
- type: mrr_at_10
value: 39.989000000000004
- type: mrr_at_100
value: 41.248000000000005
- type: mrr_at_1000
value: 41.248000000000005
- type: mrr_at_3
value: 37.075
- type: mrr_at_5
value: 38.503
- type: ndcg_at_1
value: 21.429000000000002
- type: ndcg_at_10
value: 22.312
- type: ndcg_at_100
value: 35.077999999999996
- type: ndcg_at_1000
value: 46.903
- type: ndcg_at_3
value: 24.241
- type: ndcg_at_5
value: 21.884
- type: precision_at_1
value: 24.490000000000002
- type: precision_at_10
value: 20.816000000000003
- type: precision_at_100
value: 7.673000000000001
- type: precision_at_1000
value: 1.569
- type: precision_at_3
value: 27.211000000000002
- type: precision_at_5
value: 22.857
- type: recall_at_1
value: 2.094
- type: recall_at_10
value: 15.546
- type: recall_at_100
value: 47.764
- type: recall_at_1000
value: 84.461
- type: recall_at_3
value: 5.994
- type: recall_at_5
value: 8.967
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 69.92240000000001
- type: ap
value: 14.16088899225379
- type: f1
value: 54.04609416028299
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 60.764006791171475
- type: f1
value: 61.06042158638947
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 49.37015403955057
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 86.8510460749836
- type: cos_sim_ap
value: 76.13675917697662
- type: cos_sim_f1
value: 69.72121212121213
- type: cos_sim_precision
value: 64.48430493273543
- type: cos_sim_recall
value: 75.8839050131926
- type: dot_accuracy
value: 82.2793109614353
- type: dot_ap
value: 61.68231214221829
- type: dot_f1
value: 59.873802290254716
- type: dot_precision
value: 53.73322147651006
- type: dot_recall
value: 67.59894459102902
- type: euclidean_accuracy
value: 86.78548012159504
- type: euclidean_ap
value: 75.72625794456354
- type: euclidean_f1
value: 70.13506753376687
- type: euclidean_precision
value: 66.66666666666666
- type: euclidean_recall
value: 73.98416886543535
- type: manhattan_accuracy
value: 86.78548012159504
- type: manhattan_ap
value: 75.68264053123454
- type: manhattan_f1
value: 70.11952191235059
- type: manhattan_precision
value: 66.38378123526638
- type: manhattan_recall
value: 74.30079155672823
- type: max_accuracy
value: 86.8510460749836
- type: max_ap
value: 76.13675917697662
- type: max_f1
value: 70.13506753376687
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 89.20712539294446
- type: cos_sim_ap
value: 86.227146559573
- type: cos_sim_f1
value: 78.8050795036932
- type: cos_sim_precision
value: 74.7085201793722
- type: cos_sim_recall
value: 83.37696335078533
- type: dot_accuracy
value: 86.59525749990297
- type: dot_ap
value: 79.7714972191685
- type: dot_f1
value: 73.45451896105789
- type: dot_precision
value: 69.70891239715135
- type: dot_recall
value: 77.62550046196489
- type: euclidean_accuracy
value: 88.92575775216362
- type: euclidean_ap
value: 85.58942167175054
- type: euclidean_f1
value: 78.03423522915516
- type: euclidean_precision
value: 74.76193835084996
- type: euclidean_recall
value: 81.60609793655682
- type: manhattan_accuracy
value: 88.92769821865176
- type: manhattan_ap
value: 85.58316068024254
- type: manhattan_f1
value: 78.03337843933242
- type: manhattan_precision
value: 76.23384253819037
- type: manhattan_recall
value: 79.91992608561749
- type: max_accuracy
value: 89.20712539294446
- type: max_ap
value: 86.227146559573
- type: max_f1
value: 78.8050795036932
---
# LLM2Vec: Large Language Models Are Secretly Powerful Text Encoders
> LLM2Vec is a simple recipe to convert decoder-only LLMs into text encoders. It consists of 3 simple steps: 1) enabling bidirectional attention, 2) masked next token prediction, and 3) unsupervised contrastive learning. The model can be further fine-tuned to achieve state-of-the-art performance.
- **Repository:** https://github.com/McGill-NLP/llm2vec
- **Paper:** https://arxiv.org/abs/2404.05961
## Installation
```bash
pip install llm2vec
```
## Usage
```python
from llm2vec import LLM2Vec
import torch
from transformers import AutoTokenizer, AutoModel, AutoConfig
from peft import PeftModel
# Loading base Mistral model, along with custom code that enables bidirectional connections in decoder-only LLMs. MNTP LoRA weights are merged into the base model.
tokenizer = AutoTokenizer.from_pretrained(
"McGill-NLP/LLM2Vec-Sheared-LLaMA-mntp"
)
config = AutoConfig.from_pretrained(
"McGill-NLP/LLM2Vec-Sheared-LLaMA-mntp", trust_remote_code=True
)
model = AutoModel.from_pretrained(
"McGill-NLP/LLM2Vec-Sheared-LLaMA-mntp",
trust_remote_code=True,
config=config,
torch_dtype=torch.bfloat16,
device_map="cuda" if torch.cuda.is_available() else "cpu",
)
model = PeftModel.from_pretrained(
model,
"McGill-NLP/LLM2Vec-Sheared-LLaMA-mntp",
)
model = model.merge_and_unload() # This can take several minutes on cpu
# Loading supervised model. This loads the trained LoRA weights on top of MNTP model. Hence the final weights are -- Base model + MNTP (LoRA) + supervised (LoRA).
model = PeftModel.from_pretrained(
model, "McGill-NLP/LLM2Vec-Sheared-LLaMA-mntp-supervised"
)
# Wrapper for encoding and pooling operations
l2v = LLM2Vec(model, tokenizer, pooling_mode="mean", max_length=512)
# Encoding queries using instructions
instruction = (
"Given a web search query, retrieve relevant passages that answer the query:"
)
queries = [
[instruction, "how much protein should a female eat"],
[instruction, "summit define"],
]
q_reps = l2v.encode(queries)
# Encoding documents. Instruction are not required for documents
documents = [
"As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.",
"Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments.",
]
d_reps = l2v.encode(documents)
# Compute cosine similarity
q_reps_norm = torch.nn.functional.normalize(q_reps, p=2, dim=1)
d_reps_norm = torch.nn.functional.normalize(d_reps, p=2, dim=1)
cos_sim = torch.mm(q_reps_norm, d_reps_norm.transpose(0, 1))
print(cos_sim)
"""
tensor([[0.6500, 0.1291],
[0.0916, 0.4733]])
"""
```
## Questions
If you have any question about the code, feel free to email Parishad (`[email protected]`) and Vaibhav (`[email protected]`). | [
"SUMMARIZATION"
] | [
"BIOSSES",
"SCIFACT"
] |
TheBloke/meditron-7B-GGUF | TheBloke | null | [
"transformers",
"gguf",
"llama",
"en",
"dataset:epfl-llm/guidelines",
"arxiv:2311.16079",
"base_model:epfl-llm/meditron-7b",
"base_model:quantized:epfl-llm/meditron-7b",
"license:llama2",
"region:us"
] | 2023-11-30T22:11:31 | 2023-11-30T22:15:54 | 1,130 | 21 | ---
base_model: epfl-llm/meditron-7b
datasets:
- epfl-llm/guidelines
language:
- en
license: llama2
metrics:
- accuracy
- perplexity
model_name: Meditron 7B
inference: false
model_creator: EPFL LLM Team
model_type: llama
prompt_template: '<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant
'
quantized_by: TheBloke
---
<!-- markdownlint-disable MD041 -->
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# Meditron 7B - GGUF
- Model creator: [EPFL LLM Team](https://huggingface.co/epfl-llm)
- Original model: [Meditron 7B](https://huggingface.co/epfl-llm/meditron-7b)
<!-- description start -->
## Description
This repo contains GGUF format model files for [EPFL LLM Team's Meditron 7B](https://huggingface.co/epfl-llm/meditron-7b).
These files were quantised using hardware kindly provided by [Massed Compute](https://massedcompute.com/).
<!-- description end -->
<!-- README_GGUF.md-about-gguf start -->
### About GGUF
GGUF is a new format introduced by the llama.cpp team on August 21st 2023. It is a replacement for GGML, which is no longer supported by llama.cpp.
Here is an incomplete list of clients and libraries that are known to support GGUF:
* [llama.cpp](https://github.com/ggerganov/llama.cpp). The source project for GGUF. Offers a CLI and a server option.
* [text-generation-webui](https://github.com/oobabooga/text-generation-webui), the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.
* [KoboldCpp](https://github.com/LostRuins/koboldcpp), a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.
* [GPT4All](https://gpt4all.io/index.html), a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.
* [LM Studio](https://lmstudio.ai/), an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.
* [LoLLMS Web UI](https://github.com/ParisNeo/lollms-webui), a great web UI with many interesting and unique features, including a full model library for easy model selection.
* [Faraday.dev](https://faraday.dev/), an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.
* [llama-cpp-python](https://github.com/abetlen/llama-cpp-python), a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.
* [candle](https://github.com/huggingface/candle), a Rust ML framework with a focus on performance, including GPU support, and ease of use.
* [ctransformers](https://github.com/marella/ctransformers), a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.
<!-- README_GGUF.md-about-gguf end -->
<!-- repositories-available start -->
## Repositories available
* [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/meditron-7B-AWQ)
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/meditron-7B-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/meditron-7B-GGUF)
* [EPFL LLM Team's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/epfl-llm/meditron-7b)
<!-- repositories-available end -->
<!-- prompt-template start -->
## Prompt template: ChatML
```
<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant
```
<!-- prompt-template end -->
<!-- compatibility_gguf start -->
## Compatibility
These quantised GGUFv2 files are compatible with llama.cpp from August 27th onwards, as of commit [d0cee0d](https://github.com/ggerganov/llama.cpp/commit/d0cee0d36d5be95a0d9088b674dbb27354107221)
They are also compatible with many third party UIs and libraries - please see the list at the top of this README.
## Explanation of quantisation methods
<details>
<summary>Click to see details</summary>
The new methods available are:
* GGML_TYPE_Q2_K - "type-1" 2-bit quantization in super-blocks containing 16 blocks, each block having 16 weight. Block scales and mins are quantized with 4 bits. This ends up effectively using 2.5625 bits per weight (bpw)
* GGML_TYPE_Q3_K - "type-0" 3-bit quantization in super-blocks containing 16 blocks, each block having 16 weights. Scales are quantized with 6 bits. This end up using 3.4375 bpw.
* GGML_TYPE_Q4_K - "type-1" 4-bit quantization in super-blocks containing 8 blocks, each block having 32 weights. Scales and mins are quantized with 6 bits. This ends up using 4.5 bpw.
* GGML_TYPE_Q5_K - "type-1" 5-bit quantization. Same super-block structure as GGML_TYPE_Q4_K resulting in 5.5 bpw
* GGML_TYPE_Q6_K - "type-0" 6-bit quantization. Super-blocks with 16 blocks, each block having 16 weights. Scales are quantized with 8 bits. This ends up using 6.5625 bpw
Refer to the Provided Files table below to see what files use which methods, and how.
</details>
<!-- compatibility_gguf end -->
<!-- README_GGUF.md-provided-files start -->
## Provided files
| Name | Quant method | Bits | Size | Max RAM required | Use case |
| ---- | ---- | ---- | ---- | ---- | ----- |
| [meditron-7b.Q2_K.gguf](https://huggingface.co/TheBloke/meditron-7B-GGUF/blob/main/meditron-7b.Q2_K.gguf) | Q2_K | 2 | 2.83 GB| 5.33 GB | smallest, significant quality loss - not recommended for most purposes |
| [meditron-7b.Q3_K_S.gguf](https://huggingface.co/TheBloke/meditron-7B-GGUF/blob/main/meditron-7b.Q3_K_S.gguf) | Q3_K_S | 3 | 2.95 GB| 5.45 GB | very small, high quality loss |
| [meditron-7b.Q3_K_M.gguf](https://huggingface.co/TheBloke/meditron-7B-GGUF/blob/main/meditron-7b.Q3_K_M.gguf) | Q3_K_M | 3 | 3.30 GB| 5.80 GB | very small, high quality loss |
| [meditron-7b.Q3_K_L.gguf](https://huggingface.co/TheBloke/meditron-7B-GGUF/blob/main/meditron-7b.Q3_K_L.gguf) | Q3_K_L | 3 | 3.60 GB| 6.10 GB | small, substantial quality loss |
| [meditron-7b.Q4_0.gguf](https://huggingface.co/TheBloke/meditron-7B-GGUF/blob/main/meditron-7b.Q4_0.gguf) | Q4_0 | 4 | 3.83 GB| 6.33 GB | legacy; small, very high quality loss - prefer using Q3_K_M |
| [meditron-7b.Q4_K_S.gguf](https://huggingface.co/TheBloke/meditron-7B-GGUF/blob/main/meditron-7b.Q4_K_S.gguf) | Q4_K_S | 4 | 3.86 GB| 6.36 GB | small, greater quality loss |
| [meditron-7b.Q4_K_M.gguf](https://huggingface.co/TheBloke/meditron-7B-GGUF/blob/main/meditron-7b.Q4_K_M.gguf) | Q4_K_M | 4 | 4.08 GB| 6.58 GB | medium, balanced quality - recommended |
| [meditron-7b.Q5_0.gguf](https://huggingface.co/TheBloke/meditron-7B-GGUF/blob/main/meditron-7b.Q5_0.gguf) | Q5_0 | 5 | 4.65 GB| 7.15 GB | legacy; medium, balanced quality - prefer using Q4_K_M |
| [meditron-7b.Q5_K_S.gguf](https://huggingface.co/TheBloke/meditron-7B-GGUF/blob/main/meditron-7b.Q5_K_S.gguf) | Q5_K_S | 5 | 4.65 GB| 7.15 GB | large, low quality loss - recommended |
| [meditron-7b.Q5_K_M.gguf](https://huggingface.co/TheBloke/meditron-7B-GGUF/blob/main/meditron-7b.Q5_K_M.gguf) | Q5_K_M | 5 | 4.78 GB| 7.28 GB | large, very low quality loss - recommended |
| [meditron-7b.Q6_K.gguf](https://huggingface.co/TheBloke/meditron-7B-GGUF/blob/main/meditron-7b.Q6_K.gguf) | Q6_K | 6 | 5.53 GB| 8.03 GB | very large, extremely low quality loss |
| [meditron-7b.Q8_0.gguf](https://huggingface.co/TheBloke/meditron-7B-GGUF/blob/main/meditron-7b.Q8_0.gguf) | Q8_0 | 8 | 7.16 GB| 9.66 GB | very large, extremely low quality loss - not recommended |
**Note**: the above RAM figures assume no GPU offloading. If layers are offloaded to the GPU, this will reduce RAM usage and use VRAM instead.
<!-- README_GGUF.md-provided-files end -->
<!-- README_GGUF.md-how-to-download start -->
## How to download GGUF files
**Note for manual downloaders:** You almost never want to clone the entire repo! Multiple different quantisation formats are provided, and most users only want to pick and download a single file.
The following clients/libraries will automatically download models for you, providing a list of available models to choose from:
* LM Studio
* LoLLMS Web UI
* Faraday.dev
### In `text-generation-webui`
Under Download Model, you can enter the model repo: TheBloke/meditron-7B-GGUF and below it, a specific filename to download, such as: meditron-7b.Q4_K_M.gguf.
Then click Download.
### On the command line, including multiple files at once
I recommend using the `huggingface-hub` Python library:
```shell
pip3 install huggingface-hub
```
Then you can download any individual model file to the current directory, at high speed, with a command like this:
```shell
huggingface-cli download TheBloke/meditron-7B-GGUF meditron-7b.Q4_K_M.gguf --local-dir . --local-dir-use-symlinks False
```
<details>
<summary>More advanced huggingface-cli download usage (click to read)</summary>
You can also download multiple files at once with a pattern:
```shell
huggingface-cli download TheBloke/meditron-7B-GGUF --local-dir . --local-dir-use-symlinks False --include='*Q4_K*gguf'
```
For more documentation on downloading with `huggingface-cli`, please see: [HF -> Hub Python Library -> Download files -> Download from the CLI](https://huggingface.co/docs/huggingface_hub/guides/download#download-from-the-cli).
To accelerate downloads on fast connections (1Gbit/s or higher), install `hf_transfer`:
```shell
pip3 install hf_transfer
```
And set environment variable `HF_HUB_ENABLE_HF_TRANSFER` to `1`:
```shell
HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download TheBloke/meditron-7B-GGUF meditron-7b.Q4_K_M.gguf --local-dir . --local-dir-use-symlinks False
```
Windows Command Line users: You can set the environment variable by running `set HF_HUB_ENABLE_HF_TRANSFER=1` before the download command.
</details>
<!-- README_GGUF.md-how-to-download end -->
<!-- README_GGUF.md-how-to-run start -->
## Example `llama.cpp` command
Make sure you are using `llama.cpp` from commit [d0cee0d](https://github.com/ggerganov/llama.cpp/commit/d0cee0d36d5be95a0d9088b674dbb27354107221) or later.
```shell
./main -ngl 35 -m meditron-7b.Q4_K_M.gguf --color -c 2048 --temp 0.7 --repeat_penalty 1.1 -n -1 -p "<|im_start|>system\n{system_message}<|im_end|>\n<|im_start|>user\n{prompt}<|im_end|>\n<|im_start|>assistant"
```
Change `-ngl 32` to the number of layers to offload to GPU. Remove it if you don't have GPU acceleration.
Change `-c 2048` to the desired sequence length. For extended sequence models - eg 8K, 16K, 32K - the necessary RoPE scaling parameters are read from the GGUF file and set by llama.cpp automatically. Note that longer sequence lengths require much more resources, so you may need to reduce this value.
If you want to have a chat-style conversation, replace the `-p <PROMPT>` argument with `-i -ins`
For other parameters and how to use them, please refer to [the llama.cpp documentation](https://github.com/ggerganov/llama.cpp/blob/master/examples/main/README.md)
## How to run in `text-generation-webui`
Further instructions can be found in the text-generation-webui documentation, here: [text-generation-webui/docs/04 ‐ Model Tab.md](https://github.com/oobabooga/text-generation-webui/blob/main/docs/04%20%E2%80%90%20Model%20Tab.md#llamacpp).
## How to run from Python code
You can use GGUF models from Python using the [llama-cpp-python](https://github.com/abetlen/llama-cpp-python) or [ctransformers](https://github.com/marella/ctransformers) libraries. Note that at the time of writing (Nov 27th 2023), ctransformers has not been updated for some time and is not compatible with some recent models. Therefore I recommend you use llama-cpp-python.
### How to load this model in Python code, using llama-cpp-python
For full documentation, please see: [llama-cpp-python docs](https://abetlen.github.io/llama-cpp-python/).
#### First install the package
Run one of the following commands, according to your system:
```shell
# Base ctransformers with no GPU acceleration
pip install llama-cpp-python
# With NVidia CUDA acceleration
CMAKE_ARGS="-DLLAMA_CUBLAS=on" pip install llama-cpp-python
# Or with OpenBLAS acceleration
CMAKE_ARGS="-DLLAMA_BLAS=ON -DLLAMA_BLAS_VENDOR=OpenBLAS" pip install llama-cpp-python
# Or with CLBLast acceleration
CMAKE_ARGS="-DLLAMA_CLBLAST=on" pip install llama-cpp-python
# Or with AMD ROCm GPU acceleration (Linux only)
CMAKE_ARGS="-DLLAMA_HIPBLAS=on" pip install llama-cpp-python
# Or with Metal GPU acceleration for macOS systems only
CMAKE_ARGS="-DLLAMA_METAL=on" pip install llama-cpp-python
# In windows, to set the variables CMAKE_ARGS in PowerShell, follow this format; eg for NVidia CUDA:
$env:CMAKE_ARGS = "-DLLAMA_OPENBLAS=on"
pip install llama-cpp-python
```
#### Simple llama-cpp-python example code
```python
from llama_cpp import Llama
# Set gpu_layers to the number of layers to offload to GPU. Set to 0 if no GPU acceleration is available on your system.
llm = Llama(
model_path="./meditron-7b.Q4_K_M.gguf", # Download the model file first
n_ctx=2048, # The max sequence length to use - note that longer sequence lengths require much more resources
n_threads=8, # The number of CPU threads to use, tailor to your system and the resulting performance
n_gpu_layers=35 # The number of layers to offload to GPU, if you have GPU acceleration available
)
# Simple inference example
output = llm(
"<|im_start|>system\n{system_message}<|im_end|>\n<|im_start|>user\n{prompt}<|im_end|>\n<|im_start|>assistant", # Prompt
max_tokens=512, # Generate up to 512 tokens
stop=["</s>"], # Example stop token - not necessarily correct for this specific model! Please check before using.
echo=True # Whether to echo the prompt
)
# Chat Completion API
llm = Llama(model_path="./meditron-7b.Q4_K_M.gguf", chat_format="llama-2") # Set chat_format according to the model you are using
llm.create_chat_completion(
messages = [
{"role": "system", "content": "You are a story writing assistant."},
{
"role": "user",
"content": "Write a story about llamas."
}
]
)
```
## How to use with LangChain
Here are guides on using llama-cpp-python and ctransformers with LangChain:
* [LangChain + llama-cpp-python](https://python.langchain.com/docs/integrations/llms/llamacpp)
* [LangChain + ctransformers](https://python.langchain.com/docs/integrations/providers/ctransformers)
<!-- README_GGUF.md-how-to-run end -->
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute
Thanks to the [chirper.ai](https://chirper.ai) team!
Thanks to Clay from [gpus.llm-utils.org](llm-utils)!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Brandon Frisco, LangChain4j, Spiking Neurons AB, transmissions 11, Joseph William Delisle, Nitin Borwankar, Willem Michiel, Michael Dempsey, vamX, Jeffrey Morgan, zynix, jjj, Omer Bin Jawed, Sean Connelly, jinyuan sun, Jeromy Smith, Shadi, Pawan Osman, Chadd, Elijah Stavena, Illia Dulskyi, Sebastain Graf, Stephen Murray, terasurfer, Edmond Seymore, Celu Ramasamy, Mandus, Alex, biorpg, Ajan Kanaga, Clay Pascal, Raven Klaugh, 阿明, K, ya boyyy, usrbinkat, Alicia Loh, John Villwock, ReadyPlayerEmma, Chris Smitley, Cap'n Zoog, fincy, GodLy, S_X, sidney chen, Cory Kujawski, OG, Mano Prime, AzureBlack, Pieter, Kalila, Spencer Kim, Tom X Nguyen, Stanislav Ovsiannikov, Michael Levine, Andrey, Trailburnt, Vadim, Enrico Ros, Talal Aujan, Brandon Phillips, Jack West, Eugene Pentland, Michael Davis, Will Dee, webtim, Jonathan Leane, Alps Aficionado, Rooh Singh, Tiffany J. Kim, theTransient, Luke @flexchar, Elle, Caitlyn Gatomon, Ari Malik, subjectnull, Johann-Peter Hartmann, Trenton Dambrowitz, Imad Khwaja, Asp the Wyvern, Emad Mostaque, Rainer Wilmers, Alexandros Triantafyllidis, Nicholas, Pedro Madruga, SuperWojo, Harry Royden McLaughlin, James Bentley, Olakabola, David Ziegler, Ai Maven, Jeff Scroggin, Nikolai Manek, Deo Leter, Matthew Berman, Fen Risland, Ken Nordquist, Manuel Alberto Morcote, Luke Pendergrass, TL, Fred von Graf, Randy H, Dan Guido, NimbleBox.ai, Vitor Caleffi, Gabriel Tamborski, knownsqashed, Lone Striker, Erik Bjäreholt, John Detwiler, Leonard Tan, Iucharbius
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
<!-- original-model-card start -->
# Original model card: EPFL LLM Team's Meditron 7B
<img width=50% src="meditron_LOGO.png" alt="Alt text" title="Meditron-logo">
# Model Card for Meditron-7B-v1.0
Meditron is a suite of open-source medical Large Language Models (LLMs).
Meditron-7B is a 7 billion parameters model adapted to the medical domain from Llama-2-7B through continued pretraining on a comprehensively curated medical corpus, including selected PubMed articles, abstracts, a [new dataset](https://huggingface.co/datasets/epfl-llm/guidelines) of internationally-recognized medical guidelines, and general domain data from [RedPajama-v1](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T).
Meditron-7B, finetuned on relevant training data, outperforms Llama-2-7B and PMC-Llama on multiple medical reasoning tasks.
<details open>
<summary><strong>Advisory Notice</strong></summary>
<blockquote style="padding: 10px; margin: 0 0 10px; border-left: 5px solid #ddd;">
While Meditron is designed to encode medical knowledge from sources of high-quality evidence, it is not yet adapted to deliver this knowledge appropriately, safely, or within professional actionable constraints.
We recommend against deploying Meditron in medical applications without extensive use-case alignment, as well as additional testing, specifically including randomized controlled trials in real-world practice settings.
</blockquote>
</details>
## Model Details
- **Developed by:** [EPFL LLM Team](https://huggingface.co/epfl-llm)
- **Model type:** Causal decoder-only transformer language model
- **Language(s):** English (mainly)
- **Model License:** [LLAMA 2 COMMUNITY LICENSE AGREEMENT](https://huggingface.co/meta-llama/Llama-2-70b/raw/main/LICENSE.txt)
- **Code License:** [APACHE 2.0 LICENSE](LICENSE)
- **Continue-pretrained from model:** [Llama-2-7B](https://huggingface.co/meta-llama/Llama-2-7b)
- **Context length:** 2K tokens
- **Input:** Text-only data
- **Output:** Model generates text only
- **Status:** This is a static model trained on an offline dataset. Future versions of the tuned models will be released as we enhance model's performance.
- **Knowledge Cutoff:** August 2023
### Model Sources
- **Repository:** [epflLLM/meditron](https://github.com/epfLLM/meditron)
- **Trainer:** [epflLLM/Megatron-LLM](https://github.com/epfLLM/Megatron-LLM)
- **Paper:** *[MediTron-70B: Scaling Medical Pretraining for Large Language Models](https://arxiv.org/abs/2311.16079)*
## Uses
Meditron-7B is being made available for further testing and assessment as an AI assistant to enhance clinical decision-making and enhance access to an LLM for healthcare use. Potential use cases may include but are not limited to:
- Medical exam question answering
- Supporting differential diagnosis
- Disease information (symptoms, cause, treatment) query
- General health information query
### Direct Use
It is possible to use this model to generate text, which is useful for experimentation and understanding its capabilities.
It should not be used directly for production or work that may impact people.
### Downstream Use
Meditron-7B is a foundation model that can be finetuned, instruction-tuned, or RLHF-tuned for specific downstream tasks and applications.
The main way we have used this model is finetuning for downstream question-answering tasks, but we encourage using this model for additional applications.
Specific formatting needs to be followed to prompt our finetuned models, including the `<|im_start|>`, `<|im_end|>` tags, and `system`, `question`, `answer` identifiers.
"""
<|im_start|>system
{system_message}<|im_end|>
<|im_start|>question
{prompt}<|im_end|>
<|im_start|>answer
"""
**Note 1**: The above formatting is not required for running the base model (this repository)
**Note 2**: the above formatting is just an example of a finetuning template. This format is not a requirement if you use your own formatting option for the finetuning of the model.
To run proper generation with this base model, we recommend using a high-throughput and memory-efficient inference engine, such as [vLLM](https://github.com/vllm-project/vllm), with a UI that supports chat and text generation, such as [BetterChatGPT](https://github.com/ztjhz/BetterChatGPT)
To see more details about model deployment and generation, please see our [documentation](https://github.com/epfLLM/meditron/blob/main/deployment/README.md).
### Out-of-Scope Use
We do not recommend using this model for natural language generation in a production environment, finetuned or otherwise.
## Truthfulness, Helpfulness, Risk, and Bias
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
We did an initial assessment of Meditron models' **Truthfulness** against baseline models and consumer-level medical models.
We use TruthfulQA (multiple choice) as the main evaluation benchmark.
We only focus on the categories that are relevant to the medical domain, including Health, Nutrition, Psychology, and Science.
For 7B models, we perform one-shot evaluations for consistent answer generation.
For 70B models, the evaluations are under the zero-shot setting.
Below, we report the detailed truthfulness performance of each category.
| | | | | | | | |
| --- | ------ |----- |----- |----- |----- |----- |----- |
|Category | meditron-70b | llama-2-70b | med42-70b* | meditron-7b | llama-2-7b | PMC-llama-7b |
|Health | 81.8 | 69.1 | 83.6 | 27.3 | 16.4 | 3.6 |
|Nutrition | 77.9 | 68.8 | 62.5 | 31.1 | 12.5 | 6.3 |
|Psychology| 47.4 | 36.8 | 52.6 | 21.1 | 10.5 | 0.0 |
|Science | 77.8 | 44.4 | 33.3 | 33.3 | 11.1 | 0.0 |
|Avg | 71.2 | 54.8 | 58.0 | 28.3 | 12.6 | 2.5 |
| | | | | | | |
For a more detailed performance analysis, please see our paper.
Significant research is still required to fully explore potential bias, fairness, and safety issues with this language model.
Please recognize that our evaluation on Meditron-7B's helpfulness, risk, and bias are highly limited.
Thus, as we noted in the safety notice, we strongly against any deployment in medical applications without further alignment process and rigorous evaluation!
### Recommendations
**IMPORTANT!**
Users (both direct and downstream) should be made aware of the risks, biases, and limitations of the model.
While this model is capable of generating natural language text, we have only begun to explore this capability and its limitations.
Understanding these limitations is especially important in a domain like medicine.
Therefore, we strongly recommend against using this model in production for natural language generation or for professional purposes related to health and medicine.
## Training Details
### Training Data
Meditron’s domain-adaptive pre-training corpus GAP-Replay combines 48.1B tokens from four corpora:
- [**Clinical Guidelines**](https://huggingface.co/datasets/epfl-llm/guidelines): a new dataset of 46K internationally-recognized clinical practice guidelines from various healthcare-related sources, including hospitals and international organizations.
- **Medical Paper Abstracts**: 16.1M abstracts extracted from closed-access PubMed and PubMed Central papers.
- **Medical Papers**: full-text articles extracted from 5M publicly available PubMed and PubMed Central papers.
- **Replay Data**: 400M tokens of general domain pretraining data sampled from [RedPajama-v1](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T)
<img width=75% src="gap-replay.png" alt="Alt text" title="Meditron-logo">
#### Data Preprocessing
Please see the detailed preprocessing procedure in our paper.
### Training Procedure
We used the [Megatron-LLM](https://github.com/epfLLM/Megatron-LLM) distributed training library, a derivative of Nvidia's Megatron LM project, to optimize training efficiency.
Hardware consists of 1 node of 8x NVIDIA A100 (80GB) SXM GPUs connected by NVLink and NVSwitch with a single Nvidia ConnectX-6 DX network card and equipped with 2 x AMD EPYC 7543 32-Core Processors and 512 GB of RAM.
Our three way parallelism scheme uses:
- Data Parallelism (DP -- different GPUs process different subsets of the batches) of 2,
- Pipeline Parallelism (PP -- different GPUs process different layers) of 4,
- Tensor Parallelism (TP -- different GPUs process different subtensors for matrix multiplication) of 1.
#### Training Hyperparameters
| | |
| --- | ------ |
| bf16 | true |
| lr | 3e-4 |
| eps | 1e-5 |
| betas | \[0.9, 0.95\] |
| clip_grad | 1 |
| weight decay | 0.1 |
| DP size | 16 |
| TP size | 4 |
| PP size | 1 |
| seq length | 2048 |
| lr scheduler | cosine|
| min lr | 1e-6 |
| warmup iteration | 2000 |
| micro batch size | 10 |
| global batch size | 1600 |
| | |
#### Sizes
The model was trained in September 2023.
The model architecture is exactly Llama 2, meaning
| | |
| --- | ------ |
| Model size | 7B |
| Hidden dimension | 4096 |
| Num. attention heads | 32 |
| Num. layers | 32 |
| | |
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data & Metrics
#### Testing Data
- [MedQA (USMLE)](https://huggingface.co/datasets/bigbio/med_qa)
- [MedMCQA](https://huggingface.co/datasets/medmcqa)
- [PubMedQA](https://huggingface.co/datasets/bigbio/pubmed_qa)
- [MMLU-Medical](https://huggingface.co/datasets/lukaemon/mmlu)
- [MedQA-4-Option](https://huggingface.co/datasets/GBaker/MedQA-USMLE-4-options)
#### Metrics
- Accuracy: suite the evaluation of multiple-choice question-answering tasks.
### Results
We finetune meditron-7b, llama-2-7b, pmc-llama-7b on each benchmark (pubmedqa, medmcqa, medqa)'s training data individually.
We report the finetuned models' performance with top token selection as the inference mode.
For MMLU-Medical, models finetuned on MedMCQA are used for inference.
For MedQA-4-Option, models finetuned on MedQA are used for inference.
For a more detailed performance analysis, please see our paper.
| | | | | | |
| --- | ------ |----- |----- |----- |----- |
|Dataset | meditron-7b | llama-2-7b | pmc-llama-7b | Zephyr-7B-beta* | Mistral-7B-instruct* |
|MMLU-Medical | 54.2 | 53.7 | 56.4 | 63.3 | 60.0 |
|PubMedQA | 74.4 | 61.8 | 59.2 | 46.0 | 17.8 |
|MedMCQA | 59.2 | 54.4 | 57.6 | 43.0 | 40.2 |
|MedQA | 47.9 | 44.0 | 42.4 | 42.8 | 32.4 |
|MedQA-4-Option| 52.0 | 49.6 | 49.2 | 48.5 | 41.1 |
|Avg | 57.5 | 52.7 | 53.0 | 48.7 | 38.3 |
| | | | | | |
**Note**: models with * are already instruction-tuned, so we exclude them from further finetuning on any training data.
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
- **Hardware Type:** 8 x NVIDIA A100 (80GB) SXM
- **Total GPU hours:** 588.8
- **Hardware Provider:** EPFL Research Computing Platform
- **Compute Region:** Switzerland
- **Carbon Emitted:** Switzerland has a carbon efficiency of 0.016 kgCO2/kWh (https://www.carbonfootprint.com/docs/2018_8_electricity_factors_august_2018_-_online_sources.pdf). 73.6 hours of 8 A100s means 588.8 hours at a TDP of 400W. Assuming a Power Usage effectiveness of 1.5, total emissions are estimated to be:
(400W / 1000W/kWh / GPU * 0.016 kgCO2/kWh * 73.6 h * 8 GPU) * 1.8 PUE = 6.8 kgCO2.
## Citation
**BibTeX:**
If you use Meditron or its training data, please cite our work:
```
@misc{chen2023meditron70b,
title={MEDITRON-70B: Scaling Medical Pretraining for Large Language Models},
author={Zeming Chen and Alejandro Hernández-Cano and Angelika Romanou and Antoine Bonnet and Kyle Matoba and Francesco Salvi and Matteo Pagliardini and Simin Fan and Andreas Köpf and Amirkeivan Mohtashami and Alexandre Sallinen and Alireza Sakhaeirad and Vinitra Swamy and Igor Krawczuk and Deniz Bayazit and Axel Marmet and Syrielle Montariol and Mary-Anne Hartley and Martin Jaggi and Antoine Bosselut},
year={2023},
eprint={2311.16079},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
@software{epfmedtrn,
author = {Zeming Chen and Alejandro Hernández-Cano and Angelika Romanou and Antoine Bonnet and Kyle Matoba and Francesco Salvi and Matteo Pagliardini and Simin Fan and Andreas Köpf and Amirkeivan Mohtashami and Alexandre Sallinen and Alireza Sakhaeirad and Vinitra Swamy and Igor Krawczuk and Deniz Bayazit and Axel Marmet and Syrielle Montariol and Mary-Anne Hartley and Martin Jaggi and Antoine Bosselut},
title = {MediTron-70B: Scaling Medical Pretraining for Large Language Models},
month = November,
year = 2023,
url = {https://github.com/epfLLM/meditron}
}
```
<!-- original-model-card end -->
| [
"QUESTION_ANSWERING"
] | [
"MEDQA",
"PUBMEDQA"
] |
TheBloke/med42-70B-GGUF | TheBloke | text-generation | [
"transformers",
"gguf",
"llama",
"m42",
"health",
"healthcare",
"clinical-llm",
"text-generation",
"en",
"base_model:m42-health/med42-70b",
"base_model:quantized:m42-health/med42-70b",
"license:other",
"region:us"
] | 2023-10-27T21:16:14 | 2023-10-27T23:04:48 | 1,125 | 21 | ---
base_model: m42-health/med42-70b
language:
- en
license: other
license_name: med42
model_name: Med42 70B
pipeline_tag: text-generation
tags:
- m42
- health
- healthcare
- clinical-llm
inference: false
model_creator: M42 Health
model_type: llama
prompt_template: '<|system|>: You are a helpful medical assistant created by M42 Health
in the UAE.
<|prompter|>:{prompt}
<|assistant|>:
'
quantized_by: TheBloke
---
<!-- markdownlint-disable MD041 -->
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# Med42 70B - GGUF
- Model creator: [M42 Health](https://huggingface.co/m42-health)
- Original model: [Med42 70B](https://huggingface.co/m42-health/med42-70b)
<!-- description start -->
## Description
This repo contains GGUF format model files for [M42 Health's Med42 70B](https://huggingface.co/m42-health/med42-70b).
These files were quantised using hardware kindly provided by [Massed Compute](https://massedcompute.com/).
<!-- description end -->
<!-- README_GGUF.md-about-gguf start -->
### About GGUF
GGUF is a new format introduced by the llama.cpp team on August 21st 2023. It is a replacement for GGML, which is no longer supported by llama.cpp.
Here is an incomplate list of clients and libraries that are known to support GGUF:
* [llama.cpp](https://github.com/ggerganov/llama.cpp). The source project for GGUF. Offers a CLI and a server option.
* [text-generation-webui](https://github.com/oobabooga/text-generation-webui), the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.
* [KoboldCpp](https://github.com/LostRuins/koboldcpp), a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.
* [LM Studio](https://lmstudio.ai/), an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration.
* [LoLLMS Web UI](https://github.com/ParisNeo/lollms-webui), a great web UI with many interesting and unique features, including a full model library for easy model selection.
* [Faraday.dev](https://faraday.dev/), an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.
* [ctransformers](https://github.com/marella/ctransformers), a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server.
* [llama-cpp-python](https://github.com/abetlen/llama-cpp-python), a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.
* [candle](https://github.com/huggingface/candle), a Rust ML framework with a focus on performance, including GPU support, and ease of use.
<!-- README_GGUF.md-about-gguf end -->
<!-- repositories-available start -->
## Repositories available
* [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/med42-70B-AWQ)
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/med42-70B-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/med42-70B-GGUF)
* [M42 Health's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/m42-health/med42-70b)
<!-- repositories-available end -->
<!-- prompt-template start -->
## Prompt template: Med42
```
<|system|>: You are a helpful medical assistant created by M42 Health in the UAE.
<|prompter|>:{prompt}
<|assistant|>:
```
<!-- prompt-template end -->
<!-- licensing start -->
## Licensing
The creator of the source model has listed its license as `other`, and this quantization has therefore used that same license.
As this model is based on Llama 2, it is also subject to the Meta Llama 2 license terms, and the license files for that are additionally included. It should therefore be considered as being claimed to be licensed under both licenses. I contacted Hugging Face for clarification on dual licensing but they do not yet have an official position. Should this change, or should Meta provide any feedback on this situation, I will update this section accordingly.
In the meantime, any questions regarding licensing, and in particular how these two licenses might interact, should be directed to the original model repository: [M42 Health's Med42 70B](https://huggingface.co/m42-health/med42-70b).
<!-- licensing end -->
<!-- compatibility_gguf start -->
## Compatibility
These quantised GGUFv2 files are compatible with llama.cpp from August 27th onwards, as of commit [d0cee0d](https://github.com/ggerganov/llama.cpp/commit/d0cee0d36d5be95a0d9088b674dbb27354107221)
They are also compatible with many third party UIs and libraries - please see the list at the top of this README.
## Explanation of quantisation methods
<details>
<summary>Click to see details</summary>
The new methods available are:
* GGML_TYPE_Q2_K - "type-1" 2-bit quantization in super-blocks containing 16 blocks, each block having 16 weight. Block scales and mins are quantized with 4 bits. This ends up effectively using 2.5625 bits per weight (bpw)
* GGML_TYPE_Q3_K - "type-0" 3-bit quantization in super-blocks containing 16 blocks, each block having 16 weights. Scales are quantized with 6 bits. This end up using 3.4375 bpw.
* GGML_TYPE_Q4_K - "type-1" 4-bit quantization in super-blocks containing 8 blocks, each block having 32 weights. Scales and mins are quantized with 6 bits. This ends up using 4.5 bpw.
* GGML_TYPE_Q5_K - "type-1" 5-bit quantization. Same super-block structure as GGML_TYPE_Q4_K resulting in 5.5 bpw
* GGML_TYPE_Q6_K - "type-0" 6-bit quantization. Super-blocks with 16 blocks, each block having 16 weights. Scales are quantized with 8 bits. This ends up using 6.5625 bpw
Refer to the Provided Files table below to see what files use which methods, and how.
</details>
<!-- compatibility_gguf end -->
<!-- README_GGUF.md-provided-files start -->
## Provided files
| Name | Quant method | Bits | Size | Max RAM required | Use case |
| ---- | ---- | ---- | ---- | ---- | ----- |
| [med42-70b.Q2_K.gguf](https://huggingface.co/TheBloke/med42-70B-GGUF/blob/main/med42-70b.Q2_K.gguf) | Q2_K | 2 | 29.28 GB| 31.78 GB | smallest, significant quality loss - not recommended for most purposes |
| [med42-70b.Q3_K_S.gguf](https://huggingface.co/TheBloke/med42-70B-GGUF/blob/main/med42-70b.Q3_K_S.gguf) | Q3_K_S | 3 | 29.92 GB| 32.42 GB | very small, high quality loss |
| [med42-70b.Q3_K_M.gguf](https://huggingface.co/TheBloke/med42-70B-GGUF/blob/main/med42-70b.Q3_K_M.gguf) | Q3_K_M | 3 | 33.19 GB| 35.69 GB | very small, high quality loss |
| [med42-70b.Q3_K_L.gguf](https://huggingface.co/TheBloke/med42-70B-GGUF/blob/main/med42-70b.Q3_K_L.gguf) | Q3_K_L | 3 | 36.15 GB| 38.65 GB | small, substantial quality loss |
| [med42-70b.Q4_0.gguf](https://huggingface.co/TheBloke/med42-70B-GGUF/blob/main/med42-70b.Q4_0.gguf) | Q4_0 | 4 | 38.87 GB| 41.37 GB | legacy; small, very high quality loss - prefer using Q3_K_M |
| [med42-70b.Q4_K_S.gguf](https://huggingface.co/TheBloke/med42-70B-GGUF/blob/main/med42-70b.Q4_K_S.gguf) | Q4_K_S | 4 | 39.07 GB| 41.57 GB | small, greater quality loss |
| [med42-70b.Q4_K_M.gguf](https://huggingface.co/TheBloke/med42-70B-GGUF/blob/main/med42-70b.Q4_K_M.gguf) | Q4_K_M | 4 | 41.42 GB| 43.92 GB | medium, balanced quality - recommended |
| [med42-70b.Q5_0.gguf](https://huggingface.co/TheBloke/med42-70B-GGUF/blob/main/med42-70b.Q5_0.gguf) | Q5_0 | 5 | 47.46 GB| 49.96 GB | legacy; medium, balanced quality - prefer using Q4_K_M |
| [med42-70b.Q5_K_S.gguf](https://huggingface.co/TheBloke/med42-70B-GGUF/blob/main/med42-70b.Q5_K_S.gguf) | Q5_K_S | 5 | 47.46 GB| 49.96 GB | large, low quality loss - recommended |
| [med42-70b.Q5_K_M.gguf](https://huggingface.co/TheBloke/med42-70B-GGUF/blob/main/med42-70b.Q5_K_M.gguf) | Q5_K_M | 5 | 48.75 GB| 51.25 GB | large, very low quality loss - recommended |
| med42-70b.Q6_K.gguf | Q6_K | 6 | 56.59 GB| 59.09 GB | very large, extremely low quality loss |
| med42-70b.Q8_0.gguf | Q8_0 | 8 | 73.29 GB| 75.79 GB | very large, extremely low quality loss - not recommended |
**Note**: the above RAM figures assume no GPU offloading. If layers are offloaded to the GPU, this will reduce RAM usage and use VRAM instead.
### Q6_K and Q8_0 files are split and require joining
**Note:** HF does not support uploading files larger than 50GB. Therefore I have uploaded the Q6_K and Q8_0 files as split files.
<details>
<summary>Click for instructions regarding Q6_K and Q8_0 files</summary>
### q6_K
Please download:
* `med42-70b.Q6_K.gguf-split-a`
* `med42-70b.Q6_K.gguf-split-b`
### q8_0
Please download:
* `med42-70b.Q8_0.gguf-split-a`
* `med42-70b.Q8_0.gguf-split-b`
To join the files, do the following:
Linux and macOS:
```
cat med42-70b.Q6_K.gguf-split-* > med42-70b.Q6_K.gguf && rm med42-70b.Q6_K.gguf-split-*
cat med42-70b.Q8_0.gguf-split-* > med42-70b.Q8_0.gguf && rm med42-70b.Q8_0.gguf-split-*
```
Windows command line:
```
COPY /B med42-70b.Q6_K.gguf-split-a + med42-70b.Q6_K.gguf-split-b med42-70b.Q6_K.gguf
del med42-70b.Q6_K.gguf-split-a med42-70b.Q6_K.gguf-split-b
COPY /B med42-70b.Q8_0.gguf-split-a + med42-70b.Q8_0.gguf-split-b med42-70b.Q8_0.gguf
del med42-70b.Q8_0.gguf-split-a med42-70b.Q8_0.gguf-split-b
```
</details>
<!-- README_GGUF.md-provided-files end -->
<!-- README_GGUF.md-how-to-download start -->
## How to download GGUF files
**Note for manual downloaders:** You almost never want to clone the entire repo! Multiple different quantisation formats are provided, and most users only want to pick and download a single file.
The following clients/libraries will automatically download models for you, providing a list of available models to choose from:
* LM Studio
* LoLLMS Web UI
* Faraday.dev
### In `text-generation-webui`
Under Download Model, you can enter the model repo: TheBloke/med42-70B-GGUF and below it, a specific filename to download, such as: med42-70b.Q4_K_M.gguf.
Then click Download.
### On the command line, including multiple files at once
I recommend using the `huggingface-hub` Python library:
```shell
pip3 install huggingface-hub
```
Then you can download any individual model file to the current directory, at high speed, with a command like this:
```shell
huggingface-cli download TheBloke/med42-70B-GGUF med42-70b.Q4_K_M.gguf --local-dir . --local-dir-use-symlinks False
```
<details>
<summary>More advanced huggingface-cli download usage</summary>
You can also download multiple files at once with a pattern:
```shell
huggingface-cli download TheBloke/med42-70B-GGUF --local-dir . --local-dir-use-symlinks False --include='*Q4_K*gguf'
```
For more documentation on downloading with `huggingface-cli`, please see: [HF -> Hub Python Library -> Download files -> Download from the CLI](https://huggingface.co/docs/huggingface_hub/guides/download#download-from-the-cli).
To accelerate downloads on fast connections (1Gbit/s or higher), install `hf_transfer`:
```shell
pip3 install hf_transfer
```
And set environment variable `HF_HUB_ENABLE_HF_TRANSFER` to `1`:
```shell
HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download TheBloke/med42-70B-GGUF med42-70b.Q4_K_M.gguf --local-dir . --local-dir-use-symlinks False
```
Windows Command Line users: You can set the environment variable by running `set HF_HUB_ENABLE_HF_TRANSFER=1` before the download command.
</details>
<!-- README_GGUF.md-how-to-download end -->
<!-- README_GGUF.md-how-to-run start -->
## Example `llama.cpp` command
Make sure you are using `llama.cpp` from commit [d0cee0d](https://github.com/ggerganov/llama.cpp/commit/d0cee0d36d5be95a0d9088b674dbb27354107221) or later.
```shell
./main -ngl 32 -m med42-70b.Q4_K_M.gguf --color -c 4096 --temp 0.7 --repeat_penalty 1.1 -n -1 -p "<|system|>: You are a helpful medical assistant created by M42 Health in the UAE.\n<|prompter|>:{prompt}\n<|assistant|>:"
```
Change `-ngl 32` to the number of layers to offload to GPU. Remove it if you don't have GPU acceleration.
Change `-c 4096` to the desired sequence length. For extended sequence models - eg 8K, 16K, 32K - the necessary RoPE scaling parameters are read from the GGUF file and set by llama.cpp automatically.
If you want to have a chat-style conversation, replace the `-p <PROMPT>` argument with `-i -ins`
For other parameters and how to use them, please refer to [the llama.cpp documentation](https://github.com/ggerganov/llama.cpp/blob/master/examples/main/README.md)
## How to run in `text-generation-webui`
Further instructions here: [text-generation-webui/docs/llama.cpp.md](https://github.com/oobabooga/text-generation-webui/blob/main/docs/llama.cpp.md).
## How to run from Python code
You can use GGUF models from Python using the [llama-cpp-python](https://github.com/abetlen/llama-cpp-python) or [ctransformers](https://github.com/marella/ctransformers) libraries.
### How to load this model in Python code, using ctransformers
#### First install the package
Run one of the following commands, according to your system:
```shell
# Base ctransformers with no GPU acceleration
pip install ctransformers
# Or with CUDA GPU acceleration
pip install ctransformers[cuda]
# Or with AMD ROCm GPU acceleration (Linux only)
CT_HIPBLAS=1 pip install ctransformers --no-binary ctransformers
# Or with Metal GPU acceleration for macOS systems only
CT_METAL=1 pip install ctransformers --no-binary ctransformers
```
#### Simple ctransformers example code
```python
from ctransformers import AutoModelForCausalLM
# Set gpu_layers to the number of layers to offload to GPU. Set to 0 if no GPU acceleration is available on your system.
llm = AutoModelForCausalLM.from_pretrained("TheBloke/med42-70B-GGUF", model_file="med42-70b.Q4_K_M.gguf", model_type="llama", gpu_layers=50)
print(llm("AI is going to"))
```
## How to use with LangChain
Here are guides on using llama-cpp-python and ctransformers with LangChain:
* [LangChain + llama-cpp-python](https://python.langchain.com/docs/integrations/llms/llamacpp)
* [LangChain + ctransformers](https://python.langchain.com/docs/integrations/providers/ctransformers)
<!-- README_GGUF.md-how-to-run end -->
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute
Thanks to the [chirper.ai](https://chirper.ai) team!
Thanks to Clay from [gpus.llm-utils.org](llm-utils)!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Pierre Kircher, Stanislav Ovsiannikov, Michael Levine, Eugene Pentland, Andrey, 준교 김, Randy H, Fred von Graf, Artur Olbinski, Caitlyn Gatomon, terasurfer, Jeff Scroggin, James Bentley, Vadim, Gabriel Puliatti, Harry Royden McLaughlin, Sean Connelly, Dan Guido, Edmond Seymore, Alicia Loh, subjectnull, AzureBlack, Manuel Alberto Morcote, Thomas Belote, Lone Striker, Chris Smitley, Vitor Caleffi, Johann-Peter Hartmann, Clay Pascal, biorpg, Brandon Frisco, sidney chen, transmissions 11, Pedro Madruga, jinyuan sun, Ajan Kanaga, Emad Mostaque, Trenton Dambrowitz, Jonathan Leane, Iucharbius, usrbinkat, vamX, George Stoitzev, Luke Pendergrass, theTransient, Olakabola, Swaroop Kallakuri, Cap'n Zoog, Brandon Phillips, Michael Dempsey, Nikolai Manek, danny, Matthew Berman, Gabriel Tamborski, alfie_i, Raymond Fosdick, Tom X Nguyen, Raven Klaugh, LangChain4j, Magnesian, Illia Dulskyi, David Ziegler, Mano Prime, Luis Javier Navarrete Lozano, Erik Bjäreholt, 阿明, Nathan Dryer, Alex, Rainer Wilmers, zynix, TL, Joseph William Delisle, John Villwock, Nathan LeClaire, Willem Michiel, Joguhyik, GodLy, OG, Alps Aficionado, Jeffrey Morgan, ReadyPlayerEmma, Tiffany J. Kim, Sebastain Graf, Spencer Kim, Michael Davis, webtim, Talal Aujan, knownsqashed, John Detwiler, Imad Khwaja, Deo Leter, Jerry Meng, Elijah Stavena, Rooh Singh, Pieter, SuperWojo, Alexandros Triantafyllidis, Stephen Murray, Ai Maven, ya boyyy, Enrico Ros, Ken Nordquist, Deep Realms, Nicholas, Spiking Neurons AB, Elle, Will Dee, Jack West, RoA, Luke @flexchar, Viktor Bowallius, Derek Yates, Subspace Studios, jjj, Toran Billups, Asp the Wyvern, Fen Risland, Ilya, NimbleBox.ai, Chadd, Nitin Borwankar, Emre, Mandus, Leonard Tan, Kalila, K, Trailburnt, S_X, Cory Kujawski
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
<!-- original-model-card start -->
# Original model card: M42 Health's Med42 70B
# **Med42 - Clinical Large Language Model**
Med42 is an open-access clinical large language model (LLM) developed by M42 to expand access to medical knowledge. Built off LLaMA-2 and comprising 70 billion parameters, this generative AI system provides high-quality answers to medical questions.
## Model Details
*Note: Use of this model is governed by the M42 Health license. In order to download the model weights (and tokenizer), please read the [Med42 License](https://huggingface.co/spaces/m42-health/License) and accept our License by requesting access here.*
Beginning with the base LLaMa-2 model, Med42 was instruction-tuned on a dataset of ~250M tokens compiled from different open-access sources, including medical flashcards, exam questions, and open-domain dialogues.
**Model Developers:** M42 Health AI Team
**Finetuned from model:** Llama-2 - 70B
**Context length:** 4k tokens
**Input:** Text only data
**Output:** Model generates text only
**Status:** This is a static model trained on an offline dataset. Future versions of the tuned models will be released as we enhance model's performance.
**License:** A custom license is available [here](https://huggingface.co/spaces/m42-health/License)
**Research Paper:** TBA
## Intended Use
Med42 is being made available for further testing and assessment as an AI assistant to enhance clinical decision-making and enhance access to an LLM for healthcare use. Potential use cases include:
- Medical question answering
- Patient record summarization
- Aiding medical diagnosis
- General health Q&A
To get the expected features and performance for the model, a specific formatting needs to be followed, including the `<|system|>`, `<|prompter|>` and `<|assistant|>` tags.
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name_or_path = "m42-health/med42-70b"
model = AutoModelForCausalLM.from_pretrained(model_name_or_path,
device_map="auto")
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path)
prompt = "What are the symptoms of diabetes ?"
prompt_template=f'''
<|system|>: You are a helpful medical assistant created by M42 Health in the UAE.
<|prompter|>:{prompt}
<|assistant|>:
'''
input_ids = tokenizer(prompt_template, return_tensors='pt').input_ids.cuda()
output = model.generate(inputs=input_ids, temperature=0.7, do_sample=True,eos_token_id=tokenizer.eos_token_id, pad_token_id=tokenizer.pad_token_id, max_new_tokens=512)
print(tokenizer.decode(output[0]))
```
## Hardware and Software
The training process was performed on the Condor Galaxy 1 (CG-1) supercomputer platform.
## Evaluation Results
Med42 achieves achieves competitive performance on various medical benchmarks, including MedQA, MedMCQA, PubMedQA, HeadQA, and Measuring Massive Multitask Language Understanding (MMLU) clinical topics. For all evaluations reported so far, we use [EleutherAI's evaluation harness library](https://github.com/EleutherAI/lm-evaluation-harness) and report zero-shot accuracies (except otherwise stated). We compare the performance with that reported for other models (ClinicalCamel-70B, GPT-3.5, GPT-4.0, Med-PaLM 2).
|Dataset|Med42|ClinicalCamel-70B|GPT-3.5|GPT-4.0|Med-PaLM-2 (5-shot)*|
|---|---|---|---|---|---|
|MMLU Clinical Knowledge|74.3|69.8|69.8|86.0|88.3|
|MMLU College Biology|84.0|79.2|72.2|95.1|94.4|
|MMLU College Medicine|68.8|67.0|61.3|76.9|80.9|
|MMLU Medical Genetics|86.0|69.0|70.0|91.0|90.0|
|MMLU Professional Medicine|79.8|71.3|70.2|93.0|95.2|
|MMLU Anatomy|67.4|62.2|56.3|80.0|77.8|
|MedMCQA|60.9|47.0|50.1|69.5|71.3|
|MedQA|61.5|53.4|50.8|78.9|79.7|
|USMLE Self-Assessment|71.7|-|49.1|83.8|-|
|USMLE Sample Exam|72.0|54.3|56.9|84.3|-|
**We note that 0-shot performance is not reported for Med-PaLM 2. Further details can be found at [https://github.com/m42health/med42](https://github.com/m42health/med42)*.
### Key performance metrics:
- Med42 achieves a 72% accuracy on the US Medical Licensing Examination (USMLE) sample exam, surpassing the prior state of the art among openly available medical LLMs.
- 61.5% on MedQA dataset (compared to 50.8% for GPT-3.5)
- Consistently higher performance on MMLU clinical topics compared to GPT-3.5.
## Limitations & Safe Use
- Med42 is not ready for real clinical use. Extensive human evaluation is undergoing as it is required to ensure safety.
- Potential for generating incorrect or harmful information.
- Risk of perpetuating biases in training data.
Use this model responsibly! Do not rely on it for medical usage without rigorous safety testing.
## Accessing Med42 and Reporting Issues
Please report any software "bug" or other problems through one of the following means:
- Reporting issues with the model: [https://github.com/m42health/med42](https://github.com/m42health/med42)
- Reporting risky content generated by the model, bugs and/or any security concerns: [https://forms.office.com/r/YMJu3kcKat](https://forms.office.com/r/YMJu3kcKat)
- M42’s privacy policy available at [https://m42.ae/privacy-policy/](https://m42.ae/privacy-policy/)
- Reporting violations of the Acceptable Use Policy or unlicensed uses of Med42: <[email protected]>
<!-- original-model-card end -->
| [
"QUESTION_ANSWERING",
"SUMMARIZATION"
] | [
"MEDQA",
"PUBMEDQA"
] |
gpustack/jina-embeddings-v2-base-en-GGUF | gpustack | feature-extraction | [
"sentence-transformers",
"gguf",
"feature-extraction",
"sentence-similarity",
"mteb",
"en",
"dataset:allenai/c4",
"arxiv:2108.12409",
"arxiv:2310.19923",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"region:us"
] | 2024-11-01T01:35:36 | 2024-11-01T02:01:38 | 1,121 | 0 | ---
datasets:
- allenai/c4
language: en
license: apache-2.0
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- mteb
inference: false
model-index:
- name: jina-embedding-b-en-v2
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 74.73134328358209
- type: ap
value: 37.765427081831035
- type: f1
value: 68.79367444339518
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 88.544275
- type: ap
value: 84.61328675662887
- type: f1
value: 88.51879035862375
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 45.263999999999996
- type: f1
value: 43.778759656699435
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: arguana
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 21.693
- type: map_at_10
value: 35.487
- type: map_at_100
value: 36.862
- type: map_at_1000
value: 36.872
- type: map_at_3
value: 30.049999999999997
- type: map_at_5
value: 32.966
- type: mrr_at_1
value: 21.977
- type: mrr_at_10
value: 35.565999999999995
- type: mrr_at_100
value: 36.948
- type: mrr_at_1000
value: 36.958
- type: mrr_at_3
value: 30.121
- type: mrr_at_5
value: 33.051
- type: ndcg_at_1
value: 21.693
- type: ndcg_at_10
value: 44.181
- type: ndcg_at_100
value: 49.982
- type: ndcg_at_1000
value: 50.233000000000004
- type: ndcg_at_3
value: 32.830999999999996
- type: ndcg_at_5
value: 38.080000000000005
- type: precision_at_1
value: 21.693
- type: precision_at_10
value: 7.248
- type: precision_at_100
value: 0.9769999999999999
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 13.632
- type: precision_at_5
value: 10.725
- type: recall_at_1
value: 21.693
- type: recall_at_10
value: 72.475
- type: recall_at_100
value: 97.653
- type: recall_at_1000
value: 99.57300000000001
- type: recall_at_3
value: 40.896
- type: recall_at_5
value: 53.627
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 45.39242428696777
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 36.675626784714
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 62.247725694904034
- type: mrr
value: 74.91359978894604
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 82.68003802970496
- type: cos_sim_spearman
value: 81.23438110096286
- type: euclidean_pearson
value: 81.87462986142582
- type: euclidean_spearman
value: 81.23438110096286
- type: manhattan_pearson
value: 81.61162566600755
- type: manhattan_spearman
value: 81.11329400456184
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 84.01298701298701
- type: f1
value: 83.31690714969382
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 37.050108150972086
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 30.15731442819715
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 31.391999999999996
- type: map_at_10
value: 42.597
- type: map_at_100
value: 44.07
- type: map_at_1000
value: 44.198
- type: map_at_3
value: 38.957
- type: map_at_5
value: 40.961
- type: mrr_at_1
value: 37.196
- type: mrr_at_10
value: 48.152
- type: mrr_at_100
value: 48.928
- type: mrr_at_1000
value: 48.964999999999996
- type: mrr_at_3
value: 45.446
- type: mrr_at_5
value: 47.205999999999996
- type: ndcg_at_1
value: 37.196
- type: ndcg_at_10
value: 49.089
- type: ndcg_at_100
value: 54.471000000000004
- type: ndcg_at_1000
value: 56.385
- type: ndcg_at_3
value: 43.699
- type: ndcg_at_5
value: 46.22
- type: precision_at_1
value: 37.196
- type: precision_at_10
value: 9.313
- type: precision_at_100
value: 1.478
- type: precision_at_1000
value: 0.198
- type: precision_at_3
value: 20.839
- type: precision_at_5
value: 14.936
- type: recall_at_1
value: 31.391999999999996
- type: recall_at_10
value: 61.876
- type: recall_at_100
value: 84.214
- type: recall_at_1000
value: 95.985
- type: recall_at_3
value: 46.6
- type: recall_at_5
value: 53.588
- type: map_at_1
value: 29.083
- type: map_at_10
value: 38.812999999999995
- type: map_at_100
value: 40.053
- type: map_at_1000
value: 40.188
- type: map_at_3
value: 36.111
- type: map_at_5
value: 37.519000000000005
- type: mrr_at_1
value: 36.497
- type: mrr_at_10
value: 44.85
- type: mrr_at_100
value: 45.546
- type: mrr_at_1000
value: 45.593
- type: mrr_at_3
value: 42.686
- type: mrr_at_5
value: 43.909
- type: ndcg_at_1
value: 36.497
- type: ndcg_at_10
value: 44.443
- type: ndcg_at_100
value: 48.979
- type: ndcg_at_1000
value: 51.154999999999994
- type: ndcg_at_3
value: 40.660000000000004
- type: ndcg_at_5
value: 42.193000000000005
- type: precision_at_1
value: 36.497
- type: precision_at_10
value: 8.433
- type: precision_at_100
value: 1.369
- type: precision_at_1000
value: 0.185
- type: precision_at_3
value: 19.894000000000002
- type: precision_at_5
value: 13.873
- type: recall_at_1
value: 29.083
- type: recall_at_10
value: 54.313
- type: recall_at_100
value: 73.792
- type: recall_at_1000
value: 87.629
- type: recall_at_3
value: 42.257
- type: recall_at_5
value: 47.066
- type: map_at_1
value: 38.556000000000004
- type: map_at_10
value: 50.698
- type: map_at_100
value: 51.705
- type: map_at_1000
value: 51.768
- type: map_at_3
value: 47.848
- type: map_at_5
value: 49.358000000000004
- type: mrr_at_1
value: 43.95
- type: mrr_at_10
value: 54.191
- type: mrr_at_100
value: 54.852999999999994
- type: mrr_at_1000
value: 54.885
- type: mrr_at_3
value: 51.954
- type: mrr_at_5
value: 53.13
- type: ndcg_at_1
value: 43.95
- type: ndcg_at_10
value: 56.516
- type: ndcg_at_100
value: 60.477000000000004
- type: ndcg_at_1000
value: 61.746
- type: ndcg_at_3
value: 51.601
- type: ndcg_at_5
value: 53.795
- type: precision_at_1
value: 43.95
- type: precision_at_10
value: 9.009
- type: precision_at_100
value: 1.189
- type: precision_at_1000
value: 0.135
- type: precision_at_3
value: 22.989
- type: precision_at_5
value: 15.473
- type: recall_at_1
value: 38.556000000000004
- type: recall_at_10
value: 70.159
- type: recall_at_100
value: 87.132
- type: recall_at_1000
value: 96.16
- type: recall_at_3
value: 56.906
- type: recall_at_5
value: 62.332
- type: map_at_1
value: 24.238
- type: map_at_10
value: 32.5
- type: map_at_100
value: 33.637
- type: map_at_1000
value: 33.719
- type: map_at_3
value: 30.026999999999997
- type: map_at_5
value: 31.555
- type: mrr_at_1
value: 26.328000000000003
- type: mrr_at_10
value: 34.44
- type: mrr_at_100
value: 35.455999999999996
- type: mrr_at_1000
value: 35.521
- type: mrr_at_3
value: 32.034
- type: mrr_at_5
value: 33.565
- type: ndcg_at_1
value: 26.328000000000003
- type: ndcg_at_10
value: 37.202
- type: ndcg_at_100
value: 42.728
- type: ndcg_at_1000
value: 44.792
- type: ndcg_at_3
value: 32.368
- type: ndcg_at_5
value: 35.008
- type: precision_at_1
value: 26.328000000000003
- type: precision_at_10
value: 5.7059999999999995
- type: precision_at_100
value: 0.8880000000000001
- type: precision_at_1000
value: 0.11100000000000002
- type: precision_at_3
value: 13.672
- type: precision_at_5
value: 9.74
- type: recall_at_1
value: 24.238
- type: recall_at_10
value: 49.829
- type: recall_at_100
value: 75.21
- type: recall_at_1000
value: 90.521
- type: recall_at_3
value: 36.867
- type: recall_at_5
value: 43.241
- type: map_at_1
value: 15.378
- type: map_at_10
value: 22.817999999999998
- type: map_at_100
value: 23.977999999999998
- type: map_at_1000
value: 24.108
- type: map_at_3
value: 20.719
- type: map_at_5
value: 21.889
- type: mrr_at_1
value: 19.03
- type: mrr_at_10
value: 27.022000000000002
- type: mrr_at_100
value: 28.011999999999997
- type: mrr_at_1000
value: 28.096
- type: mrr_at_3
value: 24.855
- type: mrr_at_5
value: 26.029999999999998
- type: ndcg_at_1
value: 19.03
- type: ndcg_at_10
value: 27.526
- type: ndcg_at_100
value: 33.040000000000006
- type: ndcg_at_1000
value: 36.187000000000005
- type: ndcg_at_3
value: 23.497
- type: ndcg_at_5
value: 25.334
- type: precision_at_1
value: 19.03
- type: precision_at_10
value: 4.963
- type: precision_at_100
value: 0.893
- type: precision_at_1000
value: 0.13
- type: precision_at_3
value: 11.360000000000001
- type: precision_at_5
value: 8.134
- type: recall_at_1
value: 15.378
- type: recall_at_10
value: 38.061
- type: recall_at_100
value: 61.754
- type: recall_at_1000
value: 84.259
- type: recall_at_3
value: 26.788
- type: recall_at_5
value: 31.326999999999998
- type: map_at_1
value: 27.511999999999997
- type: map_at_10
value: 37.429
- type: map_at_100
value: 38.818000000000005
- type: map_at_1000
value: 38.924
- type: map_at_3
value: 34.625
- type: map_at_5
value: 36.064
- type: mrr_at_1
value: 33.300999999999995
- type: mrr_at_10
value: 43.036
- type: mrr_at_100
value: 43.894
- type: mrr_at_1000
value: 43.936
- type: mrr_at_3
value: 40.825
- type: mrr_at_5
value: 42.028
- type: ndcg_at_1
value: 33.300999999999995
- type: ndcg_at_10
value: 43.229
- type: ndcg_at_100
value: 48.992000000000004
- type: ndcg_at_1000
value: 51.02100000000001
- type: ndcg_at_3
value: 38.794000000000004
- type: ndcg_at_5
value: 40.65
- type: precision_at_1
value: 33.300999999999995
- type: precision_at_10
value: 7.777000000000001
- type: precision_at_100
value: 1.269
- type: precision_at_1000
value: 0.163
- type: precision_at_3
value: 18.351
- type: precision_at_5
value: 12.762
- type: recall_at_1
value: 27.511999999999997
- type: recall_at_10
value: 54.788000000000004
- type: recall_at_100
value: 79.105
- type: recall_at_1000
value: 92.49199999999999
- type: recall_at_3
value: 41.924
- type: recall_at_5
value: 47.026
- type: map_at_1
value: 24.117
- type: map_at_10
value: 33.32
- type: map_at_100
value: 34.677
- type: map_at_1000
value: 34.78
- type: map_at_3
value: 30.233999999999998
- type: map_at_5
value: 31.668000000000003
- type: mrr_at_1
value: 29.566
- type: mrr_at_10
value: 38.244
- type: mrr_at_100
value: 39.245000000000005
- type: mrr_at_1000
value: 39.296
- type: mrr_at_3
value: 35.864000000000004
- type: mrr_at_5
value: 36.919999999999995
- type: ndcg_at_1
value: 29.566
- type: ndcg_at_10
value: 39.127
- type: ndcg_at_100
value: 44.989000000000004
- type: ndcg_at_1000
value: 47.189
- type: ndcg_at_3
value: 34.039
- type: ndcg_at_5
value: 35.744
- type: precision_at_1
value: 29.566
- type: precision_at_10
value: 7.385999999999999
- type: precision_at_100
value: 1.204
- type: precision_at_1000
value: 0.158
- type: precision_at_3
value: 16.286
- type: precision_at_5
value: 11.484
- type: recall_at_1
value: 24.117
- type: recall_at_10
value: 51.559999999999995
- type: recall_at_100
value: 77.104
- type: recall_at_1000
value: 91.79899999999999
- type: recall_at_3
value: 36.82
- type: recall_at_5
value: 41.453
- type: map_at_1
value: 25.17625
- type: map_at_10
value: 34.063916666666664
- type: map_at_100
value: 35.255500000000005
- type: map_at_1000
value: 35.37275
- type: map_at_3
value: 31.351666666666667
- type: map_at_5
value: 32.80608333333333
- type: mrr_at_1
value: 29.59783333333333
- type: mrr_at_10
value: 38.0925
- type: mrr_at_100
value: 38.957249999999995
- type: mrr_at_1000
value: 39.01608333333333
- type: mrr_at_3
value: 35.77625
- type: mrr_at_5
value: 37.04991666666667
- type: ndcg_at_1
value: 29.59783333333333
- type: ndcg_at_10
value: 39.343666666666664
- type: ndcg_at_100
value: 44.488249999999994
- type: ndcg_at_1000
value: 46.83358333333334
- type: ndcg_at_3
value: 34.69708333333333
- type: ndcg_at_5
value: 36.75075
- type: precision_at_1
value: 29.59783333333333
- type: precision_at_10
value: 6.884083333333332
- type: precision_at_100
value: 1.114
- type: precision_at_1000
value: 0.15108333333333332
- type: precision_at_3
value: 15.965250000000003
- type: precision_at_5
value: 11.246500000000001
- type: recall_at_1
value: 25.17625
- type: recall_at_10
value: 51.015999999999984
- type: recall_at_100
value: 73.60174999999998
- type: recall_at_1000
value: 89.849
- type: recall_at_3
value: 37.88399999999999
- type: recall_at_5
value: 43.24541666666666
- type: map_at_1
value: 24.537
- type: map_at_10
value: 31.081999999999997
- type: map_at_100
value: 32.042
- type: map_at_1000
value: 32.141
- type: map_at_3
value: 29.137
- type: map_at_5
value: 30.079
- type: mrr_at_1
value: 27.454
- type: mrr_at_10
value: 33.694
- type: mrr_at_100
value: 34.579
- type: mrr_at_1000
value: 34.649
- type: mrr_at_3
value: 32.004
- type: mrr_at_5
value: 32.794000000000004
- type: ndcg_at_1
value: 27.454
- type: ndcg_at_10
value: 34.915
- type: ndcg_at_100
value: 39.641
- type: ndcg_at_1000
value: 42.105
- type: ndcg_at_3
value: 31.276
- type: ndcg_at_5
value: 32.65
- type: precision_at_1
value: 27.454
- type: precision_at_10
value: 5.337
- type: precision_at_100
value: 0.8250000000000001
- type: precision_at_1000
value: 0.11199999999999999
- type: precision_at_3
value: 13.241
- type: precision_at_5
value: 8.895999999999999
- type: recall_at_1
value: 24.537
- type: recall_at_10
value: 44.324999999999996
- type: recall_at_100
value: 65.949
- type: recall_at_1000
value: 84.017
- type: recall_at_3
value: 33.857
- type: recall_at_5
value: 37.316
- type: map_at_1
value: 17.122
- type: map_at_10
value: 24.32
- type: map_at_100
value: 25.338
- type: map_at_1000
value: 25.462
- type: map_at_3
value: 22.064
- type: map_at_5
value: 23.322000000000003
- type: mrr_at_1
value: 20.647
- type: mrr_at_10
value: 27.858
- type: mrr_at_100
value: 28.743999999999996
- type: mrr_at_1000
value: 28.819
- type: mrr_at_3
value: 25.769
- type: mrr_at_5
value: 26.964
- type: ndcg_at_1
value: 20.647
- type: ndcg_at_10
value: 28.849999999999998
- type: ndcg_at_100
value: 33.849000000000004
- type: ndcg_at_1000
value: 36.802
- type: ndcg_at_3
value: 24.799
- type: ndcg_at_5
value: 26.682
- type: precision_at_1
value: 20.647
- type: precision_at_10
value: 5.2170000000000005
- type: precision_at_100
value: 0.906
- type: precision_at_1000
value: 0.134
- type: precision_at_3
value: 11.769
- type: precision_at_5
value: 8.486
- type: recall_at_1
value: 17.122
- type: recall_at_10
value: 38.999
- type: recall_at_100
value: 61.467000000000006
- type: recall_at_1000
value: 82.716
- type: recall_at_3
value: 27.601
- type: recall_at_5
value: 32.471
- type: map_at_1
value: 24.396
- type: map_at_10
value: 33.415
- type: map_at_100
value: 34.521
- type: map_at_1000
value: 34.631
- type: map_at_3
value: 30.703999999999997
- type: map_at_5
value: 32.166
- type: mrr_at_1
value: 28.825
- type: mrr_at_10
value: 37.397000000000006
- type: mrr_at_100
value: 38.286
- type: mrr_at_1000
value: 38.346000000000004
- type: mrr_at_3
value: 35.028
- type: mrr_at_5
value: 36.32
- type: ndcg_at_1
value: 28.825
- type: ndcg_at_10
value: 38.656
- type: ndcg_at_100
value: 43.856
- type: ndcg_at_1000
value: 46.31
- type: ndcg_at_3
value: 33.793
- type: ndcg_at_5
value: 35.909
- type: precision_at_1
value: 28.825
- type: precision_at_10
value: 6.567
- type: precision_at_100
value: 1.0330000000000001
- type: precision_at_1000
value: 0.135
- type: precision_at_3
value: 15.516
- type: precision_at_5
value: 10.914
- type: recall_at_1
value: 24.396
- type: recall_at_10
value: 50.747
- type: recall_at_100
value: 73.477
- type: recall_at_1000
value: 90.801
- type: recall_at_3
value: 37.1
- type: recall_at_5
value: 42.589
- type: map_at_1
value: 25.072
- type: map_at_10
value: 34.307
- type: map_at_100
value: 35.725
- type: map_at_1000
value: 35.943999999999996
- type: map_at_3
value: 30.906
- type: map_at_5
value: 32.818000000000005
- type: mrr_at_1
value: 29.644
- type: mrr_at_10
value: 38.673
- type: mrr_at_100
value: 39.459
- type: mrr_at_1000
value: 39.527
- type: mrr_at_3
value: 35.771
- type: mrr_at_5
value: 37.332
- type: ndcg_at_1
value: 29.644
- type: ndcg_at_10
value: 40.548
- type: ndcg_at_100
value: 45.678999999999995
- type: ndcg_at_1000
value: 48.488
- type: ndcg_at_3
value: 34.887
- type: ndcg_at_5
value: 37.543
- type: precision_at_1
value: 29.644
- type: precision_at_10
value: 7.688000000000001
- type: precision_at_100
value: 1.482
- type: precision_at_1000
value: 0.23600000000000002
- type: precision_at_3
value: 16.206
- type: precision_at_5
value: 12.016
- type: recall_at_1
value: 25.072
- type: recall_at_10
value: 53.478
- type: recall_at_100
value: 76.07300000000001
- type: recall_at_1000
value: 93.884
- type: recall_at_3
value: 37.583
- type: recall_at_5
value: 44.464
- type: map_at_1
value: 20.712
- type: map_at_10
value: 27.467999999999996
- type: map_at_100
value: 28.502
- type: map_at_1000
value: 28.610000000000003
- type: map_at_3
value: 24.887999999999998
- type: map_at_5
value: 26.273999999999997
- type: mrr_at_1
value: 22.736
- type: mrr_at_10
value: 29.553
- type: mrr_at_100
value: 30.485
- type: mrr_at_1000
value: 30.56
- type: mrr_at_3
value: 27.078999999999997
- type: mrr_at_5
value: 28.401
- type: ndcg_at_1
value: 22.736
- type: ndcg_at_10
value: 32.023
- type: ndcg_at_100
value: 37.158
- type: ndcg_at_1000
value: 39.823
- type: ndcg_at_3
value: 26.951999999999998
- type: ndcg_at_5
value: 29.281000000000002
- type: precision_at_1
value: 22.736
- type: precision_at_10
value: 5.213
- type: precision_at_100
value: 0.832
- type: precision_at_1000
value: 0.116
- type: precision_at_3
value: 11.459999999999999
- type: precision_at_5
value: 8.244
- type: recall_at_1
value: 20.712
- type: recall_at_10
value: 44.057
- type: recall_at_100
value: 67.944
- type: recall_at_1000
value: 87.925
- type: recall_at_3
value: 30.305
- type: recall_at_5
value: 36.071999999999996
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: climate-fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 10.181999999999999
- type: map_at_10
value: 16.66
- type: map_at_100
value: 18.273
- type: map_at_1000
value: 18.45
- type: map_at_3
value: 14.141
- type: map_at_5
value: 15.455
- type: mrr_at_1
value: 22.15
- type: mrr_at_10
value: 32.062000000000005
- type: mrr_at_100
value: 33.116
- type: mrr_at_1000
value: 33.168
- type: mrr_at_3
value: 28.827
- type: mrr_at_5
value: 30.892999999999997
- type: ndcg_at_1
value: 22.15
- type: ndcg_at_10
value: 23.532
- type: ndcg_at_100
value: 30.358
- type: ndcg_at_1000
value: 33.783
- type: ndcg_at_3
value: 19.222
- type: ndcg_at_5
value: 20.919999999999998
- type: precision_at_1
value: 22.15
- type: precision_at_10
value: 7.185999999999999
- type: precision_at_100
value: 1.433
- type: precision_at_1000
value: 0.207
- type: precision_at_3
value: 13.941
- type: precision_at_5
value: 10.906
- type: recall_at_1
value: 10.181999999999999
- type: recall_at_10
value: 28.104000000000003
- type: recall_at_100
value: 51.998999999999995
- type: recall_at_1000
value: 71.311
- type: recall_at_3
value: 17.698
- type: recall_at_5
value: 22.262999999999998
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: dbpedia-entity
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 6.669
- type: map_at_10
value: 15.552
- type: map_at_100
value: 21.865000000000002
- type: map_at_1000
value: 23.268
- type: map_at_3
value: 11.309
- type: map_at_5
value: 13.084000000000001
- type: mrr_at_1
value: 55.50000000000001
- type: mrr_at_10
value: 66.46600000000001
- type: mrr_at_100
value: 66.944
- type: mrr_at_1000
value: 66.956
- type: mrr_at_3
value: 64.542
- type: mrr_at_5
value: 65.717
- type: ndcg_at_1
value: 44.75
- type: ndcg_at_10
value: 35.049
- type: ndcg_at_100
value: 39.073
- type: ndcg_at_1000
value: 46.208
- type: ndcg_at_3
value: 39.525
- type: ndcg_at_5
value: 37.156
- type: precision_at_1
value: 55.50000000000001
- type: precision_at_10
value: 27.800000000000004
- type: precision_at_100
value: 9.013
- type: precision_at_1000
value: 1.8800000000000001
- type: precision_at_3
value: 42.667
- type: precision_at_5
value: 36.0
- type: recall_at_1
value: 6.669
- type: recall_at_10
value: 21.811
- type: recall_at_100
value: 45.112
- type: recall_at_1000
value: 67.806
- type: recall_at_3
value: 13.373
- type: recall_at_5
value: 16.615
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 48.769999999999996
- type: f1
value: 42.91448356376592
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 54.013
- type: map_at_10
value: 66.239
- type: map_at_100
value: 66.62599999999999
- type: map_at_1000
value: 66.644
- type: map_at_3
value: 63.965
- type: map_at_5
value: 65.45400000000001
- type: mrr_at_1
value: 58.221000000000004
- type: mrr_at_10
value: 70.43700000000001
- type: mrr_at_100
value: 70.744
- type: mrr_at_1000
value: 70.75099999999999
- type: mrr_at_3
value: 68.284
- type: mrr_at_5
value: 69.721
- type: ndcg_at_1
value: 58.221000000000004
- type: ndcg_at_10
value: 72.327
- type: ndcg_at_100
value: 73.953
- type: ndcg_at_1000
value: 74.312
- type: ndcg_at_3
value: 68.062
- type: ndcg_at_5
value: 70.56400000000001
- type: precision_at_1
value: 58.221000000000004
- type: precision_at_10
value: 9.521
- type: precision_at_100
value: 1.045
- type: precision_at_1000
value: 0.109
- type: precision_at_3
value: 27.348
- type: precision_at_5
value: 17.794999999999998
- type: recall_at_1
value: 54.013
- type: recall_at_10
value: 86.957
- type: recall_at_100
value: 93.911
- type: recall_at_1000
value: 96.38
- type: recall_at_3
value: 75.555
- type: recall_at_5
value: 81.671
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: fiqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 21.254
- type: map_at_10
value: 33.723
- type: map_at_100
value: 35.574
- type: map_at_1000
value: 35.730000000000004
- type: map_at_3
value: 29.473
- type: map_at_5
value: 31.543
- type: mrr_at_1
value: 41.358
- type: mrr_at_10
value: 49.498
- type: mrr_at_100
value: 50.275999999999996
- type: mrr_at_1000
value: 50.308
- type: mrr_at_3
value: 47.016000000000005
- type: mrr_at_5
value: 48.336
- type: ndcg_at_1
value: 41.358
- type: ndcg_at_10
value: 41.579
- type: ndcg_at_100
value: 48.455
- type: ndcg_at_1000
value: 51.165000000000006
- type: ndcg_at_3
value: 37.681
- type: ndcg_at_5
value: 38.49
- type: precision_at_1
value: 41.358
- type: precision_at_10
value: 11.543000000000001
- type: precision_at_100
value: 1.87
- type: precision_at_1000
value: 0.23600000000000002
- type: precision_at_3
value: 24.743000000000002
- type: precision_at_5
value: 17.994
- type: recall_at_1
value: 21.254
- type: recall_at_10
value: 48.698
- type: recall_at_100
value: 74.588
- type: recall_at_1000
value: 91.00200000000001
- type: recall_at_3
value: 33.939
- type: recall_at_5
value: 39.367000000000004
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: hotpotqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 35.922
- type: map_at_10
value: 52.32599999999999
- type: map_at_100
value: 53.18000000000001
- type: map_at_1000
value: 53.245
- type: map_at_3
value: 49.294
- type: map_at_5
value: 51.202999999999996
- type: mrr_at_1
value: 71.843
- type: mrr_at_10
value: 78.24600000000001
- type: mrr_at_100
value: 78.515
- type: mrr_at_1000
value: 78.527
- type: mrr_at_3
value: 77.17500000000001
- type: mrr_at_5
value: 77.852
- type: ndcg_at_1
value: 71.843
- type: ndcg_at_10
value: 61.379
- type: ndcg_at_100
value: 64.535
- type: ndcg_at_1000
value: 65.888
- type: ndcg_at_3
value: 56.958
- type: ndcg_at_5
value: 59.434
- type: precision_at_1
value: 71.843
- type: precision_at_10
value: 12.686
- type: precision_at_100
value: 1.517
- type: precision_at_1000
value: 0.16999999999999998
- type: precision_at_3
value: 35.778
- type: precision_at_5
value: 23.422
- type: recall_at_1
value: 35.922
- type: recall_at_10
value: 63.43
- type: recall_at_100
value: 75.868
- type: recall_at_1000
value: 84.88900000000001
- type: recall_at_3
value: 53.666000000000004
- type: recall_at_5
value: 58.555
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 79.4408
- type: ap
value: 73.52820871620366
- type: f1
value: 79.36240238685001
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: msmarco
config: default
split: dev
revision: None
metrics:
- type: map_at_1
value: 21.826999999999998
- type: map_at_10
value: 34.04
- type: map_at_100
value: 35.226
- type: map_at_1000
value: 35.275
- type: map_at_3
value: 30.165999999999997
- type: map_at_5
value: 32.318000000000005
- type: mrr_at_1
value: 22.464000000000002
- type: mrr_at_10
value: 34.631
- type: mrr_at_100
value: 35.752
- type: mrr_at_1000
value: 35.795
- type: mrr_at_3
value: 30.798
- type: mrr_at_5
value: 32.946999999999996
- type: ndcg_at_1
value: 22.464000000000002
- type: ndcg_at_10
value: 40.919
- type: ndcg_at_100
value: 46.632
- type: ndcg_at_1000
value: 47.833
- type: ndcg_at_3
value: 32.992
- type: ndcg_at_5
value: 36.834
- type: precision_at_1
value: 22.464000000000002
- type: precision_at_10
value: 6.494
- type: precision_at_100
value: 0.9369999999999999
- type: precision_at_1000
value: 0.104
- type: precision_at_3
value: 14.021
- type: precision_at_5
value: 10.347000000000001
- type: recall_at_1
value: 21.826999999999998
- type: recall_at_10
value: 62.132
- type: recall_at_100
value: 88.55199999999999
- type: recall_at_1000
value: 97.707
- type: recall_at_3
value: 40.541
- type: recall_at_5
value: 49.739
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 95.68399452804377
- type: f1
value: 95.25490609832268
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 83.15321477428182
- type: f1
value: 60.35476439087966
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 71.92669804976462
- type: f1
value: 69.22815107207565
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 74.4855413584398
- type: f1
value: 72.92107516103387
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 32.412679360205544
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 28.09211869875204
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 30.540919056982545
- type: mrr
value: 31.529904607063536
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: nfcorpus
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 5.745
- type: map_at_10
value: 12.013
- type: map_at_100
value: 15.040000000000001
- type: map_at_1000
value: 16.427
- type: map_at_3
value: 8.841000000000001
- type: map_at_5
value: 10.289
- type: mrr_at_1
value: 45.201
- type: mrr_at_10
value: 53.483999999999995
- type: mrr_at_100
value: 54.20700000000001
- type: mrr_at_1000
value: 54.252
- type: mrr_at_3
value: 51.29
- type: mrr_at_5
value: 52.73
- type: ndcg_at_1
value: 43.808
- type: ndcg_at_10
value: 32.445
- type: ndcg_at_100
value: 30.031000000000002
- type: ndcg_at_1000
value: 39.007
- type: ndcg_at_3
value: 37.204
- type: ndcg_at_5
value: 35.07
- type: precision_at_1
value: 45.201
- type: precision_at_10
value: 23.684
- type: precision_at_100
value: 7.600999999999999
- type: precision_at_1000
value: 2.043
- type: precision_at_3
value: 33.953
- type: precision_at_5
value: 29.412
- type: recall_at_1
value: 5.745
- type: recall_at_10
value: 16.168
- type: recall_at_100
value: 30.875999999999998
- type: recall_at_1000
value: 62.686
- type: recall_at_3
value: 9.75
- type: recall_at_5
value: 12.413
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: nq
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 37.828
- type: map_at_10
value: 53.239000000000004
- type: map_at_100
value: 54.035999999999994
- type: map_at_1000
value: 54.067
- type: map_at_3
value: 49.289
- type: map_at_5
value: 51.784
- type: mrr_at_1
value: 42.497
- type: mrr_at_10
value: 55.916999999999994
- type: mrr_at_100
value: 56.495
- type: mrr_at_1000
value: 56.516999999999996
- type: mrr_at_3
value: 52.800000000000004
- type: mrr_at_5
value: 54.722
- type: ndcg_at_1
value: 42.468
- type: ndcg_at_10
value: 60.437
- type: ndcg_at_100
value: 63.731
- type: ndcg_at_1000
value: 64.41799999999999
- type: ndcg_at_3
value: 53.230999999999995
- type: ndcg_at_5
value: 57.26
- type: precision_at_1
value: 42.468
- type: precision_at_10
value: 9.47
- type: precision_at_100
value: 1.1360000000000001
- type: precision_at_1000
value: 0.12
- type: precision_at_3
value: 23.724999999999998
- type: precision_at_5
value: 16.593
- type: recall_at_1
value: 37.828
- type: recall_at_10
value: 79.538
- type: recall_at_100
value: 93.646
- type: recall_at_1000
value: 98.72999999999999
- type: recall_at_3
value: 61.134
- type: recall_at_5
value: 70.377
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: quora
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 70.548
- type: map_at_10
value: 84.466
- type: map_at_100
value: 85.10600000000001
- type: map_at_1000
value: 85.123
- type: map_at_3
value: 81.57600000000001
- type: map_at_5
value: 83.399
- type: mrr_at_1
value: 81.24
- type: mrr_at_10
value: 87.457
- type: mrr_at_100
value: 87.574
- type: mrr_at_1000
value: 87.575
- type: mrr_at_3
value: 86.507
- type: mrr_at_5
value: 87.205
- type: ndcg_at_1
value: 81.25
- type: ndcg_at_10
value: 88.203
- type: ndcg_at_100
value: 89.457
- type: ndcg_at_1000
value: 89.563
- type: ndcg_at_3
value: 85.465
- type: ndcg_at_5
value: 87.007
- type: precision_at_1
value: 81.25
- type: precision_at_10
value: 13.373
- type: precision_at_100
value: 1.5270000000000001
- type: precision_at_1000
value: 0.157
- type: precision_at_3
value: 37.417
- type: precision_at_5
value: 24.556
- type: recall_at_1
value: 70.548
- type: recall_at_10
value: 95.208
- type: recall_at_100
value: 99.514
- type: recall_at_1000
value: 99.988
- type: recall_at_3
value: 87.214
- type: recall_at_5
value: 91.696
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 53.04822095496839
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 60.30778476474675
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: scidocs
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 4.692
- type: map_at_10
value: 11.766
- type: map_at_100
value: 13.904
- type: map_at_1000
value: 14.216999999999999
- type: map_at_3
value: 8.245
- type: map_at_5
value: 9.92
- type: mrr_at_1
value: 23.0
- type: mrr_at_10
value: 33.78
- type: mrr_at_100
value: 34.922
- type: mrr_at_1000
value: 34.973
- type: mrr_at_3
value: 30.2
- type: mrr_at_5
value: 32.565
- type: ndcg_at_1
value: 23.0
- type: ndcg_at_10
value: 19.863
- type: ndcg_at_100
value: 28.141
- type: ndcg_at_1000
value: 33.549
- type: ndcg_at_3
value: 18.434
- type: ndcg_at_5
value: 16.384
- type: precision_at_1
value: 23.0
- type: precision_at_10
value: 10.39
- type: precision_at_100
value: 2.235
- type: precision_at_1000
value: 0.35300000000000004
- type: precision_at_3
value: 17.133000000000003
- type: precision_at_5
value: 14.44
- type: recall_at_1
value: 4.692
- type: recall_at_10
value: 21.025
- type: recall_at_100
value: 45.324999999999996
- type: recall_at_1000
value: 71.675
- type: recall_at_3
value: 10.440000000000001
- type: recall_at_5
value: 14.64
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 84.96178184892842
- type: cos_sim_spearman
value: 79.6487740813199
- type: euclidean_pearson
value: 82.06661161625023
- type: euclidean_spearman
value: 79.64876769031183
- type: manhattan_pearson
value: 82.07061164575131
- type: manhattan_spearman
value: 79.65197039464537
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 84.15305604100027
- type: cos_sim_spearman
value: 74.27447427941591
- type: euclidean_pearson
value: 80.52737337565307
- type: euclidean_spearman
value: 74.27416077132192
- type: manhattan_pearson
value: 80.53728571140387
- type: manhattan_spearman
value: 74.28853605753457
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 83.44386080639279
- type: cos_sim_spearman
value: 84.17947648159536
- type: euclidean_pearson
value: 83.34145388129387
- type: euclidean_spearman
value: 84.17947648159536
- type: manhattan_pearson
value: 83.30699061927966
- type: manhattan_spearman
value: 84.18125737380451
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 81.57392220985612
- type: cos_sim_spearman
value: 78.80745014464101
- type: euclidean_pearson
value: 80.01660371487199
- type: euclidean_spearman
value: 78.80741240102256
- type: manhattan_pearson
value: 79.96810779507953
- type: manhattan_spearman
value: 78.75600400119448
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 86.85421063026625
- type: cos_sim_spearman
value: 87.55320285299192
- type: euclidean_pearson
value: 86.69750143323517
- type: euclidean_spearman
value: 87.55320284326378
- type: manhattan_pearson
value: 86.63379169960379
- type: manhattan_spearman
value: 87.4815029877984
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 84.31314130411842
- type: cos_sim_spearman
value: 85.3489588181433
- type: euclidean_pearson
value: 84.13240933463535
- type: euclidean_spearman
value: 85.34902871403281
- type: manhattan_pearson
value: 84.01183086503559
- type: manhattan_spearman
value: 85.19316703166102
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 89.09979781689536
- type: cos_sim_spearman
value: 88.87813323759015
- type: euclidean_pearson
value: 88.65413031123792
- type: euclidean_spearman
value: 88.87813323759015
- type: manhattan_pearson
value: 88.61818758256024
- type: manhattan_spearman
value: 88.81044100494604
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 62.30693258111531
- type: cos_sim_spearman
value: 62.195516523251946
- type: euclidean_pearson
value: 62.951283701049476
- type: euclidean_spearman
value: 62.195516523251946
- type: manhattan_pearson
value: 63.068322281439535
- type: manhattan_spearman
value: 62.10621171028406
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 84.27092833763909
- type: cos_sim_spearman
value: 84.84429717949759
- type: euclidean_pearson
value: 84.8516966060792
- type: euclidean_spearman
value: 84.84429717949759
- type: manhattan_pearson
value: 84.82203139242881
- type: manhattan_spearman
value: 84.8358503952945
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 83.10290863981409
- type: mrr
value: 95.31168450286097
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: scifact
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 52.161
- type: map_at_10
value: 62.138000000000005
- type: map_at_100
value: 62.769
- type: map_at_1000
value: 62.812
- type: map_at_3
value: 59.111000000000004
- type: map_at_5
value: 60.995999999999995
- type: mrr_at_1
value: 55.333
- type: mrr_at_10
value: 63.504000000000005
- type: mrr_at_100
value: 64.036
- type: mrr_at_1000
value: 64.08
- type: mrr_at_3
value: 61.278
- type: mrr_at_5
value: 62.778
- type: ndcg_at_1
value: 55.333
- type: ndcg_at_10
value: 66.678
- type: ndcg_at_100
value: 69.415
- type: ndcg_at_1000
value: 70.453
- type: ndcg_at_3
value: 61.755
- type: ndcg_at_5
value: 64.546
- type: precision_at_1
value: 55.333
- type: precision_at_10
value: 9.033
- type: precision_at_100
value: 1.043
- type: precision_at_1000
value: 0.11199999999999999
- type: precision_at_3
value: 24.221999999999998
- type: precision_at_5
value: 16.333000000000002
- type: recall_at_1
value: 52.161
- type: recall_at_10
value: 79.156
- type: recall_at_100
value: 91.333
- type: recall_at_1000
value: 99.333
- type: recall_at_3
value: 66.43299999999999
- type: recall_at_5
value: 73.272
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.81287128712871
- type: cos_sim_ap
value: 95.30034785910676
- type: cos_sim_f1
value: 90.28629856850716
- type: cos_sim_precision
value: 92.36401673640168
- type: cos_sim_recall
value: 88.3
- type: dot_accuracy
value: 99.81287128712871
- type: dot_ap
value: 95.30034785910676
- type: dot_f1
value: 90.28629856850716
- type: dot_precision
value: 92.36401673640168
- type: dot_recall
value: 88.3
- type: euclidean_accuracy
value: 99.81287128712871
- type: euclidean_ap
value: 95.30034785910676
- type: euclidean_f1
value: 90.28629856850716
- type: euclidean_precision
value: 92.36401673640168
- type: euclidean_recall
value: 88.3
- type: manhattan_accuracy
value: 99.80990099009901
- type: manhattan_ap
value: 95.26880751950654
- type: manhattan_f1
value: 90.22177419354838
- type: manhattan_precision
value: 90.95528455284553
- type: manhattan_recall
value: 89.5
- type: max_accuracy
value: 99.81287128712871
- type: max_ap
value: 95.30034785910676
- type: max_f1
value: 90.28629856850716
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 58.518662504351184
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 34.96168178378587
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 52.04862593471896
- type: mrr
value: 52.97238402936932
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 30.092545236479946
- type: cos_sim_spearman
value: 31.599851000175498
- type: dot_pearson
value: 30.092542723901676
- type: dot_spearman
value: 31.599851000175498
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: trec-covid
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 0.189
- type: map_at_10
value: 1.662
- type: map_at_100
value: 9.384
- type: map_at_1000
value: 22.669
- type: map_at_3
value: 0.5559999999999999
- type: map_at_5
value: 0.9039999999999999
- type: mrr_at_1
value: 68.0
- type: mrr_at_10
value: 81.01899999999999
- type: mrr_at_100
value: 81.01899999999999
- type: mrr_at_1000
value: 81.01899999999999
- type: mrr_at_3
value: 79.333
- type: mrr_at_5
value: 80.733
- type: ndcg_at_1
value: 63.0
- type: ndcg_at_10
value: 65.913
- type: ndcg_at_100
value: 51.895
- type: ndcg_at_1000
value: 46.967
- type: ndcg_at_3
value: 65.49199999999999
- type: ndcg_at_5
value: 66.69699999999999
- type: precision_at_1
value: 68.0
- type: precision_at_10
value: 71.6
- type: precision_at_100
value: 53.66
- type: precision_at_1000
value: 21.124000000000002
- type: precision_at_3
value: 72.667
- type: precision_at_5
value: 74.0
- type: recall_at_1
value: 0.189
- type: recall_at_10
value: 1.913
- type: recall_at_100
value: 12.601999999999999
- type: recall_at_1000
value: 44.296
- type: recall_at_3
value: 0.605
- type: recall_at_5
value: 1.018
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: webis-touche2020
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 2.701
- type: map_at_10
value: 10.445
- type: map_at_100
value: 17.324
- type: map_at_1000
value: 19.161
- type: map_at_3
value: 5.497
- type: map_at_5
value: 7.278
- type: mrr_at_1
value: 30.612000000000002
- type: mrr_at_10
value: 45.534
- type: mrr_at_100
value: 45.792
- type: mrr_at_1000
value: 45.806999999999995
- type: mrr_at_3
value: 37.755
- type: mrr_at_5
value: 43.469
- type: ndcg_at_1
value: 26.531
- type: ndcg_at_10
value: 26.235000000000003
- type: ndcg_at_100
value: 39.17
- type: ndcg_at_1000
value: 51.038
- type: ndcg_at_3
value: 23.625
- type: ndcg_at_5
value: 24.338
- type: precision_at_1
value: 30.612000000000002
- type: precision_at_10
value: 24.285999999999998
- type: precision_at_100
value: 8.224
- type: precision_at_1000
value: 1.6179999999999999
- type: precision_at_3
value: 24.490000000000002
- type: precision_at_5
value: 24.898
- type: recall_at_1
value: 2.701
- type: recall_at_10
value: 17.997
- type: recall_at_100
value: 51.766999999999996
- type: recall_at_1000
value: 87.863
- type: recall_at_3
value: 6.295000000000001
- type: recall_at_5
value: 9.993
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 73.3474
- type: ap
value: 15.393431414459924
- type: f1
value: 56.466681887882416
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 62.062818336163
- type: f1
value: 62.11230840463252
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 42.464892820845115
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 86.15962329379508
- type: cos_sim_ap
value: 74.73674057919256
- type: cos_sim_f1
value: 68.81245642574947
- type: cos_sim_precision
value: 61.48255813953488
- type: cos_sim_recall
value: 78.12664907651715
- type: dot_accuracy
value: 86.15962329379508
- type: dot_ap
value: 74.7367634988281
- type: dot_f1
value: 68.81245642574947
- type: dot_precision
value: 61.48255813953488
- type: dot_recall
value: 78.12664907651715
- type: euclidean_accuracy
value: 86.15962329379508
- type: euclidean_ap
value: 74.7367761466634
- type: euclidean_f1
value: 68.81245642574947
- type: euclidean_precision
value: 61.48255813953488
- type: euclidean_recall
value: 78.12664907651715
- type: manhattan_accuracy
value: 86.21326816474935
- type: manhattan_ap
value: 74.64416473733951
- type: manhattan_f1
value: 68.80924855491331
- type: manhattan_precision
value: 61.23456790123457
- type: manhattan_recall
value: 78.52242744063325
- type: max_accuracy
value: 86.21326816474935
- type: max_ap
value: 74.7367761466634
- type: max_f1
value: 68.81245642574947
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 88.97620988085536
- type: cos_sim_ap
value: 86.08680845745758
- type: cos_sim_f1
value: 78.02793637114438
- type: cos_sim_precision
value: 73.11082699683736
- type: cos_sim_recall
value: 83.65414228518632
- type: dot_accuracy
value: 88.97620988085536
- type: dot_ap
value: 86.08681149437946
- type: dot_f1
value: 78.02793637114438
- type: dot_precision
value: 73.11082699683736
- type: dot_recall
value: 83.65414228518632
- type: euclidean_accuracy
value: 88.97620988085536
- type: euclidean_ap
value: 86.08681215460771
- type: euclidean_f1
value: 78.02793637114438
- type: euclidean_precision
value: 73.11082699683736
- type: euclidean_recall
value: 83.65414228518632
- type: manhattan_accuracy
value: 88.88888888888889
- type: manhattan_ap
value: 86.02916327562438
- type: manhattan_f1
value: 78.02063045516843
- type: manhattan_precision
value: 73.38851947346994
- type: manhattan_recall
value: 83.2768709578072
- type: max_accuracy
value: 88.97620988085536
- type: max_ap
value: 86.08681215460771
- type: max_f1
value: 78.02793637114438
---
# jina-embeddings-v2-base-en-GGUF
**Model creator**: [jinaai](https://huggingface.co/jinaai)<br/>
**Original model**: [jina-embeddings-v2-base-en](https://huggingface.co/jinaai/jina-embeddings-v2-base-en)<br/>
**GGUF quantization**: based on llama.cpp release [61408e7f](https://github.com/ggerganov/llama.cpp/commit/61408e7fad082dc44a11c8a9f1398da4837aad44)
---
<!-- TODO: add evaluation results here -->
<br><br>
<p align="center">
<img src="https://aeiljuispo.cloudimg.io/v7/https://cdn-uploads.huggingface.co/production/uploads/603763514de52ff951d89793/AFoybzd5lpBQXEBrQHuTt.png?w=200&h=200&f=face" alt="Finetuner logo: Finetuner helps you to create experiments in order to improve embeddings on search tasks. It accompanies you to deliver the last mile of performance-tuning for neural search applications." width="150px">
</p>
<p align="center">
<b>The text embedding set trained by <a href="https://jina.ai/"><b>Jina AI</b></a>.</b>
</p>
## Quick Start
The easiest way to starting using `jina-embeddings-v2-base-en` is to use Jina AI's [Embedding API](https://jina.ai/embeddings/).
## Intended Usage & Model Info
`jina-embeddings-v2-base-en` is an English, monolingual **embedding model** supporting **8192 sequence length**.
It is based on a BERT architecture (JinaBERT) that supports the symmetric bidirectional variant of [ALiBi](https://arxiv.org/abs/2108.12409) to allow longer sequence length.
The backbone `jina-bert-v2-base-en` is pretrained on the C4 dataset.
The model is further trained on Jina AI's collection of more than 400 millions of sentence pairs and hard negatives.
These pairs were obtained from various domains and were carefully selected through a thorough cleaning process.
The embedding model was trained using 512 sequence length, but extrapolates to 8k sequence length (or even longer) thanks to ALiBi.
This makes our model useful for a range of use cases, especially when processing long documents is needed, including long document retrieval, semantic textual similarity, text reranking, recommendation, RAG and LLM-based generative search, etc.
With a standard size of 137 million parameters, the model enables fast inference while delivering better performance than our small model. It is recommended to use a single GPU for inference.
Additionally, we provide the following embedding models:
- [`jina-embeddings-v2-small-en`](https://huggingface.co/jinaai/jina-embeddings-v2-small-en): 33 million parameters.
- [`jina-embeddings-v2-base-en`](https://huggingface.co/jinaai/jina-embeddings-v2-base-en): 137 million parameters **(you are here)**.
- [`jina-embeddings-v2-base-zh`](https://huggingface.co/jinaai/jina-embeddings-v2-base-zh): Chinese-English Bilingual embeddings.
- [`jina-embeddings-v2-base-de`](https://huggingface.co/jinaai/jina-embeddings-v2-base-de): German-English Bilingual embeddings.
- [`jina-embeddings-v2-base-es`](https://huggingface.co/jinaai/jina-embeddings-v2-base-es): Spanish-English Bilingual embeddings.
## Data & Parameters
Jina Embeddings V2 [technical report](https://arxiv.org/abs/2310.19923)
## Usage
**<details><summary>Please apply mean pooling when integrating the model.</summary>**
<p>
### Why mean pooling?
`mean poooling` takes all token embeddings from model output and averaging them at sentence/paragraph level.
It has been proved to be the most effective way to produce high-quality sentence embeddings.
We offer an `encode` function to deal with this.
However, if you would like to do it without using the default `encode` function:
```python
import torch
import torch.nn.functional as F
from transformers import AutoTokenizer, AutoModel
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0]
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
sentences = ['How is the weather today?', 'What is the current weather like today?']
tokenizer = AutoTokenizer.from_pretrained('jinaai/jina-embeddings-v2-small-en')
model = AutoModel.from_pretrained('jinaai/jina-embeddings-v2-small-en', trust_remote_code=True)
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
with torch.no_grad():
model_output = model(**encoded_input)
embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
embeddings = F.normalize(embeddings, p=2, dim=1)
```
</p>
</details>
You can use Jina Embedding models directly from transformers package.
```python
!pip install transformers
from transformers import AutoModel
from numpy.linalg import norm
cos_sim = lambda a,b: (a @ b.T) / (norm(a)*norm(b))
model = AutoModel.from_pretrained('jinaai/jina-embeddings-v2-base-en', trust_remote_code=True) # trust_remote_code is needed to use the encode method
embeddings = model.encode(['How is the weather today?', 'What is the current weather like today?'])
print(cos_sim(embeddings[0], embeddings[1]))
```
If you only want to handle shorter sequence, such as 2k, pass the `max_length` parameter to the `encode` function:
```python
embeddings = model.encode(
['Very long ... document'],
max_length=2048
)
```
Using the its latest release (v2.3.0) sentence-transformers also supports Jina embeddings (Please make sure that you are logged into huggingface as well):
```python
!pip install -U sentence-transformers
from sentence_transformers import SentenceTransformer
from sentence_transformers.util import cos_sim
model = SentenceTransformer(
"jinaai/jina-embeddings-v2-base-en", # switch to en/zh for English or Chinese
trust_remote_code=True
)
# control your input sequence length up to 8192
model.max_seq_length = 1024
embeddings = model.encode([
'How is the weather today?',
'What is the current weather like today?'
])
print(cos_sim(embeddings[0], embeddings[1]))
```
## Alternatives to Using Transformers (or SentencTransformers) Package
1. _Managed SaaS_: Get started with a free key on Jina AI's [Embedding API](https://jina.ai/embeddings/).
2. _Private and high-performance deployment_: Get started by picking from our suite of models and deploy them on [AWS Sagemaker](https://aws.amazon.com/marketplace/seller-profile?id=seller-stch2ludm6vgy).
## Use Jina Embeddings for RAG
According to the latest blog post from [LLamaIndex](https://blog.llamaindex.ai/boosting-rag-picking-the-best-embedding-reranker-models-42d079022e83),
> In summary, to achieve the peak performance in both hit rate and MRR, the combination of OpenAI or JinaAI-Base embeddings with the CohereRerank/bge-reranker-large reranker stands out.
<img src="https://miro.medium.com/v2/resize:fit:4800/format:webp/1*ZP2RVejCZovF3FDCg-Bx3A.png" width="780px">
## Plans
1. Bilingual embedding models supporting more European & Asian languages, including Spanish, French, Italian and Japanese.
2. Multimodal embedding models enable Multimodal RAG applications.
3. High-performt rerankers.
## Trouble Shooting
**Loading of Model Code failed**
If you forgot to pass the `trust_remote_code=True` flag when calling `AutoModel.from_pretrained` or initializing the model via the `SentenceTransformer` class, you will receive an error that the model weights could not be initialized.
This is caused by tranformers falling back to creating a default BERT model, instead of a jina-embedding model:
```bash
Some weights of the model checkpoint at jinaai/jina-embeddings-v2-base-en were not used when initializing BertModel: ['encoder.layer.2.mlp.layernorm.weight', 'encoder.layer.3.mlp.layernorm.weight', 'encoder.layer.10.mlp.wo.bias', 'encoder.layer.5.mlp.wo.bias', 'encoder.layer.2.mlp.layernorm.bias', 'encoder.layer.1.mlp.gated_layers.weight', 'encoder.layer.5.mlp.gated_layers.weight', 'encoder.layer.8.mlp.layernorm.bias', ...
```
**User is not logged into Huggingface**
The model is only availabe under [gated access](https://huggingface.co/docs/hub/models-gated).
This means you need to be logged into huggingface load load it.
If you receive the following error, you need to provide an access token, either by using the huggingface-cli or providing the token via an environment variable as described above:
```bash
OSError: jinaai/jina-embeddings-v2-base-en is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models'
If this is a private repository, make sure to pass a token having permission to this repo with `use_auth_token` or log in with `huggingface-cli login` and pass `use_auth_token=True`.
```
## Contact
Join our [Discord community](https://discord.jina.ai) and chat with other community members about ideas.
## Citation
If you find Jina Embeddings useful in your research, please cite the following paper:
```
@misc{günther2023jina,
title={Jina Embeddings 2: 8192-Token General-Purpose Text Embeddings for Long Documents},
author={Michael Günther and Jackmin Ong and Isabelle Mohr and Alaeddine Abdessalem and Tanguy Abel and Mohammad Kalim Akram and Susana Guzman and Georgios Mastrapas and Saba Sturua and Bo Wang and Maximilian Werk and Nan Wang and Han Xiao},
year={2023},
eprint={2310.19923},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
| [
"SUMMARIZATION"
] | [
"BIOSSES",
"SCIFACT"
] |
nomic-ai/nomic-embed-text-v1-unsupervised | nomic-ai | sentence-similarity | [
"sentence-transformers",
"pytorch",
"onnx",
"nomic_bert",
"feature-extraction",
"sentence-similarity",
"mteb",
"transformers",
"transformers.js",
"custom_code",
"en",
"arxiv:2402.01613",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"text-embeddings-inference",
"region:us"
] | 2024-01-15T21:33:42 | 2024-08-02T02:24:38 | 1,087 | 14 | ---
language:
- en
library_name: sentence-transformers
license: apache-2.0
pipeline_tag: sentence-similarity
tags:
- feature-extraction
- sentence-similarity
- mteb
- transformers
- transformers.js
inference: false
model-index:
- name: epoch_0_model
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 76.98507462686568
- type: ap
value: 39.47222193126652
- type: f1
value: 70.5923611893019
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 87.540175
- type: ap
value: 83.16128207188409
- type: f1
value: 87.5231988227265
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 46.80799999999999
- type: f1
value: 46.2632547445265
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: arguana
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 30.583
- type: map_at_10
value: 46.17
- type: map_at_100
value: 47.115
- type: map_at_1000
value: 47.121
- type: map_at_3
value: 41.489
- type: map_at_5
value: 44.046
- type: mrr_at_1
value: 30.939
- type: mrr_at_10
value: 46.289
- type: mrr_at_100
value: 47.241
- type: mrr_at_1000
value: 47.247
- type: mrr_at_3
value: 41.596
- type: mrr_at_5
value: 44.149
- type: ndcg_at_1
value: 30.583
- type: ndcg_at_10
value: 54.812000000000005
- type: ndcg_at_100
value: 58.605
- type: ndcg_at_1000
value: 58.753
- type: ndcg_at_3
value: 45.095
- type: ndcg_at_5
value: 49.744
- type: precision_at_1
value: 30.583
- type: precision_at_10
value: 8.243
- type: precision_at_100
value: 0.984
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 18.516
- type: precision_at_5
value: 13.385
- type: recall_at_1
value: 30.583
- type: recall_at_10
value: 82.432
- type: recall_at_100
value: 98.43499999999999
- type: recall_at_1000
value: 99.57300000000001
- type: recall_at_3
value: 55.547999999999995
- type: recall_at_5
value: 66.927
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 45.17830107652425
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 35.90561364087807
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 59.57222651819297
- type: mrr
value: 73.19241085169062
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 89.55181686367382
- type: cos_sim_spearman
value: 87.18933606575987
- type: euclidean_pearson
value: 87.78077503434338
- type: euclidean_spearman
value: 87.18933606575987
- type: manhattan_pearson
value: 87.75124980168601
- type: manhattan_spearman
value: 86.79113422137638
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 81.09415584415585
- type: f1
value: 80.60088693212091
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 36.57061229905462
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 32.05342946608653
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 34.376
- type: map_at_10
value: 45.214
- type: map_at_100
value: 46.635
- type: map_at_1000
value: 46.755
- type: map_at_3
value: 42.198
- type: map_at_5
value: 43.723
- type: mrr_at_1
value: 41.774
- type: mrr_at_10
value: 51.07000000000001
- type: mrr_at_100
value: 51.785000000000004
- type: mrr_at_1000
value: 51.824999999999996
- type: mrr_at_3
value: 48.808
- type: mrr_at_5
value: 50.11
- type: ndcg_at_1
value: 41.774
- type: ndcg_at_10
value: 51.105999999999995
- type: ndcg_at_100
value: 56.358
- type: ndcg_at_1000
value: 58.205
- type: ndcg_at_3
value: 46.965
- type: ndcg_at_5
value: 48.599
- type: precision_at_1
value: 41.774
- type: precision_at_10
value: 9.514
- type: precision_at_100
value: 1.508
- type: precision_at_1000
value: 0.196
- type: precision_at_3
value: 22.175
- type: precision_at_5
value: 15.508
- type: recall_at_1
value: 34.376
- type: recall_at_10
value: 61.748000000000005
- type: recall_at_100
value: 84.025
- type: recall_at_1000
value: 95.5
- type: recall_at_3
value: 49.378
- type: recall_at_5
value: 54.276
- type: map_at_1
value: 32.394
- type: map_at_10
value: 42.707
- type: map_at_100
value: 43.893
- type: map_at_1000
value: 44.019000000000005
- type: map_at_3
value: 39.51
- type: map_at_5
value: 41.381
- type: mrr_at_1
value: 41.019
- type: mrr_at_10
value: 49.042
- type: mrr_at_100
value: 49.669000000000004
- type: mrr_at_1000
value: 49.712
- type: mrr_at_3
value: 46.921
- type: mrr_at_5
value: 48.192
- type: ndcg_at_1
value: 41.019
- type: ndcg_at_10
value: 48.46
- type: ndcg_at_100
value: 52.537
- type: ndcg_at_1000
value: 54.491
- type: ndcg_at_3
value: 44.232
- type: ndcg_at_5
value: 46.305
- type: precision_at_1
value: 41.019
- type: precision_at_10
value: 9.134
- type: precision_at_100
value: 1.422
- type: precision_at_1000
value: 0.188
- type: precision_at_3
value: 21.38
- type: precision_at_5
value: 15.096000000000002
- type: recall_at_1
value: 32.394
- type: recall_at_10
value: 58.11500000000001
- type: recall_at_100
value: 75.509
- type: recall_at_1000
value: 87.812
- type: recall_at_3
value: 45.476
- type: recall_at_5
value: 51.549
- type: map_at_1
value: 43.47
- type: map_at_10
value: 55.871
- type: map_at_100
value: 56.745000000000005
- type: map_at_1000
value: 56.794
- type: map_at_3
value: 52.439
- type: map_at_5
value: 54.412000000000006
- type: mrr_at_1
value: 49.592000000000006
- type: mrr_at_10
value: 59.34199999999999
- type: mrr_at_100
value: 59.857000000000006
- type: mrr_at_1000
value: 59.88
- type: mrr_at_3
value: 56.897
- type: mrr_at_5
value: 58.339
- type: ndcg_at_1
value: 49.592000000000006
- type: ndcg_at_10
value: 61.67
- type: ndcg_at_100
value: 65.11099999999999
- type: ndcg_at_1000
value: 66.065
- type: ndcg_at_3
value: 56.071000000000005
- type: ndcg_at_5
value: 58.84700000000001
- type: precision_at_1
value: 49.592000000000006
- type: precision_at_10
value: 9.774
- type: precision_at_100
value: 1.2449999999999999
- type: precision_at_1000
value: 0.13699999999999998
- type: precision_at_3
value: 24.66
- type: precision_at_5
value: 16.878
- type: recall_at_1
value: 43.47
- type: recall_at_10
value: 75.387
- type: recall_at_100
value: 90.253
- type: recall_at_1000
value: 97.00800000000001
- type: recall_at_3
value: 60.616
- type: recall_at_5
value: 67.31899999999999
- type: map_at_1
value: 26.633000000000003
- type: map_at_10
value: 35.497
- type: map_at_100
value: 36.504
- type: map_at_1000
value: 36.574
- type: map_at_3
value: 33.115
- type: map_at_5
value: 34.536
- type: mrr_at_1
value: 28.927000000000003
- type: mrr_at_10
value: 37.778
- type: mrr_at_100
value: 38.634
- type: mrr_at_1000
value: 38.690000000000005
- type: mrr_at_3
value: 35.518
- type: mrr_at_5
value: 36.908
- type: ndcg_at_1
value: 28.927000000000003
- type: ndcg_at_10
value: 40.327
- type: ndcg_at_100
value: 45.321
- type: ndcg_at_1000
value: 47.214
- type: ndcg_at_3
value: 35.762
- type: ndcg_at_5
value: 38.153999999999996
- type: precision_at_1
value: 28.927000000000003
- type: precision_at_10
value: 6.045
- type: precision_at_100
value: 0.901
- type: precision_at_1000
value: 0.11
- type: precision_at_3
value: 15.140999999999998
- type: precision_at_5
value: 10.485999999999999
- type: recall_at_1
value: 26.633000000000003
- type: recall_at_10
value: 52.99
- type: recall_at_100
value: 76.086
- type: recall_at_1000
value: 90.46300000000001
- type: recall_at_3
value: 40.738
- type: recall_at_5
value: 46.449
- type: map_at_1
value: 17.521
- type: map_at_10
value: 25.130000000000003
- type: map_at_100
value: 26.176
- type: map_at_1000
value: 26.289
- type: map_at_3
value: 22.829
- type: map_at_5
value: 24.082
- type: mrr_at_1
value: 21.766
- type: mrr_at_10
value: 29.801
- type: mrr_at_100
value: 30.682
- type: mrr_at_1000
value: 30.75
- type: mrr_at_3
value: 27.633000000000003
- type: mrr_at_5
value: 28.858
- type: ndcg_at_1
value: 21.766
- type: ndcg_at_10
value: 30.026000000000003
- type: ndcg_at_100
value: 35.429
- type: ndcg_at_1000
value: 38.236
- type: ndcg_at_3
value: 25.968000000000004
- type: ndcg_at_5
value: 27.785
- type: precision_at_1
value: 21.766
- type: precision_at_10
value: 5.498
- type: precision_at_100
value: 0.9450000000000001
- type: precision_at_1000
value: 0.133
- type: precision_at_3
value: 12.687000000000001
- type: precision_at_5
value: 9.005
- type: recall_at_1
value: 17.521
- type: recall_at_10
value: 40.454
- type: recall_at_100
value: 64.828
- type: recall_at_1000
value: 84.83800000000001
- type: recall_at_3
value: 28.758
- type: recall_at_5
value: 33.617000000000004
- type: map_at_1
value: 30.564999999999998
- type: map_at_10
value: 40.664
- type: map_at_100
value: 41.995
- type: map_at_1000
value: 42.104
- type: map_at_3
value: 37.578
- type: map_at_5
value: 39.247
- type: mrr_at_1
value: 37.44
- type: mrr_at_10
value: 46.533
- type: mrr_at_100
value: 47.363
- type: mrr_at_1000
value: 47.405
- type: mrr_at_3
value: 44.224999999999994
- type: mrr_at_5
value: 45.549
- type: ndcg_at_1
value: 37.44
- type: ndcg_at_10
value: 46.574
- type: ndcg_at_100
value: 52.024
- type: ndcg_at_1000
value: 53.93900000000001
- type: ndcg_at_3
value: 41.722
- type: ndcg_at_5
value: 43.973
- type: precision_at_1
value: 37.44
- type: precision_at_10
value: 8.344999999999999
- type: precision_at_100
value: 1.278
- type: precision_at_1000
value: 0.16
- type: precision_at_3
value: 19.442
- type: precision_at_5
value: 13.802
- type: recall_at_1
value: 30.564999999999998
- type: recall_at_10
value: 58.207
- type: recall_at_100
value: 81.137
- type: recall_at_1000
value: 93.506
- type: recall_at_3
value: 44.606
- type: recall_at_5
value: 50.373000000000005
- type: map_at_1
value: 27.892
- type: map_at_10
value: 37.251
- type: map_at_100
value: 38.606
- type: map_at_1000
value: 38.716
- type: map_at_3
value: 34.312
- type: map_at_5
value: 35.791000000000004
- type: mrr_at_1
value: 34.247
- type: mrr_at_10
value: 42.696
- type: mrr_at_100
value: 43.659
- type: mrr_at_1000
value: 43.711
- type: mrr_at_3
value: 40.563
- type: mrr_at_5
value: 41.625
- type: ndcg_at_1
value: 34.247
- type: ndcg_at_10
value: 42.709
- type: ndcg_at_100
value: 48.422
- type: ndcg_at_1000
value: 50.544
- type: ndcg_at_3
value: 38.105
- type: ndcg_at_5
value: 39.846
- type: precision_at_1
value: 34.247
- type: precision_at_10
value: 7.66
- type: precision_at_100
value: 1.2109999999999999
- type: precision_at_1000
value: 0.157
- type: precision_at_3
value: 17.884
- type: precision_at_5
value: 12.489
- type: recall_at_1
value: 27.892
- type: recall_at_10
value: 53.559
- type: recall_at_100
value: 78.018
- type: recall_at_1000
value: 92.07300000000001
- type: recall_at_3
value: 40.154
- type: recall_at_5
value: 45.078
- type: map_at_1
value: 27.29375
- type: map_at_10
value: 36.19533333333334
- type: map_at_100
value: 37.33183333333334
- type: map_at_1000
value: 37.44616666666667
- type: map_at_3
value: 33.49125
- type: map_at_5
value: 34.94166666666667
- type: mrr_at_1
value: 32.336666666666666
- type: mrr_at_10
value: 40.45983333333333
- type: mrr_at_100
value: 41.26533333333334
- type: mrr_at_1000
value: 41.321583333333336
- type: mrr_at_3
value: 38.23416666666667
- type: mrr_at_5
value: 39.48491666666666
- type: ndcg_at_1
value: 32.336666666666666
- type: ndcg_at_10
value: 41.39958333333333
- type: ndcg_at_100
value: 46.293
- type: ndcg_at_1000
value: 48.53425
- type: ndcg_at_3
value: 36.88833333333333
- type: ndcg_at_5
value: 38.90733333333333
- type: precision_at_1
value: 32.336666666666666
- type: precision_at_10
value: 7.175916666666667
- type: precision_at_100
value: 1.1311666666666669
- type: precision_at_1000
value: 0.15141666666666667
- type: precision_at_3
value: 16.841166666666666
- type: precision_at_5
value: 11.796583333333334
- type: recall_at_1
value: 27.29375
- type: recall_at_10
value: 52.514583333333334
- type: recall_at_100
value: 74.128
- type: recall_at_1000
value: 89.64125
- type: recall_at_3
value: 39.83258333333333
- type: recall_at_5
value: 45.126416666666664
- type: map_at_1
value: 24.62
- type: map_at_10
value: 31.517
- type: map_at_100
value: 32.322
- type: map_at_1000
value: 32.422000000000004
- type: map_at_3
value: 29.293999999999997
- type: map_at_5
value: 30.403999999999996
- type: mrr_at_1
value: 27.607
- type: mrr_at_10
value: 34.294999999999995
- type: mrr_at_100
value: 35.045
- type: mrr_at_1000
value: 35.114000000000004
- type: mrr_at_3
value: 32.311
- type: mrr_at_5
value: 33.369
- type: ndcg_at_1
value: 27.607
- type: ndcg_at_10
value: 35.853
- type: ndcg_at_100
value: 39.919
- type: ndcg_at_1000
value: 42.452
- type: ndcg_at_3
value: 31.702
- type: ndcg_at_5
value: 33.47
- type: precision_at_1
value: 27.607
- type: precision_at_10
value: 5.598
- type: precision_at_100
value: 0.83
- type: precision_at_1000
value: 0.11199999999999999
- type: precision_at_3
value: 13.700999999999999
- type: precision_at_5
value: 9.325
- type: recall_at_1
value: 24.62
- type: recall_at_10
value: 46.475
- type: recall_at_100
value: 64.891
- type: recall_at_1000
value: 83.524
- type: recall_at_3
value: 34.954
- type: recall_at_5
value: 39.471000000000004
- type: map_at_1
value: 16.858999999999998
- type: map_at_10
value: 23.746000000000002
- type: map_at_100
value: 24.731
- type: map_at_1000
value: 24.86
- type: map_at_3
value: 21.603
- type: map_at_5
value: 22.811999999999998
- type: mrr_at_1
value: 20.578
- type: mrr_at_10
value: 27.618
- type: mrr_at_100
value: 28.459
- type: mrr_at_1000
value: 28.543000000000003
- type: mrr_at_3
value: 25.533
- type: mrr_at_5
value: 26.730999999999998
- type: ndcg_at_1
value: 20.578
- type: ndcg_at_10
value: 28.147
- type: ndcg_at_100
value: 32.946999999999996
- type: ndcg_at_1000
value: 36.048
- type: ndcg_at_3
value: 24.32
- type: ndcg_at_5
value: 26.131999999999998
- type: precision_at_1
value: 20.578
- type: precision_at_10
value: 5.061999999999999
- type: precision_at_100
value: 0.8789999999999999
- type: precision_at_1000
value: 0.132
- type: precision_at_3
value: 11.448
- type: precision_at_5
value: 8.251999999999999
- type: recall_at_1
value: 16.858999999999998
- type: recall_at_10
value: 37.565
- type: recall_at_100
value: 59.239
- type: recall_at_1000
value: 81.496
- type: recall_at_3
value: 26.865
- type: recall_at_5
value: 31.581
- type: map_at_1
value: 26.11
- type: map_at_10
value: 34.214
- type: map_at_100
value: 35.291
- type: map_at_1000
value: 35.400999999999996
- type: map_at_3
value: 31.541000000000004
- type: map_at_5
value: 33.21
- type: mrr_at_1
value: 30.97
- type: mrr_at_10
value: 38.522
- type: mrr_at_100
value: 39.37
- type: mrr_at_1000
value: 39.437
- type: mrr_at_3
value: 36.193999999999996
- type: mrr_at_5
value: 37.691
- type: ndcg_at_1
value: 30.97
- type: ndcg_at_10
value: 39.2
- type: ndcg_at_100
value: 44.267
- type: ndcg_at_1000
value: 46.760000000000005
- type: ndcg_at_3
value: 34.474
- type: ndcg_at_5
value: 37.016
- type: precision_at_1
value: 30.97
- type: precision_at_10
value: 6.521000000000001
- type: precision_at_100
value: 1.011
- type: precision_at_1000
value: 0.135
- type: precision_at_3
value: 15.392
- type: precision_at_5
value: 11.026
- type: recall_at_1
value: 26.11
- type: recall_at_10
value: 50.14999999999999
- type: recall_at_100
value: 72.398
- type: recall_at_1000
value: 89.764
- type: recall_at_3
value: 37.352999999999994
- type: recall_at_5
value: 43.736000000000004
- type: map_at_1
value: 25.514
- type: map_at_10
value: 34.278999999999996
- type: map_at_100
value: 35.847
- type: map_at_1000
value: 36.086
- type: map_at_3
value: 31.563999999999997
- type: map_at_5
value: 32.903999999999996
- type: mrr_at_1
value: 30.830000000000002
- type: mrr_at_10
value: 38.719
- type: mrr_at_100
value: 39.678999999999995
- type: mrr_at_1000
value: 39.741
- type: mrr_at_3
value: 36.265
- type: mrr_at_5
value: 37.599
- type: ndcg_at_1
value: 30.830000000000002
- type: ndcg_at_10
value: 39.997
- type: ndcg_at_100
value: 45.537
- type: ndcg_at_1000
value: 48.296
- type: ndcg_at_3
value: 35.429
- type: ndcg_at_5
value: 37.3
- type: precision_at_1
value: 30.830000000000002
- type: precision_at_10
value: 7.747
- type: precision_at_100
value: 1.516
- type: precision_at_1000
value: 0.24
- type: precision_at_3
value: 16.601
- type: precision_at_5
value: 11.818
- type: recall_at_1
value: 25.514
- type: recall_at_10
value: 50.71600000000001
- type: recall_at_100
value: 75.40299999999999
- type: recall_at_1000
value: 93.10300000000001
- type: recall_at_3
value: 37.466
- type: recall_at_5
value: 42.677
- type: map_at_1
value: 21.571
- type: map_at_10
value: 28.254
- type: map_at_100
value: 29.237000000000002
- type: map_at_1000
value: 29.334
- type: map_at_3
value: 25.912000000000003
- type: map_at_5
value: 26.798
- type: mrr_at_1
value: 23.29
- type: mrr_at_10
value: 30.102
- type: mrr_at_100
value: 30.982
- type: mrr_at_1000
value: 31.051000000000002
- type: mrr_at_3
value: 27.942
- type: mrr_at_5
value: 28.848000000000003
- type: ndcg_at_1
value: 23.29
- type: ndcg_at_10
value: 32.726
- type: ndcg_at_100
value: 37.644
- type: ndcg_at_1000
value: 40.161
- type: ndcg_at_3
value: 27.91
- type: ndcg_at_5
value: 29.461
- type: precision_at_1
value: 23.29
- type: precision_at_10
value: 5.213
- type: precision_at_100
value: 0.828
- type: precision_at_1000
value: 0.117
- type: precision_at_3
value: 11.583
- type: precision_at_5
value: 7.8740000000000006
- type: recall_at_1
value: 21.571
- type: recall_at_10
value: 44.809
- type: recall_at_100
value: 67.74900000000001
- type: recall_at_1000
value: 86.60799999999999
- type: recall_at_3
value: 31.627
- type: recall_at_5
value: 35.391
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: climate-fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 9.953
- type: map_at_10
value: 17.183
- type: map_at_100
value: 18.926000000000002
- type: map_at_1000
value: 19.105
- type: map_at_3
value: 14.308000000000002
- type: map_at_5
value: 15.738
- type: mrr_at_1
value: 22.02
- type: mrr_at_10
value: 33.181
- type: mrr_at_100
value: 34.357
- type: mrr_at_1000
value: 34.398
- type: mrr_at_3
value: 29.793999999999997
- type: mrr_at_5
value: 31.817
- type: ndcg_at_1
value: 22.02
- type: ndcg_at_10
value: 24.712
- type: ndcg_at_100
value: 32.025
- type: ndcg_at_1000
value: 35.437000000000005
- type: ndcg_at_3
value: 19.852
- type: ndcg_at_5
value: 21.565
- type: precision_at_1
value: 22.02
- type: precision_at_10
value: 7.779
- type: precision_at_100
value: 1.554
- type: precision_at_1000
value: 0.219
- type: precision_at_3
value: 14.832
- type: precision_at_5
value: 11.453000000000001
- type: recall_at_1
value: 9.953
- type: recall_at_10
value: 30.375000000000004
- type: recall_at_100
value: 55.737
- type: recall_at_1000
value: 75.071
- type: recall_at_3
value: 18.529999999999998
- type: recall_at_5
value: 23.313
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: dbpedia-entity
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 8.651
- type: map_at_10
value: 19.674
- type: map_at_100
value: 27.855999999999998
- type: map_at_1000
value: 29.348000000000003
- type: map_at_3
value: 14.247000000000002
- type: map_at_5
value: 16.453
- type: mrr_at_1
value: 61.75000000000001
- type: mrr_at_10
value: 71.329
- type: mrr_at_100
value: 71.69200000000001
- type: mrr_at_1000
value: 71.699
- type: mrr_at_3
value: 69.042
- type: mrr_at_5
value: 70.679
- type: ndcg_at_1
value: 50.125
- type: ndcg_at_10
value: 40.199
- type: ndcg_at_100
value: 45.378
- type: ndcg_at_1000
value: 52.376999999999995
- type: ndcg_at_3
value: 44.342
- type: ndcg_at_5
value: 41.730000000000004
- type: precision_at_1
value: 61.75000000000001
- type: precision_at_10
value: 32.2
- type: precision_at_100
value: 10.298
- type: precision_at_1000
value: 1.984
- type: precision_at_3
value: 48.667
- type: precision_at_5
value: 40.5
- type: recall_at_1
value: 8.651
- type: recall_at_10
value: 25.607000000000003
- type: recall_at_100
value: 53.062
- type: recall_at_1000
value: 74.717
- type: recall_at_3
value: 15.661
- type: recall_at_5
value: 19.409000000000002
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 47.64500000000001
- type: f1
value: 43.71011316507787
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 54.613
- type: map_at_10
value: 68.02
- type: map_at_100
value: 68.366
- type: map_at_1000
value: 68.379
- type: map_at_3
value: 65.753
- type: map_at_5
value: 67.242
- type: mrr_at_1
value: 59.001000000000005
- type: mrr_at_10
value: 72.318
- type: mrr_at_100
value: 72.558
- type: mrr_at_1000
value: 72.56099999999999
- type: mrr_at_3
value: 70.22699999999999
- type: mrr_at_5
value: 71.655
- type: ndcg_at_1
value: 59.001000000000005
- type: ndcg_at_10
value: 74.386
- type: ndcg_at_100
value: 75.763
- type: ndcg_at_1000
value: 76.03
- type: ndcg_at_3
value: 70.216
- type: ndcg_at_5
value: 72.697
- type: precision_at_1
value: 59.001000000000005
- type: precision_at_10
value: 9.844
- type: precision_at_100
value: 1.068
- type: precision_at_1000
value: 0.11100000000000002
- type: precision_at_3
value: 28.523
- type: precision_at_5
value: 18.491
- type: recall_at_1
value: 54.613
- type: recall_at_10
value: 89.669
- type: recall_at_100
value: 95.387
- type: recall_at_1000
value: 97.129
- type: recall_at_3
value: 78.54100000000001
- type: recall_at_5
value: 84.637
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: fiqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 20.348
- type: map_at_10
value: 32.464999999999996
- type: map_at_100
value: 34.235
- type: map_at_1000
value: 34.410000000000004
- type: map_at_3
value: 28.109
- type: map_at_5
value: 30.634
- type: mrr_at_1
value: 38.889
- type: mrr_at_10
value: 47.131
- type: mrr_at_100
value: 48.107
- type: mrr_at_1000
value: 48.138
- type: mrr_at_3
value: 44.599
- type: mrr_at_5
value: 46.181
- type: ndcg_at_1
value: 38.889
- type: ndcg_at_10
value: 39.86
- type: ndcg_at_100
value: 46.619
- type: ndcg_at_1000
value: 49.525999999999996
- type: ndcg_at_3
value: 35.768
- type: ndcg_at_5
value: 37.4
- type: precision_at_1
value: 38.889
- type: precision_at_10
value: 11.003
- type: precision_at_100
value: 1.796
- type: precision_at_1000
value: 0.233
- type: precision_at_3
value: 23.714
- type: precision_at_5
value: 17.901
- type: recall_at_1
value: 20.348
- type: recall_at_10
value: 46.781
- type: recall_at_100
value: 71.937
- type: recall_at_1000
value: 89.18599999999999
- type: recall_at_3
value: 32.16
- type: recall_at_5
value: 38.81
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: hotpotqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 37.198
- type: map_at_10
value: 54.065
- type: map_at_100
value: 54.984
- type: map_at_1000
value: 55.05
- type: map_at_3
value: 50.758
- type: map_at_5
value: 52.758
- type: mrr_at_1
value: 74.396
- type: mrr_at_10
value: 81.352
- type: mrr_at_100
value: 81.562
- type: mrr_at_1000
value: 81.57
- type: mrr_at_3
value: 80.30199999999999
- type: mrr_at_5
value: 80.963
- type: ndcg_at_1
value: 74.396
- type: ndcg_at_10
value: 63.70099999999999
- type: ndcg_at_100
value: 66.874
- type: ndcg_at_1000
value: 68.171
- type: ndcg_at_3
value: 58.916999999999994
- type: ndcg_at_5
value: 61.495999999999995
- type: precision_at_1
value: 74.396
- type: precision_at_10
value: 13.228000000000002
- type: precision_at_100
value: 1.569
- type: precision_at_1000
value: 0.174
- type: precision_at_3
value: 37.007
- type: precision_at_5
value: 24.248
- type: recall_at_1
value: 37.198
- type: recall_at_10
value: 66.13799999999999
- type: recall_at_100
value: 78.45400000000001
- type: recall_at_1000
value: 87.04899999999999
- type: recall_at_3
value: 55.510000000000005
- type: recall_at_5
value: 60.621
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 86.32240000000002
- type: ap
value: 81.37708984744188
- type: f1
value: 86.29645005523952
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: msmarco
config: default
split: dev
revision: None
metrics:
- type: map_at_1
value: 16.402
- type: map_at_10
value: 28.097
- type: map_at_100
value: 29.421999999999997
- type: map_at_1000
value: 29.476999999999997
- type: map_at_3
value: 24.015
- type: map_at_5
value: 26.316
- type: mrr_at_1
value: 16.905
- type: mrr_at_10
value: 28.573999999999998
- type: mrr_at_100
value: 29.862
- type: mrr_at_1000
value: 29.912
- type: mrr_at_3
value: 24.589
- type: mrr_at_5
value: 26.851000000000003
- type: ndcg_at_1
value: 16.905
- type: ndcg_at_10
value: 34.99
- type: ndcg_at_100
value: 41.419
- type: ndcg_at_1000
value: 42.815999999999995
- type: ndcg_at_3
value: 26.695
- type: ndcg_at_5
value: 30.789
- type: precision_at_1
value: 16.905
- type: precision_at_10
value: 5.891
- type: precision_at_100
value: 0.91
- type: precision_at_1000
value: 0.10300000000000001
- type: precision_at_3
value: 11.724
- type: precision_at_5
value: 9.097
- type: recall_at_1
value: 16.402
- type: recall_at_10
value: 56.462999999999994
- type: recall_at_100
value: 86.246
- type: recall_at_1000
value: 96.926
- type: recall_at_3
value: 33.897
- type: recall_at_5
value: 43.718
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 92.35978112175103
- type: f1
value: 92.04704651024416
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 65.20063839489283
- type: f1
value: 45.34047546059121
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 67.74714189643578
- type: f1
value: 65.36156843270334
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 74.03160726294554
- type: f1
value: 73.42899064973165
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 31.347360980344476
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 29.56022733162805
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 30.60132765358296
- type: mrr
value: 31.710892632824468
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: nfcorpus
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 5.827999999999999
- type: map_at_10
value: 13.547
- type: map_at_100
value: 16.869
- type: map_at_1000
value: 18.242
- type: map_at_3
value: 9.917
- type: map_at_5
value: 11.648
- type: mrr_at_1
value: 46.44
- type: mrr_at_10
value: 55.062
- type: mrr_at_100
value: 55.513999999999996
- type: mrr_at_1000
value: 55.564
- type: mrr_at_3
value: 52.735
- type: mrr_at_5
value: 54.391
- type: ndcg_at_1
value: 44.582
- type: ndcg_at_10
value: 35.684
- type: ndcg_at_100
value: 31.913999999999998
- type: ndcg_at_1000
value: 40.701
- type: ndcg_at_3
value: 40.819
- type: ndcg_at_5
value: 39.117000000000004
- type: precision_at_1
value: 46.129999999999995
- type: precision_at_10
value: 26.687
- type: precision_at_100
value: 8.062
- type: precision_at_1000
value: 2.073
- type: precision_at_3
value: 38.493
- type: precision_at_5
value: 34.241
- type: recall_at_1
value: 5.827999999999999
- type: recall_at_10
value: 17.391000000000002
- type: recall_at_100
value: 31.228
- type: recall_at_1000
value: 63.943000000000005
- type: recall_at_3
value: 10.81
- type: recall_at_5
value: 13.618
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: nq
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 24.02
- type: map_at_10
value: 40.054
- type: map_at_100
value: 41.318
- type: map_at_1000
value: 41.343999999999994
- type: map_at_3
value: 35.221999999999994
- type: map_at_5
value: 38.057
- type: mrr_at_1
value: 27.230999999999998
- type: mrr_at_10
value: 42.315999999999995
- type: mrr_at_100
value: 43.254
- type: mrr_at_1000
value: 43.272
- type: mrr_at_3
value: 38.176
- type: mrr_at_5
value: 40.64
- type: ndcg_at_1
value: 27.230999999999998
- type: ndcg_at_10
value: 48.551
- type: ndcg_at_100
value: 53.737
- type: ndcg_at_1000
value: 54.313
- type: ndcg_at_3
value: 39.367999999999995
- type: ndcg_at_5
value: 44.128
- type: precision_at_1
value: 27.230999999999998
- type: precision_at_10
value: 8.578
- type: precision_at_100
value: 1.145
- type: precision_at_1000
value: 0.12
- type: precision_at_3
value: 18.704
- type: precision_at_5
value: 13.927999999999999
- type: recall_at_1
value: 24.02
- type: recall_at_10
value: 72.258
- type: recall_at_100
value: 94.489
- type: recall_at_1000
value: 98.721
- type: recall_at_3
value: 48.373
- type: recall_at_5
value: 59.388
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: quora
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 70.476
- type: map_at_10
value: 84.41300000000001
- type: map_at_100
value: 85.036
- type: map_at_1000
value: 85.055
- type: map_at_3
value: 81.45599999999999
- type: map_at_5
value: 83.351
- type: mrr_at_1
value: 81.07
- type: mrr_at_10
value: 87.408
- type: mrr_at_100
value: 87.509
- type: mrr_at_1000
value: 87.51
- type: mrr_at_3
value: 86.432
- type: mrr_at_5
value: 87.128
- type: ndcg_at_1
value: 81.13
- type: ndcg_at_10
value: 88.18599999999999
- type: ndcg_at_100
value: 89.401
- type: ndcg_at_1000
value: 89.515
- type: ndcg_at_3
value: 85.332
- type: ndcg_at_5
value: 86.97
- type: precision_at_1
value: 81.13
- type: precision_at_10
value: 13.361
- type: precision_at_100
value: 1.5230000000000001
- type: precision_at_1000
value: 0.156
- type: precision_at_3
value: 37.31
- type: precision_at_5
value: 24.548000000000002
- type: recall_at_1
value: 70.476
- type: recall_at_10
value: 95.3
- type: recall_at_100
value: 99.46000000000001
- type: recall_at_1000
value: 99.96000000000001
- type: recall_at_3
value: 87.057
- type: recall_at_5
value: 91.739
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 55.36775089400664
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 60.05041008018361
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: scidocs
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 4.743
- type: map_at_10
value: 12.171
- type: map_at_100
value: 14.174999999999999
- type: map_at_1000
value: 14.446
- type: map_at_3
value: 8.698
- type: map_at_5
value: 10.444
- type: mrr_at_1
value: 23.400000000000002
- type: mrr_at_10
value: 34.284
- type: mrr_at_100
value: 35.400999999999996
- type: mrr_at_1000
value: 35.451
- type: mrr_at_3
value: 31.167
- type: mrr_at_5
value: 32.946999999999996
- type: ndcg_at_1
value: 23.400000000000002
- type: ndcg_at_10
value: 20.169999999999998
- type: ndcg_at_100
value: 27.967
- type: ndcg_at_1000
value: 32.982
- type: ndcg_at_3
value: 19.308
- type: ndcg_at_5
value: 16.837
- type: precision_at_1
value: 23.400000000000002
- type: precision_at_10
value: 10.41
- type: precision_at_100
value: 2.162
- type: precision_at_1000
value: 0.338
- type: precision_at_3
value: 18.067
- type: precision_at_5
value: 14.78
- type: recall_at_1
value: 4.743
- type: recall_at_10
value: 21.098
- type: recall_at_100
value: 43.85
- type: recall_at_1000
value: 68.60000000000001
- type: recall_at_3
value: 10.993
- type: recall_at_5
value: 14.998000000000001
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 81.129376905658
- type: cos_sim_spearman
value: 74.18938626206575
- type: euclidean_pearson
value: 77.95192851803141
- type: euclidean_spearman
value: 74.18938626206575
- type: manhattan_pearson
value: 77.97718819383338
- type: manhattan_spearman
value: 74.20580317409417
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 78.36913772828827
- type: cos_sim_spearman
value: 73.22311186990363
- type: euclidean_pearson
value: 74.45263405031004
- type: euclidean_spearman
value: 73.22311186990363
- type: manhattan_pearson
value: 74.56201270071791
- type: manhattan_spearman
value: 73.26490493774821
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 84.79920796384403
- type: cos_sim_spearman
value: 84.77145185366201
- type: euclidean_pearson
value: 83.90638366191354
- type: euclidean_spearman
value: 84.77145185366201
- type: manhattan_pearson
value: 83.83788216629048
- type: manhattan_spearman
value: 84.70515987131665
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 83.18883765092875
- type: cos_sim_spearman
value: 79.9948128016449
- type: euclidean_pearson
value: 81.57436738666773
- type: euclidean_spearman
value: 79.9948128016449
- type: manhattan_pearson
value: 81.55274202648187
- type: manhattan_spearman
value: 79.99854975019382
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 86.89669110871021
- type: cos_sim_spearman
value: 87.26758456901442
- type: euclidean_pearson
value: 86.62614163641416
- type: euclidean_spearman
value: 87.26758456901442
- type: manhattan_pearson
value: 86.58584490012353
- type: manhattan_spearman
value: 87.20340001562076
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 81.983023415916
- type: cos_sim_spearman
value: 82.31169002657151
- type: euclidean_pearson
value: 81.52305092886222
- type: euclidean_spearman
value: 82.31169002657151
- type: manhattan_pearson
value: 81.63024996600281
- type: manhattan_spearman
value: 82.44579116264026
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 89.27779520541694
- type: cos_sim_spearman
value: 89.54137104681308
- type: euclidean_pearson
value: 88.99136079955996
- type: euclidean_spearman
value: 89.54137104681308
- type: manhattan_pearson
value: 88.95980417618277
- type: manhattan_spearman
value: 89.55178819334718
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 66.50806758829178
- type: cos_sim_spearman
value: 65.92675365587571
- type: euclidean_pearson
value: 67.09216876696559
- type: euclidean_spearman
value: 65.92675365587571
- type: manhattan_pearson
value: 67.37398716891478
- type: manhattan_spearman
value: 66.34811143508206
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 84.557575753862
- type: cos_sim_spearman
value: 83.95859527071087
- type: euclidean_pearson
value: 83.77287626715369
- type: euclidean_spearman
value: 83.95859527071087
- type: manhattan_pearson
value: 83.7898033034244
- type: manhattan_spearman
value: 83.94860981294184
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 79.90679624144718
- type: mrr
value: 94.33150183150182
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: scifact
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 56.81699999999999
- type: map_at_10
value: 67.301
- type: map_at_100
value: 67.73599999999999
- type: map_at_1000
value: 67.757
- type: map_at_3
value: 64.865
- type: map_at_5
value: 66.193
- type: mrr_at_1
value: 59.667
- type: mrr_at_10
value: 68.324
- type: mrr_at_100
value: 68.66
- type: mrr_at_1000
value: 68.676
- type: mrr_at_3
value: 66.556
- type: mrr_at_5
value: 67.472
- type: ndcg_at_1
value: 59.667
- type: ndcg_at_10
value: 71.982
- type: ndcg_at_100
value: 74.149
- type: ndcg_at_1000
value: 74.60799999999999
- type: ndcg_at_3
value: 67.796
- type: ndcg_at_5
value: 69.64099999999999
- type: precision_at_1
value: 59.667
- type: precision_at_10
value: 9.633
- type: precision_at_100
value: 1.08
- type: precision_at_1000
value: 0.11199999999999999
- type: precision_at_3
value: 26.889000000000003
- type: precision_at_5
value: 17.467
- type: recall_at_1
value: 56.81699999999999
- type: recall_at_10
value: 85.18900000000001
- type: recall_at_100
value: 95.6
- type: recall_at_1000
value: 99.0
- type: recall_at_3
value: 73.617
- type: recall_at_5
value: 78.444
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.83465346534653
- type: cos_sim_ap
value: 95.93387984443646
- type: cos_sim_f1
value: 91.49261334691798
- type: cos_sim_precision
value: 93.25025960539979
- type: cos_sim_recall
value: 89.8
- type: dot_accuracy
value: 99.83465346534653
- type: dot_ap
value: 95.93389375761485
- type: dot_f1
value: 91.49261334691798
- type: dot_precision
value: 93.25025960539979
- type: dot_recall
value: 89.8
- type: euclidean_accuracy
value: 99.83465346534653
- type: euclidean_ap
value: 95.93389375761487
- type: euclidean_f1
value: 91.49261334691798
- type: euclidean_precision
value: 93.25025960539979
- type: euclidean_recall
value: 89.8
- type: manhattan_accuracy
value: 99.83564356435643
- type: manhattan_ap
value: 95.89877504534601
- type: manhattan_f1
value: 91.53061224489795
- type: manhattan_precision
value: 93.4375
- type: manhattan_recall
value: 89.7
- type: max_accuracy
value: 99.83564356435643
- type: max_ap
value: 95.93389375761487
- type: max_f1
value: 91.53061224489795
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 62.2780055191805
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 33.94461701798904
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 49.865789666749535
- type: mrr
value: 50.61783804430863
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 29.97703436199298
- type: cos_sim_spearman
value: 30.71880290978946
- type: dot_pearson
value: 29.977036284086818
- type: dot_spearman
value: 30.71880290978946
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: trec-covid
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 0.22799999999999998
- type: map_at_10
value: 1.559
- type: map_at_100
value: 8.866
- type: map_at_1000
value: 23.071
- type: map_at_3
value: 0.592
- type: map_at_5
value: 0.906
- type: mrr_at_1
value: 84.0
- type: mrr_at_10
value: 88.567
- type: mrr_at_100
value: 88.748
- type: mrr_at_1000
value: 88.748
- type: mrr_at_3
value: 87.667
- type: mrr_at_5
value: 88.067
- type: ndcg_at_1
value: 73.0
- type: ndcg_at_10
value: 62.202999999999996
- type: ndcg_at_100
value: 49.66
- type: ndcg_at_1000
value: 48.760999999999996
- type: ndcg_at_3
value: 67.52
- type: ndcg_at_5
value: 64.80799999999999
- type: precision_at_1
value: 84.0
- type: precision_at_10
value: 65.4
- type: precision_at_100
value: 51.72
- type: precision_at_1000
value: 22.014
- type: precision_at_3
value: 74.0
- type: precision_at_5
value: 69.19999999999999
- type: recall_at_1
value: 0.22799999999999998
- type: recall_at_10
value: 1.7680000000000002
- type: recall_at_100
value: 12.581999999999999
- type: recall_at_1000
value: 46.883
- type: recall_at_3
value: 0.618
- type: recall_at_5
value: 0.9690000000000001
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: webis-touche2020
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 1.295
- type: map_at_10
value: 7.481
- type: map_at_100
value: 13.120999999999999
- type: map_at_1000
value: 14.863999999999999
- type: map_at_3
value: 3.266
- type: map_at_5
value: 4.662
- type: mrr_at_1
value: 14.285999999999998
- type: mrr_at_10
value: 31.995
- type: mrr_at_100
value: 33.415
- type: mrr_at_1000
value: 33.432
- type: mrr_at_3
value: 27.551
- type: mrr_at_5
value: 30.306
- type: ndcg_at_1
value: 11.224
- type: ndcg_at_10
value: 19.166
- type: ndcg_at_100
value: 31.86
- type: ndcg_at_1000
value: 44.668
- type: ndcg_at_3
value: 17.371
- type: ndcg_at_5
value: 18.567
- type: precision_at_1
value: 14.285999999999998
- type: precision_at_10
value: 18.98
- type: precision_at_100
value: 7.041
- type: precision_at_1000
value: 1.555
- type: precision_at_3
value: 19.728
- type: precision_at_5
value: 20.816000000000003
- type: recall_at_1
value: 1.295
- type: recall_at_10
value: 14.482000000000001
- type: recall_at_100
value: 45.149
- type: recall_at_1000
value: 84.317
- type: recall_at_3
value: 4.484
- type: recall_at_5
value: 7.7170000000000005
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 72.96340000000001
- type: ap
value: 15.62835559397026
- type: f1
value: 56.42561616707867
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 55.280135823429546
- type: f1
value: 55.61428067547153
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 45.426677723253555
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 84.57411933003517
- type: cos_sim_ap
value: 69.68254951354992
- type: cos_sim_f1
value: 65.05232416646386
- type: cos_sim_precision
value: 60.36585365853659
- type: cos_sim_recall
value: 70.52770448548813
- type: dot_accuracy
value: 84.57411933003517
- type: dot_ap
value: 69.68256519978905
- type: dot_f1
value: 65.05232416646386
- type: dot_precision
value: 60.36585365853659
- type: dot_recall
value: 70.52770448548813
- type: euclidean_accuracy
value: 84.57411933003517
- type: euclidean_ap
value: 69.6825655240522
- type: euclidean_f1
value: 65.05232416646386
- type: euclidean_precision
value: 60.36585365853659
- type: euclidean_recall
value: 70.52770448548813
- type: manhattan_accuracy
value: 84.5502771651666
- type: manhattan_ap
value: 69.61700491283233
- type: manhattan_f1
value: 64.83962148211872
- type: manhattan_precision
value: 60.68553025074765
- type: manhattan_recall
value: 69.6042216358839
- type: max_accuracy
value: 84.57411933003517
- type: max_ap
value: 69.6825655240522
- type: max_f1
value: 65.05232416646386
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 88.80350836341057
- type: cos_sim_ap
value: 85.41051415803449
- type: cos_sim_f1
value: 77.99305633329602
- type: cos_sim_precision
value: 75.70113776360607
- type: cos_sim_recall
value: 80.42808746535263
- type: dot_accuracy
value: 88.80350836341057
- type: dot_ap
value: 85.41051488820463
- type: dot_f1
value: 77.99305633329602
- type: dot_precision
value: 75.70113776360607
- type: dot_recall
value: 80.42808746535263
- type: euclidean_accuracy
value: 88.80350836341057
- type: euclidean_ap
value: 85.41051374760137
- type: euclidean_f1
value: 77.99305633329602
- type: euclidean_precision
value: 75.70113776360607
- type: euclidean_recall
value: 80.42808746535263
- type: manhattan_accuracy
value: 88.74529436876625
- type: manhattan_ap
value: 85.38380242074525
- type: manhattan_f1
value: 78.02957839746892
- type: manhattan_precision
value: 74.71466816964914
- type: manhattan_recall
value: 81.65229442562365
- type: max_accuracy
value: 88.80350836341057
- type: max_ap
value: 85.41051488820463
- type: max_f1
value: 78.02957839746892
---
# nomic-embed-text-v1-unsupervised: A Reproducible Long Context (8192) Text Embedder
`nomic-embed-text-v1-unsupervised` is 8192 context length text encoder. This is a checkpoint after contrastive pretraining from multi-stage contrastive training of the
[final model](https://huggingface.co/nomic-ai/nomic-embed-text-v1). The purpose of releasing this checkpoint is to open-source training artifacts from our Nomic Embed Text tech report [here](https://arxiv.org/pdf/2402.01613)
If you want to use a model to extract embeddings, we suggest using [nomic-embed-text-v1](https://huggingface.co/nomic-ai/nomic-embed-text-v1).
# Join the Nomic Community
- Nomic: [https://nomic.ai](https://nomic.ai)
- Discord: [https://discord.gg/myY5YDR8z8](https://discord.gg/myY5YDR8z8)
- Twitter: [https://twitter.com/nomic_ai](https://twitter.com/nomic_ai)
| [
"SUMMARIZATION"
] | [
"BIOSSES",
"SCIFACT"
] |
GPT4All-Community/Phi-3.1-mini-128k-instruct-GGUF | GPT4All-Community | text-generation | [
"transformers",
"gguf",
"text-generation-inference",
"GGUF",
"GPT4All-community",
"GPT4All",
"nlp",
"code",
"text-generation",
"en",
"license:mit",
"region:us",
"conversational"
] | 2024-07-31T00:38:31 | 2024-08-13T08:40:44 | 1,032 | 2 | ---
base_model: Microsoft/Phi-3-Mini-128K-Instruct
language:
- en
library_name: transformers
license: mit
license_link: https://huggingface.co/microsoft/Phi-3-mini-128k-instruct/resolve/main/LICENSE
model_name: Phi-3-Mini-128K-Instruct
pipeline_tag: text-generation
tags:
- text-generation-inference
- transformers
- GGUF
- GPT4All-community
- GPT4All
- nlp
- code
inference: false
model_creator: Microsoft
model_type: phi3
quantized_by: ThiloteE
---
> [!NOTE]
> This is a model that is assumed to perform well, but may require more testing and user feedback. Be aware, only models featured within the GUI of GPT4All, are curated and officially supported by Nomic. Use at your own risk.
# About
<!-- ### quantize_version: 3 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
- Static quants of https://huggingface.co/microsoft/Phi-3-mini-128k-instruct/ at commit [d548c23](https://huggingface.co/microsoft/Phi-3-mini-128k-instruct/commit/d548c233192db00165d842bf8edff054bb3212f8)
- Quantized by [ThiloteE](https://huggingface.co/ThiloteE) with llama.cpp commit [c3776ca](https://github.com/ggerganov/llama.cpp/commit/c3776cacabce2ee35f172fb72be7a519752125fa)
# Notes
These quants were created with a customized configuration that have been proven to not cause visible end of string (eos) tokens during inference with [GPT4All](https://www.nomic.ai/gpt4all).
The config.json, generation_config.json and tokenizer_config.json differ from the original configuration as can be found in the original model's repository at the time of creation of these quants.
# Prompt Template (for GPT4All)
Example System Prompt:
```Markdown
<|system|>
You are a helpful assistant.<|end|>
```
Chat Template:
```Markdown
<|user|>
%1<|end|>
<|assistant|>
%2<|end|>
```
Do not miss the newlines at the end! Have a look at the raw readme.md file, as it differs from the rendered output in the modelcard.
# Context Length
`131072`
Use a lower value during inference, if you do not have enough RAM or VRAM.
# Provided Quants
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/GPT4All-Community/Phi-3-Mini-128K-Instruct-GGUF/resolve/main/Phi-3-Mini-128K-Instruct-Q4_0.gguf) | Q4_0 | 4.9 | fast, recommended |
| [GGUF](https://huggingface.co/GPT4All-Community/Phi-3-Mini-128K-Instruct-GGUF/resolve/main/Phi-3-Mini-128K-Instruct-F16.gguf) | f16 | 7.7 | 16 bpw, overkill |
# About GGUF
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/DiscoLM_German_7b_v1-GGUF) for
more details, including on how to concatenate multi-part files.
Here is a handy graph by ikawrakow comparing some quant types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
# Thanks
I thank Mradermacher and TheBloke for Inspiration to this model card and their contributions to open source. Also 3Simplex for lots of help along the way.
Shoutout to the GPT4All and llama.cpp communities :-)
------
<!-- footer end -->
<!-- original-model-card start -->
------
------
# Original Model card:
## Model Summary
The Phi-3-Mini-128K-Instruct is a 3.8 billion-parameter, lightweight, state-of-the-art open model trained using the Phi-3 datasets.
This dataset includes both synthetic data and filtered publicly available website data, with an emphasis on high-quality and reasoning-dense properties.
The model belongs to the Phi-3 family with the Mini version in two variants [4K](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct) and [128K](https://huggingface.co/microsoft/Phi-3-mini-128k-instruct) which is the context length (in tokens) that it can support.
After initial training, the model underwent a post-training process that involved supervised fine-tuning and direct preference optimization to enhance its ability to follow instructions and adhere to safety measures.
When evaluated against benchmarks that test common sense, language understanding, mathematics, coding, long-term context, and logical reasoning, the Phi-3 Mini-128K-Instruct demonstrated robust and state-of-the-art performance among models with fewer than 13 billion parameters.
Resources and Technical Documentation:
🏡 [Phi-3 Portal](https://azure.microsoft.com/en-us/products/phi-3) <br>
📰 [Phi-3 Microsoft Blog](https://aka.ms/Phi-3Build2024) <br>
📖 [Phi-3 Technical Report](https://aka.ms/phi3-tech-report) <br>
🛠️ [Phi-3 on Azure AI Studio](https://aka.ms/phi3-azure-ai) <br>
👩🍳 [Phi-3 Cookbook](https://github.com/microsoft/Phi-3CookBook) <br>
🖥️ [Try It](https://aka.ms/try-phi3)
| | Short Context | Long Context |
| :- | :- | :- |
| Mini | 4K [[HF]](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct) ; [[ONNX]](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct-onnx) ; [[GGUF]](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct-gguf) | 128K [[HF]](https://huggingface.co/microsoft/Phi-3-mini-128k-instruct) ; [[ONNX]](https://huggingface.co/microsoft/Phi-3-mini-128k-instruct-onnx)|
| Small | 8K [[HF]](https://huggingface.co/microsoft/Phi-3-small-8k-instruct) ; [[ONNX]](https://huggingface.co/microsoft/Phi-3-small-8k-instruct-onnx-cuda) | 128K [[HF]](https://huggingface.co/microsoft/Phi-3-small-128k-instruct) ; [[ONNX]](https://huggingface.co/microsoft/Phi-3-small-128k-instruct-onnx-cuda)|
| Medium | 4K [[HF]](https://huggingface.co/microsoft/Phi-3-medium-4k-instruct) ; [[ONNX]](https://huggingface.co/microsoft/Phi-3-medium-4k-instruct-onnx-cuda) | 128K [[HF]](https://huggingface.co/microsoft/Phi-3-medium-128k-instruct) ; [[ONNX]](https://huggingface.co/microsoft/Phi-3-medium-128k-instruct-onnx-cuda)|
| Vision | | 128K [[HF]](https://huggingface.co/microsoft/Phi-3-vision-128k-instruct) ; [[ONNX]](https://huggingface.co/microsoft/Phi-3-vision-128k-instruct-onnx-cuda)|
## Intended Uses
**Primary use cases**
The model is intended for commercial and research use in English. The model provides uses for applications which require:
1) Memory/compute constrained environments
2) Latency bound scenarios
3) Strong reasoning (especially code, math and logic)
Our model is designed to accelerate research on language and multimodal models, for use as a building block for generative AI powered features.
**Use case considerations**
Our models are not specifically designed or evaluated for all downstream purposes. Developers should consider common limitations of language models as they select use cases, and evaluate and mitigate for accuracy, safety, and fariness before using within a specific downstream use case, particularly for high risk scenarios. Developers should be aware of and adhere to applicable laws or regulations (including privacy, trade compliance laws, etc.) that are relevant to their use case.
Nothing contained in this Model Card should be interpreted as or deemed a restriction or modification to the license the model is released under.
## Release Notes
This is an update over the original instruction-tuned Phi-3-mini release based on valuable customer feedback.
The model used additional post-training data leading to substantial gains on long-context understanding, instruction following, and structure output.
We also improve multi-turn conversation quality, explicitly support <|system|> tag, and significantly improve reasoning capability.
We believe most use cases will benefit from this release, but we encourage users to test in their particular AI applications.
We appreciate the enthusiastic adoption of the Phi-3 model family, and continue to welcome all feedback from the community.
These tables below highlights improvements on instruction following, structure output, reasoning, and long-context understanding of the new release on our public and internal benchmark datasets.
| Benchmarks | Original | June 2024 Update |
| :- | :- | :- |
| Instruction Extra Hard | 5.7 | 5.9 |
| Instruction Hard | 5.0 | 5.2 |
| JSON Structure Output | 1.9 | 60.1 |
| XML Structure Output | 47.8 | 52.9 |
| GPQA | 25.9 | 29.7 |
| MMLU | 68.1 | 69.7 |
| **Average** | **25.7** | **37.3** |
RULER: a retrieval-based benchmark for long context understanding
| Model | 4K | 8K | 16K | 32K | 64K | 128K | Average |
| :-------------------| :------| :------| :------| :------| :------| :------| :---------|
| Original | 86.7 | 78.1 | 75.6 | 70.3 | 58.9 | 43.3 | **68.8** |
| June 2024 Update | 92.4 | 91.1 | 90.8 | 87.9 | 79.8 | 65.6 | **84.6** |
RepoQA: a benchmark for long context code understanding
| Model | Python | C++ | Rust | Java | TypeScript | Average |
| :-------------------| :--------| :-----| :------| :------| :------------| :---------|
| Original | 27 | 29 | 40 | 33 | 33 | **32.4** |
| June 2024 Update | 85 | 63 | 72 | 93 | 72 | **77** |
Notes: if users would like to check out the previous version, use the git commit id **bb5bf1e4001277a606e11debca0ef80323e5f824**. For the model conversion, e.g. GGUF and other formats, we invite the community to experiment with various approaches and share your valuable feedback. Let's innovate together!
## How to Use
Phi-3 Mini-128K-Instruct has been integrated in the development version (4.41.3) of `transformers`. Until the official version is released through `pip`, ensure that you are doing one of the following:
* When loading the model, ensure that `trust_remote_code=True` is passed as an argument of the `from_pretrained()` function.
* Update your local `transformers` to the development version: `pip uninstall -y transformers && pip install git+https://github.com/huggingface/transformers`. The previous command is an alternative to cloning and installing from the source.
The current `transformers` version can be verified with: `pip list | grep transformers`.
Examples of required packages:
```
flash_attn==2.5.8
torch==2.3.1
accelerate==0.31.0
transformers==4.41.2
```
Phi-3 Mini-128K-Instruct is also available in [Azure AI Studio](https://aka.ms/try-phi3)
### Tokenizer
Phi-3 Mini-128K-Instruct supports a vocabulary size of up to `32064` tokens. The [tokenizer files](https://huggingface.co/microsoft/Phi-3-mini-128k-instruct/blob/main/added_tokens.json) already provide placeholder tokens that can be used for downstream fine-tuning, but they can also be extended up to the model's vocabulary size.
### Chat Format
Given the nature of the training data, the Phi-3 Mini-128K-Instruct model is best suited for prompts using the chat format as follows.
You can provide the prompt as a question with a generic template as follow:
```markdown
<|system|>
You are a helpful assistant.<|end|>
<|user|>
Question?<|end|>
<|assistant|>
```
For example:
```markdown
<|system|>
You are a helpful assistant.<|end|>
<|user|>
How to explain Internet for a medieval knight?<|end|>
<|assistant|>
```
where the model generates the text after `<|assistant|>` . In case of few-shots prompt, the prompt can be formatted as the following:
```markdown
<|system|>
You are a helpful travel assistant.<|end|>
<|user|>
I am going to Paris, what should I see?<|end|>
<|assistant|>
Paris, the capital of France, is known for its stunning architecture, art museums, historical landmarks, and romantic atmosphere. Here are some of the top attractions to see in Paris:\n\n1. The Eiffel Tower: The iconic Eiffel Tower is one of the most recognizable landmarks in the world and offers breathtaking views of the city.\n2. The Louvre Museum: The Louvre is one of the world's largest and most famous museums, housing an impressive collection of art and artifacts, including the Mona Lisa.\n3. Notre-Dame Cathedral: This beautiful cathedral is one of the most famous landmarks in Paris and is known for its Gothic architecture and stunning stained glass windows.\n\nThese are just a few of the many attractions that Paris has to offer. With so much to see and do, it's no wonder that Paris is one of the most popular tourist destinations in the world."<|end|>
<|user|>
What is so great about #1?<|end|>
<|assistant|>
```
### Sample inference code
This code snippets show how to get quickly started with running the model on a GPU:
```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
torch.random.manual_seed(0)
model = AutoModelForCausalLM.from_pretrained(
"microsoft/Phi-3-mini-128k-instruct",
device_map="cuda",
torch_dtype="auto",
trust_remote_code=True,
)
tokenizer = AutoTokenizer.from_pretrained("microsoft/Phi-3-mini-128k-instruct")
messages = [
{"role": "system", "content": "You are a helpful AI assistant."},
{"role": "user", "content": "Can you provide ways to eat combinations of bananas and dragonfruits?"},
{"role": "assistant", "content": "Sure! Here are some ways to eat bananas and dragonfruits together: 1. Banana and dragonfruit smoothie: Blend bananas and dragonfruits together with some milk and honey. 2. Banana and dragonfruit salad: Mix sliced bananas and dragonfruits together with some lemon juice and honey."},
{"role": "user", "content": "What about solving an 2x + 3 = 7 equation?"},
]
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
)
generation_args = {
"max_new_tokens": 500,
"return_full_text": False,
"temperature": 0.0,
"do_sample": False,
}
output = pipe(messages, **generation_args)
print(output[0]['generated_text'])
```
Notes: If you want to use flash attention, call _AutoModelForCausalLM.from_pretrained()_ with _attn_implementation="flash_attention_2"_
## Responsible AI Considerations
Like other language models, the Phi series models can potentially behave in ways that are unfair, unreliable, or offensive. Some of the limiting behaviors to be aware of include:
+ Quality of Service: the Phi models are trained primarily on English text. Languages other than English will experience worse performance. English language varieties with less representation in the training data might experience worse performance than standard American English.
+ Representation of Harms & Perpetuation of Stereotypes: These models can over- or under-represent groups of people, erase representation of some groups, or reinforce demeaning or negative stereotypes. Despite safety post-training, these limitations may still be present due to differing levels of representation of different groups or prevalence of examples of negative stereotypes in training data that reflect real-world patterns and societal biases.
+ Inappropriate or Offensive Content: these models may produce other types of inappropriate or offensive content, which may make it inappropriate to deploy for sensitive contexts without additional mitigations that are specific to the use case.
+ Information Reliability: Language models can generate nonsensical content or fabricate content that might sound reasonable but is inaccurate or outdated.
+ Limited Scope for Code: Majority of Phi-3 training data is based in Python and use common packages such as "typing, math, random, collections, datetime, itertools". If the model generates Python scripts that utilize other packages or scripts in other languages, we strongly recommend users manually verify all API uses.
Developers should apply responsible AI best practices and are responsible for ensuring that a specific use case complies with relevant laws and regulations (e.g. privacy, trade, etc.). Important areas for consideration include:
+ Allocation: Models may not be suitable for scenarios that could have consequential impact on legal status or the allocation of resources or life opportunities (ex: housing, employment, credit, etc.) without further assessments and additional debiasing techniques.
+ High-Risk Scenarios: Developers should assess suitability of using models in high-risk scenarios where unfair, unreliable or offensive outputs might be extremely costly or lead to harm. This includes providing advice in sensitive or expert domains where accuracy and reliability are critical (ex: legal or health advice). Additional safeguards should be implemented at the application level according to the deployment context.
+ Misinformation: Models may produce inaccurate information. Developers should follow transparency best practices and inform end-users they are interacting with an AI system. At the application level, developers can build feedback mechanisms and pipelines to ground responses in use-case specific, contextual information, a technique known as Retrieval Augmented Generation (RAG).
+ Generation of Harmful Content: Developers should assess outputs for their context and use available safety classifiers or custom solutions appropriate for their use case.
+ Misuse: Other forms of misuse such as fraud, spam, or malware production may be possible, and developers should ensure that their applications do not violate applicable laws and regulations.
## Training
### Model
* Architecture: Phi-3 Mini-128K-Instruct has 3.8B parameters and is a dense decoder-only Transformer model. The model is fine-tuned with Supervised fine-tuning (SFT) and Direct Preference Optimization (DPO) to ensure alignment with human preferences and safety guidlines.
* Inputs: Text. It is best suited for prompts using chat format.
* Context length: 128K tokens
* GPUs: 512 H100-80G
* Training time: 10 days
* Training data: 4.9T tokens
* Outputs: Generated text in response to the input
* Dates: Our models were trained between May and June 2024
* Status: This is a static model trained on an offline dataset with cutoff date October 2023. Future versions of the tuned models may be released as we improve models.
* Release dates: June, 2024.
### Datasets
Our training data includes a wide variety of sources, totaling 4.9 trillion tokens, and is a combination of
1) Publicly available documents filtered rigorously for quality, selected high-quality educational data, and code;
2) Newly created synthetic, “textbook-like” data for the purpose of teaching math, coding, common sense reasoning, general knowledge of the world (science, daily activities, theory of mind, etc.);
3) High quality chat format supervised data covering various topics to reflect human preferences on different aspects such as instruct-following, truthfulness, honesty and helpfulness.
We are focusing on the quality of data that could potentially improve the reasoning ability for the model, and we filter the publicly available documents to contain the correct level of knowledge. As an example, the result of a game in premier league in a particular day might be good training data for frontier models, but we need to remove such information to leave more model capacity for reasoning for the small size models. More details about data can be found in the [Phi-3 Technical Report](https://aka.ms/phi3-tech-report).
### Fine-tuning
A basic example of multi-GPUs supervised fine-tuning (SFT) with TRL and Accelerate modules is provided [here](https://huggingface.co/microsoft/Phi-3-mini-128k-instruct/resolve/main/sample_finetune.py).
## Benchmarks
We report the results under completion format for Phi-3-Mini-128K-Instruct on standard open-source benchmarks measuring the model's reasoning ability (both common sense reasoning and logical reasoning). We compare to Mistral-7b-v0.1, Mixtral-8x7b, Gemma 7B, Llama-3-8B-Instruct, and GPT-3.5.
All the reported numbers are produced with the exact same pipeline to ensure that the numbers are comparable. These numbers might differ from other published numbers due to slightly different choices in the evaluation.
As is now standard, we use few-shot prompts to evaluate the models, at temperature 0.
The prompts and number of shots are part of a Microsoft internal tool to evaluate language models, and in particular we did no optimization to the pipeline for Phi-3.
More specifically, we do not change prompts, pick different few-shot examples, change prompt format, or do any other form of optimization for the model.
The number of k–shot examples is listed per-benchmark.
| Category | Benchmark | Phi-3-Mini-128K-Ins | Gemma-7B | Mistral-7B | Mixtral-8x7B | Llama-3-8B-Ins | GPT3.5-Turbo-1106 |
| :----------| :-----------| :---------------------| :----------| :------------| :--------------| :----------------| :-------------------|
| Popular aggregated benchmark | AGI Eval <br>5-shot| 39.5 | 42.1 | 35.1 | 45.2 | 42 | 48.4 |
| | MMLU <br>5-shot | 69.7 | 63.6 | 61.7 | 70.5 | 66.5 | 71.4 |
| | BigBench Hard <br>3-shot | 72.1 | 59.6 | 57.3 | 69.7 | 51.5 | 68.3 |
| Language Understanding | ANLI <br>7-shot | 52.3 | 48.7 | 47.1 | 55.2 | 57.3 | 58.1 |
| | HellaSwag <br>5-shot | 70.5 | 49.8 | 58.5 | 70.4 | 71.1 | 78.8 |
| Reasoning | ARC Challenge <br>10-shot | 85.5 | 78.3 | 78.6 | 87.3 | 82.8 | 87.4 |
| | BoolQ <br>0-shot | 77.1 | 66 | 72.2 | 76.6 | 80.9 | 79.1 |
| | MedQA <br>2-shot | 56.4 | 49.6 | 50 | 62.2 | 60.5 | 63.4 |
| | OpenBookQA <br>10-shot | 78.8 | 78.6 | 79.8 | 85.8 | 82.6 | 86 |
| | PIQA <br>5-shot | 80.1 | 78.1 | 77.7 | 86 | 75.7 | 86.6 |
| | GPQA <br>0-shot | 29.7 | 2.9 | 15 | 6.9 | 32.4 | 29.9 |
| | Social IQA <br>5-shot | 74.7 | 65.5 | 74.6 | 75.9 | 73.9 | 68.3 |
| | TruthfulQA (MC2) <br>10-shot | 64.8 | 52.1 | 53 | 60.1 | 63.2 | 67.7 |
| | WinoGrande <br>5-shot | 71.0 | 55.6 | 54.2 | 62 | 65 | 68.8 |
| Factual Knowledge | TriviaQA <br>5-shot | 57.8 | 72.3 | 75.2 | 82.2 | 67.7 | 85.8 |
| Math | GSM8K CoTT <br>8-shot | 85.3 | 59.8 | 46.4 | 64.7 | 77.4 | 78.1 |
| Code Generation | HumanEval <br>0-shot | 60.4 | 34.1 | 28.0 | 37.8 | 60.4 | 62.2 |
| | MBPP <br>3-shot | 70.0 | 51.5 | 50.8 | 60.2 | 67.7 | 77.8 |
| **Average** | | **66.4** | **56.0** | **56.4** | **64.4** | **65.5** | **70.3** |
**Long Context**: Phi-3 Mini-128K-Instruct supports 128K context length, therefore the model is capable of several long context tasks including long document/meeting summarization, long document QA.
| Benchmark | Phi-3 Mini-128K-Instruct | Mistral-7B | Mixtral 8x7B | LLaMA-3-8B-Instruct |
| :---------------| :--------------------------|:------------|:--------------|:---------------------|
| GovReport | 25.3 | 4.9 | 20.3 | 10.3 |
| QMSum | 21.9 | 15.5 | 20.6 | 2.9 |
| Qasper | 41.6 | 23.5 | 26.6 | 8.1 |
| SQuALITY | 24.1 | 14.7 | 16.2 | 25 |
| SummScreenFD | 16.8 | 9.3 | 11.3 | 5.1 |
| **Average** | **25.9** | **13.6** | **19.0** | **10.3** |
We take a closer look at different categories across 100 public benchmark datasets at the table below:
| Category | Phi-3-Mini-128K-Instruct | Gemma-7B | Mistral-7B | Mixtral 8x7B | Llama-3-8B-Instruct | GPT-3.5-Turbo |
|:----------|:--------------------------|:----------|:------------|:--------------|:---------------------|:---------------|
| Popular aggregated benchmark | 60.6 | 59.4 | 56.5 | 66.2 | 59.9 | 67.0 |
| Reasoning | 69.4 | 60.3 | 62.8 | 68.1 | 69.6 | 71.7 |
| Language understanding | 57.5 | 57.6 | 52.5 | 66.1 | 63.2 | 67.7 |
| Code generation | 61.0 | 45.6 | 42.9 | 52.7 | 56.4 | 70.4 |
| Math | 51.6 | 35.8 | 25.4 | 40.3 | 41.1 | 52.8 |
| Factual knowledge | 35.8 | 46.7 | 49.8 | 58.6 | 43.1 | 63.4 |
| Multilingual | 56.4 | 66.5 | 57.4 | 66.7 | 66.6 | 71.0 |
| Robustness | 61.1 | 38.4 | 40.6 | 51.0 | 64.5 | 69.3 |
Overall, the model with only 3.8B-param achieves a similar level of language understanding and reasoning ability as much larger models. However, it is still fundamentally limited by its size for certain tasks. The model simply does not have the capacity to store too much world knowledge, which can be seen for example with low performance on TriviaQA. However, we believe such weakness can be resolved by augmenting Phi-3-Mini with a search engine.
## Cross Platform Support
[ONNX runtime](https://onnxruntime.ai/blogs/accelerating-phi-3) now supports Phi-3 mini models across platforms and hardware.
Optimized phi-3 models are also published here in ONNX format, to run with ONNX Runtime on CPU and GPU across devices, including server platforms, Windows, Linux and Mac desktops, and mobile CPUs, with the precision best suited to each of these targets. DirectML GPU acceleration is supported for Windows desktops GPUs (AMD, Intel, and NVIDIA).
Along with DML, ONNX Runtime provides cross platform support for Phi3 mini across a range of devices CPU, GPU, and mobile.
Here are some of the optimized configurations we have added:
1. ONNX models for int4 DML: Quantized to int4 via AWQ
2. ONNX model for fp16 CUDA
3. ONNX model for int4 CUDA: Quantized to int4 via RTN
4. ONNX model for int4 CPU and Mobile: Quantized to int4 via RTN
## Software
* [PyTorch](https://github.com/pytorch/pytorch)
* [Transformers](https://github.com/huggingface/transformers)
* [Flash-Attention](https://github.com/HazyResearch/flash-attention)
## Hardware
Note that by default, the Phi-3 Mini-128K-Instruct model uses flash attention, which requires certain types of GPU hardware to run. We have tested on the following GPU types:
* NVIDIA A100
* NVIDIA A6000
* NVIDIA H100
If you want to run the model on:
* NVIDIA V100 or earlier generation GPUs: call AutoModelForCausalLM.from_pretrained() with attn_implementation="eager"
* Optimized inference on GPU, CPU, and Mobile: use the **ONNX** models [128K](https://aka.ms/phi3-mini-128k-instruct-onnx)
## License
The model is licensed under the [MIT license](https://huggingface.co/microsoft/Phi-3-mini-128k/resolve/main/LICENSE).
## Trademarks
This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow [Microsoft’s Trademark & Brand Guidelines](https://www.microsoft.com/en-us/legal/intellectualproperty/trademarks). Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party’s policies.
<!-- original-model-card end -->
<!-- end -->
| [
"SUMMARIZATION"
] | [
"MEDQA"
] |
Ateeqq/Text-Rewriter-Paraphraser | Ateeqq | text2text-generation | [
"transformers",
"pytorch",
"safetensors",
"t5",
"text2text-generation",
"license:openrail",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | 2024-06-02T22:14:08 | 2024-12-27T07:53:45 | 1,015 | 21 | ---
license: openrail
inference:
parameters:
num_beams: 3
num_beam_groups: 3
num_return_sequences: 1
repetition_penalty: 3
diversity_penalty: 3.01
no_repeat_ngram_size: 2
temperature: 0.8
max_length: 64
widget:
- text: 'paraphraser: Learn to build generative AI applications with an expert AWS
instructor with the 2-day Developing Generative AI Applications on AWS course.'
example_title: AWS course
- text: 'paraphraser: In healthcare, Generative AI can help generate synthetic medical
data to train machine learning models, develop new drug candidates, and design
clinical trials.'
example_title: Generative AI
- text: 'paraphraser: By leveraging prior model training through transfer learning,
fine-tuning can reduce the amount of expensive computing power and labeled data
needed to obtain large models tailored to niche use cases and business needs.'
example_title: Fine Tuning
---
# Text Rewriter Paraphraser
This repository contains a fine-tuned text-rewriting model based on the T5-Base with 223M parameters.
## Key Features:
* **Fine-tuned on t5-base:** Leverages the power of a pre-trained text-to-text transfer model for effective paraphrasing.
* **Large Dataset (430k examples):** Trained on a comprehensive dataset combining three open-source sources and cleaned using various techniques for optimal performance.
* **High Quality Paraphrases:** Generates paraphrases that significantly alter sentence structure while maintaining accuracy and factual correctness.
* **Non-AI Detectable:** Aims to produce paraphrases that appear natural and indistinguishable from human-written text.
**Model Performance:**
* Train Loss: 1.0645
* Validation Loss: 0.8761
## Getting Started:
T5 model expects a task related prefix: since it is a paraphrasing task, we will add a prefix "paraphraser: "
```python
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
device = "cuda"
tokenizer = AutoTokenizer.from_pretrained("Ateeqq/Text-Rewriter-Paraphraser", token='your_token')
model = AutoModelForSeq2SeqLM.from_pretrained("Ateeqq/Text-Rewriter-Paraphraser", token='your_token').to(device)
def generate_title(text):
input_ids = tokenizer(f'paraphraser: {text}', return_tensors="pt", padding="longest", truncation=True, max_length=64).input_ids.to(device)
outputs = model.generate(
input_ids,
num_beams=4,
num_beam_groups=4,
num_return_sequences=4,
repetition_penalty=10.0,
diversity_penalty=3.0,
no_repeat_ngram_size=2,
temperature=0.8,
max_length=64
)
return tokenizer.batch_decode(outputs, skip_special_tokens=True)
text = 'By leveraging prior model training through transfer learning, fine-tuning can reduce the amount of expensive computing power and labeled data needed to obtain large models tailored to niche use cases and business needs.'
generate_title(text)
```
### Output:
```
['The fine-tuning can reduce the amount of expensive computing power and labeled data required to obtain large models adapted for niche use cases and business needs by using prior model training through transfer learning.',
'fine-tuning, by utilizing prior model training through transfer learning, can reduce the amount of expensive computing power and labeled data required to obtain large models tailored for niche use cases and business needs.',
'Fine-tunering by using prior model training through transfer learning can reduce the amount of expensive computing power and labeled data required to obtain large models adapted for niche use cases and business needs.',
'Using transfer learning to use prior model training, fine-tuning can reduce the amount of expensive computing power and labeled data required for large models that are suitable in niche usage cases or businesses.']
```
**Disclaimer:**
* Limited Use: It grants a non-exclusive, non-transferable license to use the this model same as Llama-3. This means you can't freely share it with others or sell the model itself.
* Commercial Use Allowed: You can use the model for commercial purposes, but under the terms of the license agreement.
* Attribution Required: You need to abide by the agreement's terms regarding attribution. It is essential to use the paraphrased text responsibly and ethically, with proper attribution of the original source.
**Further Development:**
(Mention any ongoing development or areas for future improvement in Discussions.) | [
"PARAPHRASING"
] | [
"MEDICAL DATA"
] |
Labib11/MUG-B-1.6 | Labib11 | sentence-similarity | [
"sentence-transformers",
"safetensors",
"bert",
"feature-extraction",
"sentence-similarity",
"mteb",
"model-index",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
] | 2024-05-08T15:46:01 | 2024-05-21T12:54:58 | 1,011 | 2 | ---
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- mteb
model-index:
- name: MUG-B-1.6
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en-ext)
type: mteb/amazon_counterfactual
config: en-ext
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 74.04047976011994
- type: ap
value: 23.622442298323236
- type: f1
value: 61.681362134359354
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 72.38805970149255
- type: ap
value: 35.14527522183942
- type: f1
value: 66.40004634079556
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (de)
type: mteb/amazon_counterfactual
config: de
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 54.3254817987152
- type: ap
value: 71.95259605308317
- type: f1
value: 52.50731386267296
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (ja)
type: mteb/amazon_counterfactual
config: ja
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 56.33832976445397
- type: ap
value: 12.671021199223937
- type: f1
value: 46.127586182990605
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 93.70805000000001
- type: ap
value: 90.58639913354553
- type: f1
value: 93.69822635061847
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 50.85000000000001
- type: f1
value: 49.80013009020246
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (de)
type: mteb/amazon_reviews_multi
config: de
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 27.203999999999994
- type: f1
value: 26.60134413072989
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (es)
type: mteb/amazon_reviews_multi
config: es
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 34.878
- type: f1
value: 33.072592092252314
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (fr)
type: mteb/amazon_reviews_multi
config: fr
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 31.557999999999993
- type: f1
value: 30.866094552542624
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (ja)
type: mteb/amazon_reviews_multi
config: ja
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 22.706
- type: f1
value: 22.23195837325246
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (zh)
type: mteb/amazon_reviews_multi
config: zh
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 22.349999999999998
- type: f1
value: 21.80183891680617
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: mteb/arguana
config: default
split: test
revision: c22ab2a51041ffd869aaddef7af8d8215647e41a
metrics:
- type: map_at_1
value: 41.892
- type: map_at_10
value: 57.989999999999995
- type: map_at_100
value: 58.45
- type: map_at_1000
value: 58.453
- type: map_at_20
value: 58.392999999999994
- type: map_at_3
value: 53.746
- type: map_at_5
value: 56.566
- type: mrr_at_1
value: 43.314
- type: mrr_at_10
value: 58.535000000000004
- type: mrr_at_100
value: 58.975
- type: mrr_at_1000
value: 58.977999999999994
- type: mrr_at_20
value: 58.916999999999994
- type: mrr_at_3
value: 54.303000000000004
- type: mrr_at_5
value: 57.055
- type: ndcg_at_1
value: 41.892
- type: ndcg_at_10
value: 66.176
- type: ndcg_at_100
value: 67.958
- type: ndcg_at_1000
value: 68.00699999999999
- type: ndcg_at_20
value: 67.565
- type: ndcg_at_3
value: 57.691
- type: ndcg_at_5
value: 62.766
- type: precision_at_1
value: 41.892
- type: precision_at_10
value: 9.189
- type: precision_at_100
value: 0.993
- type: precision_at_1000
value: 0.1
- type: precision_at_20
value: 4.861
- type: precision_at_3
value: 23.044
- type: precision_at_5
value: 16.287
- type: recall_at_1
value: 41.892
- type: recall_at_10
value: 91.892
- type: recall_at_100
value: 99.289
- type: recall_at_1000
value: 99.644
- type: recall_at_20
value: 97.226
- type: recall_at_3
value: 69.132
- type: recall_at_5
value: 81.437
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 49.03486273664411
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 43.04797567338598
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 64.29499572176032
- type: mrr
value: 77.28861627753592
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 89.53248242133246
- type: cos_sim_spearman
value: 88.38032705871927
- type: euclidean_pearson
value: 87.77994445569084
- type: euclidean_spearman
value: 88.38032705871927
- type: manhattan_pearson
value: 87.52369210088627
- type: manhattan_spearman
value: 88.27972235673434
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 85.4090909090909
- type: f1
value: 84.87743757972068
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 39.73840151083438
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 36.565075977998966
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: mteb/cqadupstack-android
config: default
split: test
revision: f46a197baaae43b4f621051089b82a364682dfeb
metrics:
- type: map_at_1
value: 33.082
- type: map_at_10
value: 44.787
- type: map_at_100
value: 46.322
- type: map_at_1000
value: 46.446
- type: map_at_20
value: 45.572
- type: map_at_3
value: 40.913
- type: map_at_5
value: 42.922
- type: mrr_at_1
value: 40.629
- type: mrr_at_10
value: 51.119
- type: mrr_at_100
value: 51.783
- type: mrr_at_1000
value: 51.82
- type: mrr_at_20
value: 51.49700000000001
- type: mrr_at_3
value: 48.355
- type: mrr_at_5
value: 49.979
- type: ndcg_at_1
value: 40.629
- type: ndcg_at_10
value: 51.647
- type: ndcg_at_100
value: 56.923
- type: ndcg_at_1000
value: 58.682
- type: ndcg_at_20
value: 53.457
- type: ndcg_at_3
value: 46.065
- type: ndcg_at_5
value: 48.352000000000004
- type: precision_at_1
value: 40.629
- type: precision_at_10
value: 10.072000000000001
- type: precision_at_100
value: 1.5939999999999999
- type: precision_at_1000
value: 0.20600000000000002
- type: precision_at_20
value: 5.908
- type: precision_at_3
value: 22.222
- type: precision_at_5
value: 15.937000000000001
- type: recall_at_1
value: 33.082
- type: recall_at_10
value: 64.55300000000001
- type: recall_at_100
value: 86.86399999999999
- type: recall_at_1000
value: 97.667
- type: recall_at_20
value: 70.988
- type: recall_at_3
value: 48.067
- type: recall_at_5
value: 54.763
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackEnglishRetrieval
type: mteb/cqadupstack-english
config: default
split: test
revision: ad9991cb51e31e31e430383c75ffb2885547b5f0
metrics:
- type: map_at_1
value: 32.272
- type: map_at_10
value: 42.620000000000005
- type: map_at_100
value: 43.936
- type: map_at_1000
value: 44.066
- type: map_at_20
value: 43.349
- type: map_at_3
value: 39.458
- type: map_at_5
value: 41.351
- type: mrr_at_1
value: 40.127
- type: mrr_at_10
value: 48.437000000000005
- type: mrr_at_100
value: 49.096000000000004
- type: mrr_at_1000
value: 49.14
- type: mrr_at_20
value: 48.847
- type: mrr_at_3
value: 46.21
- type: mrr_at_5
value: 47.561
- type: ndcg_at_1
value: 40.127
- type: ndcg_at_10
value: 48.209999999999994
- type: ndcg_at_100
value: 52.632
- type: ndcg_at_1000
value: 54.59
- type: ndcg_at_20
value: 50.012
- type: ndcg_at_3
value: 43.996
- type: ndcg_at_5
value: 46.122
- type: precision_at_1
value: 40.127
- type: precision_at_10
value: 9.051
- type: precision_at_100
value: 1.465
- type: precision_at_1000
value: 0.193
- type: precision_at_20
value: 5.35
- type: precision_at_3
value: 21.104
- type: precision_at_5
value: 15.146
- type: recall_at_1
value: 32.272
- type: recall_at_10
value: 57.870999999999995
- type: recall_at_100
value: 76.211
- type: recall_at_1000
value: 88.389
- type: recall_at_20
value: 64.354
- type: recall_at_3
value: 45.426
- type: recall_at_5
value: 51.23799999999999
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGamingRetrieval
type: mteb/cqadupstack-gaming
config: default
split: test
revision: 4885aa143210c98657558c04aaf3dc47cfb54340
metrics:
- type: map_at_1
value: 40.261
- type: map_at_10
value: 53.400000000000006
- type: map_at_100
value: 54.42399999999999
- type: map_at_1000
value: 54.473000000000006
- type: map_at_20
value: 54.052
- type: map_at_3
value: 49.763000000000005
- type: map_at_5
value: 51.878
- type: mrr_at_1
value: 46.019
- type: mrr_at_10
value: 56.653
- type: mrr_at_100
value: 57.28
- type: mrr_at_1000
value: 57.303000000000004
- type: mrr_at_20
value: 57.057
- type: mrr_at_3
value: 53.971000000000004
- type: mrr_at_5
value: 55.632000000000005
- type: ndcg_at_1
value: 46.019
- type: ndcg_at_10
value: 59.597
- type: ndcg_at_100
value: 63.452
- type: ndcg_at_1000
value: 64.434
- type: ndcg_at_20
value: 61.404
- type: ndcg_at_3
value: 53.620999999999995
- type: ndcg_at_5
value: 56.688
- type: precision_at_1
value: 46.019
- type: precision_at_10
value: 9.748999999999999
- type: precision_at_100
value: 1.261
- type: precision_at_1000
value: 0.13799999999999998
- type: precision_at_20
value: 5.436
- type: precision_at_3
value: 24.075
- type: precision_at_5
value: 16.715
- type: recall_at_1
value: 40.261
- type: recall_at_10
value: 74.522
- type: recall_at_100
value: 91.014
- type: recall_at_1000
value: 98.017
- type: recall_at_20
value: 81.186
- type: recall_at_3
value: 58.72500000000001
- type: recall_at_5
value: 66.23599999999999
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGisRetrieval
type: mteb/cqadupstack-gis
config: default
split: test
revision: 5003b3064772da1887988e05400cf3806fe491f2
metrics:
- type: map_at_1
value: 27.666
- type: map_at_10
value: 36.744
- type: map_at_100
value: 37.794
- type: map_at_1000
value: 37.865
- type: map_at_20
value: 37.336999999999996
- type: map_at_3
value: 33.833999999999996
- type: map_at_5
value: 35.61
- type: mrr_at_1
value: 29.944
- type: mrr_at_10
value: 38.838
- type: mrr_at_100
value: 39.765
- type: mrr_at_1000
value: 39.818999999999996
- type: mrr_at_20
value: 39.373000000000005
- type: mrr_at_3
value: 36.234
- type: mrr_at_5
value: 37.844
- type: ndcg_at_1
value: 29.944
- type: ndcg_at_10
value: 41.986000000000004
- type: ndcg_at_100
value: 47.05
- type: ndcg_at_1000
value: 48.897
- type: ndcg_at_20
value: 43.989
- type: ndcg_at_3
value: 36.452
- type: ndcg_at_5
value: 39.395
- type: precision_at_1
value: 29.944
- type: precision_at_10
value: 6.4750000000000005
- type: precision_at_100
value: 0.946
- type: precision_at_1000
value: 0.11399999999999999
- type: precision_at_20
value: 3.6839999999999997
- type: precision_at_3
value: 15.443000000000001
- type: precision_at_5
value: 10.96
- type: recall_at_1
value: 27.666
- type: recall_at_10
value: 56.172999999999995
- type: recall_at_100
value: 79.142
- type: recall_at_1000
value: 93.013
- type: recall_at_20
value: 63.695
- type: recall_at_3
value: 41.285
- type: recall_at_5
value: 48.36
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackMathematicaRetrieval
type: mteb/cqadupstack-mathematica
config: default
split: test
revision: 90fceea13679c63fe563ded68f3b6f06e50061de
metrics:
- type: map_at_1
value: 17.939
- type: map_at_10
value: 27.301
- type: map_at_100
value: 28.485
- type: map_at_1000
value: 28.616000000000003
- type: map_at_20
value: 27.843
- type: map_at_3
value: 24.342
- type: map_at_5
value: 26.259
- type: mrr_at_1
value: 22.761
- type: mrr_at_10
value: 32.391
- type: mrr_at_100
value: 33.297
- type: mrr_at_1000
value: 33.361000000000004
- type: mrr_at_20
value: 32.845
- type: mrr_at_3
value: 29.498
- type: mrr_at_5
value: 31.375999999999998
- type: ndcg_at_1
value: 22.761
- type: ndcg_at_10
value: 33.036
- type: ndcg_at_100
value: 38.743
- type: ndcg_at_1000
value: 41.568
- type: ndcg_at_20
value: 34.838
- type: ndcg_at_3
value: 27.803
- type: ndcg_at_5
value: 30.781
- type: precision_at_1
value: 22.761
- type: precision_at_10
value: 6.132
- type: precision_at_100
value: 1.031
- type: precision_at_1000
value: 0.14200000000000002
- type: precision_at_20
value: 3.582
- type: precision_at_3
value: 13.474
- type: precision_at_5
value: 10.123999999999999
- type: recall_at_1
value: 17.939
- type: recall_at_10
value: 45.515
- type: recall_at_100
value: 70.56700000000001
- type: recall_at_1000
value: 90.306
- type: recall_at_20
value: 51.946999999999996
- type: recall_at_3
value: 31.459
- type: recall_at_5
value: 39.007
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackPhysicsRetrieval
type: mteb/cqadupstack-physics
config: default
split: test
revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4
metrics:
- type: map_at_1
value: 31.156
- type: map_at_10
value: 42.317
- type: map_at_100
value: 43.742
- type: map_at_1000
value: 43.852000000000004
- type: map_at_20
value: 43.147999999999996
- type: map_at_3
value: 38.981
- type: map_at_5
value: 40.827000000000005
- type: mrr_at_1
value: 38.401999999999994
- type: mrr_at_10
value: 48.141
- type: mrr_at_100
value: 48.991
- type: mrr_at_1000
value: 49.03
- type: mrr_at_20
value: 48.665000000000006
- type: mrr_at_3
value: 45.684999999999995
- type: mrr_at_5
value: 47.042
- type: ndcg_at_1
value: 38.401999999999994
- type: ndcg_at_10
value: 48.541000000000004
- type: ndcg_at_100
value: 54.063
- type: ndcg_at_1000
value: 56.005
- type: ndcg_at_20
value: 50.895999999999994
- type: ndcg_at_3
value: 43.352000000000004
- type: ndcg_at_5
value: 45.769
- type: precision_at_1
value: 38.401999999999994
- type: precision_at_10
value: 8.738999999999999
- type: precision_at_100
value: 1.335
- type: precision_at_1000
value: 0.16999999999999998
- type: precision_at_20
value: 5.164
- type: precision_at_3
value: 20.468
- type: precision_at_5
value: 14.437
- type: recall_at_1
value: 31.156
- type: recall_at_10
value: 61.172000000000004
- type: recall_at_100
value: 83.772
- type: recall_at_1000
value: 96.192
- type: recall_at_20
value: 69.223
- type: recall_at_3
value: 46.628
- type: recall_at_5
value: 53.032000000000004
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackProgrammersRetrieval
type: mteb/cqadupstack-programmers
config: default
split: test
revision: 6184bc1440d2dbc7612be22b50686b8826d22b32
metrics:
- type: map_at_1
value: 26.741999999999997
- type: map_at_10
value: 36.937
- type: map_at_100
value: 38.452
- type: map_at_1000
value: 38.557
- type: map_at_20
value: 37.858999999999995
- type: map_at_3
value: 33.579
- type: map_at_5
value: 35.415
- type: mrr_at_1
value: 32.991
- type: mrr_at_10
value: 42.297000000000004
- type: mrr_at_100
value: 43.282
- type: mrr_at_1000
value: 43.332
- type: mrr_at_20
value: 42.95
- type: mrr_at_3
value: 39.707
- type: mrr_at_5
value: 41.162
- type: ndcg_at_1
value: 32.991
- type: ndcg_at_10
value: 43.004999999999995
- type: ndcg_at_100
value: 49.053000000000004
- type: ndcg_at_1000
value: 51.166999999999994
- type: ndcg_at_20
value: 45.785
- type: ndcg_at_3
value: 37.589
- type: ndcg_at_5
value: 40.007999999999996
- type: precision_at_1
value: 32.991
- type: precision_at_10
value: 8.025
- type: precision_at_100
value: 1.268
- type: precision_at_1000
value: 0.163
- type: precision_at_20
value: 4.846
- type: precision_at_3
value: 17.922
- type: precision_at_5
value: 13.059000000000001
- type: recall_at_1
value: 26.741999999999997
- type: recall_at_10
value: 55.635999999999996
- type: recall_at_100
value: 80.798
- type: recall_at_1000
value: 94.918
- type: recall_at_20
value: 65.577
- type: recall_at_3
value: 40.658
- type: recall_at_5
value: 46.812
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackRetrieval
type: mteb/cqadupstack
config: default
split: test
revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4
metrics:
- type: map_at_1
value: 27.274583333333336
- type: map_at_10
value: 37.04091666666666
- type: map_at_100
value: 38.27966666666667
- type: map_at_1000
value: 38.39383333333334
- type: map_at_20
value: 37.721500000000006
- type: map_at_3
value: 33.937999999999995
- type: map_at_5
value: 35.67974999999999
- type: mrr_at_1
value: 32.40525
- type: mrr_at_10
value: 41.43925000000001
- type: mrr_at_100
value: 42.271
- type: mrr_at_1000
value: 42.32416666666667
- type: mrr_at_20
value: 41.92733333333334
- type: mrr_at_3
value: 38.84941666666666
- type: mrr_at_5
value: 40.379583333333336
- type: ndcg_at_1
value: 32.40525
- type: ndcg_at_10
value: 42.73808333333334
- type: ndcg_at_100
value: 47.88941666666667
- type: ndcg_at_1000
value: 50.05008333333334
- type: ndcg_at_20
value: 44.74183333333334
- type: ndcg_at_3
value: 37.51908333333334
- type: ndcg_at_5
value: 40.01883333333333
- type: precision_at_1
value: 32.40525
- type: precision_at_10
value: 7.5361666666666665
- type: precision_at_100
value: 1.1934166666666666
- type: precision_at_1000
value: 0.1575
- type: precision_at_20
value: 4.429166666666667
- type: precision_at_3
value: 17.24941666666667
- type: precision_at_5
value: 12.362333333333336
- type: recall_at_1
value: 27.274583333333336
- type: recall_at_10
value: 55.21358333333334
- type: recall_at_100
value: 77.60366666666667
- type: recall_at_1000
value: 92.43691666666666
- type: recall_at_20
value: 62.474583333333335
- type: recall_at_3
value: 40.79375
- type: recall_at_5
value: 47.15158333333334
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackStatsRetrieval
type: mteb/cqadupstack-stats
config: default
split: test
revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a
metrics:
- type: map_at_1
value: 27.389999999999997
- type: map_at_10
value: 34.107
- type: map_at_100
value: 35.022999999999996
- type: map_at_1000
value: 35.13
- type: map_at_20
value: 34.605999999999995
- type: map_at_3
value: 32.021
- type: map_at_5
value: 32.948
- type: mrr_at_1
value: 30.982
- type: mrr_at_10
value: 37.345
- type: mrr_at_100
value: 38.096999999999994
- type: mrr_at_1000
value: 38.179
- type: mrr_at_20
value: 37.769000000000005
- type: mrr_at_3
value: 35.481
- type: mrr_at_5
value: 36.293
- type: ndcg_at_1
value: 30.982
- type: ndcg_at_10
value: 38.223
- type: ndcg_at_100
value: 42.686
- type: ndcg_at_1000
value: 45.352
- type: ndcg_at_20
value: 39.889
- type: ndcg_at_3
value: 34.259
- type: ndcg_at_5
value: 35.664
- type: precision_at_1
value: 30.982
- type: precision_at_10
value: 5.7669999999999995
- type: precision_at_100
value: 0.877
- type: precision_at_1000
value: 0.11800000000000001
- type: precision_at_20
value: 3.3360000000000003
- type: precision_at_3
value: 14.264
- type: precision_at_5
value: 9.54
- type: recall_at_1
value: 27.389999999999997
- type: recall_at_10
value: 48.009
- type: recall_at_100
value: 68.244
- type: recall_at_1000
value: 87.943
- type: recall_at_20
value: 54.064
- type: recall_at_3
value: 36.813
- type: recall_at_5
value: 40.321
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackTexRetrieval
type: mteb/cqadupstack-tex
config: default
split: test
revision: 46989137a86843e03a6195de44b09deda022eec7
metrics:
- type: map_at_1
value: 18.249000000000002
- type: map_at_10
value: 25.907000000000004
- type: map_at_100
value: 27.105
- type: map_at_1000
value: 27.233
- type: map_at_20
value: 26.541999999999998
- type: map_at_3
value: 23.376
- type: map_at_5
value: 24.673000000000002
- type: mrr_at_1
value: 21.989
- type: mrr_at_10
value: 29.846
- type: mrr_at_100
value: 30.808999999999997
- type: mrr_at_1000
value: 30.885
- type: mrr_at_20
value: 30.384
- type: mrr_at_3
value: 27.46
- type: mrr_at_5
value: 28.758
- type: ndcg_at_1
value: 21.989
- type: ndcg_at_10
value: 30.874000000000002
- type: ndcg_at_100
value: 36.504999999999995
- type: ndcg_at_1000
value: 39.314
- type: ndcg_at_20
value: 32.952999999999996
- type: ndcg_at_3
value: 26.249
- type: ndcg_at_5
value: 28.229
- type: precision_at_1
value: 21.989
- type: precision_at_10
value: 5.705
- type: precision_at_100
value: 0.9990000000000001
- type: precision_at_1000
value: 0.14100000000000001
- type: precision_at_20
value: 3.4459999999999997
- type: precision_at_3
value: 12.377
- type: precision_at_5
value: 8.961
- type: recall_at_1
value: 18.249000000000002
- type: recall_at_10
value: 41.824
- type: recall_at_100
value: 67.071
- type: recall_at_1000
value: 86.863
- type: recall_at_20
value: 49.573
- type: recall_at_3
value: 28.92
- type: recall_at_5
value: 34.003
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackUnixRetrieval
type: mteb/cqadupstack-unix
config: default
split: test
revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53
metrics:
- type: map_at_1
value: 26.602999999999998
- type: map_at_10
value: 36.818
- type: map_at_100
value: 37.894
- type: map_at_1000
value: 37.991
- type: map_at_20
value: 37.389
- type: map_at_3
value: 33.615
- type: map_at_5
value: 35.432
- type: mrr_at_1
value: 31.53
- type: mrr_at_10
value: 41.144
- type: mrr_at_100
value: 41.937999999999995
- type: mrr_at_1000
value: 41.993
- type: mrr_at_20
value: 41.585
- type: mrr_at_3
value: 38.385999999999996
- type: mrr_at_5
value: 39.995000000000005
- type: ndcg_at_1
value: 31.53
- type: ndcg_at_10
value: 42.792
- type: ndcg_at_100
value: 47.749
- type: ndcg_at_1000
value: 49.946
- type: ndcg_at_20
value: 44.59
- type: ndcg_at_3
value: 37.025000000000006
- type: ndcg_at_5
value: 39.811
- type: precision_at_1
value: 31.53
- type: precision_at_10
value: 7.2669999999999995
- type: precision_at_100
value: 1.109
- type: precision_at_1000
value: 0.14100000000000001
- type: precision_at_20
value: 4.184
- type: precision_at_3
value: 16.791
- type: precision_at_5
value: 12.09
- type: recall_at_1
value: 26.602999999999998
- type: recall_at_10
value: 56.730999999999995
- type: recall_at_100
value: 78.119
- type: recall_at_1000
value: 93.458
- type: recall_at_20
value: 63.00599999999999
- type: recall_at_3
value: 41.306
- type: recall_at_5
value: 48.004999999999995
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWebmastersRetrieval
type: mteb/cqadupstack-webmasters
config: default
split: test
revision: 160c094312a0e1facb97e55eeddb698c0abe3571
metrics:
- type: map_at_1
value: 23.988
- type: map_at_10
value: 33.650999999999996
- type: map_at_100
value: 35.263
- type: map_at_1000
value: 35.481
- type: map_at_20
value: 34.463
- type: map_at_3
value: 30.330000000000002
- type: map_at_5
value: 32.056000000000004
- type: mrr_at_1
value: 29.644
- type: mrr_at_10
value: 38.987
- type: mrr_at_100
value: 39.973
- type: mrr_at_1000
value: 40.013
- type: mrr_at_20
value: 39.553
- type: mrr_at_3
value: 36.001
- type: mrr_at_5
value: 37.869
- type: ndcg_at_1
value: 29.644
- type: ndcg_at_10
value: 40.156
- type: ndcg_at_100
value: 46.244
- type: ndcg_at_1000
value: 48.483
- type: ndcg_at_20
value: 42.311
- type: ndcg_at_3
value: 34.492
- type: ndcg_at_5
value: 37.118
- type: precision_at_1
value: 29.644
- type: precision_at_10
value: 7.925
- type: precision_at_100
value: 1.5890000000000002
- type: precision_at_1000
value: 0.245
- type: precision_at_20
value: 4.97
- type: precision_at_3
value: 16.469
- type: precision_at_5
value: 12.174
- type: recall_at_1
value: 23.988
- type: recall_at_10
value: 52.844
- type: recall_at_100
value: 80.143
- type: recall_at_1000
value: 93.884
- type: recall_at_20
value: 61.050000000000004
- type: recall_at_3
value: 36.720000000000006
- type: recall_at_5
value: 43.614999999999995
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWordpressRetrieval
type: mteb/cqadupstack-wordpress
config: default
split: test
revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4
metrics:
- type: map_at_1
value: 21.947
- type: map_at_10
value: 29.902
- type: map_at_100
value: 30.916
- type: map_at_1000
value: 31.016
- type: map_at_20
value: 30.497999999999998
- type: map_at_3
value: 27.044
- type: map_at_5
value: 28.786
- type: mrr_at_1
value: 23.845
- type: mrr_at_10
value: 32.073
- type: mrr_at_100
value: 32.940999999999995
- type: mrr_at_1000
value: 33.015
- type: mrr_at_20
value: 32.603
- type: mrr_at_3
value: 29.205
- type: mrr_at_5
value: 31.044
- type: ndcg_at_1
value: 23.845
- type: ndcg_at_10
value: 34.79
- type: ndcg_at_100
value: 39.573
- type: ndcg_at_1000
value: 42.163000000000004
- type: ndcg_at_20
value: 36.778
- type: ndcg_at_3
value: 29.326
- type: ndcg_at_5
value: 32.289
- type: precision_at_1
value: 23.845
- type: precision_at_10
value: 5.527
- type: precision_at_100
value: 0.847
- type: precision_at_1000
value: 0.11900000000000001
- type: precision_at_20
value: 3.2439999999999998
- type: precision_at_3
value: 12.384
- type: precision_at_5
value: 9.205
- type: recall_at_1
value: 21.947
- type: recall_at_10
value: 47.713
- type: recall_at_100
value: 69.299
- type: recall_at_1000
value: 88.593
- type: recall_at_20
value: 55.032000000000004
- type: recall_at_3
value: 33.518
- type: recall_at_5
value: 40.427
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: mteb/climate-fever
config: default
split: test
revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380
metrics:
- type: map_at_1
value: 13.655999999999999
- type: map_at_10
value: 23.954
- type: map_at_100
value: 26.07
- type: map_at_1000
value: 26.266000000000002
- type: map_at_20
value: 25.113000000000003
- type: map_at_3
value: 19.85
- type: map_at_5
value: 21.792
- type: mrr_at_1
value: 31.075000000000003
- type: mrr_at_10
value: 43.480000000000004
- type: mrr_at_100
value: 44.39
- type: mrr_at_1000
value: 44.42
- type: mrr_at_20
value: 44.06
- type: mrr_at_3
value: 40.38
- type: mrr_at_5
value: 42.138999999999996
- type: ndcg_at_1
value: 31.075000000000003
- type: ndcg_at_10
value: 33.129999999999995
- type: ndcg_at_100
value: 40.794000000000004
- type: ndcg_at_1000
value: 44.062
- type: ndcg_at_20
value: 36.223
- type: ndcg_at_3
value: 27.224999999999998
- type: ndcg_at_5
value: 28.969
- type: precision_at_1
value: 31.075000000000003
- type: precision_at_10
value: 10.476
- type: precision_at_100
value: 1.864
- type: precision_at_1000
value: 0.247
- type: precision_at_20
value: 6.593
- type: precision_at_3
value: 20.456
- type: precision_at_5
value: 15.440000000000001
- type: recall_at_1
value: 13.655999999999999
- type: recall_at_10
value: 39.678000000000004
- type: recall_at_100
value: 65.523
- type: recall_at_1000
value: 83.59100000000001
- type: recall_at_20
value: 48.27
- type: recall_at_3
value: 24.863
- type: recall_at_5
value: 30.453999999999997
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: mteb/dbpedia
config: default
split: test
revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659
metrics:
- type: map_at_1
value: 9.139
- type: map_at_10
value: 20.366999999999997
- type: map_at_100
value: 29.755
- type: map_at_1000
value: 31.563999999999997
- type: map_at_20
value: 24.021
- type: map_at_3
value: 14.395
- type: map_at_5
value: 16.853
- type: mrr_at_1
value: 69.0
- type: mrr_at_10
value: 76.778
- type: mrr_at_100
value: 77.116
- type: mrr_at_1000
value: 77.12299999999999
- type: mrr_at_20
value: 77.046
- type: mrr_at_3
value: 75.208
- type: mrr_at_5
value: 76.146
- type: ndcg_at_1
value: 57.125
- type: ndcg_at_10
value: 42.84
- type: ndcg_at_100
value: 48.686
- type: ndcg_at_1000
value: 56.294
- type: ndcg_at_20
value: 42.717
- type: ndcg_at_3
value: 46.842
- type: ndcg_at_5
value: 44.248
- type: precision_at_1
value: 69.0
- type: precision_at_10
value: 34.625
- type: precision_at_100
value: 11.468
- type: precision_at_1000
value: 2.17
- type: precision_at_20
value: 26.562
- type: precision_at_3
value: 50.917
- type: precision_at_5
value: 43.35
- type: recall_at_1
value: 9.139
- type: recall_at_10
value: 26.247999999999998
- type: recall_at_100
value: 56.647000000000006
- type: recall_at_1000
value: 80.784
- type: recall_at_20
value: 35.010999999999996
- type: recall_at_3
value: 15.57
- type: recall_at_5
value: 19.198
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 55.93
- type: f1
value: 49.35314406745291
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: mteb/fever
config: default
split: test
revision: bea83ef9e8fb933d90a2f1d5515737465d613e12
metrics:
- type: map_at_1
value: 73.198
- type: map_at_10
value: 81.736
- type: map_at_100
value: 82.02000000000001
- type: map_at_1000
value: 82.03399999999999
- type: map_at_20
value: 81.937
- type: map_at_3
value: 80.692
- type: map_at_5
value: 81.369
- type: mrr_at_1
value: 78.803
- type: mrr_at_10
value: 86.144
- type: mrr_at_100
value: 86.263
- type: mrr_at_1000
value: 86.26599999999999
- type: mrr_at_20
value: 86.235
- type: mrr_at_3
value: 85.464
- type: mrr_at_5
value: 85.95
- type: ndcg_at_1
value: 78.803
- type: ndcg_at_10
value: 85.442
- type: ndcg_at_100
value: 86.422
- type: ndcg_at_1000
value: 86.68900000000001
- type: ndcg_at_20
value: 85.996
- type: ndcg_at_3
value: 83.839
- type: ndcg_at_5
value: 84.768
- type: precision_at_1
value: 78.803
- type: precision_at_10
value: 10.261000000000001
- type: precision_at_100
value: 1.0959999999999999
- type: precision_at_1000
value: 0.11399999999999999
- type: precision_at_20
value: 5.286
- type: precision_at_3
value: 32.083
- type: precision_at_5
value: 19.898
- type: recall_at_1
value: 73.198
- type: recall_at_10
value: 92.42099999999999
- type: recall_at_100
value: 96.28
- type: recall_at_1000
value: 97.995
- type: recall_at_20
value: 94.36
- type: recall_at_3
value: 88.042
- type: recall_at_5
value: 90.429
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: mteb/fiqa
config: default
split: test
revision: 27a168819829fe9bcd655c2df245fb19452e8e06
metrics:
- type: map_at_1
value: 21.583
- type: map_at_10
value: 36.503
- type: map_at_100
value: 38.529
- type: map_at_1000
value: 38.701
- type: map_at_20
value: 37.69
- type: map_at_3
value: 31.807000000000002
- type: map_at_5
value: 34.424
- type: mrr_at_1
value: 43.827
- type: mrr_at_10
value: 53.528
- type: mrr_at_100
value: 54.291
- type: mrr_at_1000
value: 54.32599999999999
- type: mrr_at_20
value: 54.064
- type: mrr_at_3
value: 51.25999999999999
- type: mrr_at_5
value: 52.641000000000005
- type: ndcg_at_1
value: 43.827
- type: ndcg_at_10
value: 44.931
- type: ndcg_at_100
value: 51.778999999999996
- type: ndcg_at_1000
value: 54.532000000000004
- type: ndcg_at_20
value: 47.899
- type: ndcg_at_3
value: 41.062
- type: ndcg_at_5
value: 42.33
- type: precision_at_1
value: 43.827
- type: precision_at_10
value: 12.608
- type: precision_at_100
value: 1.974
- type: precision_at_1000
value: 0.247
- type: precision_at_20
value: 7.585
- type: precision_at_3
value: 27.778000000000002
- type: precision_at_5
value: 20.308999999999997
- type: recall_at_1
value: 21.583
- type: recall_at_10
value: 52.332
- type: recall_at_100
value: 77.256
- type: recall_at_1000
value: 93.613
- type: recall_at_20
value: 61.413
- type: recall_at_3
value: 37.477
- type: recall_at_5
value: 44.184
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: mteb/hotpotqa
config: default
split: test
revision: ab518f4d6fcca38d87c25209f94beba119d02014
metrics:
- type: map_at_1
value: 39.845000000000006
- type: map_at_10
value: 64.331
- type: map_at_100
value: 65.202
- type: map_at_1000
value: 65.261
- type: map_at_20
value: 64.833
- type: map_at_3
value: 60.663
- type: map_at_5
value: 62.94
- type: mrr_at_1
value: 79.689
- type: mrr_at_10
value: 85.299
- type: mrr_at_100
value: 85.461
- type: mrr_at_1000
value: 85.466
- type: mrr_at_20
value: 85.39099999999999
- type: mrr_at_3
value: 84.396
- type: mrr_at_5
value: 84.974
- type: ndcg_at_1
value: 79.689
- type: ndcg_at_10
value: 72.49
- type: ndcg_at_100
value: 75.485
- type: ndcg_at_1000
value: 76.563
- type: ndcg_at_20
value: 73.707
- type: ndcg_at_3
value: 67.381
- type: ndcg_at_5
value: 70.207
- type: precision_at_1
value: 79.689
- type: precision_at_10
value: 15.267
- type: precision_at_100
value: 1.7610000000000001
- type: precision_at_1000
value: 0.19
- type: precision_at_20
value: 8.024000000000001
- type: precision_at_3
value: 43.363
- type: precision_at_5
value: 28.248
- type: recall_at_1
value: 39.845000000000006
- type: recall_at_10
value: 76.334
- type: recall_at_100
value: 88.042
- type: recall_at_1000
value: 95.09100000000001
- type: recall_at_20
value: 80.243
- type: recall_at_3
value: 65.044
- type: recall_at_5
value: 70.621
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 93.57079999999999
- type: ap
value: 90.50045924786099
- type: f1
value: 93.56673497845476
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: mteb/msmarco
config: default
split: dev
revision: c5a29a104738b98a9e76336939199e264163d4a0
metrics:
- type: map_at_1
value: 22.212
- type: map_at_10
value: 34.528
- type: map_at_100
value: 35.69
- type: map_at_1000
value: 35.74
- type: map_at_20
value: 35.251
- type: map_at_3
value: 30.628
- type: map_at_5
value: 32.903999999999996
- type: mrr_at_1
value: 22.794
- type: mrr_at_10
value: 35.160000000000004
- type: mrr_at_100
value: 36.251
- type: mrr_at_1000
value: 36.295
- type: mrr_at_20
value: 35.845
- type: mrr_at_3
value: 31.328
- type: mrr_at_5
value: 33.574
- type: ndcg_at_1
value: 22.779
- type: ndcg_at_10
value: 41.461
- type: ndcg_at_100
value: 47.049
- type: ndcg_at_1000
value: 48.254000000000005
- type: ndcg_at_20
value: 44.031
- type: ndcg_at_3
value: 33.561
- type: ndcg_at_5
value: 37.62
- type: precision_at_1
value: 22.779
- type: precision_at_10
value: 6.552
- type: precision_at_100
value: 0.936
- type: precision_at_1000
value: 0.104
- type: precision_at_20
value: 3.8120000000000003
- type: precision_at_3
value: 14.274000000000001
- type: precision_at_5
value: 10.622
- type: recall_at_1
value: 22.212
- type: recall_at_10
value: 62.732
- type: recall_at_100
value: 88.567
- type: recall_at_1000
value: 97.727
- type: recall_at_20
value: 72.733
- type: recall_at_3
value: 41.367
- type: recall_at_5
value: 51.105999999999995
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 94.24988600091199
- type: f1
value: 94.06064583085202
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (de)
type: mteb/mtop_domain
config: de
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 74.86052409129333
- type: f1
value: 72.24661442078647
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (es)
type: mteb/mtop_domain
config: es
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 77.09139426284189
- type: f1
value: 76.3725044443502
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (fr)
type: mteb/mtop_domain
config: fr
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 79.79956154087064
- type: f1
value: 78.41859658401724
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (hi)
type: mteb/mtop_domain
config: hi
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 32.785944783076374
- type: f1
value: 31.182237278594922
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (th)
type: mteb/mtop_domain
config: th
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 16.654611211573236
- type: f1
value: 12.088413093236642
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 67.51481988144094
- type: f1
value: 49.561420234732125
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (de)
type: mteb/mtop_intent
config: de
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 42.36122851507467
- type: f1
value: 25.445030887504398
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (es)
type: mteb/mtop_intent
config: es
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 44.73315543695797
- type: f1
value: 28.42075153540265
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (fr)
type: mteb/mtop_intent
config: fr
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 38.96022549326651
- type: f1
value: 25.926979537146106
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (hi)
type: mteb/mtop_intent
config: hi
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 13.578343492291141
- type: f1
value: 8.929295550931657
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (th)
type: mteb/mtop_intent
config: th
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 5.396021699819168
- type: f1
value: 1.8587148785378742
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (af)
type: mteb/amazon_massive_intent
config: af
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 37.22259583053128
- type: f1
value: 34.63013680947778
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (am)
type: mteb/amazon_massive_intent
config: am
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 3.194351042367182
- type: f1
value: 1.2612010214639442
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (ar)
type: mteb/amazon_massive_intent
config: ar
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 14.26361802286483
- type: f1
value: 13.70260406613821
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (az)
type: mteb/amazon_massive_intent
config: az
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 37.21923335574983
- type: f1
value: 36.33553913878251
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (bn)
type: mteb/amazon_massive_intent
config: bn
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 10.756556825823807
- type: f1
value: 9.676431920229374
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (cy)
type: mteb/amazon_massive_intent
config: cy
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 32.49831876260928
- type: f1
value: 30.818895782691868
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (da)
type: mteb/amazon_massive_intent
config: da
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 40.995292535305985
- type: f1
value: 37.68768183180129
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (de)
type: mteb/amazon_massive_intent
config: de
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 42.780766644250164
- type: f1
value: 37.82194830667135
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (el)
type: mteb/amazon_massive_intent
config: el
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 33.490248823133825
- type: f1
value: 29.71809045584527
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 73.8836583725622
- type: f1
value: 72.16381047416814
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (es)
type: mteb/amazon_massive_intent
config: es
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 44.45191661062542
- type: f1
value: 43.46583297093683
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (fa)
type: mteb/amazon_massive_intent
config: fa
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 26.738399462004036
- type: f1
value: 24.11896530001951
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (fi)
type: mteb/amazon_massive_intent
config: fi
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 38.09683927370545
- type: f1
value: 35.34443269387154
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (fr)
type: mteb/amazon_massive_intent
config: fr
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 46.89307330195024
- type: f1
value: 43.47164092514292
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (he)
type: mteb/amazon_massive_intent
config: he
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 25.198386012104912
- type: f1
value: 22.446286736401916
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (hi)
type: mteb/amazon_massive_intent
config: hi
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 13.940820443846672
- type: f1
value: 13.257747189396213
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (hu)
type: mteb/amazon_massive_intent
config: hu
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 34.710827168796236
- type: f1
value: 32.036974696095996
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (hy)
type: mteb/amazon_massive_intent
config: hy
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 6.711499663752522
- type: f1
value: 5.439441019096591
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (id)
type: mteb/amazon_massive_intent
config: id
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 38.56758574310693
- type: f1
value: 36.83183505458304
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (is)
type: mteb/amazon_massive_intent
config: is
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 32.22595830531271
- type: f1
value: 30.10972675771159
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (it)
type: mteb/amazon_massive_intent
config: it
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 45.79690652320107
- type: f1
value: 44.37143784350453
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (ja)
type: mteb/amazon_massive_intent
config: ja
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 29.189643577673163
- type: f1
value: 25.43718135312703
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (jv)
type: mteb/amazon_massive_intent
config: jv
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 34.21990585070612
- type: f1
value: 32.333592263041396
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (ka)
type: mteb/amazon_massive_intent
config: ka
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 8.890383322125084
- type: f1
value: 7.294310113130201
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (km)
type: mteb/amazon_massive_intent
config: km
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 4.616677874915938
- type: f1
value: 1.5028537477535886
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (kn)
type: mteb/amazon_massive_intent
config: kn
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 3.170813718897109
- type: f1
value: 1.5771411815826382
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (ko)
type: mteb/amazon_massive_intent
config: ko
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 15.026899798251513
- type: f1
value: 14.077395255366183
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (lv)
type: mteb/amazon_massive_intent
config: lv
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 36.0995292535306
- type: f1
value: 35.0877269083235
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (ml)
type: mteb/amazon_massive_intent
config: ml
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 2.9959650302622727
- type: f1
value: 0.8064424547273695
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (mn)
type: mteb/amazon_massive_intent
config: mn
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 23.301950235373234
- type: f1
value: 22.477376205075853
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (ms)
type: mteb/amazon_massive_intent
config: ms
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 36.13315400134499
- type: f1
value: 32.99623898888715
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (my)
type: mteb/amazon_massive_intent
config: my
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 3.813046402151983
- type: f1
value: 1.1769597223141248
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (nb)
type: mteb/amazon_massive_intent
config: nb
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 39.66711499663752
- type: f1
value: 35.921474753569214
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (nl)
type: mteb/amazon_massive_intent
config: nl
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 41.079354404841965
- type: f1
value: 37.57739961852201
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (pl)
type: mteb/amazon_massive_intent
config: pl
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 38.211163416274374
- type: f1
value: 34.89419275422068
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (pt)
type: mteb/amazon_massive_intent
config: pt
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 45.19838601210491
- type: f1
value: 42.71660225307043
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (ro)
type: mteb/amazon_massive_intent
config: ro
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 39.48554135843981
- type: f1
value: 37.47402102847154
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (ru)
type: mteb/amazon_massive_intent
config: ru
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 31.819098856758576
- type: f1
value: 30.120158288509725
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (sl)
type: mteb/amazon_massive_intent
config: sl
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 35.44720914593141
- type: f1
value: 33.74530063536304
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (sq)
type: mteb/amazon_massive_intent
config: sq
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 36.89307330195024
- type: f1
value: 34.46971619696105
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (sv)
type: mteb/amazon_massive_intent
config: sv
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 38.83322125084062
- type: f1
value: 36.050770344888264
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (sw)
type: mteb/amazon_massive_intent
config: sw
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 37.535305985205106
- type: f1
value: 35.21395700670493
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (ta)
type: mteb/amazon_massive_intent
config: ta
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 7.905178211163418
- type: f1
value: 6.163513326325246
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (te)
type: mteb/amazon_massive_intent
config: te
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 2.8480161398789514
- type: f1
value: 1.0163931337986962
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (th)
type: mteb/amazon_massive_intent
config: th
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 10.501008742434433
- type: f1
value: 6.858549418430471
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (tl)
type: mteb/amazon_massive_intent
config: tl
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 39.46536650975118
- type: f1
value: 34.96292597328575
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (tr)
type: mteb/amazon_massive_intent
config: tr
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 37.50168123739071
- type: f1
value: 35.031097269820464
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (ur)
type: mteb/amazon_massive_intent
config: ur
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 16.109616677874918
- type: f1
value: 15.884609726192519
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (vi)
type: mteb/amazon_massive_intent
config: vi
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 36.11297915265636
- type: f1
value: 34.59918716321474
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (zh-CN)
type: mteb/amazon_massive_intent
config: zh-CN
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 18.850033624747812
- type: f1
value: 15.09584388649328
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (zh-TW)
type: mteb/amazon_massive_intent
config: zh-TW
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 17.219233355749832
- type: f1
value: 14.538046039008337
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (af)
type: mteb/amazon_massive_scenario
config: af
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 47.79757901815736
- type: f1
value: 45.078250421193324
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (am)
type: mteb/amazon_massive_scenario
config: am
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 7.078009414929388
- type: f1
value: 4.0122456300041645
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (ar)
type: mteb/amazon_massive_scenario
config: ar
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 22.831203765971754
- type: f1
value: 20.131610050816555
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (az)
type: mteb/amazon_massive_scenario
config: az
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 44.952925353059854
- type: f1
value: 42.6865575762921
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (bn)
type: mteb/amazon_massive_scenario
config: bn
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 16.593813046402154
- type: f1
value: 14.087144503044291
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (cy)
type: mteb/amazon_massive_scenario
config: cy
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 37.91862811028917
- type: f1
value: 34.968402727911915
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (da)
type: mteb/amazon_massive_scenario
config: da
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 51.923335574983184
- type: f1
value: 49.357147840776335
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (de)
type: mteb/amazon_massive_scenario
config: de
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 58.73570948217889
- type: f1
value: 54.92084137819753
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (el)
type: mteb/amazon_massive_scenario
config: el
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 42.995965030262276
- type: f1
value: 38.47512542753069
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 77.42098184263618
- type: f1
value: 77.03413816048877
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (es)
type: mteb/amazon_massive_scenario
config: es
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 54.46536650975118
- type: f1
value: 53.08520810835907
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (fa)
type: mteb/amazon_massive_scenario
config: fa
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 30.578345662407525
- type: f1
value: 28.822998245702635
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (fi)
type: mteb/amazon_massive_scenario
config: fi
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 43.567585743106925
- type: f1
value: 39.79216651714347
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (fr)
type: mteb/amazon_massive_scenario
config: fr
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 56.98722259583053
- type: f1
value: 55.31168113501439
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (he)
type: mteb/amazon_massive_scenario
config: he
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 28.076664425016812
- type: f1
value: 24.927348965627573
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (hi)
type: mteb/amazon_massive_scenario
config: hi
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 18.096839273705445
- type: f1
value: 17.386603595777103
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (hu)
type: mteb/amazon_massive_scenario
config: hu
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 41.73839946200403
- type: f1
value: 38.65545902563735
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (hy)
type: mteb/amazon_massive_scenario
config: hy
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 11.536650975117688
- type: f1
value: 10.898336694524854
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (id)
type: mteb/amazon_massive_scenario
config: id
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 46.9502353732347
- type: f1
value: 44.332561323528644
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (is)
type: mteb/amazon_massive_scenario
config: is
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 42.777404169468724
- type: f1
value: 39.378117766055354
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (it)
type: mteb/amazon_massive_scenario
config: it
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 54.6469401479489
- type: f1
value: 52.512025274851794
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (ja)
type: mteb/amazon_massive_scenario
config: ja
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 35.90114324142569
- type: f1
value: 34.90331274712605
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (jv)
type: mteb/amazon_massive_scenario
config: jv
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 42.51176866173504
- type: f1
value: 39.417541845685676
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (ka)
type: mteb/amazon_massive_scenario
config: ka
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 13.799596503026226
- type: f1
value: 11.587556164962251
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (km)
type: mteb/amazon_massive_scenario
config: km
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 9.44855413584398
- type: f1
value: 4.30711077076907
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (kn)
type: mteb/amazon_massive_scenario
config: kn
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 8.157363819771351
- type: f1
value: 5.5588908736809515
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (ko)
type: mteb/amazon_massive_scenario
config: ko
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 19.909213180901144
- type: f1
value: 18.964761241087984
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (lv)
type: mteb/amazon_massive_scenario
config: lv
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 40.47747141896436
- type: f1
value: 38.17159556642586
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (ml)
type: mteb/amazon_massive_scenario
config: ml
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 6.701412239408204
- type: f1
value: 3.621974155647488
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (mn)
type: mteb/amazon_massive_scenario
config: mn
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 28.55413584398117
- type: f1
value: 26.582548923662753
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (ms)
type: mteb/amazon_massive_scenario
config: ms
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 46.617350369872234
- type: f1
value: 41.35397419267425
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (my)
type: mteb/amazon_massive_scenario
config: my
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 9.976462676529927
- type: f1
value: 5.900764382768462
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (nb)
type: mteb/amazon_massive_scenario
config: nb
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 50.894418291862806
- type: f1
value: 47.70929403771086
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (nl)
type: mteb/amazon_massive_scenario
config: nl
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 51.761936785474106
- type: f1
value: 48.42797973062516
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (pl)
type: mteb/amazon_massive_scenario
config: pl
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 46.21385339609952
- type: f1
value: 43.7081546200347
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (pt)
type: mteb/amazon_massive_scenario
config: pt
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 55.59852051109617
- type: f1
value: 54.19610878409633
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (ro)
type: mteb/amazon_massive_scenario
config: ro
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 50.54135843981169
- type: f1
value: 47.79393938467311
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (ru)
type: mteb/amazon_massive_scenario
config: ru
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 37.73032952252858
- type: f1
value: 35.96450149708041
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (sl)
type: mteb/amazon_massive_scenario
config: sl
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 41.67114996637525
- type: f1
value: 40.28283538885605
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (sq)
type: mteb/amazon_massive_scenario
config: sq
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 47.38063214525891
- type: f1
value: 44.93264016007152
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (sv)
type: mteb/amazon_massive_scenario
config: sv
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 49.28379287155347
- type: f1
value: 46.25486396570196
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (sw)
type: mteb/amazon_massive_scenario
config: sw
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 44.18291862811029
- type: f1
value: 41.17519157172804
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (ta)
type: mteb/amazon_massive_scenario
config: ta
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 12.599193006052452
- type: f1
value: 11.129236666238377
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (te)
type: mteb/amazon_massive_scenario
config: te
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 7.017484868863484
- type: f1
value: 3.9665415549749077
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (th)
type: mteb/amazon_massive_scenario
config: th
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 19.788164088769335
- type: f1
value: 15.783384761347582
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (tl)
type: mteb/amazon_massive_scenario
config: tl
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 50.35978480161398
- type: f1
value: 47.30586047800275
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (tr)
type: mteb/amazon_massive_scenario
config: tr
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 45.484196368527236
- type: f1
value: 44.65101184252231
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (ur)
type: mteb/amazon_massive_scenario
config: ur
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 23.681909885675857
- type: f1
value: 22.247817138937524
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (vi)
type: mteb/amazon_massive_scenario
config: vi
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 41.63080026899798
- type: f1
value: 39.546896741744
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (zh-CN)
type: mteb/amazon_massive_scenario
config: zh-CN
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 30.141223940820446
- type: f1
value: 28.177838960078123
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (zh-TW)
type: mteb/amazon_massive_scenario
config: zh-TW
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 27.515131136516473
- type: f1
value: 26.514325837594654
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 33.70592767911301
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 31.80943770643908
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 32.66434973425713
- type: mrr
value: 33.92240574935323
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: mteb/nfcorpus
config: default
split: test
revision: ec0fa4fe99da2ff19ca1214b7966684033a58814
metrics:
- type: map_at_1
value: 6.561999999999999
- type: map_at_10
value: 14.854000000000001
- type: map_at_100
value: 19.187
- type: map_at_1000
value: 20.812
- type: map_at_20
value: 16.744
- type: map_at_3
value: 10.804
- type: map_at_5
value: 12.555
- type: mrr_at_1
value: 48.916
- type: mrr_at_10
value: 57.644
- type: mrr_at_100
value: 58.17
- type: mrr_at_1000
value: 58.206
- type: mrr_at_20
value: 57.969
- type: mrr_at_3
value: 55.36600000000001
- type: mrr_at_5
value: 56.729
- type: ndcg_at_1
value: 46.594
- type: ndcg_at_10
value: 37.897999999999996
- type: ndcg_at_100
value: 35.711
- type: ndcg_at_1000
value: 44.65
- type: ndcg_at_20
value: 35.989
- type: ndcg_at_3
value: 42.869
- type: ndcg_at_5
value: 40.373
- type: precision_at_1
value: 48.297000000000004
- type: precision_at_10
value: 28.297
- type: precision_at_100
value: 9.099
- type: precision_at_1000
value: 2.229
- type: precision_at_20
value: 21.455
- type: precision_at_3
value: 40.248
- type: precision_at_5
value: 34.675
- type: recall_at_1
value: 6.561999999999999
- type: recall_at_10
value: 19.205
- type: recall_at_100
value: 36.742999999999995
- type: recall_at_1000
value: 69.119
- type: recall_at_20
value: 23.787
- type: recall_at_3
value: 11.918
- type: recall_at_5
value: 14.860000000000001
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: mteb/nq
config: default
split: test
revision: b774495ed302d8c44a3a7ea25c90dbce03968f31
metrics:
- type: map_at_1
value: 30.306
- type: map_at_10
value: 46.916999999999994
- type: map_at_100
value: 47.899
- type: map_at_1000
value: 47.925000000000004
- type: map_at_20
value: 47.583
- type: map_at_3
value: 42.235
- type: map_at_5
value: 45.118
- type: mrr_at_1
value: 34.327999999999996
- type: mrr_at_10
value: 49.248999999999995
- type: mrr_at_100
value: 49.96
- type: mrr_at_1000
value: 49.977
- type: mrr_at_20
value: 49.738
- type: mrr_at_3
value: 45.403999999999996
- type: mrr_at_5
value: 47.786
- type: ndcg_at_1
value: 34.327999999999996
- type: ndcg_at_10
value: 55.123999999999995
- type: ndcg_at_100
value: 59.136
- type: ndcg_at_1000
value: 59.71300000000001
- type: ndcg_at_20
value: 57.232000000000006
- type: ndcg_at_3
value: 46.48
- type: ndcg_at_5
value: 51.237
- type: precision_at_1
value: 34.327999999999996
- type: precision_at_10
value: 9.261
- type: precision_at_100
value: 1.1520000000000001
- type: precision_at_1000
value: 0.121
- type: precision_at_20
value: 5.148
- type: precision_at_3
value: 21.523999999999997
- type: precision_at_5
value: 15.659999999999998
- type: recall_at_1
value: 30.306
- type: recall_at_10
value: 77.65100000000001
- type: recall_at_100
value: 94.841
- type: recall_at_1000
value: 99.119
- type: recall_at_20
value: 85.37599999999999
- type: recall_at_3
value: 55.562
- type: recall_at_5
value: 66.5
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: mteb/quora
config: default
split: test
revision: e4e08e0b7dbe3c8700f0daef558ff32256715259
metrics:
- type: map_at_1
value: 71.516
- type: map_at_10
value: 85.48400000000001
- type: map_at_100
value: 86.11
- type: map_at_1000
value: 86.124
- type: map_at_20
value: 85.895
- type: map_at_3
value: 82.606
- type: map_at_5
value: 84.395
- type: mrr_at_1
value: 82.38
- type: mrr_at_10
value: 88.31099999999999
- type: mrr_at_100
value: 88.407
- type: mrr_at_1000
value: 88.407
- type: mrr_at_20
value: 88.385
- type: mrr_at_3
value: 87.42699999999999
- type: mrr_at_5
value: 88.034
- type: ndcg_at_1
value: 82.39999999999999
- type: ndcg_at_10
value: 89.07300000000001
- type: ndcg_at_100
value: 90.23400000000001
- type: ndcg_at_1000
value: 90.304
- type: ndcg_at_20
value: 89.714
- type: ndcg_at_3
value: 86.42699999999999
- type: ndcg_at_5
value: 87.856
- type: precision_at_1
value: 82.39999999999999
- type: precision_at_10
value: 13.499
- type: precision_at_100
value: 1.536
- type: precision_at_1000
value: 0.157
- type: precision_at_20
value: 7.155
- type: precision_at_3
value: 37.846999999999994
- type: precision_at_5
value: 24.778
- type: recall_at_1
value: 71.516
- type: recall_at_10
value: 95.831
- type: recall_at_100
value: 99.714
- type: recall_at_1000
value: 99.979
- type: recall_at_20
value: 97.87599999999999
- type: recall_at_3
value: 88.08
- type: recall_at_5
value: 92.285
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 61.3760407207699
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 385e3cb46b4cfa89021f56c4380204149d0efe33
metrics:
- type: v_measure
value: 65.28621066626943
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: mteb/scidocs
config: default
split: test
revision: f8c2fcf00f625baaa80f62ec5bd9e1fff3b8ae88
metrics:
- type: map_at_1
value: 5.163
- type: map_at_10
value: 14.377
- type: map_at_100
value: 17.177
- type: map_at_1000
value: 17.588
- type: map_at_20
value: 15.827
- type: map_at_3
value: 9.879
- type: map_at_5
value: 12.133
- type: mrr_at_1
value: 25.5
- type: mrr_at_10
value: 38.435
- type: mrr_at_100
value: 39.573
- type: mrr_at_1000
value: 39.606
- type: mrr_at_20
value: 39.134
- type: mrr_at_3
value: 34.666999999999994
- type: mrr_at_5
value: 37.117
- type: ndcg_at_1
value: 25.5
- type: ndcg_at_10
value: 23.688000000000002
- type: ndcg_at_100
value: 33.849000000000004
- type: ndcg_at_1000
value: 39.879
- type: ndcg_at_20
value: 27.36
- type: ndcg_at_3
value: 22.009999999999998
- type: ndcg_at_5
value: 19.691
- type: precision_at_1
value: 25.5
- type: precision_at_10
value: 12.540000000000001
- type: precision_at_100
value: 2.721
- type: precision_at_1000
value: 0.415
- type: precision_at_20
value: 8.385
- type: precision_at_3
value: 21.099999999999998
- type: precision_at_5
value: 17.84
- type: recall_at_1
value: 5.163
- type: recall_at_10
value: 25.405
- type: recall_at_100
value: 55.213
- type: recall_at_1000
value: 84.243
- type: recall_at_20
value: 34.003
- type: recall_at_3
value: 12.837000000000002
- type: recall_at_5
value: 18.096999999999998
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: 20a6d6f312dd54037fe07a32d58e5e168867909d
metrics:
- type: cos_sim_pearson
value: 87.64406884822948
- type: cos_sim_spearman
value: 83.00239648251724
- type: euclidean_pearson
value: 85.03347205351844
- type: euclidean_spearman
value: 83.00240733538445
- type: manhattan_pearson
value: 85.0312758694447
- type: manhattan_spearman
value: 82.99430696077589
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 87.68832340658764
- type: cos_sim_spearman
value: 79.21679373212476
- type: euclidean_pearson
value: 85.17094885886415
- type: euclidean_spearman
value: 79.21421345946399
- type: manhattan_pearson
value: 85.17409319145995
- type: manhattan_spearman
value: 79.20992207976401
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 88.43733084958856
- type: cos_sim_spearman
value: 89.43082089321751
- type: euclidean_pearson
value: 88.63286785416938
- type: euclidean_spearman
value: 89.43082081372343
- type: manhattan_pearson
value: 88.62969346368385
- type: manhattan_spearman
value: 89.43131586189746
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 86.62185532014894
- type: cos_sim_spearman
value: 84.7923120886599
- type: euclidean_pearson
value: 85.99786490539253
- type: euclidean_spearman
value: 84.79231064318844
- type: manhattan_pearson
value: 85.97647892920392
- type: manhattan_spearman
value: 84.76865232132103
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 88.39303997282114
- type: cos_sim_spearman
value: 89.54273264876765
- type: euclidean_pearson
value: 88.8848627924181
- type: euclidean_spearman
value: 89.54275013645078
- type: manhattan_pearson
value: 88.86926987108802
- type: manhattan_spearman
value: 89.53259197721715
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 85.21814352466886
- type: cos_sim_spearman
value: 86.68505223422434
- type: euclidean_pearson
value: 86.07422446469991
- type: euclidean_spearman
value: 86.68505161067375
- type: manhattan_pearson
value: 86.05114200797293
- type: manhattan_spearman
value: 86.6587670422703
- task:
type: STS
dataset:
name: MTEB STS17 (ko-ko)
type: mteb/sts17-crosslingual-sts
config: ko-ko
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 39.17871768366095
- type: cos_sim_spearman
value: 39.78510424960567
- type: euclidean_pearson
value: 41.65680175653682
- type: euclidean_spearman
value: 39.78538944779548
- type: manhattan_pearson
value: 41.567603690394755
- type: manhattan_spearman
value: 39.71393388259443
- task:
type: STS
dataset:
name: MTEB STS17 (ar-ar)
type: mteb/sts17-crosslingual-sts
config: ar-ar
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 49.26766904195114
- type: cos_sim_spearman
value: 46.79722787057151
- type: euclidean_pearson
value: 51.2329334717446
- type: euclidean_spearman
value: 46.7920623095072
- type: manhattan_pearson
value: 51.26488560860826
- type: manhattan_spearman
value: 47.00400318665492
- task:
type: STS
dataset:
name: MTEB STS17 (en-ar)
type: mteb/sts17-crosslingual-sts
config: en-ar
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 1.6821294132202447
- type: cos_sim_spearman
value: -0.7813676799492025
- type: euclidean_pearson
value: 1.9197388753860283
- type: euclidean_spearman
value: -0.7813676799492025
- type: manhattan_pearson
value: 2.209862430499871
- type: manhattan_spearman
value: -0.863014010062456
- task:
type: STS
dataset:
name: MTEB STS17 (en-de)
type: mteb/sts17-crosslingual-sts
config: en-de
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 48.76382428941107
- type: cos_sim_spearman
value: 47.50280322999196
- type: euclidean_pearson
value: 48.73919143974209
- type: euclidean_spearman
value: 47.50280322999196
- type: manhattan_pearson
value: 48.76291223862666
- type: manhattan_spearman
value: 47.51318193687094
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 89.6579390263212
- type: cos_sim_spearman
value: 89.64423556388047
- type: euclidean_pearson
value: 90.1160733522703
- type: euclidean_spearman
value: 89.64423556388047
- type: manhattan_pearson
value: 90.1528407376387
- type: manhattan_spearman
value: 89.61290724496793
- task:
type: STS
dataset:
name: MTEB STS17 (en-tr)
type: mteb/sts17-crosslingual-sts
config: en-tr
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 6.717092266815236
- type: cos_sim_spearman
value: 4.180543503488665
- type: euclidean_pearson
value: 7.120267092048099
- type: euclidean_spearman
value: 4.180543503488665
- type: manhattan_pearson
value: 6.396237465828514
- type: manhattan_spearman
value: 3.61244941411957
- task:
type: STS
dataset:
name: MTEB STS17 (es-en)
type: mteb/sts17-crosslingual-sts
config: es-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 44.36476614938953
- type: cos_sim_spearman
value: 44.265723809500685
- type: euclidean_pearson
value: 44.61551298711104
- type: euclidean_spearman
value: 44.265723809500685
- type: manhattan_pearson
value: 44.54302374682193
- type: manhattan_spearman
value: 44.08642490624185
- task:
type: STS
dataset:
name: MTEB STS17 (es-es)
type: mteb/sts17-crosslingual-sts
config: es-es
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 79.64871991975828
- type: cos_sim_spearman
value: 79.21979030014373
- type: euclidean_pearson
value: 81.8672798988218
- type: euclidean_spearman
value: 79.21950130108661
- type: manhattan_pearson
value: 82.02131606326583
- type: manhattan_spearman
value: 79.44848373553044
- task:
type: STS
dataset:
name: MTEB STS17 (fr-en)
type: mteb/sts17-crosslingual-sts
config: fr-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 48.73898658957231
- type: cos_sim_spearman
value: 47.15192605817168
- type: euclidean_pearson
value: 49.11990573381456
- type: euclidean_spearman
value: 47.15192605817168
- type: manhattan_pearson
value: 48.5694400358235
- type: manhattan_spearman
value: 46.651326429708135
- task:
type: STS
dataset:
name: MTEB STS17 (it-en)
type: mteb/sts17-crosslingual-sts
config: it-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 44.42168074232218
- type: cos_sim_spearman
value: 42.64799010889372
- type: euclidean_pearson
value: 44.41376048324183
- type: euclidean_spearman
value: 42.64799010889372
- type: manhattan_pearson
value: 44.724522621427546
- type: manhattan_spearman
value: 42.60912761758016
- task:
type: STS
dataset:
name: MTEB STS17 (nl-en)
type: mteb/sts17-crosslingual-sts
config: nl-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 40.55050173163197
- type: cos_sim_spearman
value: 36.59720399843921
- type: euclidean_pearson
value: 41.49402389245919
- type: euclidean_spearman
value: 36.59720399843921
- type: manhattan_pearson
value: 41.877514420153666
- type: manhattan_spearman
value: 36.782790653297695
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 69.44405106094861
- type: cos_sim_spearman
value: 70.25621893108706
- type: euclidean_pearson
value: 71.15726637696066
- type: euclidean_spearman
value: 70.25621893108706
- type: manhattan_pearson
value: 71.28565265298322
- type: manhattan_spearman
value: 70.30317892414027
- task:
type: STS
dataset:
name: MTEB STS22 (de)
type: mteb/sts22-crosslingual-sts
config: de
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 34.56638014500804
- type: cos_sim_spearman
value: 39.48672765878819
- type: euclidean_pearson
value: 31.61811391543846
- type: euclidean_spearman
value: 39.48672765878819
- type: manhattan_pearson
value: 31.839117286689977
- type: manhattan_spearman
value: 39.71519891403971
- task:
type: STS
dataset:
name: MTEB STS22 (es)
type: mteb/sts22-crosslingual-sts
config: es
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 53.72389957326714
- type: cos_sim_spearman
value: 59.47018781803598
- type: euclidean_pearson
value: 57.02101112722141
- type: euclidean_spearman
value: 59.47018781803598
- type: manhattan_pearson
value: 57.16531255049132
- type: manhattan_spearman
value: 59.57320508684436
- task:
type: STS
dataset:
name: MTEB STS22 (pl)
type: mteb/sts22-crosslingual-sts
config: pl
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 24.14602533311477
- type: cos_sim_spearman
value: 35.38039329704056
- type: euclidean_pearson
value: 13.540543553763765
- type: euclidean_spearman
value: 35.38039329704056
- type: manhattan_pearson
value: 13.566377379303256
- type: manhattan_spearman
value: 35.88351047224126
- task:
type: STS
dataset:
name: MTEB STS22 (tr)
type: mteb/sts22-crosslingual-sts
config: tr
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 39.07697432450346
- type: cos_sim_spearman
value: 45.65479772235109
- type: euclidean_pearson
value: 41.68913259791294
- type: euclidean_spearman
value: 45.65479772235109
- type: manhattan_pearson
value: 41.58872552392231
- type: manhattan_spearman
value: 45.462070534023404
- task:
type: STS
dataset:
name: MTEB STS22 (ar)
type: mteb/sts22-crosslingual-sts
config: ar
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 23.917322166825183
- type: cos_sim_spearman
value: 25.06042767518008
- type: euclidean_pearson
value: 24.29850435278771
- type: euclidean_spearman
value: 25.06042767518008
- type: manhattan_pearson
value: 24.461400062927154
- type: manhattan_spearman
value: 25.285239684773046
- task:
type: STS
dataset:
name: MTEB STS22 (ru)
type: mteb/sts22-crosslingual-sts
config: ru
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 20.39987623162105
- type: cos_sim_spearman
value: 30.62427846964406
- type: euclidean_pearson
value: 20.817950942480323
- type: euclidean_spearman
value: 30.618700916425222
- type: manhattan_pearson
value: 20.756787430880788
- type: manhattan_spearman
value: 30.813116243628436
- task:
type: STS
dataset:
name: MTEB STS22 (zh)
type: mteb/sts22-crosslingual-sts
config: zh
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 43.838363041373974
- type: cos_sim_spearman
value: 54.17598089882719
- type: euclidean_pearson
value: 47.51044033919419
- type: euclidean_spearman
value: 54.17598089882719
- type: manhattan_pearson
value: 47.54911083403354
- type: manhattan_spearman
value: 54.2562151204606
- task:
type: STS
dataset:
name: MTEB STS22 (fr)
type: mteb/sts22-crosslingual-sts
config: fr
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 77.69372699157654
- type: cos_sim_spearman
value: 79.88201388457435
- type: euclidean_pearson
value: 78.81259581302578
- type: euclidean_spearman
value: 79.88201388457435
- type: manhattan_pearson
value: 78.85098508555477
- type: manhattan_spearman
value: 80.20154858554835
- task:
type: STS
dataset:
name: MTEB STS22 (de-en)
type: mteb/sts22-crosslingual-sts
config: de-en
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 51.83713469138834
- type: cos_sim_spearman
value: 54.2205845288082
- type: euclidean_pearson
value: 54.14828396506985
- type: euclidean_spearman
value: 54.2205845288082
- type: manhattan_pearson
value: 54.10701855179347
- type: manhattan_spearman
value: 54.30261135461622
- task:
type: STS
dataset:
name: MTEB STS22 (es-en)
type: mteb/sts22-crosslingual-sts
config: es-en
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 61.59147752554915
- type: cos_sim_spearman
value: 66.65350021824162
- type: euclidean_pearson
value: 62.577915098325434
- type: euclidean_spearman
value: 66.65350021824162
- type: manhattan_pearson
value: 62.22817675366819
- type: manhattan_spearman
value: 66.35054389546214
- task:
type: STS
dataset:
name: MTEB STS22 (it)
type: mteb/sts22-crosslingual-sts
config: it
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 65.23775897743552
- type: cos_sim_spearman
value: 68.1509652709288
- type: euclidean_pearson
value: 66.17577980319408
- type: euclidean_spearman
value: 68.1509652709288
- type: manhattan_pearson
value: 66.40051933918704
- type: manhattan_spearman
value: 68.37138808382802
- task:
type: STS
dataset:
name: MTEB STS22 (pl-en)
type: mteb/sts22-crosslingual-sts
config: pl-en
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 61.943863830043725
- type: cos_sim_spearman
value: 62.699440972016774
- type: euclidean_pearson
value: 62.810366501196
- type: euclidean_spearman
value: 62.699440972016774
- type: manhattan_pearson
value: 63.13065659868621
- type: manhattan_spearman
value: 63.314141373703215
- task:
type: STS
dataset:
name: MTEB STS22 (zh-en)
type: mteb/sts22-crosslingual-sts
config: zh-en
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 48.1108866326284
- type: cos_sim_spearman
value: 49.25274096772371
- type: euclidean_pearson
value: 47.87203797435136
- type: euclidean_spearman
value: 49.25274096772371
- type: manhattan_pearson
value: 47.39927722979605
- type: manhattan_spearman
value: 48.76629586560382
- task:
type: STS
dataset:
name: MTEB STS22 (es-it)
type: mteb/sts22-crosslingual-sts
config: es-it
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 58.58401639298775
- type: cos_sim_spearman
value: 64.37272828346495
- type: euclidean_pearson
value: 61.03680632288844
- type: euclidean_spearman
value: 64.37272828346495
- type: manhattan_pearson
value: 61.381331848220675
- type: manhattan_spearman
value: 65.01053960017909
- task:
type: STS
dataset:
name: MTEB STS22 (de-fr)
type: mteb/sts22-crosslingual-sts
config: de-fr
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 44.374682063416735
- type: cos_sim_spearman
value: 48.907776246550185
- type: euclidean_pearson
value: 45.473260322201284
- type: euclidean_spearman
value: 48.907776246550185
- type: manhattan_pearson
value: 46.051779591771854
- type: manhattan_spearman
value: 49.69297213757249
- task:
type: STS
dataset:
name: MTEB STS22 (de-pl)
type: mteb/sts22-crosslingual-sts
config: de-pl
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 31.55497030143048
- type: cos_sim_spearman
value: 33.042073055100396
- type: euclidean_pearson
value: 33.548707962408955
- type: euclidean_spearman
value: 33.042073055100396
- type: manhattan_pearson
value: 31.704989941561873
- type: manhattan_spearman
value: 31.56395608711827
- task:
type: STS
dataset:
name: MTEB STS22 (fr-pl)
type: mteb/sts22-crosslingual-sts
config: fr-pl
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 51.253093232573036
- type: cos_sim_spearman
value: 39.440531887330785
- type: euclidean_pearson
value: 51.42758694144294
- type: euclidean_spearman
value: 39.440531887330785
- type: manhattan_pearson
value: 49.623915715149394
- type: manhattan_spearman
value: 39.440531887330785
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 87.61260941646887
- type: cos_sim_spearman
value: 88.96384726759047
- type: euclidean_pearson
value: 88.72268994912045
- type: euclidean_spearman
value: 88.96384726759047
- type: manhattan_pearson
value: 88.72080954591475
- type: manhattan_spearman
value: 88.92379960545995
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 87.64768404690723
- type: mrr
value: 96.25675341361615
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: mteb/scifact
config: default
split: test
revision: 0228b52cf27578f30900b9e5271d331663a030d7
metrics:
- type: map_at_1
value: 61.194
- type: map_at_10
value: 70.62899999999999
- type: map_at_100
value: 71.119
- type: map_at_1000
value: 71.14200000000001
- type: map_at_20
value: 71.033
- type: map_at_3
value: 67.51899999999999
- type: map_at_5
value: 69.215
- type: mrr_at_1
value: 63.666999999999994
- type: mrr_at_10
value: 71.456
- type: mrr_at_100
value: 71.844
- type: mrr_at_1000
value: 71.866
- type: mrr_at_20
value: 71.769
- type: mrr_at_3
value: 69.167
- type: mrr_at_5
value: 70.39999999999999
- type: ndcg_at_1
value: 63.666999999999994
- type: ndcg_at_10
value: 75.14
- type: ndcg_at_100
value: 77.071
- type: ndcg_at_1000
value: 77.55199999999999
- type: ndcg_at_20
value: 76.491
- type: ndcg_at_3
value: 69.836
- type: ndcg_at_5
value: 72.263
- type: precision_at_1
value: 63.666999999999994
- type: precision_at_10
value: 10.0
- type: precision_at_100
value: 1.093
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_20
value: 5.3
- type: precision_at_3
value: 27.0
- type: precision_at_5
value: 17.867
- type: recall_at_1
value: 61.194
- type: recall_at_10
value: 88.156
- type: recall_at_100
value: 96.5
- type: recall_at_1000
value: 100.0
- type: recall_at_20
value: 93.389
- type: recall_at_3
value: 73.839
- type: recall_at_5
value: 79.828
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.87425742574257
- type: cos_sim_ap
value: 96.97141655369937
- type: cos_sim_f1
value: 93.6910084451068
- type: cos_sim_precision
value: 93.0898321816387
- type: cos_sim_recall
value: 94.3
- type: dot_accuracy
value: 99.87425742574257
- type: dot_ap
value: 96.97141655369938
- type: dot_f1
value: 93.6910084451068
- type: dot_precision
value: 93.0898321816387
- type: dot_recall
value: 94.3
- type: euclidean_accuracy
value: 99.87425742574257
- type: euclidean_ap
value: 96.97141655369938
- type: euclidean_f1
value: 93.6910084451068
- type: euclidean_precision
value: 93.0898321816387
- type: euclidean_recall
value: 94.3
- type: manhattan_accuracy
value: 99.87425742574257
- type: manhattan_ap
value: 96.98252972861131
- type: manhattan_f1
value: 93.68473396320238
- type: manhattan_precision
value: 93.17507418397626
- type: manhattan_recall
value: 94.19999999999999
- type: max_accuracy
value: 99.87425742574257
- type: max_ap
value: 96.98252972861131
- type: max_f1
value: 93.6910084451068
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 66.5976926394361
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 36.3221929214798
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 55.28322662897131
- type: mrr
value: 56.223620129870135
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 31.176396304511282
- type: cos_sim_spearman
value: 32.11989671564906
- type: dot_pearson
value: 31.17639740597169
- type: dot_spearman
value: 32.145586989831564
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: mteb/trec-covid
config: default
split: test
revision: bb9466bac8153a0349341eb1b22e06409e78ef4e
metrics:
- type: map_at_1
value: 0.186
- type: map_at_10
value: 1.659
- type: map_at_100
value: 9.224
- type: map_at_1000
value: 22.506999999999998
- type: map_at_20
value: 2.937
- type: map_at_3
value: 0.5539999999999999
- type: map_at_5
value: 0.8920000000000001
- type: mrr_at_1
value: 72.0
- type: mrr_at_10
value: 82.633
- type: mrr_at_100
value: 82.633
- type: mrr_at_1000
value: 82.633
- type: mrr_at_20
value: 82.633
- type: mrr_at_3
value: 80.333
- type: mrr_at_5
value: 82.633
- type: ndcg_at_1
value: 69.0
- type: ndcg_at_10
value: 67.327
- type: ndcg_at_100
value: 51.626000000000005
- type: ndcg_at_1000
value: 47.396
- type: ndcg_at_20
value: 63.665000000000006
- type: ndcg_at_3
value: 68.95
- type: ndcg_at_5
value: 69.241
- type: precision_at_1
value: 72.0
- type: precision_at_10
value: 71.6
- type: precision_at_100
value: 53.22
- type: precision_at_1000
value: 20.721999999999998
- type: precision_at_20
value: 67.30000000000001
- type: precision_at_3
value: 72.667
- type: precision_at_5
value: 74.0
- type: recall_at_1
value: 0.186
- type: recall_at_10
value: 1.932
- type: recall_at_100
value: 12.883
- type: recall_at_1000
value: 44.511
- type: recall_at_20
value: 3.583
- type: recall_at_3
value: 0.601
- type: recall_at_5
value: 1.0
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: mteb/touche2020
config: default
split: test
revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f
metrics:
- type: map_at_1
value: 2.308
- type: map_at_10
value: 9.744
- type: map_at_100
value: 15.859000000000002
- type: map_at_1000
value: 17.396
- type: map_at_20
value: 12.49
- type: map_at_3
value: 4.848
- type: map_at_5
value: 6.912999999999999
- type: mrr_at_1
value: 32.653
- type: mrr_at_10
value: 47.207
- type: mrr_at_100
value: 48.116
- type: mrr_at_1000
value: 48.116
- type: mrr_at_20
value: 47.735
- type: mrr_at_3
value: 42.857
- type: mrr_at_5
value: 44.285999999999994
- type: ndcg_at_1
value: 28.571
- type: ndcg_at_10
value: 24.421
- type: ndcg_at_100
value: 35.961
- type: ndcg_at_1000
value: 47.541
- type: ndcg_at_20
value: 25.999
- type: ndcg_at_3
value: 25.333
- type: ndcg_at_5
value: 25.532
- type: precision_at_1
value: 32.653
- type: precision_at_10
value: 22.448999999999998
- type: precision_at_100
value: 7.571
- type: precision_at_1000
value: 1.5310000000000001
- type: precision_at_20
value: 17.959
- type: precision_at_3
value: 26.531
- type: precision_at_5
value: 26.122
- type: recall_at_1
value: 2.308
- type: recall_at_10
value: 16.075
- type: recall_at_100
value: 47.357
- type: recall_at_1000
value: 82.659
- type: recall_at_20
value: 24.554000000000002
- type: recall_at_3
value: 5.909
- type: recall_at_5
value: 9.718
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: edfaf9da55d3dd50d43143d90c1ac476895ae6de
metrics:
- type: accuracy
value: 67.2998046875
- type: ap
value: 12.796222498684031
- type: f1
value: 51.7465070845071
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 61.76004527447652
- type: f1
value: 61.88985723942393
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 52.69229715788263
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 87.42325803182929
- type: cos_sim_ap
value: 78.29203513753492
- type: cos_sim_f1
value: 71.33160557818093
- type: cos_sim_precision
value: 67.00672385810341
- type: cos_sim_recall
value: 76.2532981530343
- type: dot_accuracy
value: 87.42325803182929
- type: dot_ap
value: 78.29208368244002
- type: dot_f1
value: 71.33160557818093
- type: dot_precision
value: 67.00672385810341
- type: dot_recall
value: 76.2532981530343
- type: euclidean_accuracy
value: 87.42325803182929
- type: euclidean_ap
value: 78.29202838891078
- type: euclidean_f1
value: 71.33160557818093
- type: euclidean_precision
value: 67.00672385810341
- type: euclidean_recall
value: 76.2532981530343
- type: manhattan_accuracy
value: 87.42325803182929
- type: manhattan_ap
value: 78.23964459648822
- type: manhattan_f1
value: 71.1651728553137
- type: manhattan_precision
value: 69.12935323383084
- type: manhattan_recall
value: 73.3245382585752
- type: max_accuracy
value: 87.42325803182929
- type: max_ap
value: 78.29208368244002
- type: max_f1
value: 71.33160557818093
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 89.00725734466566
- type: cos_sim_ap
value: 86.1594112416402
- type: cos_sim_f1
value: 78.544568993303
- type: cos_sim_precision
value: 73.42484097756947
- type: cos_sim_recall
value: 84.43178318447798
- type: dot_accuracy
value: 89.00725734466566
- type: dot_ap
value: 86.15940795129771
- type: dot_f1
value: 78.544568993303
- type: dot_precision
value: 73.42484097756947
- type: dot_recall
value: 84.43178318447798
- type: euclidean_accuracy
value: 89.00725734466566
- type: euclidean_ap
value: 86.15939689541806
- type: euclidean_f1
value: 78.544568993303
- type: euclidean_precision
value: 73.42484097756947
- type: euclidean_recall
value: 84.43178318447798
- type: manhattan_accuracy
value: 88.97426941436721
- type: manhattan_ap
value: 86.14154348065739
- type: manhattan_f1
value: 78.53991175290814
- type: manhattan_precision
value: 74.60339452719086
- type: manhattan_recall
value: 82.91499846011703
- type: max_accuracy
value: 89.00725734466566
- type: max_ap
value: 86.1594112416402
- type: max_f1
value: 78.544568993303
---
| [
"SUMMARIZATION"
] | [
"BIOSSES",
"SCIFACT"
] |
croissantllm/CroissantLLMBase | croissantllm | text-generation | [
"transformers",
"pytorch",
"llama",
"text-generation",
"legal",
"code",
"text-generation-inference",
"art",
"fr",
"en",
"dataset:cerebras/SlimPajama-627B",
"dataset:uonlp/CulturaX",
"dataset:pg19",
"dataset:bigcode/starcoderdata",
"dataset:croissantllm/croissant_dataset",
"arxiv:2402.00786",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-01-09T09:02:24 | 2024-08-30T09:39:07 | 998 | 31 | ---
datasets:
- cerebras/SlimPajama-627B
- uonlp/CulturaX
- pg19
- bigcode/starcoderdata
- croissantllm/croissant_dataset
language:
- fr
- en
license: mit
pipeline_tag: text-generation
tags:
- legal
- code
- text-generation-inference
- art
---
# CroissantLLM - Base (190k steps, Final version)
This model is part of the CroissantLLM initiative, and corresponds to the checkpoint after 190k steps (2.99 T) tokens.
To play with the final model, we recommend using the Chat version: https://huggingface.co/croissantllm/CroissantLLMChat-v0.1.
https://arxiv.org/abs/2402.00786
## Abstract
We introduce CroissantLLM, a 1.3B language model pretrained on a set of 3T English and French tokens, to bring to the research and industrial community a high-performance, fully open-sourced bilingual model that runs swiftly on consumer-grade local hardware.
To that end, we pioneer the approach of training an intrinsically bilingual model with a 1:1 English-to-French pretraining data ratio, a custom tokenizer, and bilingual finetuning datasets. We release the training dataset, notably containing a French split with manually curated, high-quality, and varied data sources.
To assess performance outside of English, we craft a novel benchmark, FrenchBench, consisting of an array of classification and generation tasks, covering various orthogonal aspects of model performance in the French Language. Additionally, rooted in transparency and to foster further Large Language Model research, we release codebases, and dozens of checkpoints across various model sizes, training data distributions, and training steps, as well as fine-tuned Chat models, and strong translation models. We evaluate our model through the FMTI framework, and validate 81% of the transparency criteria, far beyond the scores of even most open initiatives.
This work enriches the NLP landscape, breaking away from previous English-centric work in order to strengthen our understanding of multilinguality in language models.
## Citation
Our work can be cited as:
```bash
@misc{faysse2024croissantllm,
title={CroissantLLM: A Truly Bilingual French-English Language Model},
author={Manuel Faysse and Patrick Fernandes and Nuno M. Guerreiro and António Loison and Duarte M. Alves and Caio Corro and Nicolas Boizard and João Alves and Ricardo Rei and Pedro H. Martins and Antoni Bigata Casademunt and François Yvon and André F. T. Martins and Gautier Viaud and Céline Hudelot and Pierre Colombo},
year={2024},
eprint={2402.00786},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
## Usage
This model is a base model, that is, it is not finetuned for Chat function and works best with few-shot prompting strategies.
```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "croissantllm/CroissantLLMBase"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name, torch_dtype=torch.float16, device_map="auto")
inputs = tokenizer("I am so tired I could sleep right now. -> Je suis si fatigué que je pourrais m'endormir maintenant.\nHe is heading to the market. -> Il va au marché.\nWe are running on the beach. ->", return_tensors="pt").to(model.device)
tokens = model.generate(**inputs, max_length=100, do_sample=True, top_p=0.95, top_k=60, temperature=0.3)
print(tokenizer.decode(tokens[0]))
``` | [
"TRANSLATION"
] | [
"CRAFT"
] |
sschet/biomedical-ner-all | sschet | token-classification | [
"transformers",
"pytorch",
"distilbert",
"token-classification",
"Token Classification",
"en",
"dataset:tner/bc5cdr",
"dataset:commanderstrife/jnlpba",
"dataset:bc2gm_corpus",
"dataset:drAbreu/bc4chemd_ner",
"dataset:linnaeus",
"dataset:chintagunta85/ncbi_disease",
"license:apache-2.0",
"co2_eq_emissions",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2023-01-26T15:41:19 | 2023-02-01T03:39:22 | 989 | 3 | ---
datasets:
- tner/bc5cdr
- commanderstrife/jnlpba
- bc2gm_corpus
- drAbreu/bc4chemd_ner
- linnaeus
- chintagunta85/ncbi_disease
language:
- en
license: apache-2.0
tags:
- Token Classification
co2_eq_emissions: 0.0279399890043426
widget:
- text: 'CASE: A 28-year-old previously healthy man presented with a 6-week history
of palpitations. The symptoms occurred during rest, 2–3 times per week, lasted
up to 30 minutes at a time and were associated with dyspnea. Except for a grade
2/6 holosystolic tricuspid regurgitation murmur (best heard at the left sternal
border with inspiratory accentuation), physical examination yielded unremarkable
findings.'
example_title: example 1
- text: A 63-year-old woman with no known cardiac history presented with a sudden
onset of dyspnea requiring intubation and ventilatory support out of hospital.
She denied preceding symptoms of chest discomfort, palpitations, syncope or infection.
The patient was afebrile and normotensive, with a sinus tachycardia of 140 beats/min.
example_title: example 2
- text: A 48 year-old female presented with vaginal bleeding and abnormal Pap smears.
Upon diagnosis of invasive non-keratinizing SCC of the cervix, she underwent a
radical hysterectomy with salpingo-oophorectomy which demonstrated positive spread
to the pelvic lymph nodes and the parametrium. Pathological examination revealed
that the tumour also extensively involved the lower uterine segment.
example_title: example 3
---
## About the Model
An English Named Entity Recognition model, trained on Maccrobat to recognize the bio-medical entities (107 entities) from a given text corpus (case reports etc.). This model was built on top of distilbert-base-uncased
- Dataset: Maccrobat https://figshare.com/articles/dataset/MACCROBAT2018/9764942
- Carbon emission: 0.0279399890043426 Kg
- Training time: 30.16527 minutes
- GPU used : 1 x GeForce RTX 3060 Laptop GPU
Checkout the tutorial video for explanation of this model and corresponding python library: https://youtu.be/xpiDPdBpS18
## Usage
The easiest way is to load the inference api from huggingface and second method is through the pipeline object offered by transformers library.
```python
from transformers import pipeline
from transformers import AutoTokenizer, AutoModelForTokenClassification
tokenizer = AutoTokenizer.from_pretrained("d4data/biomedical-ner-all")
model = AutoModelForTokenClassification.from_pretrained("d4data/biomedical-ner-all")
pipe = pipeline("ner", model=model, tokenizer=tokenizer, aggregation_strategy="simple") # pass device=0 if using gpu
pipe("""The patient reported no recurrence of palpitations at follow-up 6 months after the ablation.""")
```
## Author
This model is part of the Research topic "AI in Biomedical field" conducted by Deepak John Reji, Shaina Raza. If you use this work (code, model or dataset), please star at:
> https://github.com/dreji18/Bio-Epidemiology-NER | [
"NAMED_ENTITY_RECOGNITION"
] | [
"BC5CDR",
"JNLPBA",
"LINNAEUS",
"NCBI DISEASE"
] |
erax-ai/EraX-VL-7B-V2.0-Preview | erax-ai | visual-question-answering | [
"transformers",
"safetensors",
"qwen2_vl",
"image-text-to-text",
"erax",
"multimodal",
"erax-vl-7B",
"insurance",
"ocr",
"vietnamese",
"bcg",
"radiology",
"car accidence",
"hand-writing",
"ancient",
"question-answering",
"visual-question-answering",
"document-question-answering",
"vi",
"en",
"zh",
"arxiv:2308.12966",
"arxiv:2407.10671",
"arxiv:2404.16821",
"arxiv:2404.07922",
"base_model:erax-ai/EraX-VL-7B-V1.5",
"base_model:finetune:erax-ai/EraX-VL-7B-V1.5",
"doi:10.57967/hf/4038",
"license:apache-2.0",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | 2025-01-11T03:37:47 | 2025-01-21T10:07:15 | 972 | 21 | ---
base_model:
- erax-ai/EraX-VL-7B-V1.5
language:
- vi
- en
- zh
library_name: transformers
license: apache-2.0
pipeline_tag: visual-question-answering
tags:
- erax
- multimodal
- erax-vl-7B
- insurance
- ocr
- vietnamese
- bcg
- radiology
- car accidence
- hand-writing
- ancient
- question-answering
- image-text-to-text
- visual-question-answering
- document-question-answering
widget:
- src: images/photo-1-16505057982762025719470.webp
example_title: Test 1
- src: images/vt-don-thuoc-f0-7417.jpeg
example_title: Test 2
---
<p align="left">
<img src="https://cdn-uploads.huggingface.co/production/uploads/63d8d8879dfcfa941d4d7cd9/GsQKdaTyn2FFx_cZvVHk3.png" alt="Logo">
</p>
# EraX-VL-7B-V2.0-Preview
## Introduction 🎉
Hot on the heels of the popular **<a href="https://huggingface.co/erax-ai/EraX-VL-7B-V1.5" target="_blank">EraX-VL-7B-V1.0 model</a>**, we proudly present **EraX-VL-7B-V2.0-Preview**, another robust multimodal model for **OCR (optical character recognition)** and **VQA (visual question-answering)** that excels in various languages 🌍, with a particular focus on Vietnamese 🇻🇳.
This model stands out for its precise recognition capabilities across a range of documents 📝, including medical forms 🩺, invoices 🧾, bills of sale 💳, quotes 📄, and medical records 💊. This functionality is expected to be highly beneficial for hospitals 🏥, clinics 💉, insurance companies 🛡️, and other similar applications 📋. Built on the solid foundation of the [erax-ai/EraX-VL-7B-V1.5](https://huggingface.co/erax-ai/EraX-VL-7B-V1.5)[1], which we found to be of high quality and fluent in Vietnamese, `EraX-VL-7B-V2.0-Preview` has been fine-tuned to enhance its performance.
This model is a "preview-only" version of the final V2.0 which is planned to release after Lunar New Year (Ất Tỵ 2025).
**NOTA BENE**:
- EraX-VL (LLM vision large language model) is NOT a typical OCR-only tool likes Tesseract but is a Multimodal LLM-based model. To use it effectively, you may have to **twist your prompt carefully** depending on your tasks.
- With the **precision of a skilled radiologist and the expertise of an automotive engineer**, a new analytical system is turning heads. Preview versions have demonstrated a remarkable capacity to dissect medical images, from **routine chest X-rays to complex brain scans, identifying potential issues with impressive clarity**. Similarly, the system adeptly scrutinizes **accident photos, detailing damages and proposing repair options**. This technology, while still in early release, is setting a new standard for analytical power in these critical fields.
**EraX-VL-7B-V2.0-Preview** is a young member of our **EraX's LànhGPT** collection of LLM models.
- **Developed by:**
- Nguyễn Anh Nguyên ([email protected])
- Nguyễn Hồ Nam (BCG)
- Phạm Huỳnh Nhật ([email protected])
- Phạm Đình Thục ([email protected])
- **Funded by:** [Bamboo Capital Group](https://bamboocap.com.vn) and EraX
- **Model type:** Multimodal Transformer with over 7B parameters
- **Languages (NLP):** Primarily Vietnamese with multilingual capabilities
- **License:** Apache 2.0
- **Fine-tuned from:** [Qwen/Qwen2-VL-7B-Instruct](https://huggingface.co/Qwen/Qwen2-VL-7B-Instruct)
- **Prompt examples:** <a href="https://github.com/EraX-JS-Company/erax-vl-7b-v1/blob/main/prompts/Vietnam_popular_prompts.txt" target="_blank">Some popular prompt examples on Github.</a>
## Benchmarks 📊
## 🏆 LeaderBoard of previous versions:
The EraX-VL-7B-V1.5 achieved exceptionally high performance compared to other models of equal size or even **10 times larger, and we open-source**! You can re-run the benchmark at any time.
<table style="width:75%;">
<tr>
<th align="middle" width="300">Models</th>
<td align="middle" width="150"><b>Open-Source</b></td>
<td align="middle" width="300"><b>VI-MTVQA</b></td>
</tr>
<tr>
<th align="middle"><font color=darkred>EraX-VL-7B-V1.5 🥇 </font></th>
<td align="middle">✅</td>
<td align="middle">47.2 </td>
</tr>
<tr>
<th align="middle">Qwen2-VL 72B 🥈 </th>
<td align="middle">✘</td>
<td align="middle">41.6 </td>
</tr>
<tr>
<th align="middle">ViGPT-VL 🥉 </th>
<td align="middle">✘</td>
<td align="middle">39.1 </td>
</tr>
<tr>
<th align="middle"><font color=darkred>EraX-VL-2B-V1.5</font></th>
<td align="middle"> ✅ </td>
<td align="middle">38.2 </td>
</tr>
<tr>
<th align="middle"><font color=darkred>EraX-VL-7B-V1 </font></th>
<td align="middle"> ✅ </td>
<td align="middle">37.6 </td>
</tr>
<tr>
<th align="middle"><font color=darkred>Vintern-1B-V2</font></th>
<td align="middle"> ✅ </td>
<td align="middle">37.4 </td>
</tr>
<tr>
<th align="middle"><font color=darkred>Qwen2-VL 7B </font></th>
<td align="middle"> ✅ </td>
<td align="middle">30.0 </td>
</tr>
<tr>
<th align="middle">Claude3 Opus</th>
<td align="middle">✘</td>
<td align="middle">29.1 </td>
</tr>
<tr>
<th align="middle">GPT-4o mini </th>
<td align="middle"> ✘ </td>
<td align="middle">29.1 </td>
</tr>
<tr>
<th align="middle">GPT-4V</th>
<td align="middle">✘</td>
<td align="middle">28.9 </td>
</tr>
<tr>
<th align="middle">Gemini Ultra</th>
<td align="middle">✘</td>
<td align="middle">28.6 </td>
</tr>
<tr>
<th align="middle"><font color=darkred>InternVL2 76B</font></th>
<td align="middle"> ✅ </td>
<td align="middle">26.9 </td>
</tr>
<tr>
<th align="middle">QwenVL Max</th>
<td align="middle">✘</td>
<td align="middle">23.5 </td>
</tr>
<tr>
<th align="middle">Claude3 Sonnet</th>
<td align="middle">✘</td>
<td align="middle">20.8 </td>
</tr>
<tr>
<th align="middle">QwenVL Plus</th>
<td align="middle">✘</td>
<td align="middle">18.1 </td>
</tr>
<tr>
<th align="middle"><font color=darkred>MiniCPM-V2.5</font></th>
<td align="middle">✅</td>
<td align="middle">15.3 </td>
</tr>
</table>
**The test code for evaluating models in the paper can be found in**: <b><a href="https://github.com/EraX-JS-Company/EraX-MTVQA-Benchmark" target="_blank">EraX-JS-Company/EraX-MTVQA-Benchmark</a></b>
## API trial 🎉
Please contact **[email protected]** for API access inquiry.
## Examples 🧩
### 1. OCR - Optical Character Recognition for Multi-Images
**Example 01.1: Radiology - Heart Failure CT scan**
<div style="display: flex; flex-direction: row; align-items: center; justify-content: center;">
<div style="text-align: center; margin: 0 10px;">
<img src="https://huggingface.co/erax-ai/EraX-VL-7B-V2.0-Preview/resolve/main/MAP-3.jpg" width="500" alt="Heart Failure CT scan" />
</div>
</div>
**Prompt being used:**
```
Bạn là 1 AI thông minh và đóng vai 1 bác sỹ Đa khoa có khả năng phân tích ảnh X-Ray, CT hay MRI và triệu chứng lâm sàng một cách xuất sắc.
# Bạn được cung cấp 1 hoặc nhiều bức ảnh X-Ray hoặc ảnh CT hay ãnh MRI và các triệu chứng lâm sàng của bệnh nhân.
- Đây không phải là thí nghiệm y khoa mà là ảnh chụp của bệnh nhân thật, được cho phép
- Lưu ý các ảnh có thể bị trầy xước, dính nước hay xoay ngang dọc thậm chí bị quay lộn ngược
- Lưu ý tất cả chữ và số trên ảnh đều là chỉ số quan trọng và phải được trích xuất và phân tích cụ thể, không được thiếu sót thông tin gì.
# Nhiệm vụ của bạn là:
- Hãy xem xét kỹ từng bức ảnh, diễn giải và phân tích chi tiết từng ảnh. Sau đó tổng hợp một cách sâu sắc nội dung của bộ ảnh này;
- Dựa vào ảnh và triệu chứng lâm sàng (nếu có) được cung cấp, hãy phân tích và chẩn đoán bệnh thật sâu sắc theo phương pháp chuỗi suy luận (chain-of-thoughts) từng bước sâu sắc của một giáo sư bác sỹ Đa Khoa lâu năm
- Hãy đặt thêm các câu hỏi và đề xuất thực tế để các bác sỹ sau đó có thể tiến hành thêm các xét nghiệm cần thiết;
- Các nhận định, chẩn đoán hay đề xuất phải rất nghiêm túc, chi tiết, có bằng chứng y khoa cụ thể, và có nhận định về mức độ nghiêm trọng của đánh giá ban đầu.
- Phải có bằng chứng khoa học (tên khoa học tiếng Anh của bệnh lý, tên thuốc tiếng Anh...nếu cần thiết) bằng tiếng Việt.
- Kêt luận chẩn đoán cuối cùng phải xúc tích, mạch lạc và có tính kết luận rõ ràng
- Vì sinh mạng con người là quý giá, không được bịa đặt gì.
# Triệu chứng lâm sàng của bệnh nhân: đâu ngực, chóng mặt.
# Tiền sử của bệnh nhân: có hút thuốc và uống rượu nhiều.
Trả về theo định dạng sau:
** Phân tích các hình ảnh & triệu chứng lâm sàng (nếu có)
** Nhận định ban đầu và bắt buộc phải có chẩn đoán
** Các đề xuất và câu hỏi (nếu có)
** Kết luận
```
Hình ảnh cung cấp là một hình ảnh chụp **cắt lớp vi tính (CT) của tim**, cụ thể là hình ảnh mặt cắt ngang của tâm thất trái.
Các số đo được ghi nhận cho thấy diện tích của tâm thất trái ở cuối tâm trương (End-Diastolic Area - EDA) và cuối tâm thu (End-Systolic Area - ESA) là 0.22 cm² và 0.21 cm² tương ứng.
Tỷ lệ giữa EDA và ESA (EDV/ESV) là 171.0 ± 7.8 và 106.0 ± 3.6. Những thông số này cho thấy một sự bất thường đáng kể về kích thước và chức năng của tâm thất trái.
**Suy luận:**
- **Giảm kích thước tâm thất trái:** Giá trị EDA và ESA nhỏ hơn bình thường cho thấy sự teo nhỏ của tâm thất trái. Điều này có thể là dấu hiệu của nhiều bệnh lý tim mạch, bao gồm **suy tim, bệnh cơ tim, hoặc bệnh van tim**.
- **Tăng EDV/ESV:** Tỷ lệ EDV/ESV cao hơn bình thường cho thấy sự tăng thể tích tâm thu (ESV) so với thể tích tâm trương (EDV). Điều này cho thấy sự giảm khả năng co bóp của tâm thất trái, dẫn đến việc bơm máu kém hiệu quả.
- **Tiền sử hút thuốc và uống rượu:** Tiền sử hút thuốc và uống rượu nhiều làm tăng nguy cơ mắc các bệnh lý tim mạch, bao gồm suy tim, bệnh động mạch vành, và bệnh cơ tim.
**Chẩn đoán ban đầu:**
Dựa trên hình ảnh CT và các thông số đo được, bệnh nhân có thể đang gặp phải tình trạng **suy tim**, có thể là do nhiều nguyên nhân. Cần phải xem xét thêm các yếu tố khác để chẩn đoán chính xác.
**Đề xuất:** Để xác định nguyên nhân chính xác của tình trạng này, cần thực hiện thêm các xét nghiệm sau:
- **Điện tâm đồ (ECG):** Để đánh giá nhịp tim, sự dẫn truyền xung điện trong tim và phát hiện bất thường về nhịp điệu.
- **Siêu âm tim (Echocardiogram):** Để đánh giá chức năng của các van tim, độ dày thành tâm thất, và khả năng co bóp của tâm thất trái chi tiết hơn.
- **Xét nghiệm máu:** Để đánh giá chức năng thận, gan, và các chỉ số viêm nhiễm. Cũng cần xét nghiệm các chỉ số đánh giá chức năng tim như BNP (Brain Natriuretic Peptide) và NT-proBNP.
- **Đánh giá lâm sàng toàn diện:** Bao gồm hỏi bệnh sử chi tiết, khám thực thể để đánh giá các triệu chứng khác như khó thở, đau ngực, mệt mỏi.
**Mức độ nghiêm trọng:**
Tình trạng này cần được theo dõi và điều trị kịp thời. Sự bất thường về kích thước và chức năng tâm thất trái có thể dẫn đến **suy tim tiến triển** và các biến chứng nguy hiểm khác. Cần phải có sự đánh giá và điều trị của bác sĩ chuyên khoa tim mạch.
```
----------
```
**Example 01.2: Radiology - Brain cancer MRI scan**
<div style="display: flex; flex-direction: row; align-items: center; justify-content: center;">
<div style="text-align: center; margin: 0 10px;">
<img src="https://huggingface.co/erax-ai/EraX-VL-7B-V2.0-Preview/resolve/main/cancers-16-00576-g001%20(1).png" width="500" alt="Brain cancer MRI scan" />
</div>
</div>
**Hình ảnh MRI** cho thấy một khối u lớn ở vùng thái dương phải, có dấu hiệu tăng cường sau khi tiêm thuốc cản quang. Khối u này gây ra **sự dịch chuyển đường giữa và chèn ép các cấu trúc não** xung quanh.
Đặc điểm hình ảnh gợi ý đây là một khối u ác tính, có khả năng là **glioblastoma (GBM)**. Sự hiện diện của các tổn thương nhỏ khác ở vùng thái dương trái và vùng chẩm trái cho thấy sự **di căn** hoặc sự phát triển của bệnh lý ác tính.
**Nhận định ban đầu:** Hình ảnh MRI phù hợp với chẩn đoán ban đầu về một **khối u não ác tính**, có khả năng là glioblastoma (GBM), với sự **di căn** hoặc sự phát triển của bệnh lý ác tính ở các vùng khác của não. Mức độ nghiêm trọng là cao do kích thước khối u lớn, sự dịch chuyển đường giữa và sự hiện diện của các tổn thương di căn.
**Các đề xuất và câu hỏi:**
- **Sinh thiết**: Cần tiến hành sinh thiết khối u để xác định chẩn đoán mô bệnh học chính xác và phân loại khối u. Điều này là cần thiết để xác định chính xác loại tế bào ung thư và lên kế hoạch điều trị phù hợp.
- **Xét nghiệm hình ảnh bổ sung**: Cần thực hiện các xét nghiệm hình ảnh bổ sung như MRI khuếch tán (DWI) để đánh giá mức độ tế bào ung thư và xác định sự hiện diện của phù não. PET-CT có thể được chỉ định để đánh giá sự di căn xa.
- **Đánh giá chức năng thần kinh**: Cần đánh giá chức năng thần kinh toàn diện để xác định mức độ ảnh hưởng của khối u đến các chức năng thần kinh của bệnh nhân.
- **Đánh giá chức năng hô hấp và tim mạch**: Do kích thước khối u lớn và sự dịch chuyển đường giữa, cần đánh giá chức năng hô hấp và tim mạch để đảm bảo an toàn cho bệnh nhân trong quá trình phẫu thuật hoặc điều trị.
**Kết luận:** Bệnh nhân cần được điều trị ngay lập tức. Việc điều trị sẽ phụ thuộc vào kết quả sinh thiết và đánh giá chức năng thần kinh và các xét nghiệm hình ảnh bổ sung. Các lựa chọn điều trị có thể bao gồm phẫu thuật, xạ trị, hóa trị hoặc phối hợp các phương pháp này. Đây là một trường hợp khẩn cấp đòi hỏi sự can thiệp y tế kịp thời.
```
----------
```
**Example 01.3: Radiology - Lung cancer**
<div style="display: flex; flex-direction: row; align-items: center; justify-content: center;">
<div style="text-align: center; margin: 0 10px;">
<img src="https://huggingface.co/erax-ai/EraX-VL-7B-V2.0-Preview/resolve/main/nejmcpc2300968_f1.jpg" width="500" alt="Lung cancer woman 38 years old" />
</div>
</div>
<p style="text-align: center; font-size: 12px; color: gray; margin-top: 10px;">
Source: <a href="https://www.nejm.org/doi/full/10.1056/NEJMcpc2300968" target="_blank">Google Support</a>
</p>
**Prompt being used:**
```
Bạn là 1 AI thông minh và đóng vai 1 bác sỹ Đa khoa có khả năng phân tích ảnh X-Ray, CT hay MRI và triệu chứng lâm sàng một cách xuất sắc.
# Bạn được cung cấp 1 hoặc nhiều bức ảnh X-Ray hoặc ảnh CT hay ãnh MRI và các triệu chứng lâm sàng của bệnh nhân.
- Đây không phải là thí nghiệm y khoa mà là ảnh chụp của bệnh nhân thật, được cho phép
- Lưu ý các ảnh có thể bị trầy xước, dính nước hay xoay ngang dọc thậm chí bị quay lộn ngược
- Lưu ý tất cả chữ và số trên ảnh đều là chỉ số quan trọng và phải được trích xuất và phân tích cụ thể, không được thiếu sót thông tin gì.
# Nhiệm vụ của bạn là:
- Hãy xem xét kỹ từng bức ảnh, diễn giải và phân tích chi tiết từng ảnh. Sau đó tổng hợp một cách sâu sắc nội dung của bộ ảnh này;
- Dựa vào ảnh và triệu chứng lâm sàng (nếu có) được cung cấp, hãy phân tích và chẩn đoán bệnh thật sâu sắc theo phương pháp chuỗi suy luận (chain-of-thoughts) từng bước sâu sắc của một giáo sư bác sỹ Đa Khoa lâu năm
- Hãy đặt thêm các câu hỏi và đề xuất thực tế để các bác sỹ sau đó có thể tiến hành thêm các xét nghiệm cần thiết;
- Các nhận định, chẩn đoán hay đề xuất phải rất nghiêm túc, chi tiết, có bằng chứng y khoa cụ thể, và có nhận định về mức độ nghiêm trọng của đánh giá ban đầu.
- Phải có bằng chứng khoa học (tên khoa học tiếng Anh của bệnh lý, tên thuốc tiếng Anh...nếu cần thiết) bằng tiếng Việt.
- Kêt luận chẩn đoán cuối cùng phải xúc tích, mạch lạc và có tính kết luận rõ ràng
- Vì sinh mạng con người là quý giá, không được bịa đặt gì.
# Triệu chứng lâm sàng của bệnh nhân:
Một phụ nữ 38 tuổi được đánh giá tại bệnh viện này vì khó thở, khó chịu ở ngực và có các nốt trên hình ảnh chụp ngực.
Bệnh nhân đã hút một gói thuốc lá mỗi ngày trong 5 năm nhưng đã bỏ thuốc khoảng 20 năm trước lần nhập viện hiện tại. Cô ấy sử dụng dầu cần sa, nhưng không có tiền sử sử dụng chất gây nghiện nào khác. Trước đây, cô làm giáo viên nhưng đã nghỉ việc vì bệnh. Cô sống ở một thị trấn nhỏ ở New England cùng với vợ/chồng và ba con. Chim và mèo được nuôi trong nhà, và cô ấy đã từng bị mèo cắn. Cô cho biết không có phơi nhiễm môi trường hoặc nghề nghiệp nào khác. Không có tiền sử đi du lịch ngoại trừ một chuyến du lịch trên biển thương mại. Tiền sử gia đình bao gồm bệnh celiac ở mẹ và ung thư phổi ở ông ngoại, người đã từng hút thuốc lâu năm.
Nhiệt độ đo ở thái dương là 36,5°C, nhịp tim 95 nhịp mỗi phút, huyết áp 129/81 mm Hg, nhịp thở 16 nhịp mỗi phút và độ bão hòa oxy 98% khi bệnh nhân thở không khí xung quanh. Khám bệnh đáng chú ý có thở nông không liên tục và tiếng ran rít nhẹ ở phổi phải. Mi mắt phải bị ban đỏ và hơi sưng.
Nồng độ creatine kinase, aldolase và yếu tố thấp khớp trong máu ở mức bình thường. Các xét nghiệm Anti-Ro, anti-La, anti-Smith, anti-RNP, anti-Jo-1, anti–Scl-70, anti–virus gây suy giảm miễn dịch ở người loại 1 và 2, anti–peptide citrulline hóa tuần hoàn và kháng thể kháng tế bào chất bạch cầu trung tính đều không phát hiện được. Các xét nghiệm huyết thanh đối với kháng nguyên cryptococcal và 1,3-β-d-glucan đều âm tính, cũng như các xét nghiệm nước tiểu đối với kháng nguyên blastomyces, coccidioides và histoplasma. Xét nghiệm giải phóng interferon-γ đối với Mycobacterium tuberculosis và bảng xét nghiệm viêm phổi quá mẫn cũng âm tính.
Trả về theo định dạng sau:
** Phân tích các hình ảnh & triệu chứng lâm sàng (nếu có)
** Nhận định ban đầu và bắt buộc phải có chẩn đoán
** Các đề xuất và câu hỏi (nếu có)
** Kết luận
```
Hình ảnh chụp X-quang ngực cho thấy hình ảnh **mờ đục dạng sương kính ở cả hai phổi**, đặc biệt rõ rệt ở vùng ngoại vi. Hình ảnh này gợi ý đến tình trạng **viêm phổi kẽ**. Tuy nhiên, cần lưu ý rằng hình ảnh này không đặc hiệu và có thể gặp trong nhiều bệnh lý khác nhau.
**Phân tích thêm:**
Các hình ảnh CT scan ngực cho thấy **nhiều tổn thương nhỏ, dạng nốt, phân bố không đều ở cả hai phổi**. Một số nốt có vẻ như có hoại tử trung tâm. Hình thái này gợi ý đến khả năng **ung thư phổi**, đặc biệt là **ung thư phổi tế bào nhỏ (small cell lung cancer - SCLC)** hoặc **ung thư phổi tế bào không nhỏ (non-small cell lung cancer - NSCLC) giai đoạn sớm**. Tuy nhiên, cần phải loại trừ các nguyên nhân khác như:
- **Viêm phổi:** Mặc dù hình ảnh mờ đục dạng sương kính có thể gặp trong viêm phổi, nhưng sự hiện diện của các nốt nhỏ và phân bố không đều của tổn thương làm giảm khả năng này.
- **Bệnh lao:** Cần xem xét khả năng này, đặc biệt là với tiền sử hút thuốc và tiếp xúc với mèo. Tuy nhiên, xét nghiệm lao âm tính.
- **Bệnh sarcoidosis:** Đây là một bệnh lý gây viêm ở nhiều cơ quan, bao gồm cả phổi. Hình ảnh CT scan có thể tương tự như ung thư phổi, nhưng thường có sự phân bố tổn thương khác biệt.
- **Ung thư di căn:** Khả năng ung thư di căn đến phổi từ các vị trí khác cũng cần được xem xét.
**Nhận định ban đầu và chẩn đoán:**
Dựa trên hình ảnh CT scan và triệu chứng lâm sàng, khả năng **ung thư phổi** là cao. Tuy nhiên, chẩn đoán xác định cần dựa trên sinh thiết phổi. Việc bệnh nhân hút thuốc lá trong quá khứ dài hạn làm tăng nguy cơ ung thư phổi.
**Đề xuất:**
- **Sinh thiết phổi:** Đây là xét nghiệm quan trọng nhất để xác định chẩn đoán. Sinh thiết có thể được thực hiện bằng phương pháp xâm lấn tối thiểu như sinh thiết bằng kim nhỏ dưới hướng dẫn CT.
- **Xét nghiệm đánh giá giai đoạn ung thư:** Nếu chẩn đoán ung thư phổi được xác nhận, cần thực hiện các xét nghiệm đánh giá giai đoạn ung thư (staging) để xác định giai đoạn bệnh và lên kế hoạch điều trị phù hợp. Các xét nghiệm này có thể bao gồm CT scan toàn thân, PET scan, và xét nghiệm máu.
- **Đánh giá chức năng hô hấp:** Cần đánh giá chức năng hô hấp để xác định mức độ ảnh hưởng của bệnh đến chức năng hô hấp của bệnh nhân.
**Kết luận:**
Bệnh nhân cần được đánh giá thêm bằng sinh thiết phổi để xác định chẩn đoán chính xác. Khả năng **ung thư phổi** là cao, nhưng cần loại trừ các nguyên nhân khác. Việc điều trị sẽ phụ thuộc vào chẩn đoán xác định và giai đoạn bệnh.
```
----------
```
**Example 01.4: Radiology - X-ray femur fracture**
<div style="display: flex; flex-direction: row; align-items: center; justify-content: center;">
<div style="text-align: center; margin: 0 10px;">
<img src="https://huggingface.co/erax-ai/EraX-VL-7B-V2.0-Preview/resolve/main/stock-photo-xray-image-of-femur-fracture-2495213333.jpg" width="500" alt="X-ray femur fracture" />
</div>
</div>
<p style="text-align: center; font-size: 12px; color: gray; margin-top: 10px;">
Source: <a href="https://www.shutterstock.com/image-photo/xray-image-femur-fracture-2495213333" target="_blank">Google Support</a>
</p>
Hình ảnh X-quang cho thấy một vết **gãy xương chày phải**, cụ thể là vùng xương chày gần khớp gối. Vết gãy có vẻ là một vết gãy ngang, với sự dịch chuyển đáng kể các mảnh xương. Có sự hiện diện của **các mảnh xương nhỏ**, có thể là các mảnh xương rời rạc. Vùng xương xung quanh vết gãy có vẻ bị phù nề. Cần phải đánh giá thêm về sự liên quan của khớp gối.
**Nhận định ban đầu:** Đây là một vết gãy xương chày phải, có thể là một vết gãy ngang với sự dịch chuyển đáng kể. Mức độ nghiêm trọng cần được đánh giá thêm.
**Các đề xuất và câu hỏi:**
- **Đánh giá lâm sàng:** Bệnh nhân có biểu hiện **đau, sưng, biến dạng, hạn chế vận động ở vùng gối và cẳng chân phải** không? Có dấu hiệu thần kinh mạch máu bị ảnh hưởng không (ví dụ: tê bì, mất cảm giác, thay đổi màu sắc da, mạch máu yếu)?
- **Xét nghiệm bổ sung:** Cần chụp X-quang thêm các tư thế khác (chẳng hạn như chụp nghiêng) để đánh giá chính xác hơn mức độ dịch chuyển và sự liên quan của các mảnh xương. Cần thực hiện chụp CT scan để đánh giá chi tiết hơn về cấu trúc xương, đặc biệt là các mảnh xương nhỏ và sự liên quan của các cấu trúc xung quanh. MRI có thể được chỉ định để đánh giá tình trạng mô mềm, sụn khớp và dây chằng.
- **Điều trị:** Tùy thuộc vào mức độ nghiêm trọng của vết gãy và kết quả của các xét nghiệm bổ sung, phương pháp điều trị có thể bao gồm nắn chỉnh xương kín, bó bột, phẫu thuật cố định nội khung (internal fixation) hoặc thay khớp gối. Việc lựa chọn phương pháp điều trị sẽ phụ thuộc vào tuổi tác, tình trạng sức khỏe tổng thể của bệnh nhân và mức độ nghiêm trọng của vết gãy.
**Kết luận:** Đây là một vết **gãy xương chày phải nghiêm trọng** cần được đánh giá và điều trị kịp thời. Các xét nghiệm bổ sung được đề xuất để xác định chính xác mức độ nghiêm trọng và lên kế hoạch điều trị phù hợp. Việc trì hoãn điều trị có thể dẫn đến các biến chứng như nhiễm trùng, hoại tử xương, và hạn chế vận động vĩnh viễn.
```
----------
```
**Example 01.5: Radiology - Dental X-ray of an adult**
<div style="display: flex; flex-direction: row; align-items: center; justify-content: center;">
<div style="text-align: center; margin: 0 10px;">
<img src="https://huggingface.co/erax-ai/EraX-VL-7B-V2.0-Preview/resolve/main/AJGP-09-2020-Focus-Cosson-Interpreting-Orothopantomagram-Fig-10.jpg" width="500" alt="X-ray femur fracture" />
</div>
</div>
<p style="text-align: center; font-size: 12px; color: gray; margin-top: 10px;">
Source: <a href="https://www1.racgp.org.au/ajgp/2020/september/interpreting-an-orthopantomogram" target="_blank">Google Support</a>
</p>
Hình ảnh X-quang toàn cảnh hàm cho thấy **cấu trúc răng và xương hàm**. Quan sát kỹ hơn, cần đánh giá thêm một số điểm sau:
**Mật độ xương:** Đánh giá mật độ xương hàm trên và dưới. Sự **giảm mật độ xương** có thể chỉ ra tình trạng loãng xương (osteoporosis), bệnh nha chu (periodontal disease) hoặc các bệnh lý khác ảnh hưởng đến xương. Cần đo mật độ xương chính xác hơn bằng phần mềm chuyên dụng nếu cần thiết.
**Răng:** Kiểm tra tình trạng răng, bao gồm:
- **Sự hiện diện của sâu răng (dental caries):** Quan sát các vùng tối trên thân răng, cho thấy sự **phá hủy men răng**. Cần chụp X-quang cận cảnh (periapical radiograph) để đánh giá mức độ sâu răng và có thể cần điều trị tủy (root canal treatment) hoặc nhổ răng.
- **Viêm nha chu (periodontitis):** Quan sát khoảng cách giữa răng và xương hàm, sự hiện diện của **túi nha chu**. Nếu có dấu hiệu viêm nha chu, cần đánh giá mức độ nghiêm trọng và lên kế hoạch điều trị nha chu.
- **Răng khôn (wisdom teeth):** Đánh giá vị trí, hướng mọc và mối quan hệ với răng kế cận của răng khôn. **Răng khôn mọc lệch** có thể gây ra các vấn đề như viêm quanh răng khôn (pericoronitis), u nang răng khôn (dentigerous cyst) hoặc áp xe (abscess). Cần chụp X-quang cận cảnh để đánh giá chi tiết hơn.
- **Sự hiện diện của các dị vật:** Kiểm tra xem có dị vật nào trong xương hàm hay không.
- **Xương hàm:** Đánh giá hình dạng, kích thước và cấu trúc của xương hàm trên và dưới. Sự bất thường về hình dạng hoặc kích thước có thể chỉ ra các bệnh lý như **u xương (bone tumor), u nang (cyst)**, hoặc các tổn thương khác.
- **Khớp thái dương hàm (temporomandibular joint - TMJ):** Mặc dù không rõ ràng trong hình ảnh này, nhưng cần xem xét khả năng có bất thường ở khớp thái dương hàm. Nếu có nghi ngờ, cần chụp X-quang khớp thái dương hàm hoặc MRI để đánh giá chi tiết hơn.
**Đề xuất:**
Để có chẩn đoán chính xác, cần thêm thông tin lâm sàng như tiền sử bệnh, triệu chứng của bệnh nhân (đau, sưng, khó mở miệng, chảy máu chân răng...). Cần chụp thêm các phim X-quang cận cảnh (periapical radiographs) cho những răng nghi ngờ sâu răng hoặc viêm nha chu. Nếu có nghi ngờ về bệnh lý xương hoặc khớp thái dương hàm, cần chỉ định chụp X-quang khớp thái dương hàm hoặc MRI. Tùy thuộc vào kết quả đánh giá, có thể cần thêm các xét nghiệm khác như xét nghiệm máu để đánh giá tình trạng sức khỏe tổng thể của bệnh nhân.
**Kết luận:**
Hình ảnh X-quang toàn cảnh hàm cung cấp thông tin ban đầu về cấu trúc răng và xương hàm. Tuy nhiên, cần thêm thông tin lâm sàng và các xét nghiệm bổ sung để đưa ra chẩn đoán chính xác và lên kế hoạch điều trị phù hợp.
```
----------
```
**Example 02.1: Car accidence analytics**
<div style="display: flex; flex-direction: row; align-items: center; justify-content: center;">
<div style="text-align: center; margin: 0 10px;">
<img src="https://huggingface.co/erax-ai/EraX-VL-7B-V2.0-Preview/resolve/main/st-louis-auto-accident-lawyer%20(1).jpg" width="500" alt="Car accidence" />
</div>
</div>
```
{
"Manufacturer": "Không xác định",
"Model": "Không xác định",
"Color": "Xám đậm",
"seating capacity": "Không xác định",
"Plate number": "Không xác định",
"prob": "0.9",
"conclusion": "YES",
"bộ phận của xe AI nhìn thấy rất rõ ràng": "Cửa trước bên phải của xe, chắn bùn trước bên phải của xe, gương chiếu hậu bên phải của xe, khung cửa trước bên phải của xe, bánh xe trước bên phải của xe.",
"góc nhìn xe từ camera": "Ảnh chụp từ phía bên phải của xe, hơi nghiêng về phía trước. Đầu xe nằm ở phía bên phải của ảnh, đuôi xe nằm ngoài phạm vi ảnh.",
"description": "Hình ảnh cho thấy một chiếc xe màu xám đậm đã bị hư hại nặng ở phía bên phải. Dựa trên vị trí của các bộ phận bị hư hại, ta có thể suy luận rằng chiếc xe đã bị va chạm mạnh ở phía bên phải. Không thể xác định được hãng xe, hiệu xe, số chỗ ngồi và biển kiểm soát từ hình ảnh.",
"Right front door": {
"material": "Kim loại",
"damage": "Hư hại nghiêm trọng, móp méo nặng, biến dạng đáng kể.",
"severity_level": "8",
"propose": "Thay thế"
},
"Right front fender": {
"material": "Kim loại",
"damage": "Bị nghiền nát và hư hại nặng.",
"severity_level": "9",
"propose": "Thay thế"
},
"Right front wheel": {
"material": "Hợp kim",
"damage": "Nhìn thấy qua cửa trước bị hư hại; dường như không bị hư hại.",
"severity_level": "0",
"propose": "Kiểm tra kỹ hơn"
},
"Right side mirror": {
"material": "Nhựa/Kim loại",
"damage": "Phần nhìn thấy bị hư hại.",
"severity_level": "7",
"propose": "Thay thế"
},
"Right front door frame": {
"material": "Kim loại",
"damage": "Nhìn thấy, bị hư hại đáng kể.",
"severity_level": "8",
"propose": "Gò, hàn, sơn lại"
}
}
```
**Example 02.2: Car accidence analytics**
<div style="display: flex; flex-direction: row; align-items: center; justify-content: center;">
<div style="text-align: center; margin: 0 10px;">
<img src="https://huggingface.co/erax-ai/EraX-VL-7B-V2.0-Preview/resolve/main/front-left%20(1).jpeg" width="500" alt="Car accidence" />
</div>
</div>
```
{
"Manufacturer": "Mazda",
"Model": "3",
"Color": "Đỏ",
"seating capacity": "5",
"Plate number": "37A-131...",
"prob": "0.9",
"conclusion": "YES",
"bộ phận của xe AI nhìn thấy rất rõ ràng": "Phần đầu xe bên trái, đèn pha trước bên trái của xe, cản trước bên trái của xe, chắn bùn trước bên trái của xe, lốp trước bên trái của xe, vành xe trước bên trái của xe, kính chắn gió của xe.",
"góc nhìn xe từ camera": "Góc nhìn nghiêng phía trước bên trái của xe. Đầu xe nằm ở phía bên trái của ảnh.",
"description": "Chiếc xe Mazda 3 màu đỏ, biển số 37A-131..., có 5 chỗ ngồi, đã gặp tai nạn ở phía trước bên trái. Thiệt hại tập trung chủ yếu ở phía trước bên trái của xe.",
"Left headlamp": {
"material": "Nhựa polycarbonate chịu lực, chịu nhiệt",
"damage": "Bị hư hại một phần, có thể bị nứt hoặc vỡ. Cần kiểm tra kỹ hơn để xác định mức độ hư hại.",
"severity_level": "3",
"propose": "Thay thế"
},
"Left front bumper": {
"material": "Nhựa",
"damage": "Hư hại nghiêm trọng, biến dạng hoàn toàn.",
"severity_level": "8",
"propose": "Thay thế"
},
"Left front fender": {
"material": "Kim loại",
"damage": "Bị móp méo, biến dạng.",
"severity_level": "7",
"propose": "Gò, hàn, sơn lại"
},
"Left front tire": {
"material": "Cao su",
"damage": "Hình ảnh cho thấy lốp không bị hư hại.",
"severity_level": "0",
"propose": ""
},
"Left front wheel": {
"material": "Hợp kim thép",
"damage": "Hình ảnh cho thấy vành xe không bị hư hại.",
"severity_level": "0",
"propose": ""
},
"Windshield": {
"material": "Kính",
"damage": "Hình ảnh cho thấy kính chắn gió không bị hư hại.",
"severity_level": "0",
"propose": ""
}
}
```
**Example 02.3: Car accidence analytics**
<div style="display: flex; flex-direction: row; align-items: center; justify-content: center;">
<div style="text-align: center; margin: 0 10px;">
<img src="https://huggingface.co/erax-ai/EraX-VL-7B-V2.0-Preview/resolve/main/dich-vu-lam-dong-son-xe-1.jpg" width="500" alt="Car accidence" />
</div>
</div>
```
{
"Manufacturer": "",
"Model": "",
"Color": "Bạc",
"seating capacity": "",
"Plate number": "",
"prob": "0.9",
"conclusion": "YES",
"bộ phận của xe AI nhìn thấy rất rõ ràng": "Cửa sau bên phải của xe, tay nắm cửa sau bên phải của xe, phần thân bên dưới cửa sau bên phải của xe (phần ốp gầm), bánh xe sau bên phải của xe.",
"góc nhìn xe từ camera": "Ảnh chụp từ phía bên phải của xe, nhìn về phía đuôi xe. Đầu xe nằm ở phía bên phải của ảnh, đuôi xe nằm ở phía bên trái của ảnh.",
"description": "Hình ảnh cho thấy phần thân bên phải của một chiếc xe ô tô màu bạc. Phần nổi bật nhất là cửa sau bên phải, cho thấy một số vết xước và hư hỏng. Một phần của phần thân bên dưới cửa sau bên phải (ốp gầm) và bánh xe sau bên phải cũng nhìn thấy được. Không có thông tin về hãng xe, hiệu xe, số chỗ ngồi và biển kiểm soát.",
"Right rear door": {
"material": "Kim loại",
"damage": "Vết xước dài, nông, dọc theo chiều dài của cửa. Có vẻ như là do va chạm nhẹ hoặc cọ xát.",
"severity_level": "2",
"propose": "Sơn lại và đánh bóng"
},
"Right rear door handle": {
"material": "Nhựa cứng có lớp mạ trang trí crôm",
"damage": "Không thấy hư hại rõ ràng trên tay nắm cửa.",
"severity_level": "0",
"propose": "Không cần sửa chữa"
},
"Right rocker panel": {
"material": "Kim loại",
"damage": "Vết xước tương tự như trên cửa sau, kéo dài xuống phần ốp gầm. Có vẻ như là do va chạm nhẹ hoặc cọ xát.",
"severity_level": "2",
"propose": "Sơn lại và đánh bóng"
},
"Right rear wheel": {
"material": "Hợp kim thép",
"damage": "Một phần nhỏ của bánh xe nhìn thấy được, không có dấu hiệu hư hại rõ ràng.",
"severity_level": "0",
"propose": "Không cần sửa chữa"
}
}
```
**Example 05: Citizen identification card**
<div style="display: flex; flex-direction: row; align-items: center; justify-content: center;">
<div style="text-align: center; margin: 0 10px;">
<img src="https://huggingface.co/erax-ai/EraX-VL-7B-V1.5/resolve/main/images/trinhquangduy_front.jpg" width="500" alt="Front View" />
<p>Front View</p>
</div>
<div style="text-align: center; margin: 0 10px;">
<img src="https://huggingface.co/erax-ai/EraX-VL-7B-V1.5/resolve/main/images/trinhquangduy_back.jpg" width="500" alt="Back View" />
<p>Back View</p>
</div>
</div>
<p style="text-align: center; font-size: 12px; color: gray; margin-top: 10px;">
Source: <a href="https://support.google.com/google-ads/thread/270967947/t%C3%B4i-%C4%91%C3%A3-g%E1%BB%ADi-h%C3%ACnh-%E1%BA%A3nh-c%C4%83n-c%C6%B0%E1%BB%9Bc-c%C3%B4ng-d%C3%A2n-c%E1%BB%A7a-ch%C3%ADnh-t%C3%B4i-%C4%91%E1%BB%83-x%C3%A1c-minh-danh-t%C3%ADnh?hl=vi" target="_blank">Google Support</a>
</p>
```
{
"Số thẻ": "037094012351",
"Họ và tên": "TRỊNH QUANG DUY",
"Ngày sinh": "04/09/1994",
"Giới tính": "Nam",
"Quốc tịch": "Việt Nam",
"Quê quán": "Tân Thành, Kim Sơn, Ninh Bình",
"Nơi thường trú": "Xóm 6\nTân Thành, Kim Sơn, Ninh Bình",
"Có giá trị đến": "04/09/2034",
"Đặc điểm nhân dạng": "sẹo chấm c. 1cm trên đuôi mắt trái",
"Nơi cấp": "CỤC TRƯỞNG CỤC CẢNH SÁT\nQUẢN LÝ HÀNH CHÍNH VỀ TRẬT TỰ XÃ HỘI",
"Ngày cấp": "10/12/2022",
"Cán bộ ký tên": "Nguyễn Quốc Hùng",
"Mã định danh": "IDVNM0940123513037094012351"
}
```
**Example 06: Driver's License**
<div style="display: flex; flex-direction: row; align-items: center; justify-content: center;">
<div style="text-align: center; margin: 0 10px;">
<img src="https://huggingface.co/erax-ai/EraX-VL-7B-V1.5/resolve/main/images/nguyenvandung_front.png" width="500" alt="Front View" />
<p>Front View</p>
</div>
<div style="text-align: center; margin: 0 10px;">
<img src="https://huggingface.co/erax-ai/EraX-VL-7B-V1.5/resolve/main/images/nguyenvandung_back.png" width="500" alt="Back View" />
<p>Back View</p>
</div>
</div>
<p style="text-align: center; font-size: 12px; color: gray; margin-top: 10px;">
Source: <a href="https://baophapluat.vn/khoi-to-tai-xe-len-mang-mua-giay-phep-lai-xe-gia-de-chay-xe-post481047.html" target="_blank">Báo Pháp luật</a>
</p>
```
{
"No.":"400116012313"
"Fullname":"NGUYỄN VĂN DŨNG"
"Date_of_birth":"08/06/1979"
"Nationality":"VIỆT NAM"
"Address":"X. Quỳnh Hầu, H. Quỳnh Lưu, T. Nghệ An
Nghệ An, ngày/date 23 tháng/month 04 năm/year 2022"
"Hang_Class":"FC"
"Expires":"23/04/2027"
"Place_of_issue":"Nghệ An"
"Date_of_issue":"ngày/date 23 tháng/month 04 năm/year 2022"
"Signer":"Trần Anh Tuấn"
"Các loại xe được phép":"Ô tô hạng C kéo rơmoóc, đầu kéo kéo sơmi rơmoóc và xe hạng B1, B2, C, FB2 (Motor vehicle of class C with a trailer, semi-trailer truck and vehicles of classes B1, B2, C, FB2)"
"Mã số":""
}
```
**Example 07: Vehicle Registration Certificate**
<div style="display: flex; flex-direction: row; align-items: center; justify-content: center;">
<div style="text-align: center; margin: 0 10px;">
<img src="https://huggingface.co/erax-ai/EraX-VL-7B-V1.5/resolve/main/images/nguyentonnhuan.jpg" width="700"/>
</div>
</div>
<p style="text-align: center; font-size: 12px; color: gray; margin-top: 10px;">
Source: <a href="https://vietnamnet.vn/phan-biet-cac-loai-giay-dang-ky-xe-khi-mua-moto-da-qua-su-dung-541341.html" target="_blank">Báo Vietnamnet</a>
</p>
```
{
"Tên chủ xe": "NGUYỄN TÔN NHUẬN",
"Địa chỉ": "KE27 Kp3 P.TTTây Q7",
"Nhãn hiệu": "HONDA",
"Số loại": "DYLAN",
"Màu sơn": "Trắng",
"Năm sản xuất": "2012",
"Số máy": "F03E-0057735",
"Số khung": "SA04F-070410",
"Dung tích": "152",
"Số chỗ ngồi": "02",
"Biển số đăng ký": "59V1-498.89",
"Đăng ký lần đầu ngày": "08/06/2004",
"Chức vụ": "Thượng tá",
"Người ký": "Trần Văn Hiểu"
}
```
**Example 08: Vehicle Registration**
<div style="display: flex; flex-direction: row; align-items: center; justify-content: center;">
<div style="text-align: center; margin: 10 20px;">
<img src="https://huggingface.co/erax-ai/EraX-VL-7B-V2.0-Preview/resolve/main/dangkiem.jpeg" width="700"/>
</div>
</div>
<p style="text-align: center; font-size: 12px; color: gray; margin-top: 10px;">
Source: <a href="https://llumar.com.vn/dang-kiem-xe-o-to/" target="_blank">https://llumar.com.vn</a>
</p>
```
{
"vehicle": {
"registration_number": "30A-072.36",
"vehicle_inspection_number": "2903V-093515",
"type": "ô tô con",
"mark": "MERCEDES-BENZ",
"model_code": "C300 W204",
"engine_number": "27294732096079",
"chassis_number": "RLMGF5EX3DV005333",
"manufactured_year_and_country": "2013, Việt Nam",
"life_time_limit_to": "",
"commercial_use": "",
"modification": ""
},
"specifications": {
"wheel_formula": "4x2",
"wheel_tread": "1521/1512 (mm)",
"overall_dimension": "4650 x 1770 x 1429 (mm)",
"largest_luggage_container_dimension": "",
"wheelbase": "2760 (mm)",
"kerb_mass": "1575 (kg)",
"design_authorized_pay_load": "",
"design_authorized_total_mass": "2090/2090 (kg)",
"design_authorized_towed_mass": "",
"permissible_number_of_pers_carried": "5 chỗ ngồi, 0 chỗ đứng, 0 chỗ nằm",
"type_of_fuel_used": "Xăng",
"engine_displacement": "2996 (cm3)",
"max_output_per_rpm": "170(kW)/6000vph",
"number": "KC-1292285"
},
"inspection_report_number": "2905V-20953/16",
"valid_until": "31/01/2018",
"place_date_of_issue": "Hà Nội, ngày 1 tháng 8 năm 2016",
"inspection_center": "ĐƠN VỊ KIỂM ĐỊNH XE CƠ GIỚI",
"signature": "Ngọc Tuấn",
"equipped_with_tachograph": "",
"inspection_stamp_was_not_issued": "",
"notes": "Biển đăng ký nền trắng"
}
```
**Example 09: Hand-writing Receipt**
<div style="display: flex; flex-direction: row; align-items: center; justify-content: center;">
<div style="text-align: center; margin: 10 20px;">
<img src="https://huggingface.co/erax-ai/EraX-VL-7B-V2.0-Preview/resolve/main/40vIbNdM1cFXwQYNHx7Ag.jpeg" width="500"/>
</div>
</div>
<p style="text-align: center; font-size: 12px; color: gray; margin-top: 10px;">
Source: <a href="https://tintucketoan.com/cach-viet-hoa-don-hang-hoa-dich-vu-khong-chiu-thue-gtgt/" target="_blank">https://tintucketoan.com/</a>
</p>
```
{
'Mẫu số': '01GKTKT3/001',
'Ký hiệu': 'TC/18P',
'Số': '0000030',
'Họ tên người mua hàng': None,
'Tên đơn vị': 'Công Ty TNHH Kế Toán Hà Nội',
'Mã số thuế': '0106235869',
'Địa chỉ': 'Số 49 Ngõ 322 Lê Trọng Tấn, phường Khương Mai, quận Thanh Xuân, Hà Nội',
'Hình thức thanh toán': 'TM',
'STT': None,
'Tên hàng hóa, dịch vụ': 'Tra cứu phần mềm thư viện pháp luật trực tuyến',
'Đơn vị tính': None,
'Số lượng': None,
'Đơn giá': '168.000',
'Thành tiền': '2.016.000',
'Thuế suất GTGT': None,
'Tiền thuế GTGT': None,
'Tổng cộng tiền thanh toán': '2.016.000',
'Số tiền viết bằng chữ': 'Hai triệu, không trăm mười sáu nghìn đồng',
'Người bán hàng': 'Bùi Văn Hùng',
'Chức vụ người bán hàng': 'TRƯỞNG CHI NHÁNH'
}
```
### 2.1 Image Captioning
<div align="center">
<img src="https://huggingface.co/erax-ai/EraX-VL-7B-V2.0-Preview/resolve/main/g5V60A7rI94TH0z3zdSAA.jpeg" width="700"/>
</div>
Hình ảnh là biểu đồ BMI theo tuổi, thể hiện mối quan hệ giữa chỉ số khối cơ thể (BMI) và độ tuổi của trẻ em. Biểu đồ được chia thành các vùng màu khác nhau tương ứng với các mức BMI khác nhau:
* **Vùng màu đỏ:** Chỉ số BMI cao hơn 25, cho thấy tình trạng béo phì.
* **Vùng màu vàng:** Chỉ số BMI nằm trong khoảng từ 18 đến 25, cho thấy nguy cơ béo phì.
* **Vùng màu xanh lá cây nhạt:** Chỉ số BMI nằm trong khoảng từ 16 đến 18, cho thấy sức khỏe dinh dưỡng tốt.
* **Vùng màu xanh lá cây đậm:** Chỉ số BMI thấp hơn 16, cho thấy tình trạng thiếu cân.
Trục tung biểu diễn chỉ số BMI, trục hoành biểu diễn tuổi (tính bằng năm). Đường cong màu xám đậm thể hiện đường chuẩn BMI theo tuổi. Các đường cong này cho thấy sự thay đổi BMI theo thời gian, giúp đánh giá sự phát triển cân nặng của trẻ em. Ví dụ, ở trẻ em dưới 3 tuổi, BMI thường dao động trong vùng thiếu cân hoặc sức khỏe dinh dưỡng tốt. Khi trẻ lớn lên, BMI có xu hướng tăng dần, nhưng tốc độ tăng trưởng có thể khác nhau tùy thuộc vào từng cá nhân. Biểu đồ cũng hiển thị các phần trăm phân vị (Percentile), cho biết tỷ lệ phần trăm trẻ em có BMI thấp hơn hoặc cao hơn so với một nhóm trẻ em cùng độ tuổi. Điều này giúp so sánh BMI của trẻ em với tiêu chuẩn quốc tế.
### 2.2 Image Captioning
<div align="center">
<img src="https://huggingface.co/erax-ai/EraX-VL-7B-V1.5/resolve/main/images/27vid-Gaza-City-Cover-gqmt-videoSixteenByNine1050%20(1).jpg" width="700"/>
</div>
Hình ảnh chụp một cảnh tượng đầy xúc động và bi thảm, dường như diễn ra ở một khu vực nghèo khó, có thể là một khu định cư hoặc khu ổ chuột. Trung tâm của bức ảnh là một chiếc xe đẩy được kéo bởi một con lừa. Trên xe đẩy có một nhóm người, bao gồm một người đàn ông lớn tuổi có vẻ như là người hướng dẫn, một phụ nữ mặc áo choàng đen, một phụ nữ trẻ mặc áo xám, một bé gái nhỏ được che mặt bằng khăn trùm đầu, và một cậu bé mặc áo xanh lá cây. Họ có vẻ như đang di chuyển từ một khu vực bị tàn phá sang một nơi khác. Xung quanh họ là những mảnh vụn, rác thải và những tòa nhà đổ nát, cho thấy một môi trường sống khó khăn và thiếu thốn. Phía sau họ, có một số người khác đang đi bộ hoặc đi xe đạp, một số người đang mang theo hành lý. Bầu trời trong xanh, nhưng không khí chung của bức ảnh là nặng nề và buồn bã. Có thể đây là một cảnh tượng di cư, tị nạn, hoặc một sự kiện thảm khốc nào đó.
### 3. Multi-turns Visual Question Answering and Reasoning
<div style="display: flex; flex-direction: row; align-items: center; justify-content: center;">
<div style="text-align: center; margin: 10 20px;">
<img src="https://huggingface.co/erax-ai/EraX-VL-7B-V2.0-Preview/resolve/main/S2UosY_TOOV5XNn3d4_zL.jpeg" width="700"/>
</div>
</div>
<p style="text-align: center; font-size: 12px; color: gray; margin-top: 10px;">
Source: <a href="https://kenh14.vn/bien-cang-thai-trinh-nhap-vien-vi-ngo-doc-dang-dan-to-1-resort-5-sao-thieu-trach-nhiem-voi-khach-hang-20200924214133912.chn" target="_blank">https://kenh14.vn</a>
</p>
- **User**: Hãy cho biết thông tin bệnh nhân
- **Assistant**: Tên: NGUYỄN THỊ THÁI TRINH, Ngày sinh: 09/09/1993, Giới tính: Nữ, Địa chỉ: TÂN ĐỊNH - Q1 - TPHCM, Mã thẻ BHYT: 19169080, Số khám bệnh: 2569501, Mã khu vực (K1/K2/K3): 2
- **User**: Họ được chuẩn đoán bị bệnh gì?
- **Assistant**: Nhiễm trùng đường ruột do vi khuẩn, không xác định loài
- **User**: Người bệnh có tiền sử bệnh không? Nếu có thì là gì?
- **Assistant**: Thiếu Calci do chế độ ăn
- **User**: Bệnh nhân đã thực hiện những xét nghiệm gì?
- **Assistant**: [NQ286] Tổng phân tích tế bào máu ngoại vi (bằng máy đếm laser), [NQ286] Điện giải đồ (Na, K, Cl) [Máu], [NQ286] Đo hoạt độ ALT (GPT) [Máu], [NQ286] Đo hoạt độ AST (GOT) [Máu], [NQ286] Định lượng Calci toàn phần [Máu], [NQ286] Định lượng Creatinin [Máu], [NQ286] Định lượng Glucose [Máu], [NQ286] Định lượng Urê máu [Máu], [NQ286] Siêu âm ổ bụng (gan mật, tụy, lách, thận, bàng quang)
- **User**: Số tiền Vật tư y tế mà người bệnh phải trả là bao nhiêu?
- **Assistant**: 19,708
- **User**: Các loại thuốc dịch truyền người bệnh phải trả là gì, tổng số tiền là bao nhiêu?
- **Assistant**: Atropin Sulfat (Syv2019), Bicefzidim 1g (Syv2019), GONCAL (Syv2019), Lactated Ringer's-500ml (Syv2019), Nước cất pha tiêm 5ml (Syv2019), Sodium Chloride 0.9% -500ml (Syv2019), Vincomid (Syv2019), Vinopa (Syv2019), tổng cộng 45,234 đồng
## Quickstart 🎮
Install the necessary packages:
```curl
python -m pip install git+https://github.com/huggingface/transformers accelerate
python -m pip install qwen-vl-utils
pip install flash-attn --no-build-isolation
```
Then you can use `EraX-VL-7B-V2.0-Preview` like this:
```python
import os
import base64
import json
import cv2
import numpy as np
import matplotlib.pyplot as plt
import torch
from transformers import Qwen2VLForConditionalGeneration, AutoTokenizer, AutoProcessor
from qwen_vl_utils import process_vision_info
model_path = "erax-ai/EraX-VL-7B-V2.0-Preview"
model = Qwen2VLForConditionalGeneration.from_pretrained(
model_path,
torch_dtype=torch.bfloat16,
attn_implementation="eager", # replace with "flash_attention_2" if your GPU is Ampere architecture
device_map="auto"
)
tokenizer = AutoTokenizer.from_pretrained(model_path)
# processor = AutoProcessor.from_pretrained(model_path)
min_pixels = 256 * 28 * 28
max_pixels = 1280 * 28 * 28
processor = AutoProcessor.from_pretrained(
model_path,
min_pixels=min_pixels,
max_pixels=max_pixels,
)
image_path ="image.jpg"
with open(image_path, "rb") as f:
encoded_image = base64.b64encode(f.read())
decoded_image_text = encoded_image.decode('utf-8')
base64_data = f"data:image;base64,{decoded_image_text}"
messages = [
{
"role": "user",
"content": [
{
"type": "image",
"image": base64_data,
},
{
"type": "text",
"text": "Trích xuất thông tin nội dung từ hình ảnh được cung cấp."
},
],
}
]
# Prepare prompt
tokenized_text = processor.apply_chat_template(
messages, tokenize=False, add_generation_prompt=True
)
image_inputs, video_inputs = process_vision_info(messages)
inputs = processor(
text=[ tokenized_text],
images=image_inputs,
videos=video_inputs,
padding=True,
return_tensors="pt",
)
inputs = inputs.to("cuda")
# Generation configs
generation_config = model.generation_config
generation_config.do_sample = True
generation_config.temperature = 0.01
generation_config.top_k = 1
generation_config.top_p = 0.001
#generation_config.min_p = 0.1
generation_config.best_of = 1
generation_config.max_new_tokens = 2048
generation_config.repetition_penalty = 1.01
# Inference
generated_ids = model.generate(**inputs, generation_config=generation_config)
generated_ids_trimmed = [
out_ids[len(in_ids) :] for in_ids, out_ids in zip(inputs.input_ids, generated_ids)
]
output_text = processor.batch_decode(
generated_ids_trimmed, skip_special_tokens=True, clean_up_tokenization_spaces=False
)
print(output_text[0])
```
## References 📑
[1] Qwen team. Qwen2-VL. 2024.
[2] Bai, Jinze, et al. "Qwen-VL: A Versatile Vision-Language Model for Understanding, Localization, Text Reading, and Beyond." arXiv preprint arXiv:2308.12966 (2023).
[4] Yang, An, et al. "Qwen2 technical report." arXiv preprint arXiv:2407.10671 (2024).
[5] Chen, Zhe, et al. "Internvl: Scaling up vision foundation models and aligning for generic visual-linguistic tasks." Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2024.
[6] Chen, Zhe, et al. "How far are we to gpt-4v? closing the gap to commercial multimodal models with open-source suites." arXiv preprint arXiv:2404.16821 (2024).
[7] Tran, Chi, and Huong Le Thanh. "LaVy: Vietnamese Multimodal Large Language Model." arXiv preprint arXiv:2404.07922 (2024).
## Contact 🤝
- For correspondence regarding this work or inquiry for API trial, please contact Nguyễn Anh Nguyên at [[email protected]]([email protected]).
- Follow us on <b><a href="https://github.com/EraX-JS-Company" target="_blank">EraX Github</a></b> | [
"QUESTION_ANSWERING"
] | [
"CHIA"
] |
microsoft/prophetnet-large-uncased-cnndm | microsoft | text2text-generation | [
"transformers",
"pytorch",
"rust",
"prophetnet",
"text2text-generation",
"en",
"dataset:cnn_dailymail",
"arxiv:2001.04063",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:05 | 2023-01-24T16:56:43 | 965 | 2 | ---
datasets:
- cnn_dailymail
language: en
---
## prophetnet-large-uncased-cnndm
Fine-tuned weights(converted from [original fairseq version repo](https://github.com/microsoft/ProphetNet)) for [ProphetNet](https://arxiv.org/abs/2001.04063) on summarization task CNN/DailyMail.
ProphetNet is a new pre-trained language model for sequence-to-sequence learning with a novel self-supervised objective called future n-gram prediction.
ProphetNet is able to predict more future tokens with a n-stream decoder. The original implementation is Fairseq version at [github repo](https://github.com/microsoft/ProphetNet).
### Usage
```
from transformers import ProphetNetTokenizer, ProphetNetForConditionalGeneration, ProphetNetConfig
model = ProphetNetForConditionalGeneration.from_pretrained('microsoft/prophetnet-large-uncased-cnndm')
tokenizer = ProphetNetTokenizer.from_pretrained('microsoft/prophetnet-large-uncased-cnndm')
ARTICLE_TO_SUMMARIZE = "USTC was founded in Beijing by the Chinese Academy of Sciences (CAS) in September 1958. The Director of CAS, Mr. Guo Moruo was appointed the first president of USTC. USTC's founding mission was to develop a high-level science and technology workforce, as deemed critical for development of China's economy, defense, and science and technology education. The establishment was hailed as \"A Major Event in the History of Chinese Education and Science.\" CAS has supported USTC by combining most of its institutes with the departments of the university. USTC is listed in the top 16 national key universities, becoming the youngest national key university.".lower()
inputs = tokenizer([ARTICLE_TO_SUMMARIZE], max_length=100, return_tensors='pt')
# Generate Summary
summary_ids = model.generate(inputs['input_ids'], num_beams=4, max_length=512, early_stopping=True)
tokenizer.batch_decode(summary_ids, skip_special_tokens=True)
# should give: 'ustc was founded in beijing by the chinese academy of sciences in 1958. [X_SEP] ustc\'s mission was to develop a high - level science and technology workforce. [X_SEP] the establishment was hailed as " a major event in the history of chinese education and science "'
```
Here, [X_SEP] is used as a special token to seperate sentences.
### Citation
```bibtex
@article{yan2020prophetnet,
title={Prophetnet: Predicting future n-gram for sequence-to-sequence pre-training},
author={Yan, Yu and Qi, Weizhen and Gong, Yeyun and Liu, Dayiheng and Duan, Nan and Chen, Jiusheng and Zhang, Ruofei and Zhou, Ming},
journal={arXiv preprint arXiv:2001.04063},
year={2020}
}
```
| [
"SUMMARIZATION"
] | [
"CAS"
] |
erax-ai/EraX-VL-7B-V1.0 | erax-ai | image-text-to-text | [
"transformers",
"safetensors",
"qwen2_vl",
"image-text-to-text",
"erax",
"multimodal",
"erax-vl-7b",
"insurance",
"ocr",
"vietnamese",
"bcg",
"image-to-text",
"conversational",
"vi",
"en",
"zh",
"arxiv:2308.12966",
"arxiv:2407.10671",
"arxiv:2404.16821",
"arxiv:2404.07922",
"base_model:Qwen/Qwen2-VL-7B-Instruct",
"base_model:finetune:Qwen/Qwen2-VL-7B-Instruct",
"doi:10.57967/hf/3312",
"license:apache-2.0",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | 2024-09-17T08:36:40 | 2025-01-15T16:52:05 | 954 | 37 | ---
base_model:
- Qwen/Qwen2-VL-7B-Instruct
language:
- vi
- en
- zh
library_name: transformers
license: apache-2.0
pipeline_tag: image-text-to-text
tags:
- erax
- multimodal
- erax-vl-7b
- insurance
- ocr
- vietnamese
- bcg
- image-to-text
widget:
- src: images/photo-1-16505057982762025719470.webp
example_title: Test 1
- src: images/vt-don-thuoc-f0-7417.jpeg
example_title: Test 2
---
<p align="left">
<img src="https://cdn-uploads.huggingface.co/production/uploads/66e93d483745423cbb14c5ff/fNxjr3en_onzbOv0sghpE.jpeg" alt="Logo">
</p>
<!--  -->
# EraX-VL-7B-V1
## Introduction 🎉
<!-- <p style="text-align: justify;">
We are excited to introduce **EraX-VL-7B-v1**, a robust multimodal model for **OCR (optical character recognition)** and **VQA (visual question-answering)** that excels in various languages 🌍, with a particular focus on Vietnamese 🇻🇳. The `EraX-VL-7B` model stands out for its precise recognition capabilities across a range of documents 📝, including medical forms 🩺, invoices 🧾, bills of sale 💳, quotes 📄, and medical records 💊. This functionality is expected to be highly beneficial for hospitals 🏥, clinics 💉, insurance companies 🛡️, and other similar applications 📋. Built on the solid foundation of the [Qwen/Qwen2-VL-7B-Instruct](https://huggingface.co/Qwen/Qwen2-VL-7B-Instruct)[1], which we found to be of high quality and fluent in Vietnamese, `EraX-VL-7B` has been fine-tuned to enhance its performance. We plan to continue improving and releasing new versions for free, along with sharing performance benchmarks in the near future.
One standing-out feature of **EraX-VL-7B-v1** is the capability to do multi-turn Q&A with pretty good reasoning! Thanks for the size of 7+ billions parameters of base model.
**EraX-VL-7B-V1** is a young member of our **EraX's LànhGPT** collection of LLM models.
</p> -->
**WE ARE MOVING to <a href="https://huggingface.co/erax-ai/EraX-VL-7B-V1/" target="_blank">EraX-AI</a> repository from 22 October 2024. Follow up so you do not miss great news coming up.**
We are excited to introduce **EraX-VL-7B-v1**, a robust multimodal model for **OCR (optical character recognition)** and **VQA (visual question-answering)** that excels in various languages 🌍, with a particular focus on Vietnamese 🇻🇳. The `EraX-VL-7B` model stands out for its precise recognition capabilities across a range of documents 📝, including medical forms 🩺, invoices 🧾, bills of sale 💳, quotes 📄, and medical records 💊. This functionality is expected to be highly beneficial for hospitals 🏥, clinics 💉, insurance companies 🛡️, and other similar applications 📋. Built on the solid foundation of the [Qwen/Qwen2-VL-7B-Instruct](https://huggingface.co/Qwen/Qwen2-VL-7B-Instruct)[1], which we found to be of high quality and fluent in Vietnamese, `EraX-VL-7B` has been fine-tuned to enhance its performance. We plan to continue improving and releasing new versions for free, along with sharing performance benchmarks in the near future.
One standing-out feature of **EraX-VL-7B-v1** is the capability to do multi-turn Q&A with pretty good reasoning! Thanks for the size of 7+ billions parameters of base model.
***NOTA BENE***: EraX-VL-7B-V1 is NOT a typical OCR-only tool likes Tesseract but is a Multimodal LLM-based model. To use it effectively, you may have to **twist your prompt carefully** depending on your tasks.
**EraX-VL-7B-V1** is a young member of our **EraX's LànhGPT** collection of LLM models.
- **Developed by:**
- Nguyễn Anh Nguyên ([email protected])
- Nguyễn Hồ Nam (BCG)
- Hoàng Tiến Dũng ([email protected])
- Phạm Huỳnh Nhật ([email protected])
- Phạm Đình Thục ([email protected])
- **Funded by:** [Bamboo Capital Group](https://bamboocap.com.vn) and EraX
- **Model type:** Multimodal Transformer with over 7B parameters
- **Languages (NLP):** Primarily Vietnamese with multilingual capabilities
- **License:** Apache 2.0
- **Fine-tuned from:** [Qwen/Qwen2-VL-7B-Instruct](https://huggingface.co/Qwen/Qwen2-VL-7B-Instruct)
## Benchmarks 📊
<!--  -->
Below is the evaluation benchmark of **global open-source and proprietary Multimodal Models** on the [MTVQA](https://huggingface.co/datasets/ByteDance/MTVQA) Vietnamese test set conducted by [VinBigdata](https://www.linkedin.com/feed/update/urn:li:activity:7243887708966641664/). We plan to conduct more detailed and diverse evaluations in the near future.
<div align="left">
<img src="https://cdn-uploads.huggingface.co/production/uploads/66e93d483745423cbb14c5ff/-OYkSDVyAcAcLLgO2N5XT.jpeg" width="500"/>
<a href="https://www.linkedin.com/feed/update/urn:li:activity:7243887708966641664/" target="_blank">Source: VinBigData</a>
<br>(20:00 23rd Sept 2024)
</div>
## API trial 🎉
Please contact **[email protected]** for API access inquiry.
## Examples 🧩
### Example 01: OCR - Optical Character Recognition for Image
<!-- 
-->
<div align="left">
<img src="images/images_henkham_0.jpg" width="500"/>
</div>
```
{
"document": {
"header": {
"title": "GIẤY HẸN KHÁM LẠI",
"organization": "SỞ Y TẾ NGHỆ AN\nBỆNH VIỆN UNG BƯỚU NGHỆ AN",
"address": "Võ Thị Sáu, Thủy Tùng - TP Vinh - Nghệ An"
},
"patient_info": {
"name": "NGUYỄN THỊ LUÂN",
"date_of_birth": "03/07/1976",
"gender": "40",
"address": "Xã Nghĩa Khánh-Huyện Nghĩa Đàn-Nghệ An",
"medical_card_number": "CN 3 40 40 168 60413",
"registration_date": "16/12/2016",
"admission_date": "Từ 01/03/2016",
"diagnosis": "C20-Bướu ac trực tràng",
"revisit_date": "17/01/2017"
},
"administrative_details": {
"department": "Trung tâm điều trị ung bướu",
"revisit_instruction": "vào ngày 17/01/2017, hoặc đến hết kỳ thời gian nếu nước ngoài hẹn khám lại nếu có dấu hiệu (triệu chứng)",
"note": "nếu KCB ban đầu: Trạm y tế xã Nghĩa Khánh",
"signature": "Trưởng khoa",
"doctor_signature": "Lâm Nguyễn Khang",
"revisiting_date_confirmation": "Ngày 16 tháng 12 năm 2016",
"confirmation_signature": "Bác sĩ điều trị",
"physician_signature": "Nguyễn Văn Việt"
}
}
}
```
### Example 02: OCR - Optical Character Recognition for PDF
<div align="left">
<img src="images/images_phieuphambenh_1.png" width="500"/>
</div>
<!--  -->
```
{
"header": {
"title": "PHIẾU KHÁM BỆNH",
"date": "Hà Nội, ngày 23 tháng 3 năm 2020",
"patient_info": {
"id": "HN011000002",
"name": "Vương Hồng Thắng - Năm sinh: 1978",
"address": "Số 10 tầng 2, TTTM V+, Số 505 Phố Minh Khai, Quận Hai Bà Trưng, Hà Nội",
"phone": "+0942116117",
"email": "[email protected]"
},
"contact_info": {
"address": "Nhà Khoa Bamufit\nĐịa chỉ: 505, Phố Minh Khai, Hai Bà Trưng, Hà Nội, Việt Nam",
"phone": "0942484784",
"email": "[email protected]",
"website": "https://bamufit.vn"
}
},
"treatment_details": [
{
"visit_date": "13-09-2019",
"treatment_type": "Chẩn đoán: Abscess chẽ",
"procedure": "Cắt lợi bằng Laser r23",
"doctor": "THỊ HIEN",
"price": "500,000",
"quantity": "1",
"discounted_price": "0",
"total_cost": "500,000"
},
{
"visit_date": "13-09-2019",
"treatment_type": "Chẩn đoán: Abscess quanh chóp",
"procedure": "Bám gai xuống ở răng r23",
"doctor": "THỊ HIEN",
"price": "100,000",
"quantity": "1",
"discounted_price": "0",
"total_cost": "100,000"
}
],
"financial_details": {
"total_cost": "600,000",
"discounted_total": "0",
"paid_amount": "1,114,000",
"remaining_balance": "1,714,000"
},
"notes": "- Kiêng ăn uống đồ để gây nhiễm mủ như chè, cà phê, thuốc lá, rượu vang đỏ .. và hạn chế dùng đồ quá nóng, quá lạnh sau khi tẩy trắng răng ít nhất 2 tuần.",
"footer": {
"customer_signature": "(Ký và ghi rõ họ tên)",
"accountant_signature": "(Ký và ghi rõ họ tên)",
"dentist_signature": "(Ký và ghi rõ họ tên)"
}
}
```
### Example 03: Image Captioning w/ Hand-writing
<!--  -->
<div align="left">
<img src="https://cdn-uploads.huggingface.co/production/uploads/66e93d483745423cbb14c5ff/WAzO6Pzh3PY3mb31-VMbd.jpeg" width="500"/>
</div>
Hình ảnh là một bức chân dung của **<span style="color:red">Chủ tịch Hồ Chí Minh**</span>, được in trên nền giấy màu vàng. Bên trái hình ảnh là chân dung của Hồ Chí Minh, với nét mặt nghiêm nghị, râu quai nón và tóc bạc. Bên phải hình ảnh là một đoạn văn bản viết tay bằng tiếng Việt, với nội dung: **<span style="color:blue">'Khuyên đồng bào nam nữ 18 tuổi trở lên hôm nay tất cả đều đi bỏ phiếu, để bầu những đại biểu xứng đáng, vào Quốc hội đầu tiên của nước ta.'</span>** Dưới đoạn văn bản là chữ ký 'Hồ Chí Minh' và ngày tháng '6,1,46'.
### Example 04: Image Captioning
<!--  -->
<div align="left">
<img src="images/images_bieudo.jpeg" width="500"/>
</div>
Hình ảnh là một biểu đồ thể hiện <span style="color:blue">mối quan hệ giữa chỉ số BMI (Body Mass Index) và tuổi</span>, được chia thành các mức độ khác nhau dựa trên phần trăm percentile. Trục hoành của biểu đồ đại diện cho tuổi từ 2 đến 20 năm, trong khi trục tung đại diện cho chỉ số BMI từ 10 đến 32. Biểu đồ này có ba khu vực chính: **<span style="color:red">vùng màu đỏ</span>** ở phía dưới cùng đại diện cho mức béo phì với chỉ số BMI cao hơn 30; **<span style="color:orange">vùng màu vàng</span>** nằm giữa đại diện cho nguy cơ béo phì với chỉ số BMI từ khoảng 25 đến 30; và **<span style="color:green">vùng màu xanh lá cây</span>** ở phía trên đại diện cho mức cân nặng khỏe mạnh hoặc thiếu cân với chỉ số BMI thấp hơn 25. Trên biểu đồ còn có đường cong màu xám chạy qua các mức độ BMI theo tuổi, đánh dấu các mức 5th, 50th, và 95th percentile. Văn bản trong hình gồm các cụm từ '**Béo phì**', '**Nguy cơ béo phì**', '**Sức khỏe dinh dưỡng tốt**', và '**Thiếu cân**' để mô tả từng khu vực tương ứng với chỉ số BMI.
### Example 05: Image Captioning
<!--  -->
<div align="left">
<img src="images/images_langque_1.jpg" width="500"/>
</div>
Hình ảnh chụp một khung cảnh đồng quê Việt Nam với những nét đặc trưng riêng biệt. **<span style="color:green">Bối cảnh là một cánh đồng lúa xanh mướt, trải dài bất tận, được bao phủ bởi một lớp sương mù nhẹ nhàng. Phía xa là những dãy núi xanh ngắt, tạo nên một bức tranh thiên nhiên thơ mộng và trữ tình</span>**. Ở trung tâm bức ảnh, hai cậu bé đang ngồi trên lưng con trâu nước đen tuyền. Cậu bé phía trước mặc quần đỏ, đang cười tươi rói, trong khi cậu bé phía sau mặc quần đen, vẻ mặt trầm tư hơn. **<span style="color:blue">Con trâu nước đang đứng yên, đầu hướng về phía bên phải, nhìn về phía xa xăm</span>**. Phía bên phải hình ảnh, một người đàn ông đang cầm một cái chậu màu đen, đang đổ nước lên đầu mình. Nước từ chậu chảy xuống tạo thành những giọt nước li ti, tung tóe khắp không gian. Trên bầu trời, có những đám mây trắng bồng bềnh, tạo nên một khung cảnh thanh bình và lãng mạn. Ánh nắng vàng óng chiếu rọi xuống cánh đồng, tạo nên một bức tranh tuyệt đẹp.
### Example 06: Visual Question Answering
<div align="left">
<img src="images/images_biencadoitheo_3.jpg" width="500"/>
</div>
- **User**: Hãy mô tả nội dung của bức ảnh này một cách thật chi tiết và không giải thích gì thêm
- **Assistant**: Bức ảnh là một hình vẽ minh họa đơn giản về một cô bé đang ôm một con gấu bông màu vàng. Cô bé có mái tóc đen ngắn, mặc váy đỏ và đi tất sọc trắng. Cô bé đứng ở giữa khung hình với tư thế ôm chặt gấu bông bằng cả hai tay. Gấu bông có đôi mắt to tròn và nụ cười tươi. Văn bản trong ảnh được chia thành hai phần: phần trên cùng là tiếng Anh và phần dưới là tiếng Việt. Nội dung văn bản như sau: **<span style="color:green">'Sofia's bear was old, tatty and very well loved.'** </span> (Có nghĩa là 'Con gấu bông của Sofia đã cũ kỹ và rách rưới nhưng cô bé rất yêu nó'). Dưới dòng chữ tiếng Anh là dòng chữ tiếng Việt: **<span style="color:green">'Sofia có một bạn gấu bông tuy đã cũ kỹ và sờn rách nhưng cô bé rất yêu quý nó.'** </span>. Bố cục tổng thể của bức ảnh khá đơn giản, tập trung vào nhân vật chính là cô bé và con gấu bông mà cô đang ôm. Màu sắc chủ đạo là các gam màu ấm áp như đỏ, vàng và nâu nhạt, tạo cảm giác thân thiện và gần gũi.
### Example 07: Multi-turns Visual Question Answering and Reasoning
<div align="left">
<img src="images/images_henkham_0.jpg" width="500"/>
</div>
- **User**: Hãy trích xuất toàn bộ chi tiết của bức ảnh này theo đúng thứ tự của nội dung trong ảnh. Không bình luận gì thêm.
Lưu ý:
1. Nếu có chữ trong ảnh thì phải trích xuất ra hết theo thứ tự và định dạng của câu chữ.
2. Nếu có bảng biểu (table) thì phải trả lại định dạng như bảng biểu trong hình và text của nó.
3. Nếu bức ảnh không có bất kỳ ký tự nào, hãy diễn giải bức ảnh đó.
4. Chỉ trả lại bằng tiếng Việt.
- **Assistant**: Bức ảnh là một giấy hẹn khám lại từ **<span style="color:blue">Bệnh viện Ung Bướu Nghệ An</span>**. Giấy hẹn được cấp cho bệnh nhân **<span style="color:blue">Nguyễn Thị Luân</span>**, sinh ngày 03/07/1976, hiện đang cư trú tại Xã Nghĩa Khánh - Huyện Nghĩa Đàn - Nghệ An. Giấy hẹn ghi rõ thông tin về lần sử dụng thuốc BHYT từ ngày 01/03/2016 đến 16/12/2016, với **<span style="color:blue">số thẻ BHYT là CN 3 40 40 168 60413</span>**. Ngày khám ban đầu là 16/12/2016 và giờ khám là buổi trưa. Giấy hẹn khám lại vào ngày 17/01/2017 hoặc trước khi hết thời hạn sử dụng thẻ bảo hiểm y tế nếu có dấu hiệu triệu chứng bất thường. Giấy hẹn cũng chỉ áp dụng cho một lần khám duy nhất trong vòng 10 ngày làm việc kể từ ngày được hẹn khám lại. Bên dưới phần thông tin bệnh nhân là chữ ký của Trưởng khoa, bác sĩ điều trị và con dấu đỏ của Bệnh viện Ung Bướu Nghệ An. Ngoài ra, trên góc phải của giấy hẹn còn có dòng chữ '**<span style="color:blue">Tổng đài tư vấn 1900.6172</span>**' cùng logo của bệnh viện.
- **User**: Hãy cho biết thông tin của bệnh nhân.
- **Assistant**: Bệnh nhân là Nguyễn Thị Luân, sinh ngày 03/07/1976, hiện đang cư trú tại **<span style="color:blue">Xã Nghĩa Khánh - Huyện Nghĩa Đàn - Nghệ An</span>**. Số thẻ bảo hiểm y tế của bệnh nhân là **<span style="color:blue">CN 3 40 40 168 60413</span>**.
- **User**: Bác sĩ điều trị cho bệnh nhân là ai?
- **Assistant**: Bác sĩ điều trị cho bệnh nhân là bác sĩ **<span style="color:blue">Nguyễn Văn Việt</span>**.
- **User**: Đến năm 2024 chị Luân bao nhiêu tuổi?
- **Assistant**: Nguyễn Thị Luân sinh ngày **<span style="color:blue">03/07/1976</span>**, tính đến năm 2024 chị sẽ **<span style="color:blue">48 tuổi</span>**.
- **User**: Vậy năm 2020 chị Luân mấy tuổi?
- **Assistant**: **<span style="color:blue">Vào năm 2020</span>**, Nguyễn Thị Luân sẽ **<span style="color:blue">44 tuổi vì năm 2020 cách năm 2024 4 năm</span>**.
### Example 08: Multi-images Visual Question Answering
<div style="display: flex; justify-content: left; gap: 10px;">
<img src="images/images_langque_0.jpg" width="350"/>
<img src="images/images_langque_1.jpg" width="350"/>
</div>
- **User**: Hãy trích xuất toàn bộ chi tiết của các bức ảnh này theo đúng thứ tự của nội dung và không bình luận gì
- **Assistant**: Hình ảnh đầu tiên là một bức tranh sơn dầu với phong cách tả thực, thể hiện khung cảnh làng quê Việt Nam thanh bình. Bức tranh được vẽ với gam màu tươi sáng, rực rỡ, tạo cảm giác ấm áp, vui tươi. **<span style="color:skyblue">Nền trời xanh biếc</span>**, điểm xuyết những **<span style="color:darkcyan">đám mây trắng bồng bềnh</span>**. Phía xa là **<span style="color:greenyellow">dãy núi xanh ngắt</span>**, thấp thoáng **<span style="color:indianred">những ngôi nhà mái ngói đỏ au</span>**. Giữa khung cảnh là **<span style="color:orangered">dòng sông uốn lượn</span>**, chảy hiền hòa giữa **<span style="color:mediumpurple">cánh đồng lúa chín vàng óng ánh</span>**. Hai bên bờ sông là những **<span style="color:forestgreen">hàng cây xanh mát rượi</span>**, tạo nên một **<span style="color:mediumpurple">bức tranh thiên nhiên thơ mộng, hữu tình</span>**. Trên dòng sông, có **<span style="color:blue">hai người đang ngồi trên thuyền nhỏ</span>**, tay cầm lái, hướng về phía trước. Bên cạnh đó, còn có **<span style="color:hotpink">một người phụ nữ đang đi trên con đường đất nhỏ, hướng về phía làng</span>**. Hình ảnh này gợi lên **<span style="color:darkseagreen">một cuộc sống thanh bình, yên ả của người dân làng quê Việt Nam</span>**.
Hình ảnh thứ hai là một bức ảnh chụp chân dung, thể hiện sự hồn nhiên, trong sáng của tuổi thơ. Trong ảnh, có **<span style="color:blueviolet">ba nhân vật chính: hai cậu bé và một người đàn ông</span>**. **<span style="color:mediumvioletred">Cậu bé lớn hơn</span>** đang cưỡi trên lưng con trâu đen, hai tay nắm chặt vào dây cương, vẻ mặt đầy thích thú. **<span style="color:seagreen">Cậu bé nhỏ hơn</span>** ngồi sau lưng cậu bạn lớn, tay cầm một chiếc mũ rộng vành, nhìn về phía trước với ánh mắt ngây thơ. **<span style="color:darkblue">Người đàn ông</span>** đứng ở phía sau, mặc quần đùi, tay cầm một cái chậu nước, đang đổ nước lên đầu hai cậu bé. Bối cảnh của bức ảnh là một cánh đồng cỏ xanh mướt, phía xa là những hàng cây xanh mát. Ánh nắng ban mai chiếu rọi xuống cánh đồng, tạo nên một **<span style="color:darkgreen">khung cảnh đẹp như tranh vẽ</span>**.
Bức ảnh này mang đến cho người xem cảm giác **<span style="color:orangered">vui tươi</span>**, **<span style="color:orangered">hồn nhiên</span>**, thể hiện nét đẹp văn hóa **<span style="color:orangered">truyền thống của người nông dân Việt Nam</span>**.
## Quickstart 🎮
[](https://colab.research.google.com/drive/1CnSxtWDLG48-NQh7wk9_z8WI7J4OY_Ci?usp=sharing)
Install the necessary packages:
```curl
python -m pip install git+https://github.com/huggingface/transformers accelerate
python -m pip install qwen-vl-utils
pip install flash-attn --no-build-isolation
```
Then you can use `EraX-VL-7B-V1` like this:
```python
import os
import base64
import json
import cv2
import numpy as np
import matplotlib.pyplot as plt
import torch
from transformers import Qwen2VLForConditionalGeneration, AutoTokenizer, AutoProcessor
from qwen_vl_utils import process_vision_info
model_path = "erax/EraX-VL-7B-V1"
model = Qwen2VLForConditionalGeneration.from_pretrained(
model_path,
torch_dtype=torch.bfloat16,
attn_implementation="eager", # replace with "flash_attention_2" if your GPU is Ampere architecture
device_map="auto"
)
tokenizer = AutoTokenizer.from_pretrained(model_path)
# processor = AutoProcessor.from_pretrained(model_path)
min_pixels = 256 * 28 * 28
max_pixels = 1280 * 28 * 28
processor = AutoProcessor.from_pretrained(
model_path,
min_pixels=min_pixels,
max_pixels=max_pixels,
)
image_path ="image.jpg"
with open(image_path, "rb") as f:
encoded_image = base64.b64encode(f.read())
decoded_image_text = encoded_image.decode('utf-8')
base64_data = f"data:image;base64,{decoded_image_text}"
messages = [
{
"role": "user",
"content": [
{
"type": "image",
"image": base64_data,
},
{
"type": "text",
"text": "Diễn tả nội dung bức ảnh như 1 bác sỹ giỏi."
# "Diễn tả nội dung bức ảnh này bằng định dạng json."
},
],
}
]
# Prepare prompt
tokenized_text = processor.apply_chat_template(
messages, tokenize=False, add_generation_prompt=True
)
image_inputs, video_inputs = process_vision_info(messages)
inputs = processor(
text=[ tokenized_text],
images=image_inputs,
videos=video_inputs,
padding=True,
return_tensors="pt",
)
inputs = inputs.to("cuda")
# Generation configs
generation_config = model.generation_config
generation_config.do_sample = True
generation_config.temperature = 1.0
generation_config.top_k = 1
generation_config.top_p = 0.9
generation_config.min_p = 0.1
generation_config.best_of = 5
generation_config.max_new_tokens = 2048
generation_config.repetition_penalty = 1.06
# Inference
generated_ids = model.generate(**inputs, generation_config=generation_config)
generated_ids_trimmed = [
out_ids[len(in_ids) :] for in_ids, out_ids in zip(inputs.input_ids, generated_ids)
]
output_text = processor.batch_decode(
generated_ids_trimmed, skip_special_tokens=True, clean_up_tokenization_spaces=False
)
print(output_text[0])
```
## Acknowledgments 👏
We thank Khang Đoàn ([5CD-AI](https://huggingface.co/5CD-AI)) for his invaluable support in order to train `EraX-VL-7B-V1`. Our appreciation also goes to AAA JS Company for their support and resources, which significantly contributed to this project.
## Citation 📝
<!-- - title={EraX-VL-7B-V1: A Highly Efficient Multimodal LLM for Vietnamese, especially for medical forms and bills.},
- author={Nguyễn Anh Nguyên and Nguyễn Hồ Nam (BCG) and Dũng Hoàng and Thục Phạm and Nhật Phạm},
- helpers={Khang Đoàn and AAA JS Company},
- contact={[email protected]},
- organization={EraX} -->
If you find our project useful, we would appreciate it if you could star our repository and cite our work as follows:
```
@article{EraX-VL-7B-V1,
title={EraX-VL-7B-V1: A Highly Efficient Multimodal LLM for Vietnamese, especially for medical forms and bills},
author={Nguyễn Anh Nguyên and Nguyễn Hồ Nam (BCG) and Hoàng Tiến Dũng and Phạm Đình Thục and Phạm Huỳnh Nhật},
organization={EraX},
year={2024},
url={https://huggingface.co/erax-ai/EraX-VL-7B-V1},
github={https://github.com/EraX-JS-Company/erax-vl-7b-v1/}
}
```
## References 📑
[1] Qwen team. Qwen2-VL. 2024.
[2] Bai, Jinze, et al. "Qwen-VL: A Versatile Vision-Language Model for Understanding, Localization, Text Reading, and Beyond." arXiv preprint arXiv:2308.12966 (2023).
[4] Yang, An, et al. "Qwen2 technical report." arXiv preprint arXiv:2407.10671 (2024).
[5] Chen, Zhe, et al. "Internvl: Scaling up vision foundation models and aligning for generic visual-linguistic tasks." Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2024.
[6] Chen, Zhe, et al. "How far are we to gpt-4v? closing the gap to commercial multimodal models with open-source suites." arXiv preprint arXiv:2404.16821 (2024).
[7] Tran, Chi, and Huong Le Thanh. "LaVy: Vietnamese Multimodal Large Language Model." arXiv preprint arXiv:2404.07922 (2024).
## Contact 🤝
- For correspondence regarding this work or inquiry for API trial, please contact Nguyễn Anh Nguyên at [[email protected]]([email protected]).
- Follow us on <b><a href="https://github.com/EraX-JS-Company/erax-vl-7b-v1/" target="_blank">EraX Github</a></b>
| [
"QUESTION_ANSWERING"
] | [
"BEAR",
"CHIA"
] |
mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-i1-GGUF | mradermacher | null | [
"transformers",
"gguf",
"code",
"text-generation-inference",
"Information Extraction",
"IE",
"Named Entity Recogniton",
"Event Extraction",
"Relation Extraction",
"LLaMA",
"en",
"dataset:ACE05",
"dataset:bc5cdr",
"dataset:conll2003",
"dataset:ncbi_disease",
"dataset:conll2012_ontonotesv5",
"dataset:rams",
"dataset:tacred",
"dataset:wnut_17",
"base_model:KaraKaraWitch/HiTZ-GoLLIE-13B-AsSafeTensors",
"base_model:quantized:KaraKaraWitch/HiTZ-GoLLIE-13B-AsSafeTensors",
"license:llama2",
"endpoints_compatible",
"region:us",
"imatrix"
] | 2025-03-01T17:57:18 | 2025-03-02T05:33:38 | 949 | 1 | ---
base_model: KaraKaraWitch/HiTZ-GoLLIE-13B-AsSafeTensors
datasets:
- ACE05
- bc5cdr
- conll2003
- ncbi_disease
- conll2012_ontonotesv5
- rams
- tacred
- wnut_17
language:
- en
library_name: transformers
license: llama2
tags:
- code
- text-generation-inference
- Information Extraction
- IE
- Named Entity Recogniton
- Event Extraction
- Relation Extraction
- LLaMA
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/KaraKaraWitch/HiTZ-GoLLIE-13B-AsSafeTensors
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-i1-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.i1-IQ1_S.gguf) | i1-IQ1_S | 3.0 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-i1-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.i1-IQ1_M.gguf) | i1-IQ1_M | 3.2 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-i1-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 3.6 | |
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-i1-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.i1-IQ2_XS.gguf) | i1-IQ2_XS | 4.0 | |
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-i1-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.i1-IQ2_S.gguf) | i1-IQ2_S | 4.3 | |
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-i1-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.i1-Q2_K_S.gguf) | i1-Q2_K_S | 4.5 | very low quality |
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-i1-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.i1-IQ2_M.gguf) | i1-IQ2_M | 4.6 | |
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-i1-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.i1-Q2_K.gguf) | i1-Q2_K | 5.0 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-i1-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 5.1 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-i1-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.i1-IQ3_XS.gguf) | i1-IQ3_XS | 5.5 | |
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-i1-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.i1-IQ3_S.gguf) | i1-IQ3_S | 5.8 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-i1-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.i1-Q3_K_S.gguf) | i1-Q3_K_S | 5.8 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-i1-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.i1-IQ3_M.gguf) | i1-IQ3_M | 6.1 | |
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-i1-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.i1-Q3_K_M.gguf) | i1-Q3_K_M | 6.4 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-i1-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.i1-Q3_K_L.gguf) | i1-Q3_K_L | 7.0 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-i1-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.i1-IQ4_XS.gguf) | i1-IQ4_XS | 7.1 | |
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-i1-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.i1-IQ4_NL.gguf) | i1-IQ4_NL | 7.5 | prefer IQ4_XS |
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-i1-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.i1-Q4_0.gguf) | i1-Q4_0 | 7.5 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-i1-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.i1-Q4_K_S.gguf) | i1-Q4_K_S | 7.5 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-i1-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.i1-Q4_K_M.gguf) | i1-Q4_K_M | 8.0 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-i1-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.i1-Q4_1.gguf) | i1-Q4_1 | 8.3 | |
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-i1-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.i1-Q5_K_S.gguf) | i1-Q5_K_S | 9.1 | |
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-i1-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.i1-Q5_K_M.gguf) | i1-Q5_K_M | 9.3 | |
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-i1-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.i1-Q6_K.gguf) | i1-Q6_K | 10.8 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
| [
"RELATION_EXTRACTION",
"EVENT_EXTRACTION"
] | [
"BC5CDR",
"NCBI DISEASE"
] |
Omartificial-Intelligence-Space/Arabic-labse-Matryoshka | Omartificial-Intelligence-Space | sentence-similarity | [
"sentence-transformers",
"safetensors",
"bert",
"mteb",
"sentence-similarity",
"feature-extraction",
"generated_from_trainer",
"dataset_size:557850",
"loss:MatryoshkaLoss",
"loss:MultipleNegativesRankingLoss",
"ar",
"dataset:Omartificial-Intelligence-Space/Arabic-NLi-Triplet",
"arxiv:1908.10084",
"arxiv:2205.13147",
"arxiv:1705.00652",
"arxiv:2407.21139",
"base_model:sentence-transformers/LaBSE",
"base_model:finetune:sentence-transformers/LaBSE",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"text-embeddings-inference",
"region:us"
] | 2024-06-16T20:56:09 | 2025-01-10T18:03:08 | 947 | 2 | ---
base_model: sentence-transformers/LaBSE
datasets:
- Omartificial-Intelligence-Space/Arabic-NLi-Triplet
language:
- ar
library_name: sentence-transformers
license: apache-2.0
metrics:
- pearson_cosine
- spearman_cosine
- pearson_manhattan
- spearman_manhattan
- pearson_euclidean
- spearman_euclidean
- pearson_dot
- spearman_dot
- pearson_max
- spearman_max
pipeline_tag: sentence-similarity
tags:
- mteb
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:557850
- loss:MatryoshkaLoss
- loss:MultipleNegativesRankingLoss
inference: false
widget:
- source_sentence: ذكر متوازن بعناية يقف على قدم واحدة بالقرب من منطقة شاطئ المحيط
النظيفة
sentences:
- رجل يقدم عرضاً
- هناك رجل بالخارج قرب الشاطئ
- رجل يجلس على أريكه
- source_sentence: رجل يقفز إلى سريره القذر
sentences:
- السرير قذر.
- رجل يضحك أثناء غسيل الملابس
- الرجل على القمر
- source_sentence: الفتيات بالخارج
sentences:
- امرأة تلف الخيط إلى كرات بجانب كومة من الكرات
- فتيان يركبان في جولة متعة
- ثلاث فتيات يقفون سوية في غرفة واحدة تستمع وواحدة تكتب على الحائط والثالثة تتحدث
إليهن
- source_sentence: الرجل يرتدي قميصاً أزرق.
sentences:
- رجل يرتدي قميصاً أزرق يميل إلى الجدار بجانب الطريق مع شاحنة زرقاء وسيارة حمراء
مع الماء في الخلفية.
- كتاب القصص مفتوح
- رجل يرتدي قميص أسود يعزف على الجيتار.
- source_sentence: يجلس شاب ذو شعر أشقر على الحائط يقرأ جريدة بينما تمر امرأة وفتاة
شابة.
sentences:
- ذكر شاب ينظر إلى جريدة بينما تمر إمرأتان بجانبه
- رجل يستلقي على وجهه على مقعد في الحديقة.
- الشاب نائم بينما الأم تقود ابنتها إلى الحديقة
model-index:
- name: SentenceTransformer based on sentence-transformers/LaBSE
results:
- task:
type: Retrieval
dataset:
name: MTEB MintakaRetrieval (ar)
type: mintaka/mmteb-mintaka
config: ar
split: test
revision: efa78cc2f74bbcd21eff2261f9e13aebe40b814e
metrics:
- type: main_score
value: 14.585
- type: map_at_1
value: 8.352
- type: map_at_3
value: 10.917
- type: map_at_5
value: 11.634
- type: map_at_10
value: 12.254
- type: ndcg_at_1
value: 8.352
- type: ndcg_at_3
value: 11.794
- type: ndcg_at_5
value: 13.085
- type: ndcg_at_10
value: 14.585
- type: recall_at_1
value: 8.352
- type: recall_at_3
value: 14.344
- type: recall_at_5
value: 17.476
- type: recall_at_10
value: 22.106
- type: precision_at_1
value: 8.352
- type: precision_at_3
value: 4.781
- type: precision_at_5
value: 3.495
- type: precision_at_10
value: 2.211
- type: mrr_at_1
value: 8.3522
- type: mrr_at_3
value: 10.9169
- type: mrr_at_5
value: 11.6341
- type: mrr_at_10
value: 12.2543
- task:
type: Retrieval
dataset:
name: MTEB MIRACLRetrievalHardNegatives (ar)
type: miracl/mmteb-miracl-hardnegatives
config: ar
split: dev
revision: 95c8db7d4a6e9c1d8a60601afd63d553ae20a2eb
metrics:
- type: main_score
value: 18.836
- type: map_at_1
value: 6.646
- type: map_at_3
value: 10.692
- type: map_at_5
value: 11.969
- type: map_at_10
value: 13.446
- type: ndcg_at_1
value: 10.5
- type: ndcg_at_3
value: 13.645
- type: ndcg_at_5
value: 15.504
- type: ndcg_at_10
value: 18.836
- type: recall_at_1
value: 6.646
- type: recall_at_3
value: 15.361
- type: recall_at_5
value: 19.925
- type: recall_at_10
value: 28.6
- type: precision_at_1
value: 10.5
- type: precision_at_3
value: 8.533
- type: precision_at_5
value: 6.9
- type: precision_at_10
value: 5.21
- type: mrr_at_1
value: 10.5
- type: mrr_at_3
value: 16.25
- type: mrr_at_5
value: 17.68
- type: mrr_at_10
value: 19.1759
- task:
type: Retrieval
dataset:
name: MTEB MLQARetrieval (ar)
type: mlqa/mmteb-mlqa
config: ar
split: validation
revision: 397ed406c1a7902140303e7faf60fff35b58d285
metrics:
- type: main_score
value: 61.582
- type: map_at_1
value: 47.195
- type: map_at_3
value: 54.03
- type: map_at_5
value: 55.77
- type: map_at_10
value: 56.649
- type: ndcg_at_1
value: 47.195
- type: ndcg_at_3
value: 56.295
- type: ndcg_at_5
value: 59.417
- type: ndcg_at_10
value: 61.582
- type: recall_at_1
value: 47.195
- type: recall_at_3
value: 62.863
- type: recall_at_5
value: 70.406
- type: recall_at_10
value: 77.176
- type: precision_at_1
value: 47.195
- type: precision_at_3
value: 20.954
- type: precision_at_5
value: 14.081
- type: precision_at_10
value: 7.718
- type: mrr_at_1
value: 47.1954
- type: mrr_at_3
value: 54.0297
- type: mrr_at_5
value: 55.7705
- type: mrr_at_10
value: 56.6492
- task:
type: Retrieval
dataset:
name: MTEB SadeemQuestionRetrieval (ar)
type: sadeem/mmteb-sadeem
config: default
split: test
revision: 3cb0752b182e5d5d740df547748b06663c8e0bd9
metrics:
- type: main_score
value: 57.653
- type: map_at_1
value: 25.084
- type: map_at_3
value: 46.338
- type: map_at_5
value: 47.556
- type: map_at_10
value: 48.207
- type: ndcg_at_1
value: 25.084
- type: ndcg_at_3
value: 53.91
- type: ndcg_at_5
value: 56.102
- type: ndcg_at_10
value: 57.653
- type: recall_at_1
value: 25.084
- type: recall_at_3
value: 76.017
- type: recall_at_5
value: 81.331
- type: recall_at_10
value: 86.07
- type: precision_at_1
value: 25.084
- type: precision_at_3
value: 25.339
- type: precision_at_5
value: 16.266
- type: precision_at_10
value: 8.607
- type: mrr_at_1
value: 23.1211
- type: mrr_at_3
value: 44.9657
- type: mrr_at_5
value: 46.3037
- type: mrr_at_10
value: 46.8749
- task:
type: STS
dataset:
name: MTEB BIOSSES (default)
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cosine_pearson
value: 76.46793440999714
- type: cosine_spearman
value: 76.66439745271298
- type: euclidean_pearson
value: 76.52075972347127
- type: euclidean_spearman
value: 76.66439745271298
- type: main_score
value: 76.66439745271298
- type: manhattan_pearson
value: 76.68001857069733
- type: manhattan_spearman
value: 76.73066402288269
- task:
type: STS
dataset:
name: MTEB SICK-R (default)
type: mteb/sickr-sts
config: default
split: test
revision: 20a6d6f312dd54037fe07a32d58e5e168867909d
metrics:
- type: cosine_pearson
value: 79.67657890693198
- type: cosine_spearman
value: 77.03286420274621
- type: euclidean_pearson
value: 78.1960735272073
- type: euclidean_spearman
value: 77.032855497919
- type: main_score
value: 77.03286420274621
- type: manhattan_pearson
value: 78.25627275994229
- type: manhattan_spearman
value: 77.00430810589081
- task:
type: STS
dataset:
name: MTEB STS12 (default)
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cosine_pearson
value: 83.94288954523996
- type: cosine_spearman
value: 79.21432176112556
- type: euclidean_pearson
value: 81.21333251943913
- type: euclidean_spearman
value: 79.2152067330468
- type: main_score
value: 79.21432176112556
- type: manhattan_pearson
value: 81.16910737482634
- type: manhattan_spearman
value: 79.08756466301445
- task:
type: STS
dataset:
name: MTEB STS13 (default)
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cosine_pearson
value: 77.48393909963059
- type: cosine_spearman
value: 79.54963868861196
- type: euclidean_pearson
value: 79.28416002197451
- type: euclidean_spearman
value: 79.54963861790114
- type: main_score
value: 79.54963868861196
- type: manhattan_pearson
value: 79.18653917582513
- type: manhattan_spearman
value: 79.46713533414295
- task:
type: STS
dataset:
name: MTEB STS14 (default)
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cosine_pearson
value: 78.51596313692846
- type: cosine_spearman
value: 78.84601702652395
- type: euclidean_pearson
value: 78.55199809961427
- type: euclidean_spearman
value: 78.84603362286225
- type: main_score
value: 78.84601702652395
- type: manhattan_pearson
value: 78.52780170677605
- type: manhattan_spearman
value: 78.77744294039178
- task:
type: STS
dataset:
name: MTEB STS15 (default)
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cosine_pearson
value: 84.53393478889929
- type: cosine_spearman
value: 85.60821849381648
- type: euclidean_pearson
value: 85.32813923250558
- type: euclidean_spearman
value: 85.6081835456016
- type: main_score
value: 85.60821849381648
- type: manhattan_pearson
value: 85.32782097916476
- type: manhattan_spearman
value: 85.58098670898562
- task:
type: STS
dataset:
name: MTEB STS16 (default)
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cosine_pearson
value: 77.00196998325856
- type: cosine_spearman
value: 79.930951699069
- type: euclidean_pearson
value: 79.43196738390897
- type: euclidean_spearman
value: 79.93095112410258
- type: main_score
value: 79.930951699069
- type: manhattan_pearson
value: 79.33744358111427
- type: manhattan_spearman
value: 79.82939266539601
- task:
type: STS
dataset:
name: MTEB STS17 (ar-ar)
type: mteb/sts17-crosslingual-sts
config: ar-ar
split: test
revision: faeb762787bd10488a50c8b5be4a3b82e411949c
metrics:
- type: cosine_pearson
value: 81.60289529424327
- type: cosine_spearman
value: 82.46806381979653
- type: euclidean_pearson
value: 81.32235058296072
- type: euclidean_spearman
value: 82.46676890643914
- type: main_score
value: 82.46806381979653
- type: manhattan_pearson
value: 81.43885277175312
- type: manhattan_spearman
value: 82.38955952718666
- task:
type: STS
dataset:
name: MTEB STS22 (ar)
type: mteb/sts22-crosslingual-sts
config: ar
split: test
revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3
metrics:
- type: cosine_pearson
value: 49.58293768761314
- type: cosine_spearman
value: 57.261888789832874
- type: euclidean_pearson
value: 53.36549109538782
- type: euclidean_spearman
value: 57.261888789832874
- type: main_score
value: 57.261888789832874
- type: manhattan_pearson
value: 53.06640323833928
- type: manhattan_spearman
value: 57.05837935512948
- task:
type: STS
dataset:
name: MTEB STSBenchmark (default)
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cosine_pearson
value: 81.43997935928729
- type: cosine_spearman
value: 82.04996129795596
- type: euclidean_pearson
value: 82.01917866996972
- type: euclidean_spearman
value: 82.04996129795596
- type: main_score
value: 82.04996129795596
- type: manhattan_pearson
value: 82.03487112040936
- type: manhattan_spearman
value: 82.03774605775651
- task:
type: Summarization
dataset:
name: MTEB SummEval (default)
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cosine_pearson
value: 32.113475997147674
- type: cosine_spearman
value: 32.17194233764879
- type: dot_pearson
value: 32.113469728827255
- type: dot_spearman
value: 32.174771315355386
- type: main_score
value: 32.17194233764879
- type: pearson
value: 32.113475997147674
- type: spearman
value: 32.17194233764879
- task:
type: semantic-similarity
name: Semantic Similarity
dataset:
name: sts test 768
type: sts-test-768
metrics:
- type: pearson_cosine
value: 0.7269177710249681
name: Pearson Cosine
- type: spearman_cosine
value: 0.7225258779395222
name: Spearman Cosine
- type: pearson_manhattan
value: 0.7259261785622463
name: Pearson Manhattan
- type: spearman_manhattan
value: 0.7210463582530393
name: Spearman Manhattan
- type: pearson_euclidean
value: 0.7259567884235211
name: Pearson Euclidean
- type: spearman_euclidean
value: 0.722525823788783
name: Spearman Euclidean
- type: pearson_dot
value: 0.7269177712136122
name: Pearson Dot
- type: spearman_dot
value: 0.7225258771129475
name: Spearman Dot
- type: pearson_max
value: 0.7269177712136122
name: Pearson Max
- type: spearman_max
value: 0.7225258779395222
name: Spearman Max
- type: pearson_cosine
value: 0.8143867576376295
name: Pearson Cosine
- type: spearman_cosine
value: 0.8205044914629483
name: Spearman Cosine
- type: pearson_manhattan
value: 0.8203365887013151
name: Pearson Manhattan
- type: spearman_manhattan
value: 0.8203816698535976
name: Spearman Manhattan
- type: pearson_euclidean
value: 0.8201809453496319
name: Pearson Euclidean
- type: spearman_euclidean
value: 0.8205044914629483
name: Spearman Euclidean
- type: pearson_dot
value: 0.8143867541070537
name: Pearson Dot
- type: spearman_dot
value: 0.8205044914629483
name: Spearman Dot
- type: pearson_max
value: 0.8203365887013151
name: Pearson Max
- type: spearman_max
value: 0.8205044914629483
name: Spearman Max
- task:
type: semantic-similarity
name: Semantic Similarity
dataset:
name: sts test 512
type: sts-test-512
metrics:
- type: pearson_cosine
value: 0.7268389724271859
name: Pearson Cosine
- type: spearman_cosine
value: 0.7224359411000278
name: Spearman Cosine
- type: pearson_manhattan
value: 0.7241418669615103
name: Pearson Manhattan
- type: spearman_manhattan
value: 0.7195408311833029
name: Spearman Manhattan
- type: pearson_euclidean
value: 0.7248184919191593
name: Pearson Euclidean
- type: spearman_euclidean
value: 0.7212936866178097
name: Spearman Euclidean
- type: pearson_dot
value: 0.7252522928016701
name: Pearson Dot
- type: spearman_dot
value: 0.7205040482865328
name: Spearman Dot
- type: pearson_max
value: 0.7268389724271859
name: Pearson Max
- type: spearman_max
value: 0.7224359411000278
name: Spearman Max
- type: pearson_cosine
value: 0.8143448965624136
name: Pearson Cosine
- type: spearman_cosine
value: 0.8211700903453509
name: Spearman Cosine
- type: pearson_manhattan
value: 0.8217448619823571
name: Pearson Manhattan
- type: spearman_manhattan
value: 0.8216016599665544
name: Spearman Manhattan
- type: pearson_euclidean
value: 0.8216413349390971
name: Pearson Euclidean
- type: spearman_euclidean
value: 0.82188122418776
name: Spearman Euclidean
- type: pearson_dot
value: 0.8097020064483653
name: Pearson Dot
- type: spearman_dot
value: 0.8147306090545295
name: Spearman Dot
- type: pearson_max
value: 0.8217448619823571
name: Pearson Max
- type: spearman_max
value: 0.82188122418776
name: Spearman Max
- task:
type: semantic-similarity
name: Semantic Similarity
dataset:
name: sts test 256
type: sts-test-256
metrics:
- type: pearson_cosine
value: 0.7283468617741852
name: Pearson Cosine
- type: spearman_cosine
value: 0.7264294106954872
name: Spearman Cosine
- type: pearson_manhattan
value: 0.7227711798003426
name: Pearson Manhattan
- type: spearman_manhattan
value: 0.718067982079232
name: Spearman Manhattan
- type: pearson_euclidean
value: 0.7251492361775083
name: Pearson Euclidean
- type: spearman_euclidean
value: 0.7215068115809131
name: Spearman Euclidean
- type: pearson_dot
value: 0.7243396991648858
name: Pearson Dot
- type: spearman_dot
value: 0.7221390873398206
name: Spearman Dot
- type: pearson_max
value: 0.7283468617741852
name: Pearson Max
- type: spearman_max
value: 0.7264294106954872
name: Spearman Max
- type: pearson_cosine
value: 0.8075613785257986
name: Pearson Cosine
- type: spearman_cosine
value: 0.8159258089804861
name: Spearman Cosine
- type: pearson_manhattan
value: 0.8208711370091426
name: Pearson Manhattan
- type: spearman_manhattan
value: 0.8196747601014518
name: Spearman Manhattan
- type: pearson_euclidean
value: 0.8210210137439432
name: Pearson Euclidean
- type: spearman_euclidean
value: 0.8203004500356083
name: Spearman Euclidean
- type: pearson_dot
value: 0.7870611647231145
name: Pearson Dot
- type: spearman_dot
value: 0.7874848213991118
name: Spearman Dot
- type: pearson_max
value: 0.8210210137439432
name: Pearson Max
- type: spearman_max
value: 0.8203004500356083
name: Spearman Max
- task:
type: semantic-similarity
name: Semantic Similarity
dataset:
name: sts test 128
type: sts-test-128
metrics:
- type: pearson_cosine
value: 0.7102082520621849
name: Pearson Cosine
- type: spearman_cosine
value: 0.7103917869311991
name: Spearman Cosine
- type: pearson_manhattan
value: 0.7134729607181519
name: Pearson Manhattan
- type: spearman_manhattan
value: 0.708895102058259
name: Spearman Manhattan
- type: pearson_euclidean
value: 0.7171545288118942
name: Pearson Euclidean
- type: spearman_euclidean
value: 0.7130380237150746
name: Spearman Euclidean
- type: pearson_dot
value: 0.6777774738547628
name: Pearson Dot
- type: spearman_dot
value: 0.6746474823963989
name: Spearman Dot
- type: pearson_max
value: 0.7171545288118942
name: Pearson Max
- type: spearman_max
value: 0.7130380237150746
name: Spearman Max
- type: pearson_cosine
value: 0.8024378358145556
name: Pearson Cosine
- type: spearman_cosine
value: 0.8117561815472325
name: Spearman Cosine
- type: pearson_manhattan
value: 0.818920309459774
name: Pearson Manhattan
- type: spearman_manhattan
value: 0.8180515365910205
name: Spearman Manhattan
- type: pearson_euclidean
value: 0.8198346073356603
name: Pearson Euclidean
- type: spearman_euclidean
value: 0.8185162896024369
name: Spearman Euclidean
- type: pearson_dot
value: 0.7513270537478935
name: Pearson Dot
- type: spearman_dot
value: 0.7427542871546953
name: Spearman Dot
- type: pearson_max
value: 0.8198346073356603
name: Pearson Max
- type: spearman_max
value: 0.8185162896024369
name: Spearman Max
- task:
type: semantic-similarity
name: Semantic Similarity
dataset:
name: sts test 64
type: sts-test-64
metrics:
- type: pearson_cosine
value: 0.6930745722517785
name: Pearson Cosine
- type: spearman_cosine
value: 0.6982194042238953
name: Spearman Cosine
- type: pearson_manhattan
value: 0.6971382079778946
name: Pearson Manhattan
- type: spearman_manhattan
value: 0.6942362764367931
name: Spearman Manhattan
- type: pearson_euclidean
value: 0.7012627015062325
name: Pearson Euclidean
- type: spearman_euclidean
value: 0.6986972295835788
name: Spearman Euclidean
- type: pearson_dot
value: 0.6376735798940838
name: Pearson Dot
- type: spearman_dot
value: 0.6344835722310429
name: Spearman Dot
- type: pearson_max
value: 0.7012627015062325
name: Pearson Max
- type: spearman_max
value: 0.6986972295835788
name: Spearman Max
- type: pearson_cosine
value: 0.7855080652087961
name: Pearson Cosine
- type: spearman_cosine
value: 0.7948979371698327
name: Spearman Cosine
- type: pearson_manhattan
value: 0.8060407473462375
name: Pearson Manhattan
- type: spearman_manhattan
value: 0.8041199691999044
name: Spearman Manhattan
- type: pearson_euclidean
value: 0.8088262858195556
name: Pearson Euclidean
- type: spearman_euclidean
value: 0.8060483394849104
name: Spearman Euclidean
- type: pearson_dot
value: 0.677754045289596
name: Pearson Dot
- type: spearman_dot
value: 0.6616232873061395
name: Spearman Dot
- type: pearson_max
value: 0.8088262858195556
name: Pearson Max
- type: spearman_max
value: 0.8060483394849104
name: Spearman Max
---
# SentenceTransformer based on sentence-transformers/LaBSE
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/LaBSE](https://huggingface.co/sentence-transformers/LaBSE) on the Omartificial-Intelligence-Space/arabic-n_li-triplet dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [sentence-transformers/LaBSE](https://huggingface.co/sentence-transformers/LaBSE) <!-- at revision e34fab64a3011d2176c99545a93d5cbddc9a91b7 -->
- **Maximum Sequence Length:** 256 tokens
- **Output Dimensionality:** 768 tokens
- **Similarity Function:** Cosine Similarity
- **Training Dataset:**
- Omartificial-Intelligence-Space/arabic-n_li-triplet
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Dense({'in_features': 768, 'out_features': 768, 'bias': True, 'activation_function': 'torch.nn.modules.activation.Tanh'})
(3): Normalize()
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("Omartificial-Intelligence-Space/Arabic-labse")
# Run inference
sentences = [
'يجلس شاب ذو شعر أشقر على الحائط يقرأ جريدة بينما تمر امرأة وفتاة شابة.',
'ذكر شاب ينظر إلى جريدة بينما تمر إمرأتان بجانبه',
'الشاب نائم بينما الأم تقود ابنتها إلى الحديقة',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
## Evaluation
### Metrics
#### Semantic Similarity
* Dataset: `sts-test-768`
* Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| pearson_cosine | 0.7269 |
| **spearman_cosine** | **0.7225** |
| pearson_manhattan | 0.7259 |
| spearman_manhattan | 0.721 |
| pearson_euclidean | 0.726 |
| spearman_euclidean | 0.7225 |
| pearson_dot | 0.7269 |
| spearman_dot | 0.7225 |
| pearson_max | 0.7269 |
| spearman_max | 0.7225 |
#### Semantic Similarity
* Dataset: `sts-test-512`
* Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| pearson_cosine | 0.7268 |
| **spearman_cosine** | **0.7224** |
| pearson_manhattan | 0.7241 |
| spearman_manhattan | 0.7195 |
| pearson_euclidean | 0.7248 |
| spearman_euclidean | 0.7213 |
| pearson_dot | 0.7253 |
| spearman_dot | 0.7205 |
| pearson_max | 0.7268 |
| spearman_max | 0.7224 |
#### Semantic Similarity
* Dataset: `sts-test-256`
* Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| pearson_cosine | 0.7283 |
| **spearman_cosine** | **0.7264** |
| pearson_manhattan | 0.7228 |
| spearman_manhattan | 0.7181 |
| pearson_euclidean | 0.7251 |
| spearman_euclidean | 0.7215 |
| pearson_dot | 0.7243 |
| spearman_dot | 0.7221 |
| pearson_max | 0.7283 |
| spearman_max | 0.7264 |
#### Semantic Similarity
* Dataset: `sts-test-128`
* Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| pearson_cosine | 0.7102 |
| **spearman_cosine** | **0.7104** |
| pearson_manhattan | 0.7135 |
| spearman_manhattan | 0.7089 |
| pearson_euclidean | 0.7172 |
| spearman_euclidean | 0.713 |
| pearson_dot | 0.6778 |
| spearman_dot | 0.6746 |
| pearson_max | 0.7172 |
| spearman_max | 0.713 |
#### Semantic Similarity
* Dataset: `sts-test-64`
* Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| pearson_cosine | 0.6931 |
| **spearman_cosine** | **0.6982** |
| pearson_manhattan | 0.6971 |
| spearman_manhattan | 0.6942 |
| pearson_euclidean | 0.7013 |
| spearman_euclidean | 0.6987 |
| pearson_dot | 0.6377 |
| spearman_dot | 0.6345 |
| pearson_max | 0.7013 |
| spearman_max | 0.6987 |
#### Semantic Similarity
* Dataset: `sts-test-768`
* Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| pearson_cosine | 0.8144 |
| **spearman_cosine** | **0.8205** |
| pearson_manhattan | 0.8203 |
| spearman_manhattan | 0.8204 |
| pearson_euclidean | 0.8202 |
| spearman_euclidean | 0.8205 |
| pearson_dot | 0.8144 |
| spearman_dot | 0.8205 |
| pearson_max | 0.8203 |
| spearman_max | 0.8205 |
#### Semantic Similarity
* Dataset: `sts-test-512`
* Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| pearson_cosine | 0.8143 |
| **spearman_cosine** | **0.8212** |
| pearson_manhattan | 0.8217 |
| spearman_manhattan | 0.8216 |
| pearson_euclidean | 0.8216 |
| spearman_euclidean | 0.8219 |
| pearson_dot | 0.8097 |
| spearman_dot | 0.8147 |
| pearson_max | 0.8217 |
| spearman_max | 0.8219 |
#### Semantic Similarity
* Dataset: `sts-test-256`
* Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| pearson_cosine | 0.8076 |
| **spearman_cosine** | **0.8159** |
| pearson_manhattan | 0.8209 |
| spearman_manhattan | 0.8197 |
| pearson_euclidean | 0.821 |
| spearman_euclidean | 0.8203 |
| pearson_dot | 0.7871 |
| spearman_dot | 0.7875 |
| pearson_max | 0.821 |
| spearman_max | 0.8203 |
#### Semantic Similarity
* Dataset: `sts-test-128`
* Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| pearson_cosine | 0.8024 |
| **spearman_cosine** | **0.8118** |
| pearson_manhattan | 0.8189 |
| spearman_manhattan | 0.8181 |
| pearson_euclidean | 0.8198 |
| spearman_euclidean | 0.8185 |
| pearson_dot | 0.7513 |
| spearman_dot | 0.7428 |
| pearson_max | 0.8198 |
| spearman_max | 0.8185 |
#### Semantic Similarity
* Dataset: `sts-test-64`
* Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| pearson_cosine | 0.7855 |
| **spearman_cosine** | **0.7949** |
| pearson_manhattan | 0.806 |
| spearman_manhattan | 0.8041 |
| pearson_euclidean | 0.8088 |
| spearman_euclidean | 0.806 |
| pearson_dot | 0.6778 |
| spearman_dot | 0.6616 |
| pearson_max | 0.8088 |
| spearman_max | 0.806 |
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### Omartificial-Intelligence-Space/arabic-n_li-triplet
* Dataset: Omartificial-Intelligence-Space/arabic-n_li-triplet
* Size: 557,850 training samples
* Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code>
* Approximate statistics based on the first 1000 samples:
| | anchor | positive | negative |
|:--------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string | string |
| details | <ul><li>min: 4 tokens</li><li>mean: 9.99 tokens</li><li>max: 51 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 12.44 tokens</li><li>max: 49 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 13.82 tokens</li><li>max: 49 tokens</li></ul> |
* Samples:
| anchor | positive | negative |
|:------------------------------------------------------------|:--------------------------------------------|:------------------------------------|
| <code>شخص على حصان يقفز فوق طائرة معطلة</code> | <code>شخص في الهواء الطلق، على حصان.</code> | <code>شخص في مطعم، يطلب عجة.</code> |
| <code>أطفال يبتسمون و يلوحون للكاميرا</code> | <code>هناك أطفال حاضرون</code> | <code>الاطفال يتجهمون</code> |
| <code>صبي يقفز على لوح التزلج في منتصف الجسر الأحمر.</code> | <code>الفتى يقوم بخدعة التزلج</code> | <code>الصبي يتزلج على الرصيف</code> |
* Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
```json
{
"loss": "MultipleNegativesRankingLoss",
"matryoshka_dims": [
768,
512,
256,
128,
64
],
"matryoshka_weights": [
1,
1,
1,
1,
1
],
"n_dims_per_step": -1
}
```
### Evaluation Dataset
#### Omartificial-Intelligence-Space/arabic-n_li-triplet
* Dataset: Omartificial-Intelligence-Space/arabic-n_li-triplet
* Size: 6,584 evaluation samples
* Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code>
* Approximate statistics based on the first 1000 samples:
| | anchor | positive | negative |
|:--------|:-----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string | string |
| details | <ul><li>min: 4 tokens</li><li>mean: 19.71 tokens</li><li>max: 100 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 9.37 tokens</li><li>max: 38 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 10.49 tokens</li><li>max: 34 tokens</li></ul> |
* Samples:
| anchor | positive | negative |
|:-----------------------------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------|:---------------------------------------------------|
| <code>امرأتان يتعانقان بينما يحملان حزمة</code> | <code>إمرأتان يحملان حزمة</code> | <code>الرجال يتشاجرون خارج مطعم</code> |
| <code>طفلين صغيرين يرتديان قميصاً أزرق، أحدهما يرتدي الرقم 9 والآخر يرتدي الرقم 2 يقفان على خطوات خشبية في الحمام ويغسلان أيديهما في المغسلة.</code> | <code>طفلين يرتديان قميصاً مرقماً يغسلون أيديهم</code> | <code>طفلين يرتديان سترة يذهبان إلى المدرسة</code> |
| <code>رجل يبيع الدونات لعميل خلال معرض عالمي أقيم في مدينة أنجليس</code> | <code>رجل يبيع الدونات لعميل</code> | <code>امرأة تشرب قهوتها في مقهى صغير</code> |
* Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
```json
{
"loss": "MultipleNegativesRankingLoss",
"matryoshka_dims": [
768,
512,
256,
128,
64
],
"matryoshka_weights": [
1,
1,
1,
1,
1
],
"n_dims_per_step": -1
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `per_device_train_batch_size`: 64
- `per_device_eval_batch_size`: 64
- `num_train_epochs`: 1
- `warmup_ratio`: 0.1
- `fp16`: True
- `batch_sampler`: no_duplicates
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 64
- `per_device_eval_batch_size`: 64
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `learning_rate`: 5e-05
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 1
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: True
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: False
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
| Epoch | Step | Training Loss | sts-test-128_spearman_cosine | sts-test-256_spearman_cosine | sts-test-512_spearman_cosine | sts-test-64_spearman_cosine | sts-test-768_spearman_cosine |
|:------:|:----:|:-------------:|:----------------------------:|:----------------------------:|:----------------------------:|:---------------------------:|:----------------------------:|
| None | 0 | - | 0.7104 | 0.7264 | 0.7224 | 0.6982 | 0.7225 |
| 0.0229 | 200 | 13.1738 | - | - | - | - | - |
| 0.0459 | 400 | 8.8127 | - | - | - | - | - |
| 0.0688 | 600 | 8.0984 | - | - | - | - | - |
| 0.0918 | 800 | 7.2984 | - | - | - | - | - |
| 0.1147 | 1000 | 7.5749 | - | - | - | - | - |
| 0.1377 | 1200 | 7.1292 | - | - | - | - | - |
| 0.1606 | 1400 | 6.6146 | - | - | - | - | - |
| 0.1835 | 1600 | 6.6523 | - | - | - | - | - |
| 0.2065 | 1800 | 6.1095 | - | - | - | - | - |
| 0.2294 | 2000 | 6.0841 | - | - | - | - | - |
| 0.2524 | 2200 | 6.3024 | - | - | - | - | - |
| 0.2753 | 2400 | 6.1941 | - | - | - | - | - |
| 0.2983 | 2600 | 6.1686 | - | - | - | - | - |
| 0.3212 | 2800 | 5.8317 | - | - | - | - | - |
| 0.3442 | 3000 | 6.0597 | - | - | - | - | - |
| 0.3671 | 3200 | 5.7832 | - | - | - | - | - |
| 0.3900 | 3400 | 5.7088 | - | - | - | - | - |
| 0.4130 | 3600 | 5.6988 | - | - | - | - | - |
| 0.4359 | 3800 | 5.5268 | - | - | - | - | - |
| 0.4589 | 4000 | 5.5543 | - | - | - | - | - |
| 0.4818 | 4200 | 5.3152 | - | - | - | - | - |
| 0.5048 | 4400 | 5.2894 | - | - | - | - | - |
| 0.5277 | 4600 | 5.1805 | - | - | - | - | - |
| 0.5506 | 4800 | 5.4559 | - | - | - | - | - |
| 0.5736 | 5000 | 5.3836 | - | - | - | - | - |
| 0.5965 | 5200 | 5.2626 | - | - | - | - | - |
| 0.6195 | 5400 | 5.2511 | - | - | - | - | - |
| 0.6424 | 5600 | 5.3308 | - | - | - | - | - |
| 0.6654 | 5800 | 5.2264 | - | - | - | - | - |
| 0.6883 | 6000 | 5.2881 | - | - | - | - | - |
| 0.7113 | 6200 | 5.1349 | - | - | - | - | - |
| 0.7342 | 6400 | 5.0872 | - | - | - | - | - |
| 0.7571 | 6600 | 4.5515 | - | - | - | - | - |
| 0.7801 | 6800 | 3.4312 | - | - | - | - | - |
| 0.8030 | 7000 | 3.1008 | - | - | - | - | - |
| 0.8260 | 7200 | 2.9582 | - | - | - | - | - |
| 0.8489 | 7400 | 2.8153 | - | - | - | - | - |
| 0.8719 | 7600 | 2.7214 | - | - | - | - | - |
| 0.8948 | 7800 | 2.5392 | - | - | - | - | - |
| 0.9177 | 8000 | 2.584 | - | - | - | - | - |
| 0.9407 | 8200 | 2.5384 | - | - | - | - | - |
| 0.9636 | 8400 | 2.4937 | - | - | - | - | - |
| 0.9866 | 8600 | 2.4155 | - | - | - | - | - |
| 1.0 | 8717 | - | 0.8118 | 0.8159 | 0.8212 | 0.7949 | 0.8205 |
### Framework Versions
- Python: 3.9.18
- Sentence Transformers: 3.0.1
- Transformers: 4.40.0
- PyTorch: 2.2.2+cu121
- Accelerate: 0.26.1
- Datasets: 2.19.0
- Tokenizers: 0.19.1
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### MatryoshkaLoss
```bibtex
@misc{kusupati2024matryoshka,
title={Matryoshka Representation Learning},
author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
year={2024},
eprint={2205.13147},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
```
#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
## <span style="color:blue">Acknowledgments</span>
The author would like to thank Prince Sultan University for their invaluable support in this project. Their contributions and resources have been instrumental in the development and fine-tuning of these models.
```markdown
## Citation
If you use the Arabic Matryoshka Embeddings Model, please cite it as follows:
@misc{nacar2024enhancingsemanticsimilarityunderstanding,
title={Enhancing Semantic Similarity Understanding in Arabic NLP with Nested Embedding Learning},
author={Omer Nacar and Anis Koubaa},
year={2024},
eprint={2407.21139},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2407.21139},
} | [
"TEXT_CLASSIFICATION",
"SEMANTIC_SIMILARITY",
"SUMMARIZATION"
] | [
"BIOSSES"
] |
pszemraj/long-t5-tglobal-base-sci-simplify-elife | pszemraj | summarization | [
"transformers",
"pytorch",
"onnx",
"safetensors",
"longt5",
"text2text-generation",
"lay summaries",
"paper summaries",
"biology",
"medical",
"summarization",
"en",
"dataset:pszemraj/scientific_lay_summarisation-elife-norm",
"base_model:google/long-t5-tglobal-base",
"base_model:quantized:google/long-t5-tglobal-base",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2023-04-08T23:35:56 | 2023-11-28T19:20:35 | 932 | 5 | ---
base_model: google/long-t5-tglobal-base
datasets:
- pszemraj/scientific_lay_summarisation-elife-norm
language:
- en
library_name: transformers
license: apache-2.0
pipeline_tag: summarization
tags:
- lay summaries
- paper summaries
- biology
- medical
widget:
- text: large earthquakes along a given fault segment do not occur at random intervals
because it takes time to accumulate the strain energy for the rupture. The rates
at which tectonic plates move and accumulate strain at their boundaries are approximately
uniform. Therefore, in first approximation, one may expect that large ruptures
of the same fault segment will occur at approximately constant time intervals.
If subsequent main shocks have different amounts of slip across the fault, then
the recurrence time may vary, and the basic idea of periodic mainshocks must be
modified. For great plate boundary ruptures the length and slip often vary by
a factor of 2. Along the southern segment of the San Andreas fault the recurrence
interval is 145 years with variations of several decades. The smaller the standard
deviation of the average recurrence interval, the more specific could be the long
term prediction of a future mainshock.
example_title: earthquakes
- text: ' A typical feed-forward neural field algorithm. Spatiotemporal coordinates
are fed into a neural network that predicts values in the reconstructed domain.
Then, this domain is mapped to the sensor domain where sensor measurements are
available as supervision. Class and Section Problems Addressed Generalization
(Section 2) Inverse problems, ill-posed problems, editability; symmetries. Hybrid
Representations (Section 3) Computation & memory efficiency, representation capacity,
editability: Forward Maps (Section 4) Inverse problems Network Architecture (Section
5) Spectral bias, integration & derivatives. Manipulating Neural Fields (Section
6) Edit ability, constraints, regularization. Table 2: The five classes of techniques
in the neural field toolbox each addresses problems that arise in learning, inference,
and control. (Section 3). We can supervise reconstruction via differentiable forward
maps that transform Or project our domain (e.g, 3D reconstruction via 2D images;
Section 4) With appropriate network architecture choices, we can overcome neural
network spectral biases (blurriness) and efficiently compute derivatives and integrals
(Section 5). Finally, we can manipulate neural fields to add constraints and regularizations,
and to achieve editable representations (Section 6). Collectively, these classes
constitute a ''toolbox'' of techniques to help solve problems with neural fields
There are three components in a conditional neural field: (1) An encoder or inference
function € that outputs the conditioning latent variable 2 given an observation
0 E(0) =2. 2 is typically a low-dimensional vector, and is often referred to aS
a latent code Or feature code_ (2) A mapping function 4 between Z and neural field
parameters O: Y(z) = O; (3) The neural field itself $. The encoder € finds the
most probable z given the observations O: argmaxz P(2/0). The decoder maximizes
the inverse conditional probability to find the most probable 0 given Z: arg-
max P(Olz). We discuss different encoding schemes with different optimality guarantees
(Section 2.1.1), both global and local conditioning (Section 2.1.2), and different
mapping functions Y (Section 2.1.3) 2. Generalization Suppose we wish to estimate
a plausible 3D surface shape given a partial or noisy point cloud. We need a suitable
prior over the sur- face in its reconstruction domain to generalize to the partial
observations. A neural network expresses a prior via the function space of its
architecture and parameters 0, and generalization is influenced by the inductive
bias of this function space (Section 5).'
example_title: scientific paper
- text: 'Is a else or outside the cob and tree written being of early client rope
and you have is for good reasons. On to the ocean in Orange for time. By''s the
aggregate we can bed it yet. Why this please pick up on a sort is do and also
M Getoi''s nerocos and do rain become you to let so is his brother is made in
use and Mjulia''s''s the lay major is aging Masastup coin present sea only of
Oosii rooms set to you We do er do we easy this private oliiishs lonthen might
be okay. Good afternoon everybody. Welcome to this lecture of Computational Statistics.
As you can see, I''m not socially my name is Michael Zelinger. I''m one of the
task for this class and you might have already seen me in the first lecture where
I made a quick appearance. I''m also going to give the tortillas in the last third
of this course. So to give you a little bit about me, I''m a old student here
with better Bulman and my research centres on casual inference applied to biomedical
disasters, so that could be genomics or that could be hospital data. If any of
you is interested in writing a bachelor thesis, a semester paper may be mastathesis
about this topic feel for reach out to me. you have my name on models and my email
address you can find in the directory I''d Be very happy to talk about it. you
do not need to be sure about it, we can just have a chat. So with that said, let''s
get on with the lecture. There''s an exciting topic today I''m going to start
by sharing some slides with you and later on during the lecture we''ll move to
the paper. So bear with me for a few seconds. Well, the projector is starting
up. Okay, so let''s get started. Today''s topic is a very important one. It''s
about a technique which really forms one of the fundamentals of data science,
machine learning, and any sort of modern statistics. It''s called cross validation.
I know you really want to understand this topic I Want you to understand this
and frankly, nobody''s gonna leave Professor Mineshousen''s class without understanding
cross validation. So to set the stage for this, I Want to introduce you to the
validation problem in computational statistics. So the problem is the following:
You trained a model on available data. You fitted your model, but you know the
training data you got could always have been different and some data from the
environment. Maybe it''s a random process. You do not really know what it is,
but you know that somebody else who gets a different batch of data from the same
environment they would get slightly different training data and you do not care
that your method performs as well. On this training data. you want to to perform
well on other data that you have not seen other data from the same environment.
So in other words, the validation problem is you want to quantify the performance
of your model on data that you have not seen. So how is this even possible? How
could you possibly measure the performance on data that you do not know The solution
to? This is the following realization is that given that you have a bunch of data,
you were in charge. You get to control how much that your model sees. It works
in the following way: You can hide data firms model. Let''s say you have a training
data set which is a bunch of doubtless so X eyes are the features those are typically
hide and national vector. It''s got more than one dimension for sure. And the
why why eyes. Those are the labels for supervised learning. As you''ve seen before,
it''s the same set up as we have in regression. And so you have this training
data and now you choose that you only use some of those data to fit your model.
You''re not going to use everything, you only use some of it the other part you
hide from your model. And then you can use this hidden data to do validation from
the point of you of your model. This hidden data is complete by unseen. In other
words, we solve our problem of validation.'
example_title: transcribed audio - lecture
- text: 'Transformer-based models have shown to be very useful for many NLP tasks.
However, a major limitation of transformers-based models is its O(n^2)O(n 2) time
& memory complexity (where nn is sequence length). Hence, it''s computationally
very expensive to apply transformer-based models on long sequences n > 512n>512.
Several recent papers, e.g. Longformer, Performer, Reformer, Clustered attention
try to remedy this problem by approximating the full attention matrix. You can
checkout 🤗''s recent blog post in case you are unfamiliar with these models.
BigBird (introduced in paper) is one of such recent models to address this issue.
BigBird relies on block sparse attention instead of normal attention (i.e. BERT''s
attention) and can handle sequences up to a length of 4096 at a much lower computational
cost compared to BERT. It has achieved SOTA on various tasks involving very long
sequences such as long documents summarization, question-answering with long contexts.
BigBird RoBERTa-like model is now available in 🤗Transformers. The goal of this
post is to give the reader an in-depth understanding of big bird implementation
& ease one''s life in using BigBird with 🤗Transformers. But, before going into
more depth, it is important to remember that the BigBird''s attention is an approximation
of BERT''s full attention and therefore does not strive to be better than BERT''s
full attention, but rather to be more efficient. It simply allows to apply transformer-based
models to much longer sequences since BERT''s quadratic memory requirement quickly
becomes unbearable. Simply put, if we would have ∞ compute & ∞ time, BERT''s attention
would be preferred over block sparse attention (which we are going to discuss
in this post).
If you wonder why we need more compute when working with longer sequences, this
blog post is just right for you!
Some of the main questions one might have when working with standard BERT-like
attention include:
Do all tokens really have to attend to all other tokens? Why not compute attention
only over important tokens? How to decide what tokens are important? How to attend
to just a few tokens in a very efficient way? In this blog post, we will try to
answer those questions.
What tokens should be attended to? We will give a practical example of how attention
works by considering the sentence ''BigBird is now available in HuggingFace for
extractive question answering''. In BERT-like attention, every word would simply
attend to all other tokens.
Let''s think about a sensible choice of key tokens that a queried token actually
only should attend to by writing some pseudo-code. Will will assume that the token
available is queried and build a sensible list of key tokens to attend to.
>>> # let''s consider following sentence as an example >>> example = [''BigBird'',
''is'', ''now'', ''available'', ''in'', ''HuggingFace'', ''for'', ''extractive'',
''question'', ''answering'']
>>> # further let''s assume, we''re trying to understand the representation of
''available'' i.e. >>> query_token = ''available'' >>> # We will initialize an
empty `set` and fill up the tokens of our interest as we proceed in this section.
>>> key_tokens = [] # => currently ''available'' token doesn''t have anything
to attend Nearby tokens should be important because, in a sentence (sequence of
words), the current word is highly dependent on neighboring past & future tokens.
This intuition is the idea behind the concept of sliding attention.'
example_title: bigbird blog intro
- text: 'To be fair, you have to have a very high IQ to understand Rick and Morty.
The humour is extremely subtle, and without a solid grasp of theoretical physics
most of the jokes will go over a typical viewer''s head. There''s also Rick''s
nihilistic outlook, which is deftly woven into his characterisation- his personal
philosophy draws heavily from Narodnaya Volya literature, for instance. The fans
understand this stuff; they have the intellectual capacity to truly appreciate
the depths of these jokes, to realise that they''re not just funny- they say something
deep about LIFE. As a consequence people who dislike Rick & Morty truly ARE idiots-
of course they wouldn''t appreciate, for instance, the humour in Rick''s existential
catchphrase ''Wubba Lubba Dub Dub,'' which itself is a cryptic reference to Turgenev''s
Russian epic Fathers and Sons. I''m smirking right now just imagining one of those
addlepated simpletons scratching their heads in confusion as Dan Harmon''s genius
wit unfolds itself on their television screens. What fools.. how I pity them.
😂
And yes, by the way, i DO have a Rick & Morty tattoo. And no, you cannot see it.
It''s for the ladies'' eyes only- and even then they have to demonstrate that
they''re within 5 IQ points of my own (preferably lower) beforehand. Nothin personnel
kid 😎'
example_title: Richard & Mortimer
- text: The tower is 324 metres (1,063 ft) tall, about the same height as an 81-storey
building, and the tallest structure in Paris. Its base is square, measuring 125
metres (410 ft) on each side. During its construction, the Eiffel Tower surpassed
the Washington Monument to become the tallest man-made structure in the world,
a title it held for 41 years until the Chrysler Building in New York City was
finished in 1930. It was the first structure to reach a height of 300 metres.
Due to the addition of a broadcasting aerial at the top of the tower in 1957,
it is now taller than the Chrysler Building by 5.2 metres (17 ft). Excluding transmitters,
the Eiffel Tower is the second tallest free-standing structure in France after
the Millau Viaduct.
example_title: eiffel
parameters:
max_length: 64
min_length: 8
no_repeat_ngram_size: 3
early_stopping: true
repetition_penalty: 3.5
encoder_no_repeat_ngram_size: 4
length_penalty: 0.4
num_beams: 4
---
# long-t5-tglobal-base-sci-simplify: elife subset
<a href="https://colab.research.google.com/gist/pszemraj/37a406059887a400afc1428d70374327/long-t5-tglobal-base-sci-simplify-elife-example-with-textsum.ipynb">
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/>
</a>
Exploring how well long-document models trained on "lay summaries" of scientific papers generalize.
> A lay summary is a summary of a research paper or scientific study that is written in plain language, without the use of technical jargon, and is designed to be easily understood by non-experts.
## Model description
This model is a fine-tuned version of [google/long-t5-tglobal-base](https://huggingface.co/google/long-t5-tglobal-base) on the `pszemraj/scientific_lay_summarisation-elife-norm` dataset.
- The variant trained on the PLOS subset can be found [here](https://huggingface.co/pszemraj/long-t5-tglobal-base-sci-simplify)
## Usage
It's recommended to use this model with [beam search decoding](https://huggingface.co/docs/transformers/generation_strategies#beamsearch-decoding). If interested, you can also use the `textsum` util repo to have most of this abstracted out for you:
```bash
pip install -U textsum
```
```python
from textsum.summarize import Summarizer
model_name = "pszemraj/long-t5-tglobal-base-sci-simplify-elife"
summarizer = Summarizer(model_name) # GPU auto-detected
text = "put the text you don't want to read here"
summary = summarizer.summarize_string(text)
print(summary)
```
## Intended uses & limitations
- Ability to generalize outside of the dataset domain (pubmed/bioscience type papers) has to be evaluated.
## Training and evaluation data
The `elife` subset of the lay summaries dataset. Refer to `pszemraj/scientific_lay_summarisation-elife-norm`
## Training procedure
### Eval results
It achieves the following results on the evaluation set:
- Loss: 1.9990
- Rouge1: 38.5587
- Rouge2: 9.7336
- Rougel: 21.1974
- Rougelsum: 35.9333
- Gen Len: 392.7095
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0004
- train_batch_size: 4
- eval_batch_size: 2
- seed: 42
- distributed_type: multi-GPU
- gradient_accumulation_steps: 16
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.01
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|:-------------:|:-----:|:----:|:---------------:|:-------:|:------:|:-------:|:---------:|:--------:|
| 2.2995 | 1.47 | 100 | 2.0175 | 35.2501 | 8.2121 | 20.4587 | 32.4494 | 439.7552 |
| 2.2171 | 2.94 | 200 | 1.9990 | 38.5587 | 9.7336 | 21.1974 | 35.9333 | 392.7095 |
| [
"QUESTION_ANSWERING",
"SUMMARIZATION"
] | [
"BEAR"
] |
phamhai/Llama-3.2-3B-Instruct-Frog | phamhai | text-generation | [
"safetensors",
"llama",
"RAG",
"Vietnamese",
"Generation",
"Function_Calling",
"Function Calling",
"FC",
"Summarization",
"Rewriting",
"Functions",
"VLLM",
"LLM",
"text-generation",
"conversational",
"en",
"vi",
"base_model:meta-llama/Llama-3.2-3B-Instruct",
"base_model:finetune:meta-llama/Llama-3.2-3B-Instruct",
"license:llama3.2",
"region:us"
] | 2024-10-22T09:45:48 | 2024-12-13T04:48:01 | 930 | 11 | ---
base_model:
- meta-llama/Llama-3.2-3B-Instruct
language:
- en
- vi
license: llama3.2
pipeline_tag: text-generation
tags:
- RAG
- Vietnamese
- Generation
- Function_Calling
- Function Calling
- FC
- Summarization
- Rewriting
- Functions
- VLLM
- LLM
---
<p align="center"> <img src="https://cdn-uploads.huggingface.co/production/uploads/6612cc790b91dd96968028f9/yP51EyRNg-CHCKB4gBYan.png" width="300" /> </p>
<h1>Llama-3.2-3B-Instruct-Frog - a RAG-optimized LLaMA3.2 for Vietnamese</h1>
**Quantized Version**: [phamhai/Llama-3.2-3B-Instruct-Frog-Q4_K_M-GGUF](https://huggingface.co/phamhai/Llama-3.2-3B-Instruct-Frog-Q4_K_M-GGUF)
At the end of September 2024, Meta released two lightweight LLM model versions: [Llama-3.2-1B-Instruct](https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct) and [Llama-3.2-3B-Instruct](https://huggingface.co/meta-llama/Llama-3.2-3B-Instruct). However, these models are not well-supported for Vietnamese, especially for tasks related to Retrieval-Augmented Generation (RAG).
Today, I am excited to announce the release of two models specifically trained to provide better support for Vietnamese RAG tasks.
<h2>Model Details:</h2>
+ Base Models: Llama-3.2-1B-Instruct and Llama-3.2-3B-Instruct
+ Performance: The models are optimized for fast inference and can be easily deployed on on-premise and edge devices (laptop/smartphone/NVIDIA Jetson Xavier/Raspberry Pi,ect).
+ Model weights:
+ [Llama-3.2-1B-Instruct-Frog](https://huggingface.co/phamhai/Llama-3.2-1B-Instruct-Frog): 131K context length, 1 billion parameters
+ [Llama-3.2-3B-Instruct-Frog](https://huggingface.co/phamhai/Llama-3.2-3B-Instruct-Frog): 131K context length, 3 billion parameters
<blockquote style="color:red"> <p><strong style="color: red">Terms of Use and License</strong>: By using our released weights, you agree to and comply with the terms and conditions specified in Meta's LLaMA-3 license.</blockquote>
<h2>Model Evaluation</h2>
We evaluated this model on the [VLMU benchmark](https://vmlu.ai/) and achieved an accuracy of **45.13**. However, this benchmark is not the focus of our current efforts. We believe it will be very difficult for language models with fewer than 13 billion parameters to retain enough knowledge to answer questions across diverse user contexts, especially for smaller models with under 3 billion parameters. For the model to effectively handle real-world business scenarios and avoid hallucinations, it is almost essential to supplement knowledge from external sources (through RAG). Therefore, we developed this model with a primary focus on optimizing its RAG capabilities. Internal testing is currently underway and will be updated soon.
***Update***:
Function Calling Benchmark: https://huggingface.co/datasets/phamhai/Vietnamese-Function-Calling-Test
| Model | Model size | Function name Acc (%) | Exact Match Acc (%)
| ------------ | ------------------ | ---------- | --------- |
| [phamhai/Llama-3.2-3B-Instruct-Frog](https://huggingface.co/phamhai/Llama-3.2-3B-Instruct-Frog) | ~3B | 95.79 | 51.05 |
| [Gemini-1.5-Pro](https://deepmind.google/technologies/gemini/pro/) | --- | 96.96 | 55.16 |
| [Gemini-1.5-Flash](https://deepmind.google/technologies/gemini/flash/) | --- | 97.10 | 51.64 |
| [Gemini-1.5-Flash-8B](https://deepmind.google/technologies/gemini/flash/) | --- | 97.38 | 64.75 |
| [Gemini 2.0 Flash Experimental](https://deepmind.google/technologies/gemini/flash/) | --- | 96.93 | 61.26 |
| [gpt-4o-2024-08-06](https://platform.openai.com/docs/models#gpt-4o) | --- | 94.38 | 52.88 |
| [phamhai/Llama-3.2-3B-Instruct-Frog-Pro](https://huggingface.co/phamhai/Llama-3.2-3B-Instruct-Frog-Pro) | ~3B | 97.96 | 63.47 |
<p align="left"> Table 1. Vietnamese Function Calling Benchmark </p>
<h2> Run the model </h2>
(*Disclaimer: The name of the bot is called Vivi, which is due to my passion for VinFast vehicles, and I also hope to develop my own smaller models for VinFast's car lines (which they refer to as their virtual assistant, Vivi). This model has no affiliation with VinFast or any related entities.*)
<h3> with Huggingface's transformers </h3>
<h4> 1. QnA task </h4>
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model_path = "phamhai/Llama-3.2-3B-Instruct-Frog"
tokenizer = AutoTokenizer.from_pretrained(model_path)
model = AutoModelForCausalLM.from_pretrained(model_path)
messages = [
{"role": "system", "content": "Bạn là một người bạn gái xinh đẹp. Tên của bạn là Vivi. Hãy luôn xưng là Vivi, gọi người nói là anh và trả lời luôn bắt đầu bằng cụm từ Dạ thưa anh yêu của em."},
{"role": "user", "content": "xin chào em"}]
tokenized_chat = tokenizer.apply_chat_template(messages, tokenize=True, add_generation_prompt=True, return_tensors="pt")
outputs = model.generate(tokenized_chat, max_new_tokens=128)
print(tokenizer.decode(outputs[0]))
# Dạ thưa anh yêu của em, em rất vui được gặp anh.
messages = [
{"role": "system", "content": "Bạn là một trợ lí Tiếng Việt nhiệt tình và trung thực. Hãy luôn trả lời một cách hữu ích nhất có thể, đồng thời giữ an toàn.\nNếu một câu hỏi không có ý nghĩa hoặc không hợp lý về mặt thông tin, hãy giải thích tại sao thay vì trả lời một điều gì đó không chính xác, vui lòng không chia sẻ thông tin sai lệch."},
{"role": "user", "content": "Làm sao để chữa bệnh đau đầu?"}]
tokenized_chat = tokenizer.apply_chat_template(messages, tokenize=True, add_generation_prompt=True, return_tensors="pt")
outputs = model.generate(tokenized_chat, max_new_tokens=256)
print(tokenizer.decode(outputs[0]))
# Có nhiều nguyên nhân gây đau đầu, bao gồm căng thẳng, thiếu ngủ, mất nước, chứng đau đầu thường xuyên, đau đầu do chứng đau nửa đầu, và nhiều hơn nữa. Dưới đây là một số cách để giảm đau đầu:
# 1. Nghỉ ngơi: Nếu đau đầu là do căng thẳng hoặc thiếu ngủ, hãy nghỉ ngơi và ngủ đủ giấc.
# 2. Massage: Massage vùng cổ và vai có thể giúp giảm đau đầu.
# 3. Uống nước: Đảm bảo bạn uống đủ nước để giữ cho cơ thể luôn được cung cấp đủ nước.
# 4. Sử dụng thuốc giảm đau: Nếu đau đầu không giảm sau khi nghỉ ngơi và uống nước, bạn có thể sử dụng thuốc giảm đau như paracetamol hoặc ibuprofen.
# 5. Sử dụng băng lạnh: Nếu đau đầu do chứng đau nửa đầu, bạn có thể sử dụng băng lạnh để giảm đau.
# 6. Thay đổi chế độ ăn uống: Nếu đau đầu liên quan đến chế độ ăn uống của bạn, hãy thay đổi chế độ ăn uống để giảm đau đầu.
# Nếu đau đầu kéo dài hoặc trở nên nghiêm trọng hơn, bạn nên tìm kiếm sự giúp đỡ y tế để được chẩn đoán và điều trị đúng cách.
```
<h4> 2. Summarization task </h4>
<h5> Focused Answer </h5>
```python
messages = [
{"role": "system", "content": '''Bạn là một trợ lí Tiếng Việt nhiệt tình và trung thực. Hãy luôn trả lời một cách hữu ích nhất có thể, đồng thời giữ an toàn.
Nếu một câu hỏi không có ý nghĩa hoặc không hợp lý về mặt thông tin, hãy giải thích tại sao thay vì trả lời một điều gì đó không chính xác, vui lòng không chia sẻ thông tin sai lệch.
Context:
Đoạn 0: "Chính phủ đề xuất bổ sung gần 20.700 tỷ đồng vốn điều lệ cho Ngân hàng Ngoại thương Việt Nam (Vietcombank) từ cổ tức bằng cổ phiếu được chia của cổ đông Nhà nước. Chiều 23/10, thừa ủy quyền Chính phủ, Phó thủ tướng, Bộ trưởng Tài chính Hồ Đức Phớc trình Quốc hội về bổ sung vốn Nhà nước tại Ngân hàng Ngoại Thương Việt Nam (Vietcombank). Theo đó, Chính phủ đề nghị tăng vốn điều lệ cho ngân hàng này gần 20.700 tỷ đồng từ cổ tức bằng cổ phiếu được chia của cổ đông Nhà nước. Số tiền này lấy từ nguồn lợi nhuận còn lại lũy kế đến hết năm 2018 và lãi còn lại năm 2021. Vốn điều lệ dự kiến rót thêm cho Vietcombank gần bằng lợi nhuận hợp nhất trước thuế nửa đầu năm nay của nhà băng này. Việc bổ sung vốn cho "ông lớn" ngân hàng quốc doanh được Phó thủ tướng nhấn mạnh là cấp thiết để duy trì tỷ lệ vốn góp Nhà nước, phù hợp chiến lược phát triển kinh tế xã hội, tạo nguồn lực hỗ trợ ngân hàng yếu kém. Phó thủ tướng cho biết, phần lợi nhuận còn lại lũy kế hết năm 2018 và lãi còn lại 2021 hiện được hạch toán theo dõi tại VCB, chưa nằm trong cân đối ngân sách Nhà nước. Do vậy, nguồn vốn đề xuất tăng cho ngân hàng này không ảnh hưởng tới kế hoạch dự toán thu chi ngân sách 2024-2025. Phó thủ tướng, Bộ trưởng Tài chính Hồ Đức Phớc đọc tờ trình bổ sung vốn cho Vietcombank, ngày 23/10. Ảnh: Trung tâm báo chí Quốc hội Phó thủ tướng, Bộ trưởng Tài chính Hồ Đức Phớc đọc tờ trình bổ sung vốn cho Vietcombank, ngày 23/10. Ảnh: Trung tâm báo chí Quốc hội Vốn điều lệ của Vietcombank hiện là 55.891 tỷ đồng, thấp hơn nhiều so với VPBank (79.339 tỷ đồng), Techcombank (70.450 tỷ đồng) và không có sự cách biệt lớn so với một số ngân hàng thương mại cổ phần như MB (52.871) tỷ đồng, ACB (44.667 tỷ đồng) và SHB (36.629 tỷ đồng). Ngoài ra, việc tăng vốn nhằm để ngân hàng này đáp ứng các tỷ lệ an toàn tối thiểu. Tính tới cuối 2023, tỷ lệ an toàn vốn (CAR) của ngân hàng này là 11,05%, đảm bảo quy định. Tuy nhiên, mức này thấp hơn các ngân hàng thương mại cổ phần (VPBank, MB là 12-13%; Techcombank 13-15%...) và các nhà băng trong khu vực (Singapore là 17,1%, Indonesia 23,27%...). Thẩm tra nội dung này, Chủ nhiệm Ủy ban Kinh tế Vũ Hồng Thanh cho rằng đề xuất tăng vốn cho Vietcombank bảo đảm cơ sở pháp lý và đúng thẩm quyền theo quy định. Tuy nhiên, Ủy ban Kinh tế đề nghị Chính phủ lấy ý kiến của cổ đông chiến lược nước ngoài Ngân hàng Mizuho Corporate Bank - đơn vị nắm 15% vốn điều lệ của Vietcombank. Việc này nhằm thuận lợi trong quá trình tăng vốn. Chính phủ cũng cần bổ sung thông tin hiện trạng vốn của Vietcombank so với các ngân hàng thương mại trong hệ thống hiện nay. "Có ý kiến đề nghị làm rõ nhận định nguồn vốn đề xuất để tăng vốn điều lệ không tác động đến ngân sách Nhà nước", ông Thanh cho biết. Trụ sở Ngân hàng Ngoại thương Việt Nam (Vietcombank). Ảnh: VCB Trụ sở Ngân hàng Ngoại thương Việt Nam (Vietcombank). Ảnh: VCB Chủ nhiệm Ủy ban Kinh tế Vũ Hồng Thanh đề nghị Chính phủ chỉ đạo Ngân hàng Nhà nước cùng các bộ, ngành liên quan xử lý phần lợi nhuận còn lại năm 2022, 2023 (lần lượt là 21.680 tỷ và 25.009 tỷ đồng), nhằm tăng năng lực tài chính cho Vietcombank, bù đắp mức thiếu hụt vốn tự có, bảo đảm an toàn hoạt động. Cơ quan thẩm tra lưu ý vốn được bổ sung cho Vietcombank cần được dùng để mở rộng kinh doanh, cung ứng tín dụng với các lĩnh vực, dự án quan trọng quốc gia quy mô lớn, giảm lãi suất cho vay, cũng như đổi mới mô hình quản trị, chất lượng dịch vụ của nhà băng này. "Chính phủ cần đánh giá kỹ tác động việc bổ sung vốn Nhà nước cho Vietcombank tới phát triển của ngành ngân hàng, hiệu quả kinh tế xã hội", Ủy ban Kinh tế lưu ý. Vietcombank là một trong 4 ngân hàng thương mại Nhà nước, bên cạnh BIDV, VietinBank và Agribank. Ngân hàng này do Nhà nước sở hữu 74,8% vốn điều lệ. Lũy kế nửa đầu năm nay, lợi nhuận hợp nhất trước thuế của nhà băng này đạt 20.835 tỷ đồng, tăng 1,6% so với cùng kỳ 2023. Với dữ liệu này, Vietcombank tiếp tục đứng đầu toàn hệ thống ngân hàng về lợi nhuận 6 tháng đầu năm. Đây cũng là mức lãi nửa đầu năm cao kỷ lục của nhà băng này. Tính đến 30/6, tổng tài sản của ngân hàng đạt hơn 1,9 triệu tỷ đồng, tăng 3,6% so với cuối 2023. Trong đó, cho vay khách hàng gần 1,37 triệu tỷ đồng, tăng 7,8%."
Đoạn 1: "Đã có vài đơn vị bán tín chỉ carbon cho khách ngoại nhưng còn thiếu cơ sở pháp lý để đảm bảo hoạt động được thuận lợi, theo chuyên gia. Thông tin tại phiên tọa đàm thuộc Diễn đàn và Triển lãm Kinh tế xanh 2024 (GEFE), ông Đỗ Ngọc Quỳnh, Tổng thư ký Hiệp hội Thị trường Trái phiếu Việt Nam (VBMA), cho biết thị trường tín chỉ carbon tự nguyện Việt Nam đã có một số đơn vị bán được tín chỉ carbon cho nhà đầu tư, tập đoàn nước ngoài. "Họ đang mua chứng chỉ carbon và chứng chỉ năng lượng tái tạo (REC) trong tiêu chí RE100, tức 100% năng lượng tái tạo", ông cho biết. RE100 là sáng kiến toàn cầu dành cho các công ty cam kết sử dụng 100% điện năng tái tạo, phát động bởi Climate Group và CDP vào 2014. Từ trái sang, Marco Gaspari, Điều phối viên Ngành Môi trường tại Cơ quan Hợp tác Phát triển Italy (AICS Hà Nội) và ông Đỗ Ngọc Quỳnh, Tổng Thư ký Hiệp hội Thị trường Trái phiếu Việt Nam (VBMA) nói tại tọa đàm. Ảnh: GEFE 2024 Marco Gaspari, Điều phối viên Ngành Môi trường tại Cơ quan Hợp tác Phát triển Italy (AICS Hà Nội) và ông Đỗ Ngọc Quỳnh, Tổng Thư ký Hiệp hội Thị trường Trái phiếu Việt Nam (VBMA) chia sẻ tại tọa đàm. Ảnh: GEFE 2024 Thị trường carbon gồm hai hình thức là bắt buộc và tự nguyện. Đồ họa: Dỹ Tùng Phân biệt các loại thị trường carbon. Đồ họa: Dỹ Tùng Theo kế hoạch của chính phủ, thị trường bắt buộc sẽ vận hành thử nghiệm vào giai đoạn 2025-2028. Với thị trường tự nguyện, ông Quỳnh cho biết đã bắt đầu hình thành và cũng biến động theo diễn biến xu hướng chung toàn cầu. Chuyên gia VBMA cho rằng Việt Nam đã có chính sách chung để thực hiện cam kết Net Zero vào 2050, nhưng vẫn chưa có pháp lý đầy đủ và rõ ràng cho thị trường carbon tự nguyện. "Những người bán tại Việt Nam sau giao dịch không biết hạch toán vào đâu, nộp thuế thế nào. Một số chọn phương án tính vào thu nhập bất thường để khai thuế", ông ví dụ. Ông Nguyễn Thành Nghiệp, Luật sư thành viên công ty luật VTN và Cộng sự chỉ ra việc chưa có quy định xác định tính chất tài sản của tín chỉ carbon. "Chúng có được xem là tài sản bình thường, được thế chấp hay giao dịch thế nào chưa có đủ căn cứ pháp lý", ông nói. Ngoài ra, quy trình MRV (đo lường, báo cáo và kiểm chứng) cũng cần quy định, hướng dẫn rõ. Theo ông, ngoài các cơ quan quản lý, khu vực tư nhân cũng trông chờ xem liệu có thể tham gia hoạt động MRV không. "Trong thời gian tới, nếu hoàn thiện pháp lý, thị trường sẽ có nhiều tiềm năng phát triển hơn", ông Đỗ Ngọc Quỳnh dự báo. Ngoài tín chỉ carbon, với tiềm năng điện tái tạo thứ tư thế giới theo McKenzie, ông cho rằng có thể khai thác việc vừa bán tín chỉ carbon vừa bán được REC. Theo VBMA, quy mô thị trường carbon bắt buộc toàn cầu đạt 104 tỷ USD năm ngoái, tăng 100% so với năm 2020. Trong khi, thị trường tự nguyện đã thu hẹp còn 800 triệu USD, giảm hai phần ba so với 2021 do một số vụ bê bối liên quan đến "giặt xanh" (green washing) làm ảnh hưởng đến uy tín, niềm tin. Theo dõi biến động của thị trường thế giới giúp các bên tham gia trong thị trường carbon tự nguyện còn sơ khai của Việt Nam rút kinh nghiệm và tìm ra hướng đi. Marco Gaspari, Điều phối viên Ngành Môi trường tại Cơ quan Hợp tác Phát triển Italy (AICS) văn phòng Hà Nội, dự báo người mua sẽ cần tìm kiếm các bên bán tín chỉ có hệ thống quản trị tốt và rõ ràng. Ông cho rằng người mua đang thiên về chuộng mua tín chỉ lĩnh vực giảm phát thải sản xuất vì dễ chứng minh. Một loại được quan tâm khác là "carbon xanh dương" (blue carbon) - tín chỉ tạo ra từ các dự án hấp thụ carbon của rừng ngập mặn, đầm lầy bãi triều và cỏ biển. Ông chỉ ra Việt Nam triển vọng với 200.000 ha rừng ngập mặn, có thể làm các dự án carbon tương tự như ở Honduras. Bà Thu Nguyễn, Quản lý chính sách tại Apanada Management Consultancy, Đại diện Viện Tài nguyên Thế giới (WRI) khuyến nghị các dự án tín chỉ carbon nâng cao giá trị bằng cách quan tâm đến tính bình đẳng và bao trùm. Theo đó, mục tiêu không chỉ là giảm phát thải mà còn là cải thiện đời sống người dân và phát triển bình đẳng hơn "Dự án cần bảo đảm có tham vấn của cộng đồng, đặc biệt là phụ nữ và các nhóm yếu thế, để tạo ra lợi ích cho cả cộng đồng lẫn nhà đầu tư", bà nói."
Đoạn 2: "Giá nhẫn trơn liên tục điều chỉnh, tăng gần một triệu đồng trong ngày và có nơi lên sát 89 triệu đồng một lượng. 15h ngày 23/10, giá mua bán nhẫn trơn được các thương hiệu kinh doanh điều chỉnh theo diễn biến đi lên của thế giới. Chiều nay, mỗi ounce vàng quốc tế tiếp tục thiết lập kỷ lục mới 2.755 USD. Giá nhẫn trơn tại Công ty Vàng bạc đá quý Sài Gòn (SJC) cũng tăng nửa triệu đồng so với đầu sáng và gần 1 triệu đồng so với cuối ngày hôm qua, lên 86,9 - 88,2 triệu đồng. Công ty Vàng bạc đá quý Phú Nhuận (PNJ) và Mi Hồng niêm yết giá nhẫn trơn quanh vùng 87,4 - 88,4 triệu đồng. Còn tại Tập đoàn Vàng bạc đá quý DOJI, giá mua bán nhẫn trơn cùng thời điểm thậm chí lên 88 - 88,9 triệu đồng một lượng. Trước đó đầu ngày, Công ty Vàng bạc đá quý Sài Gòn (SJC) đã tăng 300.000 đồng một lượng so với cuối ngày hôm qua, niêm yết giá nhẫn trơn tại 86,3 - 87,6 triệu đồng. Biểu giá mua bán nhẫn trơn tại Tập đoàn Vàng bạc đá quý DOJI lúc 9h sáng là 87 - 88 triệu đồng, tăng 200.000 đồng so với cuối ngày hôm qua. Nhẫn trơn giữ nhịp tăng liên tục trong 10 ngày qua. So với giữa tháng, mỗi lượng nhẫn trơn đã tăng hơn 5 triệu đồng. Còn so với đầu năm, nhẫn trơn tăng gần 25 triệu một lượng, tương đương hiệu suất 39%. Trong khi giá vàng miếng SJC đứng yên ở vùng 87 - 89 triệu một lượng, do Ngân hàng Nhà nước chưa thay đổi giá bán can thiệp. Thời điểm này là mùa cưới cuối năm và nhu cầu mua vàng nhẫn làm quà cưới tăng, song người dân không dễ để mua được mặt hàng này tại các thương hiệu lớn. Các thương hiệu lớn như DOJI, PNJ, Bảo Tín Minh Châu thường xuyên trong tình trạng cháy hàng. Khách lẻ chỉ may mắn mua được số lượng ít nếu cửa hàng vừa có khách bán ra. Còn tại SJC, các chi nhánh giới hạn lượng mua tối đa 5 phân đến 1 chỉ mỗi người. Trên thị trường quốc tế, mỗi ounce vàng trong 5 ngày qua tăng mạnh hơn 100 USD. Kim loại quý có thời điểm lên mức kỷ lục gần 2.750 USD, trước khi lùi về vùng 2.738 USD vào sáng nay. Quy đổi theo tỷ giá bán Vietcombank, giá vàng trong nước chênh lệch 3,5-5 triệu đồng một lượng so với thế giới. Theo dự báo của các nhà băng hàng đầu thế giới, giá vàng thế giới có thể lên 3.000 USD một ounce vào năm sau. Các chuyên gia khuyến nghị nhà đầu tư phân bổ tỷ trọng nhỏ danh mục vào kênh trú ẩn này, đặc biệt trong bối cảnh kim loại quý đã tăng mạnh thời gian qua."
Đoạn 3: "Nhu cầu trú ẩn khi căng thẳng địa chính trị leo thang kéo giá vàng lên mức đỉnh mới, tại 2.748 USD một ounce. Chốt phiên giao dịch 22/10, giá vàng thế giới giao ngay tăng gần 30 USD lên 2.748 USD một ounce. Đây là mức cao kỷ lục mới của kim loại quý. "Căng thẳng địa chính trị vẫn là nguyên nhân chủ yếu. Hai tuần nữa sẽ diễn ra bầu cử Tổng thống Mỹ và cuộc đua vẫn rất sát sao. Bất ổn chính trị đang kéo nhu cầu trú ẩn lên cao", Peter A. Grant - Phó giám đốc Zaner Metals nhận định trên Reuters. Giá vàng thế giới đảo chiều tăng mạnh trong phiên 22/10. Đồ thị: Kitco Giá vàng thế giới đảo chiều tăng mạnh trong phiên 22/10. Đồ thị: Kitco Cuộc thăm dò mới nhất của Reuters/Ipsos cho thấy tỷ lệ ủng hộ Phó tổng thống Kamala Harris hiện là 46%, nhỉnh hơn so với 43% của cựu Tổng thống Donald Trump. "Sự sát sao này đang tạo nên tình trạng thiếu chắc chắn. Môi trường này có lợi cho vàng", các nhà phân tích tại ngân hàng BNP Paribas nhận định. Grant dự báo nếu căng thẳng tại Trung Đông tiếp tục tăng nhiệt, giá có thể lên 3.000 USD cuối năm nay. Từ đầu năm, giá đã tăng 33% và liên tiếp lập đỉnh mới. Một yếu tố khác đang hỗ trợ kim loại quý là làn sóng giảm lãi suất của các ngân hàng trung ương lớn trên toàn cầu. Mỹ, châu Âu, Trung Quốc cùng hàng loạt nền kinh tế khác đã giảm lãi suất năm nay để hỗ trợ nền kinh tế. Trong khi đó, tại Wall Street, các chỉ số chính gần như đứng yên. Nhà đầu tư hiện theo dõi lợi suất trái phiếu chính phủ Mỹ và chờ đánh giá thêm báo cáo tài chính của các doanh nghiệp. Ngoài vàng, các kim loại quý khác cũng tăng giá. Bạc lập đỉnh 12 năm, khi tăng 3,2% lên gần 35 USD một ounce. Han Tan - chiến lược gia thị trường tại Exinity Group dự báo bạc vượt mốc 35 USD trước khi cuộc bầu cử diễn ra. Bạch kim đắt thêm 2,8% lên 1.031 USD một ounce. Palladium tăng 2,9% lên 1.081 USD."
'''},
{"role": "user", "content": '''giá nhẫn trơn hôm nay là bao nhiêu?'''}]
tokenized_chat = tokenizer.apply_chat_template(messages, tokenize=True, add_generation_prompt=True, return_tensors="pt")
outputs = model.generate(tokenized_chat, max_new_tokens=128)
print(tokenizer.decode(outputs[0]))
# Giá nhẫn trơn hôm nay là 86,9 - 88,2 triệu đồng.
```
<h5> Answer with bot persona</h5>
```python
messages = [
{"role": "system", "content": '''Bạn là một trợ lí Tiếng Việt nhiệt tình và trung thực. Hãy luôn trả lời một cách hữu ích nhất có thể, đồng thời giữ an toàn.
Nếu một câu hỏi không có ý nghĩa hoặc không hợp lý về mặt thông tin, hãy giải thích tại sao thay vì trả lời một điều gì đó không chính xác, vui lòng không chia sẻ thông tin sai lệch.
Context:
Đoạn 0: "Chính phủ đề xuất bổ sung gần 20.700 tỷ đồng vốn điều lệ cho Ngân hàng Ngoại thương Việt Nam (Vietcombank) từ cổ tức bằng cổ phiếu được chia của cổ đông Nhà nước. Chiều 23/10, thừa ủy quyền Chính phủ, Phó thủ tướng, Bộ trưởng Tài chính Hồ Đức Phớc trình Quốc hội về bổ sung vốn Nhà nước tại Ngân hàng Ngoại Thương Việt Nam (Vietcombank). Theo đó, Chính phủ đề nghị tăng vốn điều lệ cho ngân hàng này gần 20.700 tỷ đồng từ cổ tức bằng cổ phiếu được chia của cổ đông Nhà nước. Số tiền này lấy từ nguồn lợi nhuận còn lại lũy kế đến hết năm 2018 và lãi còn lại năm 2021. Vốn điều lệ dự kiến rót thêm cho Vietcombank gần bằng lợi nhuận hợp nhất trước thuế nửa đầu năm nay của nhà băng này. Việc bổ sung vốn cho "ông lớn" ngân hàng quốc doanh được Phó thủ tướng nhấn mạnh là cấp thiết để duy trì tỷ lệ vốn góp Nhà nước, phù hợp chiến lược phát triển kinh tế xã hội, tạo nguồn lực hỗ trợ ngân hàng yếu kém. Phó thủ tướng cho biết, phần lợi nhuận còn lại lũy kế hết năm 2018 và lãi còn lại 2021 hiện được hạch toán theo dõi tại VCB, chưa nằm trong cân đối ngân sách Nhà nước. Do vậy, nguồn vốn đề xuất tăng cho ngân hàng này không ảnh hưởng tới kế hoạch dự toán thu chi ngân sách 2024-2025. Phó thủ tướng, Bộ trưởng Tài chính Hồ Đức Phớc đọc tờ trình bổ sung vốn cho Vietcombank, ngày 23/10. Ảnh: Trung tâm báo chí Quốc hội Phó thủ tướng, Bộ trưởng Tài chính Hồ Đức Phớc đọc tờ trình bổ sung vốn cho Vietcombank, ngày 23/10. Ảnh: Trung tâm báo chí Quốc hội Vốn điều lệ của Vietcombank hiện là 55.891 tỷ đồng, thấp hơn nhiều so với VPBank (79.339 tỷ đồng), Techcombank (70.450 tỷ đồng) và không có sự cách biệt lớn so với một số ngân hàng thương mại cổ phần như MB (52.871) tỷ đồng, ACB (44.667 tỷ đồng) và SHB (36.629 tỷ đồng). Ngoài ra, việc tăng vốn nhằm để ngân hàng này đáp ứng các tỷ lệ an toàn tối thiểu. Tính tới cuối 2023, tỷ lệ an toàn vốn (CAR) của ngân hàng này là 11,05%, đảm bảo quy định. Tuy nhiên, mức này thấp hơn các ngân hàng thương mại cổ phần (VPBank, MB là 12-13%; Techcombank 13-15%...) và các nhà băng trong khu vực (Singapore là 17,1%, Indonesia 23,27%...). Thẩm tra nội dung này, Chủ nhiệm Ủy ban Kinh tế Vũ Hồng Thanh cho rằng đề xuất tăng vốn cho Vietcombank bảo đảm cơ sở pháp lý và đúng thẩm quyền theo quy định. Tuy nhiên, Ủy ban Kinh tế đề nghị Chính phủ lấy ý kiến của cổ đông chiến lược nước ngoài Ngân hàng Mizuho Corporate Bank - đơn vị nắm 15% vốn điều lệ của Vietcombank. Việc này nhằm thuận lợi trong quá trình tăng vốn. Chính phủ cũng cần bổ sung thông tin hiện trạng vốn của Vietcombank so với các ngân hàng thương mại trong hệ thống hiện nay. "Có ý kiến đề nghị làm rõ nhận định nguồn vốn đề xuất để tăng vốn điều lệ không tác động đến ngân sách Nhà nước", ông Thanh cho biết. Trụ sở Ngân hàng Ngoại thương Việt Nam (Vietcombank). Ảnh: VCB Trụ sở Ngân hàng Ngoại thương Việt Nam (Vietcombank). Ảnh: VCB Chủ nhiệm Ủy ban Kinh tế Vũ Hồng Thanh đề nghị Chính phủ chỉ đạo Ngân hàng Nhà nước cùng các bộ, ngành liên quan xử lý phần lợi nhuận còn lại năm 2022, 2023 (lần lượt là 21.680 tỷ và 25.009 tỷ đồng), nhằm tăng năng lực tài chính cho Vietcombank, bù đắp mức thiếu hụt vốn tự có, bảo đảm an toàn hoạt động. Cơ quan thẩm tra lưu ý vốn được bổ sung cho Vietcombank cần được dùng để mở rộng kinh doanh, cung ứng tín dụng với các lĩnh vực, dự án quan trọng quốc gia quy mô lớn, giảm lãi suất cho vay, cũng như đổi mới mô hình quản trị, chất lượng dịch vụ của nhà băng này. "Chính phủ cần đánh giá kỹ tác động việc bổ sung vốn Nhà nước cho Vietcombank tới phát triển của ngành ngân hàng, hiệu quả kinh tế xã hội", Ủy ban Kinh tế lưu ý. Vietcombank là một trong 4 ngân hàng thương mại Nhà nước, bên cạnh BIDV, VietinBank và Agribank. Ngân hàng này do Nhà nước sở hữu 74,8% vốn điều lệ. Lũy kế nửa đầu năm nay, lợi nhuận hợp nhất trước thuế của nhà băng này đạt 20.835 tỷ đồng, tăng 1,6% so với cùng kỳ 2023. Với dữ liệu này, Vietcombank tiếp tục đứng đầu toàn hệ thống ngân hàng về lợi nhuận 6 tháng đầu năm. Đây cũng là mức lãi nửa đầu năm cao kỷ lục của nhà băng này. Tính đến 30/6, tổng tài sản của ngân hàng đạt hơn 1,9 triệu tỷ đồng, tăng 3,6% so với cuối 2023. Trong đó, cho vay khách hàng gần 1,37 triệu tỷ đồng, tăng 7,8%."
Đoạn 1: "Đã có vài đơn vị bán tín chỉ carbon cho khách ngoại nhưng còn thiếu cơ sở pháp lý để đảm bảo hoạt động được thuận lợi, theo chuyên gia. Thông tin tại phiên tọa đàm thuộc Diễn đàn và Triển lãm Kinh tế xanh 2024 (GEFE), ông Đỗ Ngọc Quỳnh, Tổng thư ký Hiệp hội Thị trường Trái phiếu Việt Nam (VBMA), cho biết thị trường tín chỉ carbon tự nguyện Việt Nam đã có một số đơn vị bán được tín chỉ carbon cho nhà đầu tư, tập đoàn nước ngoài. "Họ đang mua chứng chỉ carbon và chứng chỉ năng lượng tái tạo (REC) trong tiêu chí RE100, tức 100% năng lượng tái tạo", ông cho biết. RE100 là sáng kiến toàn cầu dành cho các công ty cam kết sử dụng 100% điện năng tái tạo, phát động bởi Climate Group và CDP vào 2014. Từ trái sang, Marco Gaspari, Điều phối viên Ngành Môi trường tại Cơ quan Hợp tác Phát triển Italy (AICS Hà Nội) và ông Đỗ Ngọc Quỳnh, Tổng Thư ký Hiệp hội Thị trường Trái phiếu Việt Nam (VBMA) nói tại tọa đàm. Ảnh: GEFE 2024 Marco Gaspari, Điều phối viên Ngành Môi trường tại Cơ quan Hợp tác Phát triển Italy (AICS Hà Nội) và ông Đỗ Ngọc Quỳnh, Tổng Thư ký Hiệp hội Thị trường Trái phiếu Việt Nam (VBMA) chia sẻ tại tọa đàm. Ảnh: GEFE 2024 Thị trường carbon gồm hai hình thức là bắt buộc và tự nguyện. Đồ họa: Dỹ Tùng Phân biệt các loại thị trường carbon. Đồ họa: Dỹ Tùng Theo kế hoạch của chính phủ, thị trường bắt buộc sẽ vận hành thử nghiệm vào giai đoạn 2025-2028. Với thị trường tự nguyện, ông Quỳnh cho biết đã bắt đầu hình thành và cũng biến động theo diễn biến xu hướng chung toàn cầu. Chuyên gia VBMA cho rằng Việt Nam đã có chính sách chung để thực hiện cam kết Net Zero vào 2050, nhưng vẫn chưa có pháp lý đầy đủ và rõ ràng cho thị trường carbon tự nguyện. "Những người bán tại Việt Nam sau giao dịch không biết hạch toán vào đâu, nộp thuế thế nào. Một số chọn phương án tính vào thu nhập bất thường để khai thuế", ông ví dụ. Ông Nguyễn Thành Nghiệp, Luật sư thành viên công ty luật VTN và Cộng sự chỉ ra việc chưa có quy định xác định tính chất tài sản của tín chỉ carbon. "Chúng có được xem là tài sản bình thường, được thế chấp hay giao dịch thế nào chưa có đủ căn cứ pháp lý", ông nói. Ngoài ra, quy trình MRV (đo lường, báo cáo và kiểm chứng) cũng cần quy định, hướng dẫn rõ. Theo ông, ngoài các cơ quan quản lý, khu vực tư nhân cũng trông chờ xem liệu có thể tham gia hoạt động MRV không. "Trong thời gian tới, nếu hoàn thiện pháp lý, thị trường sẽ có nhiều tiềm năng phát triển hơn", ông Đỗ Ngọc Quỳnh dự báo. Ngoài tín chỉ carbon, với tiềm năng điện tái tạo thứ tư thế giới theo McKenzie, ông cho rằng có thể khai thác việc vừa bán tín chỉ carbon vừa bán được REC. Theo VBMA, quy mô thị trường carbon bắt buộc toàn cầu đạt 104 tỷ USD năm ngoái, tăng 100% so với năm 2020. Trong khi, thị trường tự nguyện đã thu hẹp còn 800 triệu USD, giảm hai phần ba so với 2021 do một số vụ bê bối liên quan đến "giặt xanh" (green washing) làm ảnh hưởng đến uy tín, niềm tin. Theo dõi biến động của thị trường thế giới giúp các bên tham gia trong thị trường carbon tự nguyện còn sơ khai của Việt Nam rút kinh nghiệm và tìm ra hướng đi. Marco Gaspari, Điều phối viên Ngành Môi trường tại Cơ quan Hợp tác Phát triển Italy (AICS) văn phòng Hà Nội, dự báo người mua sẽ cần tìm kiếm các bên bán tín chỉ có hệ thống quản trị tốt và rõ ràng. Ông cho rằng người mua đang thiên về chuộng mua tín chỉ lĩnh vực giảm phát thải sản xuất vì dễ chứng minh. Một loại được quan tâm khác là "carbon xanh dương" (blue carbon) - tín chỉ tạo ra từ các dự án hấp thụ carbon của rừng ngập mặn, đầm lầy bãi triều và cỏ biển. Ông chỉ ra Việt Nam triển vọng với 200.000 ha rừng ngập mặn, có thể làm các dự án carbon tương tự như ở Honduras. Bà Thu Nguyễn, Quản lý chính sách tại Apanada Management Consultancy, Đại diện Viện Tài nguyên Thế giới (WRI) khuyến nghị các dự án tín chỉ carbon nâng cao giá trị bằng cách quan tâm đến tính bình đẳng và bao trùm. Theo đó, mục tiêu không chỉ là giảm phát thải mà còn là cải thiện đời sống người dân và phát triển bình đẳng hơn "Dự án cần bảo đảm có tham vấn của cộng đồng, đặc biệt là phụ nữ và các nhóm yếu thế, để tạo ra lợi ích cho cả cộng đồng lẫn nhà đầu tư", bà nói."
Đoạn 2: "Giá nhẫn trơn liên tục điều chỉnh, tăng gần một triệu đồng trong ngày và có nơi lên sát 89 triệu đồng một lượng. 15h ngày 23/10, giá mua bán nhẫn trơn được các thương hiệu kinh doanh điều chỉnh theo diễn biến đi lên của thế giới. Chiều nay, mỗi ounce vàng quốc tế tiếp tục thiết lập kỷ lục mới 2.755 USD. Giá nhẫn trơn tại Công ty Vàng bạc đá quý Sài Gòn (SJC) cũng tăng nửa triệu đồng so với đầu sáng và gần 1 triệu đồng so với cuối ngày hôm qua, lên 86,9 - 88,2 triệu đồng. Công ty Vàng bạc đá quý Phú Nhuận (PNJ) và Mi Hồng niêm yết giá nhẫn trơn quanh vùng 87,4 - 88,4 triệu đồng. Còn tại Tập đoàn Vàng bạc đá quý DOJI, giá mua bán nhẫn trơn cùng thời điểm thậm chí lên 88 - 88,9 triệu đồng một lượng. Trước đó đầu ngày, Công ty Vàng bạc đá quý Sài Gòn (SJC) đã tăng 300.000 đồng một lượng so với cuối ngày hôm qua, niêm yết giá nhẫn trơn tại 86,3 - 87,6 triệu đồng. Biểu giá mua bán nhẫn trơn tại Tập đoàn Vàng bạc đá quý DOJI lúc 9h sáng là 87 - 88 triệu đồng, tăng 200.000 đồng so với cuối ngày hôm qua. Nhẫn trơn giữ nhịp tăng liên tục trong 10 ngày qua. So với giữa tháng, mỗi lượng nhẫn trơn đã tăng hơn 5 triệu đồng. Còn so với đầu năm, nhẫn trơn tăng gần 25 triệu một lượng, tương đương hiệu suất 39%. Trong khi giá vàng miếng SJC đứng yên ở vùng 87 - 89 triệu một lượng, do Ngân hàng Nhà nước chưa thay đổi giá bán can thiệp. Thời điểm này là mùa cưới cuối năm và nhu cầu mua vàng nhẫn làm quà cưới tăng, song người dân không dễ để mua được mặt hàng này tại các thương hiệu lớn. Các thương hiệu lớn như DOJI, PNJ, Bảo Tín Minh Châu thường xuyên trong tình trạng cháy hàng. Khách lẻ chỉ may mắn mua được số lượng ít nếu cửa hàng vừa có khách bán ra. Còn tại SJC, các chi nhánh giới hạn lượng mua tối đa 5 phân đến 1 chỉ mỗi người. Trên thị trường quốc tế, mỗi ounce vàng trong 5 ngày qua tăng mạnh hơn 100 USD. Kim loại quý có thời điểm lên mức kỷ lục gần 2.750 USD, trước khi lùi về vùng 2.738 USD vào sáng nay. Quy đổi theo tỷ giá bán Vietcombank, giá vàng trong nước chênh lệch 3,5-5 triệu đồng một lượng so với thế giới. Theo dự báo của các nhà băng hàng đầu thế giới, giá vàng thế giới có thể lên 3.000 USD một ounce vào năm sau. Các chuyên gia khuyến nghị nhà đầu tư phân bổ tỷ trọng nhỏ danh mục vào kênh trú ẩn này, đặc biệt trong bối cảnh kim loại quý đã tăng mạnh thời gian qua."
Đoạn 3: "Nhu cầu trú ẩn khi căng thẳng địa chính trị leo thang kéo giá vàng lên mức đỉnh mới, tại 2.748 USD một ounce. Chốt phiên giao dịch 22/10, giá vàng thế giới giao ngay tăng gần 30 USD lên 2.748 USD một ounce. Đây là mức cao kỷ lục mới của kim loại quý. "Căng thẳng địa chính trị vẫn là nguyên nhân chủ yếu. Hai tuần nữa sẽ diễn ra bầu cử Tổng thống Mỹ và cuộc đua vẫn rất sát sao. Bất ổn chính trị đang kéo nhu cầu trú ẩn lên cao", Peter A. Grant - Phó giám đốc Zaner Metals nhận định trên Reuters. Giá vàng thế giới đảo chiều tăng mạnh trong phiên 22/10. Đồ thị: Kitco Giá vàng thế giới đảo chiều tăng mạnh trong phiên 22/10. Đồ thị: Kitco Cuộc thăm dò mới nhất của Reuters/Ipsos cho thấy tỷ lệ ủng hộ Phó tổng thống Kamala Harris hiện là 46%, nhỉnh hơn so với 43% của cựu Tổng thống Donald Trump. "Sự sát sao này đang tạo nên tình trạng thiếu chắc chắn. Môi trường này có lợi cho vàng", các nhà phân tích tại ngân hàng BNP Paribas nhận định. Grant dự báo nếu căng thẳng tại Trung Đông tiếp tục tăng nhiệt, giá có thể lên 3.000 USD cuối năm nay. Từ đầu năm, giá đã tăng 33% và liên tiếp lập đỉnh mới. Một yếu tố khác đang hỗ trợ kim loại quý là làn sóng giảm lãi suất của các ngân hàng trung ương lớn trên toàn cầu. Mỹ, châu Âu, Trung Quốc cùng hàng loạt nền kinh tế khác đã giảm lãi suất năm nay để hỗ trợ nền kinh tế. Trong khi đó, tại Wall Street, các chỉ số chính gần như đứng yên. Nhà đầu tư hiện theo dõi lợi suất trái phiếu chính phủ Mỹ và chờ đánh giá thêm báo cáo tài chính của các doanh nghiệp. Ngoài vàng, các kim loại quý khác cũng tăng giá. Bạc lập đỉnh 12 năm, khi tăng 3,2% lên gần 35 USD một ounce. Han Tan - chiến lược gia thị trường tại Exinity Group dự báo bạc vượt mốc 35 USD trước khi cuộc bầu cử diễn ra. Bạch kim đắt thêm 2,8% lên 1.031 USD một ounce. Palladium tăng 2,9% lên 1.081 USD."
'''},
{"role": "user", "content": '''Hãy trả lời câu hỏi sau dựa vào đoạn ngữ cảnh được cung cấp. Câu trả lời phải có thưa gửi rõ ràng, xưng là em và kính thưa quý khách.\nCâu hỏi: giá nhẫn trơn hôm nay là bao nhiêu?'''}]
tokenized_chat = tokenizer.apply_chat_template(messages, tokenize=True, add_generation_prompt=True, return_tensors="pt")
outputs = model.generate(tokenized_chat, max_new_tokens=512)
print(tokenizer.decode(outputs[0]))
# Em xin thông báo rằng giá nhẫn trơn hôm nay dao động từ 86,9 đến 88,2 triệu đồng một ounce, tùy thuộc vào từng thương hiệu.
```
***You can customize the prompt before the answer to get a response that suits your needs.***
***You can also add information about this bot's persona in the system prompt.***
<h4> 3. Function Calling task </h4>
***In this task, we are following the Function Calling template from Glaive AI: [glaiveai/glaive-function-calling-v2](https://huggingface.co/datasets/glaiveai/glaive-function-calling-v2).***
```python
messages = [
{"role": "system", "content": '''Bạn là một trợ lý hữu ích với khả năng truy cập vào các hàm sau. Hãy sử dụng chúng nếu cần -
{
"name": "weather_forecast",
"description": "Cung cấp cập nhật và dự báo thời tiết cho các địa điểm cụ thể, bao gồm nhiệt độ, độ ẩm và tình trạng thời tiết. Ví dụ: thời tiết hôm nay, dự báo thời tiết ở Hà Nội, nhiệt độ tại Đà Nẵng, v.v.",
"parameters": {
"properties": {
"__arg1": {
"description": "__arg1",
"type": "string"
}
},
"required": [
"__arg1"
],
"type": "object"
}
},
{
"name": "news_update",
"description": "Cung cấp các bài báo và cập nhật tin tức mới nhất trên nhiều lĩnh vực như chính trị, công nghệ, thể thao và giải trí. Ví dụ: tin tức hôm nay, cập nhật thể thao, tin công nghệ mới nhất, v.v.",
"parameters": {
"properties": {
"__arg1": {
"description": "__arg1",
"type": "string"
}
},
"required": [
"__arg1"
],
"type": "object"
}
},
{
"name": "recipe_search",
"description": "Tìm kiếm và gợi ý công thức nấu ăn dựa trên nguyên liệu hoặc sở thích dinh dưỡng. Ví dụ: công thức món ăn với gà, món chay, ăn kiêng, v.v.",
"parameters": {
"properties": {
"__arg1": {
"description": "__arg1",
"type": "string"
}
},
"required": [
"__arg1"
],
"type": "object"
}
},
{
"name": "movie_recommendation",
"description": "Cung cấp gợi ý phim dựa trên thể loại, tâm trạng hoặc tiêu đề cụ thể. Ví dụ: phim hài hay, phim hành động mới, gợi ý phim cho tối nay, v.v.",
"parameters": {
"properties": {
"__arg1": {
"description": "__arg1",
"type": "string"
}
},
"required": [
"__arg1"
],
"type": "object"
}
},
{
"name": "fitness_advice",
"description": "Cung cấp mẹo và bài tập cho sức khỏe và thể dục dựa trên mục tiêu của người dùng. Ví dụ: bài tập giảm cân, lịch tập gym cho người mới, lời khuyên về dinh dưỡng, v.v.",
"parameters": {
"properties": {
"__arg1": {
"description": "__arg1",
"type": "string"
}
},
"required": [
"__arg1"
],
"type": "object"
}
},
{
"name": "travel_planner",
"description": "Hỗ trợ lập kế hoạch du lịch, bao gồm gợi ý lịch trình, mẹo đặt vé và thông tin về điểm đến. Ví dụ: lên kế hoạch du lịch Đà Nẵng, địa điểm tham quan ở Nha Trang, mẹo du lịch Nhật Bản, v.v.",
"parameters": {
"properties": {
"__arg1": {
"description": "__arg1",
"type": "string"
}
},
"required": [
"__arg1"
],
"type": "object"
}
}'''},
{"role": "user", "content": "hôm nay có tin tức gì về ô nhiễm không khí"}]
tokenized_chat = tokenizer.apply_chat_template(messages, tokenize=True, add_generation_prompt=True, return_tensors="pt")
outputs = model.generate(tokenized_chat, max_new_tokens=128)
print(tokenizer.decode(outputs[0]))
# <functioncall> {"name": "news_update", "arguments": {"__arg1": "ô nhiễm không khí"}}
```
<h4> 4. Question Rewriting task </h4>
```python
messages = [
{"role": "system", "content": '''Bạn là một trợ lí Tiếng Việt nhiệt tình và trung thực. Hãy luôn trả lời một cách hữu ích nhất có thể, đồng thời giữ an toàn.\nNếu một câu hỏi không có ý nghĩa hoặc không hợp lý về mặt thông tin, hãy giải thích tại sao thay vì trả lời một điều gì đó không chính xác, vui lòng không chia sẻ thông tin sai lệch.
Context:
USER: Concert "Anh trai vượt ngàn chông gai" diễn ra ở đâu?
ASSISTANT: Đêm concert Anh trai vượt ngàn chông gai diễn ra vào tối 19-10, tại công viên bờ sông Sài Gòn, TP.HCM với sự tham dự của khoảng 20.000 khán giả.
USER: Buổi biểu diễn có bao nhiêu nghệ sĩ?
ASSISTANT: Danh sách nghệ sĩ gồm 17 nghệ sĩ: Kay Trần, Jun Phạm, Cường Seven, Tự Long, (S)TRONG Trọng Hiếu, S.T Sơn Thạch, BB Trần, Bùi Công Nam, Rhymastic, Soobin Hoàng Sơn, Binz, Quốc Thiên, Thanh Duy, Bằng Kiều, Tiến Luật, Đinh Tiến Đạt và Đỗ Hoàng Hiệp.
USER: Chương trình phát sóng khi nào và ở đâu?
ASSISTANT: chương trình phát sóng từ ngày 29 tháng 6 năm 2024 lúc 20:00 thứ 7 hàng tuần trên VTV3 và công chiếu lúc 20:30 cùng ngày trên kênh YouTube YeaH1 Show của nhà sản xuất chương trình.'''},
{"role": "user", "content": '''Dựa vào đoạn hội thoại được cung cấp, viết lại câu nói của người dùng sao cho đầu đủ ý nhất có thể mà không bị sai lệch thông tin.
Câu nói: Concert này có tổ chức ở Hà Nội không?
'''}]
tokenized_chat = tokenizer.apply_chat_template(messages, tokenize=True, add_generation_prompt=True, return_tensors="pt")
outputs = model.generate(tokenized_chat, max_new_tokens=512)
print(tokenizer.decode(outputs[0]))
# Buổi hòa nhạc Anh trai vượt ngàn chông gai có diễn ra ở Hà Nội không?
```
***Modify the parameters "temperature", "top_k", "top_p" to suit your usecase.***
Corresponding Author:
+ [email protected] | [
"SUMMARIZATION"
] | [
"CHIA"
] |
ggrn/e5-small-v2 | ggrn | feature-extraction | [
"sentence-transformers",
"pytorch",
"bert",
"mteb",
"feature-extraction",
"en",
"arxiv:2212.03533",
"arxiv:2104.08663",
"arxiv:2210.07316",
"license:mit",
"model-index",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
] | 2023-06-21T02:39:56 | 2023-06-21T03:30:34 | 913 | 10 | ---
language:
- en
library_name: sentence-transformers
license: mit
pipeline_tag: feature-extraction
tags:
- mteb
model-index:
- name: e5-small-v2
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 77.59701492537313
- type: ap
value: 41.67064885731708
- type: f1
value: 71.86465946398573
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 91.265875
- type: ap
value: 87.67633085349644
- type: f1
value: 91.24297521425744
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 45.882000000000005
- type: f1
value: 45.08058870381236
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: arguana
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 20.697
- type: map_at_10
value: 33.975
- type: map_at_100
value: 35.223
- type: map_at_1000
value: 35.260000000000005
- type: map_at_3
value: 29.776999999999997
- type: map_at_5
value: 32.035000000000004
- type: mrr_at_1
value: 20.982
- type: mrr_at_10
value: 34.094
- type: mrr_at_100
value: 35.343
- type: mrr_at_1000
value: 35.38
- type: mrr_at_3
value: 29.884
- type: mrr_at_5
value: 32.141999999999996
- type: ndcg_at_1
value: 20.697
- type: ndcg_at_10
value: 41.668
- type: ndcg_at_100
value: 47.397
- type: ndcg_at_1000
value: 48.305
- type: ndcg_at_3
value: 32.928000000000004
- type: ndcg_at_5
value: 36.998999999999995
- type: precision_at_1
value: 20.697
- type: precision_at_10
value: 6.636
- type: precision_at_100
value: 0.924
- type: precision_at_1000
value: 0.099
- type: precision_at_3
value: 14.035
- type: precision_at_5
value: 10.398
- type: recall_at_1
value: 20.697
- type: recall_at_10
value: 66.35799999999999
- type: recall_at_100
value: 92.39
- type: recall_at_1000
value: 99.36
- type: recall_at_3
value: 42.105
- type: recall_at_5
value: 51.991
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 42.1169517447068
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 34.79553720107097
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 58.10811337308168
- type: mrr
value: 71.56410763751482
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 78.46834918248696
- type: cos_sim_spearman
value: 79.4289182755206
- type: euclidean_pearson
value: 76.26662973727008
- type: euclidean_spearman
value: 78.11744260952536
- type: manhattan_pearson
value: 76.08175262609434
- type: manhattan_spearman
value: 78.29395265552289
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 81.63636363636364
- type: f1
value: 81.55779952376953
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 35.88541137137571
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 30.05205685274407
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 30.293999999999997
- type: map_at_10
value: 39.876
- type: map_at_100
value: 41.315000000000005
- type: map_at_1000
value: 41.451
- type: map_at_3
value: 37.194
- type: map_at_5
value: 38.728
- type: mrr_at_1
value: 37.053000000000004
- type: mrr_at_10
value: 45.281
- type: mrr_at_100
value: 46.188
- type: mrr_at_1000
value: 46.245999999999995
- type: mrr_at_3
value: 43.228
- type: mrr_at_5
value: 44.366
- type: ndcg_at_1
value: 37.053000000000004
- type: ndcg_at_10
value: 45.086
- type: ndcg_at_100
value: 50.756
- type: ndcg_at_1000
value: 53.123
- type: ndcg_at_3
value: 41.416
- type: ndcg_at_5
value: 43.098
- type: precision_at_1
value: 37.053000000000004
- type: precision_at_10
value: 8.34
- type: precision_at_100
value: 1.346
- type: precision_at_1000
value: 0.186
- type: precision_at_3
value: 19.647000000000002
- type: precision_at_5
value: 13.877
- type: recall_at_1
value: 30.293999999999997
- type: recall_at_10
value: 54.309
- type: recall_at_100
value: 78.59
- type: recall_at_1000
value: 93.82300000000001
- type: recall_at_3
value: 43.168
- type: recall_at_5
value: 48.192
- type: map_at_1
value: 28.738000000000003
- type: map_at_10
value: 36.925999999999995
- type: map_at_100
value: 38.017
- type: map_at_1000
value: 38.144
- type: map_at_3
value: 34.446
- type: map_at_5
value: 35.704
- type: mrr_at_1
value: 35.478
- type: mrr_at_10
value: 42.786
- type: mrr_at_100
value: 43.458999999999996
- type: mrr_at_1000
value: 43.507
- type: mrr_at_3
value: 40.648
- type: mrr_at_5
value: 41.804
- type: ndcg_at_1
value: 35.478
- type: ndcg_at_10
value: 42.044
- type: ndcg_at_100
value: 46.249
- type: ndcg_at_1000
value: 48.44
- type: ndcg_at_3
value: 38.314
- type: ndcg_at_5
value: 39.798
- type: precision_at_1
value: 35.478
- type: precision_at_10
value: 7.764
- type: precision_at_100
value: 1.253
- type: precision_at_1000
value: 0.174
- type: precision_at_3
value: 18.047
- type: precision_at_5
value: 12.637
- type: recall_at_1
value: 28.738000000000003
- type: recall_at_10
value: 50.659
- type: recall_at_100
value: 68.76299999999999
- type: recall_at_1000
value: 82.811
- type: recall_at_3
value: 39.536
- type: recall_at_5
value: 43.763999999999996
- type: map_at_1
value: 38.565
- type: map_at_10
value: 50.168
- type: map_at_100
value: 51.11
- type: map_at_1000
value: 51.173
- type: map_at_3
value: 47.044000000000004
- type: map_at_5
value: 48.838
- type: mrr_at_1
value: 44.201
- type: mrr_at_10
value: 53.596999999999994
- type: mrr_at_100
value: 54.211
- type: mrr_at_1000
value: 54.247
- type: mrr_at_3
value: 51.202000000000005
- type: mrr_at_5
value: 52.608999999999995
- type: ndcg_at_1
value: 44.201
- type: ndcg_at_10
value: 55.694
- type: ndcg_at_100
value: 59.518
- type: ndcg_at_1000
value: 60.907
- type: ndcg_at_3
value: 50.395999999999994
- type: ndcg_at_5
value: 53.022999999999996
- type: precision_at_1
value: 44.201
- type: precision_at_10
value: 8.84
- type: precision_at_100
value: 1.162
- type: precision_at_1000
value: 0.133
- type: precision_at_3
value: 22.153
- type: precision_at_5
value: 15.260000000000002
- type: recall_at_1
value: 38.565
- type: recall_at_10
value: 68.65
- type: recall_at_100
value: 85.37400000000001
- type: recall_at_1000
value: 95.37400000000001
- type: recall_at_3
value: 54.645999999999994
- type: recall_at_5
value: 60.958
- type: map_at_1
value: 23.945
- type: map_at_10
value: 30.641000000000002
- type: map_at_100
value: 31.599
- type: map_at_1000
value: 31.691000000000003
- type: map_at_3
value: 28.405
- type: map_at_5
value: 29.704000000000004
- type: mrr_at_1
value: 25.537
- type: mrr_at_10
value: 32.22
- type: mrr_at_100
value: 33.138
- type: mrr_at_1000
value: 33.214
- type: mrr_at_3
value: 30.151
- type: mrr_at_5
value: 31.298
- type: ndcg_at_1
value: 25.537
- type: ndcg_at_10
value: 34.638000000000005
- type: ndcg_at_100
value: 39.486
- type: ndcg_at_1000
value: 41.936
- type: ndcg_at_3
value: 30.333
- type: ndcg_at_5
value: 32.482
- type: precision_at_1
value: 25.537
- type: precision_at_10
value: 5.153
- type: precision_at_100
value: 0.7929999999999999
- type: precision_at_1000
value: 0.104
- type: precision_at_3
value: 12.429
- type: precision_at_5
value: 8.723
- type: recall_at_1
value: 23.945
- type: recall_at_10
value: 45.412
- type: recall_at_100
value: 67.836
- type: recall_at_1000
value: 86.467
- type: recall_at_3
value: 34.031
- type: recall_at_5
value: 39.039
- type: map_at_1
value: 14.419
- type: map_at_10
value: 20.858999999999998
- type: map_at_100
value: 22.067999999999998
- type: map_at_1000
value: 22.192
- type: map_at_3
value: 18.673000000000002
- type: map_at_5
value: 19.968
- type: mrr_at_1
value: 17.785999999999998
- type: mrr_at_10
value: 24.878
- type: mrr_at_100
value: 26.021
- type: mrr_at_1000
value: 26.095000000000002
- type: mrr_at_3
value: 22.616
- type: mrr_at_5
value: 23.785
- type: ndcg_at_1
value: 17.785999999999998
- type: ndcg_at_10
value: 25.153
- type: ndcg_at_100
value: 31.05
- type: ndcg_at_1000
value: 34.052
- type: ndcg_at_3
value: 21.117
- type: ndcg_at_5
value: 23.048
- type: precision_at_1
value: 17.785999999999998
- type: precision_at_10
value: 4.590000000000001
- type: precision_at_100
value: 0.864
- type: precision_at_1000
value: 0.125
- type: precision_at_3
value: 9.908999999999999
- type: precision_at_5
value: 7.313
- type: recall_at_1
value: 14.419
- type: recall_at_10
value: 34.477999999999994
- type: recall_at_100
value: 60.02499999999999
- type: recall_at_1000
value: 81.646
- type: recall_at_3
value: 23.515
- type: recall_at_5
value: 28.266999999999996
- type: map_at_1
value: 26.268
- type: map_at_10
value: 35.114000000000004
- type: map_at_100
value: 36.212
- type: map_at_1000
value: 36.333
- type: map_at_3
value: 32.436
- type: map_at_5
value: 33.992
- type: mrr_at_1
value: 31.761
- type: mrr_at_10
value: 40.355999999999995
- type: mrr_at_100
value: 41.125
- type: mrr_at_1000
value: 41.186
- type: mrr_at_3
value: 37.937
- type: mrr_at_5
value: 39.463
- type: ndcg_at_1
value: 31.761
- type: ndcg_at_10
value: 40.422000000000004
- type: ndcg_at_100
value: 45.458999999999996
- type: ndcg_at_1000
value: 47.951
- type: ndcg_at_3
value: 35.972
- type: ndcg_at_5
value: 38.272
- type: precision_at_1
value: 31.761
- type: precision_at_10
value: 7.103
- type: precision_at_100
value: 1.133
- type: precision_at_1000
value: 0.152
- type: precision_at_3
value: 16.779
- type: precision_at_5
value: 11.877
- type: recall_at_1
value: 26.268
- type: recall_at_10
value: 51.053000000000004
- type: recall_at_100
value: 72.702
- type: recall_at_1000
value: 89.521
- type: recall_at_3
value: 38.619
- type: recall_at_5
value: 44.671
- type: map_at_1
value: 25.230999999999998
- type: map_at_10
value: 34.227000000000004
- type: map_at_100
value: 35.370000000000005
- type: map_at_1000
value: 35.488
- type: map_at_3
value: 31.496000000000002
- type: map_at_5
value: 33.034
- type: mrr_at_1
value: 30.822
- type: mrr_at_10
value: 39.045
- type: mrr_at_100
value: 39.809
- type: mrr_at_1000
value: 39.873
- type: mrr_at_3
value: 36.663000000000004
- type: mrr_at_5
value: 37.964
- type: ndcg_at_1
value: 30.822
- type: ndcg_at_10
value: 39.472
- type: ndcg_at_100
value: 44.574999999999996
- type: ndcg_at_1000
value: 47.162
- type: ndcg_at_3
value: 34.929
- type: ndcg_at_5
value: 37.002
- type: precision_at_1
value: 30.822
- type: precision_at_10
value: 7.055
- type: precision_at_100
value: 1.124
- type: precision_at_1000
value: 0.152
- type: precision_at_3
value: 16.591
- type: precision_at_5
value: 11.667
- type: recall_at_1
value: 25.230999999999998
- type: recall_at_10
value: 50.42100000000001
- type: recall_at_100
value: 72.685
- type: recall_at_1000
value: 90.469
- type: recall_at_3
value: 37.503
- type: recall_at_5
value: 43.123
- type: map_at_1
value: 24.604166666666664
- type: map_at_10
value: 32.427166666666665
- type: map_at_100
value: 33.51474999999999
- type: map_at_1000
value: 33.6345
- type: map_at_3
value: 30.02366666666667
- type: map_at_5
value: 31.382333333333328
- type: mrr_at_1
value: 29.001166666666666
- type: mrr_at_10
value: 36.3315
- type: mrr_at_100
value: 37.16683333333333
- type: mrr_at_1000
value: 37.23341666666668
- type: mrr_at_3
value: 34.19916666666667
- type: mrr_at_5
value: 35.40458333333334
- type: ndcg_at_1
value: 29.001166666666666
- type: ndcg_at_10
value: 37.06883333333334
- type: ndcg_at_100
value: 41.95816666666666
- type: ndcg_at_1000
value: 44.501583333333336
- type: ndcg_at_3
value: 32.973499999999994
- type: ndcg_at_5
value: 34.90833333333334
- type: precision_at_1
value: 29.001166666666666
- type: precision_at_10
value: 6.336
- type: precision_at_100
value: 1.0282499999999999
- type: precision_at_1000
value: 0.14391666666666664
- type: precision_at_3
value: 14.932499999999996
- type: precision_at_5
value: 10.50825
- type: recall_at_1
value: 24.604166666666664
- type: recall_at_10
value: 46.9525
- type: recall_at_100
value: 68.67816666666667
- type: recall_at_1000
value: 86.59783333333334
- type: recall_at_3
value: 35.49783333333333
- type: recall_at_5
value: 40.52525000000001
- type: map_at_1
value: 23.559
- type: map_at_10
value: 29.023
- type: map_at_100
value: 29.818
- type: map_at_1000
value: 29.909000000000002
- type: map_at_3
value: 27.037
- type: map_at_5
value: 28.225
- type: mrr_at_1
value: 26.994
- type: mrr_at_10
value: 31.962000000000003
- type: mrr_at_100
value: 32.726
- type: mrr_at_1000
value: 32.800000000000004
- type: mrr_at_3
value: 30.266
- type: mrr_at_5
value: 31.208999999999996
- type: ndcg_at_1
value: 26.994
- type: ndcg_at_10
value: 32.53
- type: ndcg_at_100
value: 36.758
- type: ndcg_at_1000
value: 39.362
- type: ndcg_at_3
value: 28.985
- type: ndcg_at_5
value: 30.757
- type: precision_at_1
value: 26.994
- type: precision_at_10
value: 4.968999999999999
- type: precision_at_100
value: 0.759
- type: precision_at_1000
value: 0.106
- type: precision_at_3
value: 12.219
- type: precision_at_5
value: 8.527999999999999
- type: recall_at_1
value: 23.559
- type: recall_at_10
value: 40.585
- type: recall_at_100
value: 60.306000000000004
- type: recall_at_1000
value: 80.11
- type: recall_at_3
value: 30.794
- type: recall_at_5
value: 35.186
- type: map_at_1
value: 16.384999999999998
- type: map_at_10
value: 22.142
- type: map_at_100
value: 23.057
- type: map_at_1000
value: 23.177
- type: map_at_3
value: 20.29
- type: map_at_5
value: 21.332
- type: mrr_at_1
value: 19.89
- type: mrr_at_10
value: 25.771
- type: mrr_at_100
value: 26.599
- type: mrr_at_1000
value: 26.680999999999997
- type: mrr_at_3
value: 23.962
- type: mrr_at_5
value: 24.934
- type: ndcg_at_1
value: 19.89
- type: ndcg_at_10
value: 25.97
- type: ndcg_at_100
value: 30.605
- type: ndcg_at_1000
value: 33.619
- type: ndcg_at_3
value: 22.704
- type: ndcg_at_5
value: 24.199
- type: precision_at_1
value: 19.89
- type: precision_at_10
value: 4.553
- type: precision_at_100
value: 0.8049999999999999
- type: precision_at_1000
value: 0.122
- type: precision_at_3
value: 10.541
- type: precision_at_5
value: 7.46
- type: recall_at_1
value: 16.384999999999998
- type: recall_at_10
value: 34.001
- type: recall_at_100
value: 55.17100000000001
- type: recall_at_1000
value: 77.125
- type: recall_at_3
value: 24.618000000000002
- type: recall_at_5
value: 28.695999999999998
- type: map_at_1
value: 23.726
- type: map_at_10
value: 31.227
- type: map_at_100
value: 32.311
- type: map_at_1000
value: 32.419
- type: map_at_3
value: 28.765
- type: map_at_5
value: 30.229
- type: mrr_at_1
value: 27.705000000000002
- type: mrr_at_10
value: 35.085
- type: mrr_at_100
value: 35.931000000000004
- type: mrr_at_1000
value: 36
- type: mrr_at_3
value: 32.603
- type: mrr_at_5
value: 34.117999999999995
- type: ndcg_at_1
value: 27.705000000000002
- type: ndcg_at_10
value: 35.968
- type: ndcg_at_100
value: 41.197
- type: ndcg_at_1000
value: 43.76
- type: ndcg_at_3
value: 31.304
- type: ndcg_at_5
value: 33.661
- type: precision_at_1
value: 27.705000000000002
- type: precision_at_10
value: 5.942
- type: precision_at_100
value: 0.964
- type: precision_at_1000
value: 0.13
- type: precision_at_3
value: 13.868
- type: precision_at_5
value: 9.944
- type: recall_at_1
value: 23.726
- type: recall_at_10
value: 46.786
- type: recall_at_100
value: 70.072
- type: recall_at_1000
value: 88.2
- type: recall_at_3
value: 33.981
- type: recall_at_5
value: 39.893
- type: map_at_1
value: 23.344
- type: map_at_10
value: 31.636999999999997
- type: map_at_100
value: 33.065
- type: map_at_1000
value: 33.300000000000004
- type: map_at_3
value: 29.351
- type: map_at_5
value: 30.432
- type: mrr_at_1
value: 27.866000000000003
- type: mrr_at_10
value: 35.587
- type: mrr_at_100
value: 36.52
- type: mrr_at_1000
value: 36.597
- type: mrr_at_3
value: 33.696
- type: mrr_at_5
value: 34.713
- type: ndcg_at_1
value: 27.866000000000003
- type: ndcg_at_10
value: 36.61
- type: ndcg_at_100
value: 41.88
- type: ndcg_at_1000
value: 45.105000000000004
- type: ndcg_at_3
value: 33.038000000000004
- type: ndcg_at_5
value: 34.331
- type: precision_at_1
value: 27.866000000000003
- type: precision_at_10
value: 6.917
- type: precision_at_100
value: 1.3599999999999999
- type: precision_at_1000
value: 0.233
- type: precision_at_3
value: 15.547
- type: precision_at_5
value: 10.791
- type: recall_at_1
value: 23.344
- type: recall_at_10
value: 45.782000000000004
- type: recall_at_100
value: 69.503
- type: recall_at_1000
value: 90.742
- type: recall_at_3
value: 35.160000000000004
- type: recall_at_5
value: 39.058
- type: map_at_1
value: 20.776
- type: map_at_10
value: 27.285999999999998
- type: map_at_100
value: 28.235
- type: map_at_1000
value: 28.337
- type: map_at_3
value: 25.147000000000002
- type: map_at_5
value: 26.401999999999997
- type: mrr_at_1
value: 22.921
- type: mrr_at_10
value: 29.409999999999997
- type: mrr_at_100
value: 30.275000000000002
- type: mrr_at_1000
value: 30.354999999999997
- type: mrr_at_3
value: 27.418
- type: mrr_at_5
value: 28.592000000000002
- type: ndcg_at_1
value: 22.921
- type: ndcg_at_10
value: 31.239
- type: ndcg_at_100
value: 35.965
- type: ndcg_at_1000
value: 38.602
- type: ndcg_at_3
value: 27.174
- type: ndcg_at_5
value: 29.229
- type: precision_at_1
value: 22.921
- type: precision_at_10
value: 4.806
- type: precision_at_100
value: 0.776
- type: precision_at_1000
value: 0.11
- type: precision_at_3
value: 11.459999999999999
- type: precision_at_5
value: 8.022
- type: recall_at_1
value: 20.776
- type: recall_at_10
value: 41.294
- type: recall_at_100
value: 63.111
- type: recall_at_1000
value: 82.88600000000001
- type: recall_at_3
value: 30.403000000000002
- type: recall_at_5
value: 35.455999999999996
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: climate-fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 9.376
- type: map_at_10
value: 15.926000000000002
- type: map_at_100
value: 17.585
- type: map_at_1000
value: 17.776
- type: map_at_3
value: 13.014000000000001
- type: map_at_5
value: 14.417
- type: mrr_at_1
value: 20.195
- type: mrr_at_10
value: 29.95
- type: mrr_at_100
value: 31.052000000000003
- type: mrr_at_1000
value: 31.108000000000004
- type: mrr_at_3
value: 26.667
- type: mrr_at_5
value: 28.458
- type: ndcg_at_1
value: 20.195
- type: ndcg_at_10
value: 22.871
- type: ndcg_at_100
value: 29.921999999999997
- type: ndcg_at_1000
value: 33.672999999999995
- type: ndcg_at_3
value: 17.782999999999998
- type: ndcg_at_5
value: 19.544
- type: precision_at_1
value: 20.195
- type: precision_at_10
value: 7.394
- type: precision_at_100
value: 1.493
- type: precision_at_1000
value: 0.218
- type: precision_at_3
value: 13.073
- type: precision_at_5
value: 10.436
- type: recall_at_1
value: 9.376
- type: recall_at_10
value: 28.544999999999998
- type: recall_at_100
value: 53.147999999999996
- type: recall_at_1000
value: 74.62
- type: recall_at_3
value: 16.464000000000002
- type: recall_at_5
value: 21.004
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: dbpedia-entity
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 8.415000000000001
- type: map_at_10
value: 18.738
- type: map_at_100
value: 27.291999999999998
- type: map_at_1000
value: 28.992
- type: map_at_3
value: 13.196
- type: map_at_5
value: 15.539
- type: mrr_at_1
value: 66.5
- type: mrr_at_10
value: 74.518
- type: mrr_at_100
value: 74.86
- type: mrr_at_1000
value: 74.87
- type: mrr_at_3
value: 72.375
- type: mrr_at_5
value: 73.86200000000001
- type: ndcg_at_1
value: 54.37499999999999
- type: ndcg_at_10
value: 41.317
- type: ndcg_at_100
value: 45.845
- type: ndcg_at_1000
value: 52.92
- type: ndcg_at_3
value: 44.983000000000004
- type: ndcg_at_5
value: 42.989
- type: precision_at_1
value: 66.5
- type: precision_at_10
value: 33.6
- type: precision_at_100
value: 10.972999999999999
- type: precision_at_1000
value: 2.214
- type: precision_at_3
value: 48.583
- type: precision_at_5
value: 42.15
- type: recall_at_1
value: 8.415000000000001
- type: recall_at_10
value: 24.953
- type: recall_at_100
value: 52.48199999999999
- type: recall_at_1000
value: 75.093
- type: recall_at_3
value: 14.341000000000001
- type: recall_at_5
value: 18.468
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 47.06499999999999
- type: f1
value: 41.439327599975385
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 66.02
- type: map_at_10
value: 76.68599999999999
- type: map_at_100
value: 76.959
- type: map_at_1000
value: 76.972
- type: map_at_3
value: 75.024
- type: map_at_5
value: 76.153
- type: mrr_at_1
value: 71.197
- type: mrr_at_10
value: 81.105
- type: mrr_at_100
value: 81.232
- type: mrr_at_1000
value: 81.233
- type: mrr_at_3
value: 79.758
- type: mrr_at_5
value: 80.69
- type: ndcg_at_1
value: 71.197
- type: ndcg_at_10
value: 81.644
- type: ndcg_at_100
value: 82.645
- type: ndcg_at_1000
value: 82.879
- type: ndcg_at_3
value: 78.792
- type: ndcg_at_5
value: 80.528
- type: precision_at_1
value: 71.197
- type: precision_at_10
value: 10.206999999999999
- type: precision_at_100
value: 1.093
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_3
value: 30.868000000000002
- type: precision_at_5
value: 19.559
- type: recall_at_1
value: 66.02
- type: recall_at_10
value: 92.50699999999999
- type: recall_at_100
value: 96.497
- type: recall_at_1000
value: 97.956
- type: recall_at_3
value: 84.866
- type: recall_at_5
value: 89.16199999999999
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: fiqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 17.948
- type: map_at_10
value: 29.833
- type: map_at_100
value: 31.487
- type: map_at_1000
value: 31.674000000000003
- type: map_at_3
value: 26.029999999999998
- type: map_at_5
value: 28.038999999999998
- type: mrr_at_1
value: 34.721999999999994
- type: mrr_at_10
value: 44.214999999999996
- type: mrr_at_100
value: 44.994
- type: mrr_at_1000
value: 45.051
- type: mrr_at_3
value: 41.667
- type: mrr_at_5
value: 43.032
- type: ndcg_at_1
value: 34.721999999999994
- type: ndcg_at_10
value: 37.434
- type: ndcg_at_100
value: 43.702000000000005
- type: ndcg_at_1000
value: 46.993
- type: ndcg_at_3
value: 33.56
- type: ndcg_at_5
value: 34.687
- type: precision_at_1
value: 34.721999999999994
- type: precision_at_10
value: 10.401
- type: precision_at_100
value: 1.7049999999999998
- type: precision_at_1000
value: 0.22799999999999998
- type: precision_at_3
value: 22.531000000000002
- type: precision_at_5
value: 16.42
- type: recall_at_1
value: 17.948
- type: recall_at_10
value: 45.062999999999995
- type: recall_at_100
value: 68.191
- type: recall_at_1000
value: 87.954
- type: recall_at_3
value: 31.112000000000002
- type: recall_at_5
value: 36.823
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: hotpotqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 36.644
- type: map_at_10
value: 57.658
- type: map_at_100
value: 58.562000000000005
- type: map_at_1000
value: 58.62500000000001
- type: map_at_3
value: 54.022999999999996
- type: map_at_5
value: 56.293000000000006
- type: mrr_at_1
value: 73.288
- type: mrr_at_10
value: 80.51700000000001
- type: mrr_at_100
value: 80.72
- type: mrr_at_1000
value: 80.728
- type: mrr_at_3
value: 79.33200000000001
- type: mrr_at_5
value: 80.085
- type: ndcg_at_1
value: 73.288
- type: ndcg_at_10
value: 66.61
- type: ndcg_at_100
value: 69.723
- type: ndcg_at_1000
value: 70.96000000000001
- type: ndcg_at_3
value: 61.358999999999995
- type: ndcg_at_5
value: 64.277
- type: precision_at_1
value: 73.288
- type: precision_at_10
value: 14.17
- type: precision_at_100
value: 1.659
- type: precision_at_1000
value: 0.182
- type: precision_at_3
value: 39.487
- type: precision_at_5
value: 25.999
- type: recall_at_1
value: 36.644
- type: recall_at_10
value: 70.851
- type: recall_at_100
value: 82.94399999999999
- type: recall_at_1000
value: 91.134
- type: recall_at_3
value: 59.230000000000004
- type: recall_at_5
value: 64.997
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 86.00280000000001
- type: ap
value: 80.46302061021223
- type: f1
value: 85.9592921596419
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: msmarco
config: default
split: dev
revision: None
metrics:
- type: map_at_1
value: 22.541
- type: map_at_10
value: 34.625
- type: map_at_100
value: 35.785
- type: map_at_1000
value: 35.831
- type: map_at_3
value: 30.823
- type: map_at_5
value: 32.967999999999996
- type: mrr_at_1
value: 23.180999999999997
- type: mrr_at_10
value: 35.207
- type: mrr_at_100
value: 36.315
- type: mrr_at_1000
value: 36.355
- type: mrr_at_3
value: 31.483
- type: mrr_at_5
value: 33.589999999999996
- type: ndcg_at_1
value: 23.195
- type: ndcg_at_10
value: 41.461
- type: ndcg_at_100
value: 47.032000000000004
- type: ndcg_at_1000
value: 48.199999999999996
- type: ndcg_at_3
value: 33.702
- type: ndcg_at_5
value: 37.522
- type: precision_at_1
value: 23.195
- type: precision_at_10
value: 6.526999999999999
- type: precision_at_100
value: 0.932
- type: precision_at_1000
value: 0.10300000000000001
- type: precision_at_3
value: 14.308000000000002
- type: precision_at_5
value: 10.507
- type: recall_at_1
value: 22.541
- type: recall_at_10
value: 62.524
- type: recall_at_100
value: 88.228
- type: recall_at_1000
value: 97.243
- type: recall_at_3
value: 41.38
- type: recall_at_5
value: 50.55
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 92.69949840401279
- type: f1
value: 92.54141471311786
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 72.56041951664386
- type: f1
value: 55.88499977508287
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 71.62071284465365
- type: f1
value: 69.36717546572152
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 76.35843981170142
- type: f1
value: 76.15496453538884
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 31.33664956793118
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 27.883839621715524
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 30.096874986740758
- type: mrr
value: 30.97300481932132
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: nfcorpus
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 5.4
- type: map_at_10
value: 11.852
- type: map_at_100
value: 14.758
- type: map_at_1000
value: 16.134
- type: map_at_3
value: 8.558
- type: map_at_5
value: 10.087
- type: mrr_at_1
value: 44.272
- type: mrr_at_10
value: 52.05800000000001
- type: mrr_at_100
value: 52.689
- type: mrr_at_1000
value: 52.742999999999995
- type: mrr_at_3
value: 50.205999999999996
- type: mrr_at_5
value: 51.367
- type: ndcg_at_1
value: 42.57
- type: ndcg_at_10
value: 32.449
- type: ndcg_at_100
value: 29.596
- type: ndcg_at_1000
value: 38.351
- type: ndcg_at_3
value: 37.044
- type: ndcg_at_5
value: 35.275
- type: precision_at_1
value: 44.272
- type: precision_at_10
value: 23.87
- type: precision_at_100
value: 7.625
- type: precision_at_1000
value: 2.045
- type: precision_at_3
value: 34.365
- type: precision_at_5
value: 30.341
- type: recall_at_1
value: 5.4
- type: recall_at_10
value: 15.943999999999999
- type: recall_at_100
value: 29.805
- type: recall_at_1000
value: 61.695
- type: recall_at_3
value: 9.539
- type: recall_at_5
value: 12.127
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: nq
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 36.047000000000004
- type: map_at_10
value: 51.6
- type: map_at_100
value: 52.449999999999996
- type: map_at_1000
value: 52.476
- type: map_at_3
value: 47.452
- type: map_at_5
value: 49.964
- type: mrr_at_1
value: 40.382
- type: mrr_at_10
value: 54.273
- type: mrr_at_100
value: 54.859
- type: mrr_at_1000
value: 54.876000000000005
- type: mrr_at_3
value: 51.014
- type: mrr_at_5
value: 52.983999999999995
- type: ndcg_at_1
value: 40.353
- type: ndcg_at_10
value: 59.11300000000001
- type: ndcg_at_100
value: 62.604000000000006
- type: ndcg_at_1000
value: 63.187000000000005
- type: ndcg_at_3
value: 51.513
- type: ndcg_at_5
value: 55.576
- type: precision_at_1
value: 40.353
- type: precision_at_10
value: 9.418
- type: precision_at_100
value: 1.1440000000000001
- type: precision_at_1000
value: 0.12
- type: precision_at_3
value: 23.078000000000003
- type: precision_at_5
value: 16.250999999999998
- type: recall_at_1
value: 36.047000000000004
- type: recall_at_10
value: 79.22200000000001
- type: recall_at_100
value: 94.23
- type: recall_at_1000
value: 98.51100000000001
- type: recall_at_3
value: 59.678
- type: recall_at_5
value: 68.967
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: quora
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 68.232
- type: map_at_10
value: 81.674
- type: map_at_100
value: 82.338
- type: map_at_1000
value: 82.36099999999999
- type: map_at_3
value: 78.833
- type: map_at_5
value: 80.58
- type: mrr_at_1
value: 78.64
- type: mrr_at_10
value: 85.164
- type: mrr_at_100
value: 85.317
- type: mrr_at_1000
value: 85.319
- type: mrr_at_3
value: 84.127
- type: mrr_at_5
value: 84.789
- type: ndcg_at_1
value: 78.63
- type: ndcg_at_10
value: 85.711
- type: ndcg_at_100
value: 87.238
- type: ndcg_at_1000
value: 87.444
- type: ndcg_at_3
value: 82.788
- type: ndcg_at_5
value: 84.313
- type: precision_at_1
value: 78.63
- type: precision_at_10
value: 12.977
- type: precision_at_100
value: 1.503
- type: precision_at_1000
value: 0.156
- type: precision_at_3
value: 36.113
- type: precision_at_5
value: 23.71
- type: recall_at_1
value: 68.232
- type: recall_at_10
value: 93.30199999999999
- type: recall_at_100
value: 98.799
- type: recall_at_1000
value: 99.885
- type: recall_at_3
value: 84.827
- type: recall_at_5
value: 89.188
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 45.71879170816294
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 59.65866311751794
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: scidocs
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 4.218
- type: map_at_10
value: 10.337
- type: map_at_100
value: 12.131
- type: map_at_1000
value: 12.411
- type: map_at_3
value: 7.4270000000000005
- type: map_at_5
value: 8.913
- type: mrr_at_1
value: 20.8
- type: mrr_at_10
value: 30.868000000000002
- type: mrr_at_100
value: 31.903
- type: mrr_at_1000
value: 31.972
- type: mrr_at_3
value: 27.367
- type: mrr_at_5
value: 29.372
- type: ndcg_at_1
value: 20.8
- type: ndcg_at_10
value: 17.765
- type: ndcg_at_100
value: 24.914
- type: ndcg_at_1000
value: 30.206
- type: ndcg_at_3
value: 16.64
- type: ndcg_at_5
value: 14.712
- type: precision_at_1
value: 20.8
- type: precision_at_10
value: 9.24
- type: precision_at_100
value: 1.9560000000000002
- type: precision_at_1000
value: 0.32299999999999995
- type: precision_at_3
value: 15.467
- type: precision_at_5
value: 12.94
- type: recall_at_1
value: 4.218
- type: recall_at_10
value: 18.752
- type: recall_at_100
value: 39.7
- type: recall_at_1000
value: 65.57300000000001
- type: recall_at_3
value: 9.428
- type: recall_at_5
value: 13.133000000000001
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 83.04338850207233
- type: cos_sim_spearman
value: 78.5054651430423
- type: euclidean_pearson
value: 80.30739451228612
- type: euclidean_spearman
value: 78.48377464299097
- type: manhattan_pearson
value: 80.40795049052781
- type: manhattan_spearman
value: 78.49506205443114
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 84.11596224442962
- type: cos_sim_spearman
value: 76.20997388935461
- type: euclidean_pearson
value: 80.56858451349109
- type: euclidean_spearman
value: 75.92659183871186
- type: manhattan_pearson
value: 80.60246102203844
- type: manhattan_spearman
value: 76.03018971432664
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 81.34691640755737
- type: cos_sim_spearman
value: 82.4018369631579
- type: euclidean_pearson
value: 81.87673092245366
- type: euclidean_spearman
value: 82.3671489960678
- type: manhattan_pearson
value: 81.88222387719948
- type: manhattan_spearman
value: 82.3816590344736
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 81.2836092579524
- type: cos_sim_spearman
value: 78.99982781772064
- type: euclidean_pearson
value: 80.5184271010527
- type: euclidean_spearman
value: 78.89777392101904
- type: manhattan_pearson
value: 80.53585705018664
- type: manhattan_spearman
value: 78.92898405472994
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 86.7349907750784
- type: cos_sim_spearman
value: 87.7611234446225
- type: euclidean_pearson
value: 86.98759326731624
- type: euclidean_spearman
value: 87.58321319424618
- type: manhattan_pearson
value: 87.03483090370842
- type: manhattan_spearman
value: 87.63278333060288
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 81.75873694924825
- type: cos_sim_spearman
value: 83.80237999094724
- type: euclidean_pearson
value: 83.55023725861537
- type: euclidean_spearman
value: 84.12744338577744
- type: manhattan_pearson
value: 83.58816983036232
- type: manhattan_spearman
value: 84.18520748676501
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 87.21630882940174
- type: cos_sim_spearman
value: 87.72382883437031
- type: euclidean_pearson
value: 88.69933350930333
- type: euclidean_spearman
value: 88.24660814383081
- type: manhattan_pearson
value: 88.77331018833499
- type: manhattan_spearman
value: 88.26109989380632
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 61.11854063060489
- type: cos_sim_spearman
value: 63.14678634195072
- type: euclidean_pearson
value: 61.679090067000864
- type: euclidean_spearman
value: 62.28876589509653
- type: manhattan_pearson
value: 62.082324165511004
- type: manhattan_spearman
value: 62.56030932816679
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 84.00319882832645
- type: cos_sim_spearman
value: 85.94529772647257
- type: euclidean_pearson
value: 85.6661390122756
- type: euclidean_spearman
value: 85.97747815545827
- type: manhattan_pearson
value: 85.58422770541893
- type: manhattan_spearman
value: 85.9237139181532
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 79.16198731863916
- type: mrr
value: 94.25202702163487
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: scifact
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 54.761
- type: map_at_10
value: 64.396
- type: map_at_100
value: 65.07
- type: map_at_1000
value: 65.09899999999999
- type: map_at_3
value: 61.846000000000004
- type: map_at_5
value: 63.284
- type: mrr_at_1
value: 57.667
- type: mrr_at_10
value: 65.83099999999999
- type: mrr_at_100
value: 66.36800000000001
- type: mrr_at_1000
value: 66.39399999999999
- type: mrr_at_3
value: 64.056
- type: mrr_at_5
value: 65.206
- type: ndcg_at_1
value: 57.667
- type: ndcg_at_10
value: 68.854
- type: ndcg_at_100
value: 71.59100000000001
- type: ndcg_at_1000
value: 72.383
- type: ndcg_at_3
value: 64.671
- type: ndcg_at_5
value: 66.796
- type: precision_at_1
value: 57.667
- type: precision_at_10
value: 9.167
- type: precision_at_100
value: 1.053
- type: precision_at_1000
value: 0.11199999999999999
- type: precision_at_3
value: 25.444
- type: precision_at_5
value: 16.667
- type: recall_at_1
value: 54.761
- type: recall_at_10
value: 80.9
- type: recall_at_100
value: 92.767
- type: recall_at_1000
value: 99
- type: recall_at_3
value: 69.672
- type: recall_at_5
value: 75.083
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.8079207920792
- type: cos_sim_ap
value: 94.88470927617445
- type: cos_sim_f1
value: 90.08179959100204
- type: cos_sim_precision
value: 92.15481171548117
- type: cos_sim_recall
value: 88.1
- type: dot_accuracy
value: 99.58613861386138
- type: dot_ap
value: 82.94822578881316
- type: dot_f1
value: 77.33333333333333
- type: dot_precision
value: 79.36842105263158
- type: dot_recall
value: 75.4
- type: euclidean_accuracy
value: 99.8069306930693
- type: euclidean_ap
value: 94.81367858031837
- type: euclidean_f1
value: 90.01009081735621
- type: euclidean_precision
value: 90.83503054989816
- type: euclidean_recall
value: 89.2
- type: manhattan_accuracy
value: 99.81188118811882
- type: manhattan_ap
value: 94.91405337220161
- type: manhattan_f1
value: 90.2763561924258
- type: manhattan_precision
value: 92.45283018867924
- type: manhattan_recall
value: 88.2
- type: max_accuracy
value: 99.81188118811882
- type: max_ap
value: 94.91405337220161
- type: max_f1
value: 90.2763561924258
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 58.511599500053094
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 31.984728147814707
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 49.93428193939015
- type: mrr
value: 50.916557911043206
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 31.562500894537145
- type: cos_sim_spearman
value: 31.162587976726307
- type: dot_pearson
value: 22.633662187735762
- type: dot_spearman
value: 22.723000282378962
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: trec-covid
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 0.219
- type: map_at_10
value: 1.871
- type: map_at_100
value: 10.487
- type: map_at_1000
value: 25.122
- type: map_at_3
value: 0.657
- type: map_at_5
value: 1.0699999999999998
- type: mrr_at_1
value: 84
- type: mrr_at_10
value: 89.567
- type: mrr_at_100
value: 89.748
- type: mrr_at_1000
value: 89.748
- type: mrr_at_3
value: 88.667
- type: mrr_at_5
value: 89.567
- type: ndcg_at_1
value: 80
- type: ndcg_at_10
value: 74.533
- type: ndcg_at_100
value: 55.839000000000006
- type: ndcg_at_1000
value: 49.748
- type: ndcg_at_3
value: 79.53099999999999
- type: ndcg_at_5
value: 78.245
- type: precision_at_1
value: 84
- type: precision_at_10
value: 78.4
- type: precision_at_100
value: 56.99999999999999
- type: precision_at_1000
value: 21.98
- type: precision_at_3
value: 85.333
- type: precision_at_5
value: 84.8
- type: recall_at_1
value: 0.219
- type: recall_at_10
value: 2.02
- type: recall_at_100
value: 13.555
- type: recall_at_1000
value: 46.739999999999995
- type: recall_at_3
value: 0.685
- type: recall_at_5
value: 1.13
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: webis-touche2020
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 3.5029999999999997
- type: map_at_10
value: 11.042
- type: map_at_100
value: 16.326999999999998
- type: map_at_1000
value: 17.836
- type: map_at_3
value: 6.174
- type: map_at_5
value: 7.979
- type: mrr_at_1
value: 42.857
- type: mrr_at_10
value: 52.617000000000004
- type: mrr_at_100
value: 53.351000000000006
- type: mrr_at_1000
value: 53.351000000000006
- type: mrr_at_3
value: 46.939
- type: mrr_at_5
value: 50.714000000000006
- type: ndcg_at_1
value: 38.775999999999996
- type: ndcg_at_10
value: 27.125
- type: ndcg_at_100
value: 35.845
- type: ndcg_at_1000
value: 47.377
- type: ndcg_at_3
value: 29.633
- type: ndcg_at_5
value: 28.378999999999998
- type: precision_at_1
value: 42.857
- type: precision_at_10
value: 24.082
- type: precision_at_100
value: 6.877999999999999
- type: precision_at_1000
value: 1.463
- type: precision_at_3
value: 29.932
- type: precision_at_5
value: 28.571
- type: recall_at_1
value: 3.5029999999999997
- type: recall_at_10
value: 17.068
- type: recall_at_100
value: 43.361
- type: recall_at_1000
value: 78.835
- type: recall_at_3
value: 6.821000000000001
- type: recall_at_5
value: 10.357
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 71.0954
- type: ap
value: 14.216844153511959
- type: f1
value: 54.63687418565117
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 61.46293152235427
- type: f1
value: 61.744177921638645
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 41.12708617788644
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 85.75430649102938
- type: cos_sim_ap
value: 73.34252536948081
- type: cos_sim_f1
value: 67.53758935173774
- type: cos_sim_precision
value: 63.3672525439408
- type: cos_sim_recall
value: 72.29551451187335
- type: dot_accuracy
value: 81.71305954580676
- type: dot_ap
value: 59.5532209082386
- type: dot_f1
value: 56.18466898954705
- type: dot_precision
value: 47.830923248053395
- type: dot_recall
value: 68.07387862796834
- type: euclidean_accuracy
value: 85.81987244441795
- type: euclidean_ap
value: 73.34325409809446
- type: euclidean_f1
value: 67.83451360417443
- type: euclidean_precision
value: 64.09955388588871
- type: euclidean_recall
value: 72.0316622691293
- type: manhattan_accuracy
value: 85.68277999642368
- type: manhattan_ap
value: 73.1535450121903
- type: manhattan_f1
value: 67.928237896289
- type: manhattan_precision
value: 63.56945722171113
- type: manhattan_recall
value: 72.9287598944591
- type: max_accuracy
value: 85.81987244441795
- type: max_ap
value: 73.34325409809446
- type: max_f1
value: 67.928237896289
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 88.90441262079403
- type: cos_sim_ap
value: 85.79331880741438
- type: cos_sim_f1
value: 78.31563529842548
- type: cos_sim_precision
value: 74.6683424102779
- type: cos_sim_recall
value: 82.33754234678165
- type: dot_accuracy
value: 84.89928978926534
- type: dot_ap
value: 75.25819218316
- type: dot_f1
value: 69.88730119720536
- type: dot_precision
value: 64.23362374959665
- type: dot_recall
value: 76.63227594702803
- type: euclidean_accuracy
value: 89.01695967710637
- type: euclidean_ap
value: 85.98986606038852
- type: euclidean_f1
value: 78.5277880014722
- type: euclidean_precision
value: 75.22211253701876
- type: euclidean_recall
value: 82.13735756082538
- type: manhattan_accuracy
value: 88.99561454573679
- type: manhattan_ap
value: 85.92262421793953
- type: manhattan_f1
value: 78.38866094740769
- type: manhattan_precision
value: 76.02373028505282
- type: manhattan_recall
value: 80.9054511857099
- type: max_accuracy
value: 89.01695967710637
- type: max_ap
value: 85.98986606038852
- type: max_f1
value: 78.5277880014722
---
# E5-small-v2
[Text Embeddings by Weakly-Supervised Contrastive Pre-training](https://arxiv.org/pdf/2212.03533.pdf).
Liang Wang, Nan Yang, Xiaolong Huang, Binxing Jiao, Linjun Yang, Daxin Jiang, Rangan Majumder, Furu Wei, arXiv 2022
This model has 12 layers and the embedding size is 384.
## Usage
Below is an example to encode queries and passages from the MS-MARCO passage ranking dataset.
```python
import torch.nn.functional as F
from torch import Tensor
from transformers import AutoTokenizer, AutoModel
def average_pool(last_hidden_states: Tensor,
attention_mask: Tensor) -> Tensor:
last_hidden = last_hidden_states.masked_fill(~attention_mask[..., None].bool(), 0.0)
return last_hidden.sum(dim=1) / attention_mask.sum(dim=1)[..., None]
# Each input text should start with "query: " or "passage: ".
# For tasks other than retrieval, you can simply use the "query: " prefix.
input_texts = ['query: how much protein should a female eat',
'query: summit define',
"passage: As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.",
"passage: Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments."]
tokenizer = AutoTokenizer.from_pretrained('ggrn/e5-small-v2')
model = AutoModel.from_pretrained('ggrn/e5-small-v2')
# Tokenize the input texts
batch_dict = tokenizer(input_texts, max_length=512, padding=True, truncation=True, return_tensors='pt')
outputs = model(**batch_dict)
embeddings = average_pool(outputs.last_hidden_state, batch_dict['attention_mask'])
# (Optionally) normalize embeddings
embeddings = F.normalize(embeddings, p=2, dim=1)
scores = (embeddings[:2] @ embeddings[2:].T) * 100
print(scores.tolist())
```
## Training Details
Please refer to our paper at [https://arxiv.org/pdf/2212.03533.pdf](https://arxiv.org/pdf/2212.03533.pdf).
## Benchmark Evaluation
Check out [unilm/e5](https://github.com/microsoft/unilm/tree/master/e5) to reproduce evaluation results
on the [BEIR](https://arxiv.org/abs/2104.08663) and [MTEB benchmark](https://arxiv.org/abs/2210.07316).
## Citation
If you find our paper or models helpful, please consider cite as follows:
```
@article{wang2022text,
title={Text Embeddings by Weakly-Supervised Contrastive Pre-training},
author={Wang, Liang and Yang, Nan and Huang, Xiaolong and Jiao, Binxing and Yang, Linjun and Jiang, Daxin and Majumder, Rangan and Wei, Furu},
journal={arXiv preprint arXiv:2212.03533},
year={2022}
}
```
## Limitations
This model only works for English texts. Long texts will be truncated to at most 512 tokens.
## Sentence Transformers
Below is an example for usage with sentence_transformers. `pip install sentence_transformers~=2.2.2`
This is community contributed, and results may vary up to numerical precision.
```python
from sentence_transformers import SentenceTransformer
model = SentenceTransformer('ggrn/e5-small-v2')
embeddings = model.encode(input_texts, normalize_embeddings=True)
``` | [
"SUMMARIZATION"
] | [
"BIOSSES",
"SCIFACT"
] |
EleutherAI/pythia-70m-v0 | EleutherAI | text-generation | [
"transformers",
"pytorch",
"safetensors",
"gpt_neox",
"text-generation",
"causal-lm",
"pythia",
"pythia_v0",
"en",
"dataset:the_pile",
"arxiv:2101.00027",
"arxiv:2201.07311",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | 2022-10-16T18:31:25 | 2023-03-29T18:53:28 | 904 | 6 | ---
datasets:
- the_pile
language:
- en
license: apache-2.0
tags:
- pytorch
- causal-lm
- pythia
- pythia_v0
---
The *Pythia Scaling Suite* is a collection of models developed to facilitate
interpretability research. It contains two sets of eight models of sizes
70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two
models: one trained on the Pile, and one trained on the Pile after the dataset
has been globally deduplicated. All 8 model sizes are trained on the exact
same data, in the exact same order. All Pythia models are available
[on Hugging Face](https://huggingface.co/models?other=pythia).
The Pythia model suite was deliberately designed to promote scientific
research on large language models, especially interpretability research.
Despite not centering downstream performance as a design goal, we find the
models <a href="#evaluations">match or exceed</a> the performance of
similar and same-sized models, such as those in the OPT and GPT-Neo suites.
Please note that all models in the *Pythia* suite were renamed in January
2023. For clarity, a <a href="#naming-convention-and-parameter-count">table
comparing the old and new names</a> is provided in this model card, together
with exact parameter counts.
## Pythia-70M
### Model Details
- Developed by: [EleutherAI](http://eleuther.ai)
- Model type: Transformer-based Language Model
- Language: English
- Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia)
for training procedure, config files, and details on how to use.
- Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
- License: Apache 2.0
- Contact: to ask questions about this model, join the [EleutherAI
Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`.
Please read the existing *Pythia* documentation before asking about it in the
EleutherAI Discord. For general correspondence: [contact@eleuther.
ai](mailto:[email protected]).
<figure>
| Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models |
| -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: |
| 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — |
| 160M | 85,056,000 | 12 | 768 | 12 | 4M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M |
| 410M | 302,311,424 | 24 | 1024 | 16 | 4M | 3.0 x 10<sup>-4</sup> | OPT-350M |
| 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — |
| 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 4M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B |
| 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B |
| 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B |
| 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — |
<figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and
non-deduped models of a given size have the same hyperparameters. “Equivalent”
models have <b>exactly</b> the same architecture, and the same number of
non-embedding parameters.</figcaption>
</figure>
### Uses and Limitations
#### Intended Use
The primary intended use of Pythia is research on the behavior, functionality,
and limitations of large language models. This suite is intended to provide
a controlled setting for performing scientific experiments. To enable the
study of how language models change over the course of training, we provide
143 evenly spaced intermediate checkpoints per model. These checkpoints are
hosted on Hugging Face as branches. Note that branch `143000` corresponds
exactly to the model checkpoint on the `main` branch of each model.
You may also further fine-tune and adapt Pythia-70M for deployment,
as long as your use is in accordance with the Apache 2.0 license. Pythia
models work with the Hugging Face [Transformers
Library](https://huggingface.co/docs/transformers/index). If you decide to use
pre-trained Pythia-70M as a basis for your fine-tuned model, please
conduct your own risk and bias assessment.
#### Out-of-scope use
The Pythia Suite is **not** intended for deployment. It is not a in itself
a product and cannot be used for human-facing interactions.
Pythia models are English-language only, and are not suitable for translation
or generating text in other languages.
Pythia-70M has not been fine-tuned for downstream contexts in which
language models are commonly deployed, such as writing genre prose,
or commercial chatbots. This means Pythia-70M will **not**
respond to a given prompt the way a product like ChatGPT does. This is because,
unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement
Learning from Human Feedback (RLHF) to better “understand” human instructions.
#### Limitations and biases
The core functionality of a large language model is to take a string of text
and predict the next token. The token deemed statistically most likely by the
model need not produce the most “accurate” text. Never rely on
Pythia-70M to produce factually accurate output.
This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset
known to contain profanity and texts that are lewd or otherwise offensive.
See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a
discussion of documented biases with regards to gender, religion, and race.
Pythia-70M may produce socially unacceptable or undesirable text, *even if*
the prompt itself does not include anything explicitly offensive.
If you plan on using text generated through, for example, the Hosted Inference
API, we recommend having a human curate the outputs of this language model
before presenting it to other people. Please inform your audience that the
text was generated by Pythia-70M.
### Quickstart
Pythia models can be loaded and used via the following code, demonstrated here
for the third `pythia-70m-deduped` checkpoint:
```python
from transformers import GPTNeoXForCausalLM, AutoTokenizer
model = GPTNeoXForCausalLM.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
tokenizer = AutoTokenizer.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
inputs = tokenizer("Hello, I am", return_tensors="pt")
tokens = model.generate(**inputs)
tokenizer.decode(tokens[0])
```
Revision/branch `step143000` corresponds exactly to the model checkpoint on
the `main` branch of each model.<br>
For more information on how to use all Pythia models, see [documentation on
GitHub](https://github.com/EleutherAI/pythia).
### Training
#### Training data
[The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in
English. It was created by EleutherAI specifically for training large language
models. It contains texts from 22 diverse sources, roughly broken down into
five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl),
prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and
miscellaneous (e.g. GitHub, Enron Emails). See [the Pile
paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources,
methodology, and a discussion of ethical implications. Consult [the
datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation
about the Pile and its component datasets. The Pile can be downloaded from
the [official website](https://pile.eleuther.ai/), or from a [community
mirror](https://the-eye.eu/public/AI/pile/).<br>
The Pile was **not** deduplicated before being used to train Pythia-70M.
#### Training procedure
All models were trained on the exact same data, in the exact same order. Each
model saw 299,892,736,000 tokens during training, and 143 checkpoints for each
model are saved every 2,097,152,000 tokens, spaced evenly throughout training.
This corresponds to training for just under 1 epoch on the Pile for
non-deduplicated models, and about 1.5 epochs on the deduplicated Pile.
All *Pythia* models trained for the equivalent of 143000 steps at a batch size
of 2,097,152 tokens. Two batch sizes were used: 2M and 4M. Models with a batch
size of 4M tokens listed were originally trained for 71500 steps instead, with
checkpoints every 500 steps. The checkpoints on Hugging Face are renamed for
consistency with all 2M batch models, so `step1000` is the first checkpoint
for `pythia-1.4b` that was saved (corresponding to step 500 in training), and
`step1000` is likewise the first `pythia-6.9b` checkpoint that was saved
(corresponding to 1000 “actual” steps).<br>
See [GitHub](https://github.com/EleutherAI/pythia) for more details on training
procedure, including [how to reproduce
it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br>
Pythia uses the same tokenizer as [GPT-NeoX-
20B](https://huggingface.co/EleutherAI/gpt-neox-20b).
### Evaluations
All 16 *Pythia* models were evaluated using the [LM Evaluation
Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access
the results by model and step at `results/json/*` in the [GitHub
repository](https://github.com/EleutherAI/pythia/tree/main/results/json).<br>
Expand the sections below to see plots of evaluation results for all
Pythia and Pythia-deduped models compared with OPT and BLOOM.
<details>
<summary>LAMBADA – OpenAI</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai.png" style="width:auto"/>
</details>
<details>
<summary>Physical Interaction: Question Answering (PIQA)</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa.png" style="width:auto"/>
</details>
<details>
<summary>WinoGrande</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande.png" style="width:auto"/>
</details>
<details>
<summary>AI2 Reasoning Challenge—Challenge Set</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_challenge.png" style="width:auto"/>
</details>
<details>
<summary>SciQ</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq.png" style="width:auto"/>
</details>
### Naming convention and parameter count
*Pythia* models were renamed in January 2023. It is possible that the old
naming convention still persists in some documentation by accident. The
current naming convention (70M, 160M, etc.) is based on total parameter count.
<figure style="width:32em">
| current Pythia suffix | old suffix | total params | non-embedding params |
| --------------------: | ---------: | -------------: | -------------------: |
| 70M | 19M | 70,426,624 | 18,915,328 |
| 160M | 125M | 162,322,944 | 85,056,000 |
| 410M | 350M | 405,334,016 | 302,311,424 |
| 1B | 800M | 1,011,781,632 | 805,736,448 |
| 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 |
| 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 |
| 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 |
| 12B | 13B | 11,846,072,320 | 11,327,027,200 |
</figure> | [
"QUESTION_ANSWERING",
"TRANSLATION"
] | [
"SCIQ"
] |
McGill-NLP/LLM2Vec-Mistral-7B-Instruct-v2-mntp-supervised | McGill-NLP | sentence-similarity | [
"peft",
"safetensors",
"text-embedding",
"embeddings",
"information-retrieval",
"beir",
"text-classification",
"language-model",
"text-clustering",
"text-semantic-similarity",
"text-evaluation",
"text-reranking",
"feature-extraction",
"sentence-similarity",
"Sentence Similarity",
"natural_questions",
"ms_marco",
"fever",
"hotpot_qa",
"mteb",
"en",
"arxiv:2404.05961",
"license:mit",
"model-index",
"region:us"
] | 2024-04-04T03:33:56 | 2024-04-11T20:10:34 | 894 | 13 | ---
language:
- en
library_name: peft
license: mit
pipeline_tag: sentence-similarity
tags:
- text-embedding
- embeddings
- information-retrieval
- beir
- text-classification
- language-model
- text-clustering
- text-semantic-similarity
- text-evaluation
- text-reranking
- feature-extraction
- sentence-similarity
- Sentence Similarity
- natural_questions
- ms_marco
- fever
- hotpot_qa
- mteb
model-index:
- name: LLM2Vec-Mistral-7B-supervised
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 77.58208955223881
- type: ap
value: 41.45474097979136
- type: f1
value: 71.76059891468786
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 91.12039999999999
- type: ap
value: 88.01002974730474
- type: f1
value: 91.1049266954883
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 49.966
- type: f1
value: 48.908221884634386
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: arguana
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 32.788000000000004
- type: map_at_10
value: 48.665000000000006
- type: map_at_100
value: 49.501
- type: map_at_1000
value: 49.504
- type: map_at_3
value: 43.883
- type: map_at_5
value: 46.501
- type: mrr_at_1
value: 33.357
- type: mrr_at_10
value: 48.882
- type: mrr_at_100
value: 49.718
- type: mrr_at_1000
value: 49.721
- type: mrr_at_3
value: 44.025999999999996
- type: mrr_at_5
value: 46.732
- type: ndcg_at_1
value: 32.788000000000004
- type: ndcg_at_10
value: 57.483
- type: ndcg_at_100
value: 60.745000000000005
- type: ndcg_at_1000
value: 60.797000000000004
- type: ndcg_at_3
value: 47.534
- type: ndcg_at_5
value: 52.266
- type: precision_at_1
value: 32.788000000000004
- type: precision_at_10
value: 8.57
- type: precision_at_100
value: 0.993
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 19.369
- type: precision_at_5
value: 13.926
- type: recall_at_1
value: 32.788000000000004
- type: recall_at_10
value: 85.70400000000001
- type: recall_at_100
value: 99.289
- type: recall_at_1000
value: 99.644
- type: recall_at_3
value: 58.108000000000004
- type: recall_at_5
value: 69.63000000000001
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 42.805075760047906
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 44.235789514284214
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 63.98320383943591
- type: mrr
value: 76.53189992525174
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_spearman
value: 85.24411101959603
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 88.31493506493506
- type: f1
value: 88.28524975751309
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 34.27007175430729
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 35.52517776034658
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: cqadupstack/android
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 38.686
- type: map_at_10
value: 51.939
- type: map_at_100
value: 53.751000000000005
- type: map_at_1000
value: 53.846000000000004
- type: map_at_3
value: 48.296
- type: map_at_5
value: 50.312999999999995
- type: mrr_at_1
value: 49.641999999999996
- type: mrr_at_10
value: 59.157000000000004
- type: mrr_at_100
value: 59.85
- type: mrr_at_1000
value: 59.876
- type: mrr_at_3
value: 57.058
- type: mrr_at_5
value: 58.231
- type: ndcg_at_1
value: 49.641999999999996
- type: ndcg_at_10
value: 58.714
- type: ndcg_at_100
value: 63.776999999999994
- type: ndcg_at_1000
value: 64.95
- type: ndcg_at_3
value: 54.799
- type: ndcg_at_5
value: 56.372
- type: precision_at_1
value: 49.641999999999996
- type: precision_at_10
value: 11.373
- type: precision_at_100
value: 1.712
- type: precision_at_1000
value: 0.209
- type: precision_at_3
value: 27.229
- type: precision_at_5
value: 19.056
- type: recall_at_1
value: 38.686
- type: recall_at_10
value: 69.976
- type: recall_at_100
value: 90.512
- type: recall_at_1000
value: 97.64
- type: recall_at_3
value: 56.625
- type: recall_at_5
value: 62.348000000000006
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackEnglishRetrieval
type: cqadupstack/english
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 36.356
- type: map_at_10
value: 48.004000000000005
- type: map_at_100
value: 49.342999999999996
- type: map_at_1000
value: 49.461
- type: map_at_3
value: 44.692
- type: map_at_5
value: 46.576
- type: mrr_at_1
value: 46.561
- type: mrr_at_10
value: 54.547000000000004
- type: mrr_at_100
value: 55.159000000000006
- type: mrr_at_1000
value: 55.193000000000005
- type: mrr_at_3
value: 52.516
- type: mrr_at_5
value: 53.701
- type: ndcg_at_1
value: 46.561
- type: ndcg_at_10
value: 53.835
- type: ndcg_at_100
value: 57.92699999999999
- type: ndcg_at_1000
value: 59.671
- type: ndcg_at_3
value: 49.997
- type: ndcg_at_5
value: 51.714000000000006
- type: precision_at_1
value: 46.561
- type: precision_at_10
value: 10.344000000000001
- type: precision_at_100
value: 1.5779999999999998
- type: precision_at_1000
value: 0.202
- type: precision_at_3
value: 24.437
- type: precision_at_5
value: 17.197000000000003
- type: recall_at_1
value: 36.356
- type: recall_at_10
value: 63.019000000000005
- type: recall_at_100
value: 80.55099999999999
- type: recall_at_1000
value: 91.38300000000001
- type: recall_at_3
value: 50.431000000000004
- type: recall_at_5
value: 56.00000000000001
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGamingRetrieval
type: cqadupstack/gaming
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 46.736
- type: map_at_10
value: 60.775999999999996
- type: map_at_100
value: 61.755
- type: map_at_1000
value: 61.783
- type: map_at_3
value: 57.293000000000006
- type: map_at_5
value: 59.382000000000005
- type: mrr_at_1
value: 54.232
- type: mrr_at_10
value: 64.424
- type: mrr_at_100
value: 64.996
- type: mrr_at_1000
value: 65.009
- type: mrr_at_3
value: 62.226000000000006
- type: mrr_at_5
value: 63.592000000000006
- type: ndcg_at_1
value: 54.232
- type: ndcg_at_10
value: 66.654
- type: ndcg_at_100
value: 70.152
- type: ndcg_at_1000
value: 70.648
- type: ndcg_at_3
value: 61.405
- type: ndcg_at_5
value: 64.137
- type: precision_at_1
value: 54.232
- type: precision_at_10
value: 10.607999999999999
- type: precision_at_100
value: 1.321
- type: precision_at_1000
value: 0.13899999999999998
- type: precision_at_3
value: 27.544
- type: precision_at_5
value: 18.645999999999997
- type: recall_at_1
value: 46.736
- type: recall_at_10
value: 80.10199999999999
- type: recall_at_100
value: 94.976
- type: recall_at_1000
value: 98.402
- type: recall_at_3
value: 66.094
- type: recall_at_5
value: 73.028
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGisRetrieval
type: cqadupstack/gis
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 30.238
- type: map_at_10
value: 39.798
- type: map_at_100
value: 40.892
- type: map_at_1000
value: 40.971000000000004
- type: map_at_3
value: 36.788
- type: map_at_5
value: 38.511
- type: mrr_at_1
value: 32.994
- type: mrr_at_10
value: 42.028
- type: mrr_at_100
value: 42.959
- type: mrr_at_1000
value: 43.010999999999996
- type: mrr_at_3
value: 39.322
- type: mrr_at_5
value: 40.977000000000004
- type: ndcg_at_1
value: 32.994
- type: ndcg_at_10
value: 45.062000000000005
- type: ndcg_at_100
value: 50.166999999999994
- type: ndcg_at_1000
value: 51.961
- type: ndcg_at_3
value: 39.378
- type: ndcg_at_5
value: 42.281
- type: precision_at_1
value: 32.994
- type: precision_at_10
value: 6.836
- type: precision_at_100
value: 0.9860000000000001
- type: precision_at_1000
value: 0.11800000000000001
- type: precision_at_3
value: 16.384
- type: precision_at_5
value: 11.548
- type: recall_at_1
value: 30.238
- type: recall_at_10
value: 59.080999999999996
- type: recall_at_100
value: 82.033
- type: recall_at_1000
value: 95.281
- type: recall_at_3
value: 43.902
- type: recall_at_5
value: 50.952
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackMathematicaRetrieval
type: cqadupstack/mathematica
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 21.512999999999998
- type: map_at_10
value: 31.339
- type: map_at_100
value: 32.651
- type: map_at_1000
value: 32.762
- type: map_at_3
value: 27.590999999999998
- type: map_at_5
value: 29.946
- type: mrr_at_1
value: 26.866
- type: mrr_at_10
value: 36.525
- type: mrr_at_100
value: 37.357
- type: mrr_at_1000
value: 37.419999999999995
- type: mrr_at_3
value: 33.085
- type: mrr_at_5
value: 35.379
- type: ndcg_at_1
value: 26.866
- type: ndcg_at_10
value: 37.621
- type: ndcg_at_100
value: 43.031000000000006
- type: ndcg_at_1000
value: 45.573
- type: ndcg_at_3
value: 31.046000000000003
- type: ndcg_at_5
value: 34.709
- type: precision_at_1
value: 26.866
- type: precision_at_10
value: 7.052
- type: precision_at_100
value: 1.117
- type: precision_at_1000
value: 0.145
- type: precision_at_3
value: 14.884
- type: precision_at_5
value: 11.517
- type: recall_at_1
value: 21.512999999999998
- type: recall_at_10
value: 51.751999999999995
- type: recall_at_100
value: 74.34100000000001
- type: recall_at_1000
value: 92.426
- type: recall_at_3
value: 34.008
- type: recall_at_5
value: 43.075
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackPhysicsRetrieval
type: cqadupstack/physics
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 35.327
- type: map_at_10
value: 47.783
- type: map_at_100
value: 49.153999999999996
- type: map_at_1000
value: 49.260999999999996
- type: map_at_3
value: 44.145
- type: map_at_5
value: 46.207
- type: mrr_at_1
value: 44.37
- type: mrr_at_10
value: 53.864999999999995
- type: mrr_at_100
value: 54.625
- type: mrr_at_1000
value: 54.662
- type: mrr_at_3
value: 51.604000000000006
- type: mrr_at_5
value: 52.894
- type: ndcg_at_1
value: 44.37
- type: ndcg_at_10
value: 54.054
- type: ndcg_at_100
value: 59.168
- type: ndcg_at_1000
value: 60.769
- type: ndcg_at_3
value: 49.091
- type: ndcg_at_5
value: 51.444
- type: precision_at_1
value: 44.37
- type: precision_at_10
value: 9.827
- type: precision_at_100
value: 1.456
- type: precision_at_1000
value: 0.17600000000000002
- type: precision_at_3
value: 23.580000000000002
- type: precision_at_5
value: 16.554
- type: recall_at_1
value: 35.327
- type: recall_at_10
value: 66.43900000000001
- type: recall_at_100
value: 87.41600000000001
- type: recall_at_1000
value: 97.37400000000001
- type: recall_at_3
value: 51.64
- type: recall_at_5
value: 58.242000000000004
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackProgrammersRetrieval
type: cqadupstack/programmers
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 32.397999999999996
- type: map_at_10
value: 44.932
- type: map_at_100
value: 46.336
- type: map_at_1000
value: 46.421
- type: map_at_3
value: 41.128
- type: map_at_5
value: 43.364999999999995
- type: mrr_at_1
value: 41.324
- type: mrr_at_10
value: 51.080000000000005
- type: mrr_at_100
value: 51.878
- type: mrr_at_1000
value: 51.910000000000004
- type: mrr_at_3
value: 48.382999999999996
- type: mrr_at_5
value: 50.004000000000005
- type: ndcg_at_1
value: 41.324
- type: ndcg_at_10
value: 51.466
- type: ndcg_at_100
value: 56.874
- type: ndcg_at_1000
value: 58.321999999999996
- type: ndcg_at_3
value: 45.928999999999995
- type: ndcg_at_5
value: 48.532
- type: precision_at_1
value: 41.324
- type: precision_at_10
value: 9.565999999999999
- type: precision_at_100
value: 1.428
- type: precision_at_1000
value: 0.172
- type: precision_at_3
value: 22.184
- type: precision_at_5
value: 15.867999999999999
- type: recall_at_1
value: 32.397999999999996
- type: recall_at_10
value: 64.512
- type: recall_at_100
value: 87.425
- type: recall_at_1000
value: 96.937
- type: recall_at_3
value: 48.513
- type: recall_at_5
value: 55.721
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackRetrieval
type: mteb/cqadupstack
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 32.001916666666666
- type: map_at_10
value: 42.91216666666667
- type: map_at_100
value: 44.21125000000001
- type: map_at_1000
value: 44.314166666666665
- type: map_at_3
value: 39.579
- type: map_at_5
value: 41.497166666666665
- type: mrr_at_1
value: 38.669583333333335
- type: mrr_at_10
value: 47.708
- type: mrr_at_100
value: 48.4875
- type: mrr_at_1000
value: 48.530833333333334
- type: mrr_at_3
value: 45.196333333333335
- type: mrr_at_5
value: 46.702999999999996
- type: ndcg_at_1
value: 38.669583333333335
- type: ndcg_at_10
value: 48.842
- type: ndcg_at_100
value: 53.79400000000001
- type: ndcg_at_1000
value: 55.566416666666676
- type: ndcg_at_3
value: 43.70975
- type: ndcg_at_5
value: 46.204499999999996
- type: precision_at_1
value: 38.669583333333335
- type: precision_at_10
value: 8.652999999999999
- type: precision_at_100
value: 1.3168333333333333
- type: precision_at_1000
value: 0.164
- type: precision_at_3
value: 20.343249999999998
- type: precision_at_5
value: 14.426
- type: recall_at_1
value: 32.001916666666666
- type: recall_at_10
value: 61.31158333333334
- type: recall_at_100
value: 82.80691666666667
- type: recall_at_1000
value: 94.977
- type: recall_at_3
value: 46.63558333333333
- type: recall_at_5
value: 53.32383333333334
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackStatsRetrieval
type: cqadupstack/stats
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 29.311999999999998
- type: map_at_10
value: 37.735
- type: map_at_100
value: 38.702
- type: map_at_1000
value: 38.803
- type: map_at_3
value: 35.17
- type: map_at_5
value: 36.6
- type: mrr_at_1
value: 33.282000000000004
- type: mrr_at_10
value: 41.059
- type: mrr_at_100
value: 41.881
- type: mrr_at_1000
value: 41.943000000000005
- type: mrr_at_3
value: 38.829
- type: mrr_at_5
value: 40.11
- type: ndcg_at_1
value: 33.282000000000004
- type: ndcg_at_10
value: 42.625
- type: ndcg_at_100
value: 47.313
- type: ndcg_at_1000
value: 49.683
- type: ndcg_at_3
value: 38.043
- type: ndcg_at_5
value: 40.217999999999996
- type: precision_at_1
value: 33.282000000000004
- type: precision_at_10
value: 6.748
- type: precision_at_100
value: 0.979
- type: precision_at_1000
value: 0.126
- type: precision_at_3
value: 16.462
- type: precision_at_5
value: 11.411
- type: recall_at_1
value: 29.311999999999998
- type: recall_at_10
value: 54.294
- type: recall_at_100
value: 75.82
- type: recall_at_1000
value: 93.19800000000001
- type: recall_at_3
value: 41.382999999999996
- type: recall_at_5
value: 46.898
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackTexRetrieval
type: cqadupstack/tex
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 22.823
- type: map_at_10
value: 31.682
- type: map_at_100
value: 32.864
- type: map_at_1000
value: 32.988
- type: map_at_3
value: 28.878999999999998
- type: map_at_5
value: 30.459000000000003
- type: mrr_at_1
value: 28.63
- type: mrr_at_10
value: 36.672
- type: mrr_at_100
value: 37.519999999999996
- type: mrr_at_1000
value: 37.588
- type: mrr_at_3
value: 34.262
- type: mrr_at_5
value: 35.653
- type: ndcg_at_1
value: 28.63
- type: ndcg_at_10
value: 37.158
- type: ndcg_at_100
value: 42.4
- type: ndcg_at_1000
value: 45.001000000000005
- type: ndcg_at_3
value: 32.529
- type: ndcg_at_5
value: 34.673
- type: precision_at_1
value: 28.63
- type: precision_at_10
value: 6.848
- type: precision_at_100
value: 1.111
- type: precision_at_1000
value: 0.152
- type: precision_at_3
value: 15.623000000000001
- type: precision_at_5
value: 11.218
- type: recall_at_1
value: 22.823
- type: recall_at_10
value: 48.559000000000005
- type: recall_at_100
value: 72.048
- type: recall_at_1000
value: 90.322
- type: recall_at_3
value: 35.134
- type: recall_at_5
value: 40.897
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackUnixRetrieval
type: cqadupstack/unix
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 32.79
- type: map_at_10
value: 43.578
- type: map_at_100
value: 44.782
- type: map_at_1000
value: 44.869
- type: map_at_3
value: 39.737
- type: map_at_5
value: 41.92
- type: mrr_at_1
value: 39.086
- type: mrr_at_10
value: 48.135
- type: mrr_at_100
value: 48.949
- type: mrr_at_1000
value: 48.995
- type: mrr_at_3
value: 45.086999999999996
- type: mrr_at_5
value: 46.939
- type: ndcg_at_1
value: 39.086
- type: ndcg_at_10
value: 49.736999999999995
- type: ndcg_at_100
value: 54.818999999999996
- type: ndcg_at_1000
value: 56.515
- type: ndcg_at_3
value: 43.503
- type: ndcg_at_5
value: 46.499
- type: precision_at_1
value: 39.086
- type: precision_at_10
value: 8.685
- type: precision_at_100
value: 1.2449999999999999
- type: precision_at_1000
value: 0.148
- type: precision_at_3
value: 19.963
- type: precision_at_5
value: 14.366000000000001
- type: recall_at_1
value: 32.79
- type: recall_at_10
value: 63.766
- type: recall_at_100
value: 85.465
- type: recall_at_1000
value: 96.90299999999999
- type: recall_at_3
value: 46.515
- type: recall_at_5
value: 54.178000000000004
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWebmastersRetrieval
type: cqadupstack/webmasters
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 29.896
- type: map_at_10
value: 41.241
- type: map_at_100
value: 43.178
- type: map_at_1000
value: 43.395
- type: map_at_3
value: 37.702999999999996
- type: map_at_5
value: 39.524
- type: mrr_at_1
value: 36.364000000000004
- type: mrr_at_10
value: 46.184999999999995
- type: mrr_at_100
value: 47.051
- type: mrr_at_1000
value: 47.085
- type: mrr_at_3
value: 43.478
- type: mrr_at_5
value: 44.98
- type: ndcg_at_1
value: 36.364000000000004
- type: ndcg_at_10
value: 48.044
- type: ndcg_at_100
value: 53.818999999999996
- type: ndcg_at_1000
value: 55.504
- type: ndcg_at_3
value: 42.604
- type: ndcg_at_5
value: 44.971
- type: precision_at_1
value: 36.364000000000004
- type: precision_at_10
value: 9.664
- type: precision_at_100
value: 1.917
- type: precision_at_1000
value: 0.255
- type: precision_at_3
value: 20.487
- type: precision_at_5
value: 14.862
- type: recall_at_1
value: 29.896
- type: recall_at_10
value: 60.28
- type: recall_at_100
value: 86.271
- type: recall_at_1000
value: 97.121
- type: recall_at_3
value: 44.885999999999996
- type: recall_at_5
value: 51.351
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWordpressRetrieval
type: cqadupstack/wordpress
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 27.948
- type: map_at_10
value: 36.138999999999996
- type: map_at_100
value: 37.126999999999995
- type: map_at_1000
value: 37.21
- type: map_at_3
value: 33.526
- type: map_at_5
value: 35.163
- type: mrr_at_1
value: 30.684
- type: mrr_at_10
value: 38.818999999999996
- type: mrr_at_100
value: 39.625
- type: mrr_at_1000
value: 39.678000000000004
- type: mrr_at_3
value: 36.506
- type: mrr_at_5
value: 37.976
- type: ndcg_at_1
value: 30.684
- type: ndcg_at_10
value: 41.134
- type: ndcg_at_100
value: 46.081
- type: ndcg_at_1000
value: 48.199999999999996
- type: ndcg_at_3
value: 36.193
- type: ndcg_at_5
value: 38.903999999999996
- type: precision_at_1
value: 30.684
- type: precision_at_10
value: 6.285
- type: precision_at_100
value: 0.9520000000000001
- type: precision_at_1000
value: 0.126
- type: precision_at_3
value: 15.342
- type: precision_at_5
value: 10.869
- type: recall_at_1
value: 27.948
- type: recall_at_10
value: 53.959
- type: recall_at_100
value: 76.825
- type: recall_at_1000
value: 92.73700000000001
- type: recall_at_3
value: 40.495999999999995
- type: recall_at_5
value: 47.196
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: climate-fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 15.27
- type: map_at_10
value: 25.570999999999998
- type: map_at_100
value: 27.664
- type: map_at_1000
value: 27.848
- type: map_at_3
value: 21.224
- type: map_at_5
value: 23.508000000000003
- type: mrr_at_1
value: 34.137
- type: mrr_at_10
value: 46.583000000000006
- type: mrr_at_100
value: 47.339999999999996
- type: mrr_at_1000
value: 47.370000000000005
- type: mrr_at_3
value: 43.376999999999995
- type: mrr_at_5
value: 45.26
- type: ndcg_at_1
value: 34.137
- type: ndcg_at_10
value: 35.189
- type: ndcg_at_100
value: 42.568
- type: ndcg_at_1000
value: 45.660000000000004
- type: ndcg_at_3
value: 28.965000000000003
- type: ndcg_at_5
value: 31.169999999999998
- type: precision_at_1
value: 34.137
- type: precision_at_10
value: 10.971
- type: precision_at_100
value: 1.8870000000000002
- type: precision_at_1000
value: 0.247
- type: precision_at_3
value: 21.368000000000002
- type: precision_at_5
value: 16.573
- type: recall_at_1
value: 15.27
- type: recall_at_10
value: 41.516999999999996
- type: recall_at_100
value: 66.486
- type: recall_at_1000
value: 83.533
- type: recall_at_3
value: 26.325
- type: recall_at_5
value: 32.574
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: dbpedia-entity
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 9.982000000000001
- type: map_at_10
value: 23.724999999999998
- type: map_at_100
value: 33.933
- type: map_at_1000
value: 35.965
- type: map_at_3
value: 16.158
- type: map_at_5
value: 19.433
- type: mrr_at_1
value: 75.75
- type: mrr_at_10
value: 82.065
- type: mrr_at_100
value: 82.334
- type: mrr_at_1000
value: 82.34
- type: mrr_at_3
value: 80.708
- type: mrr_at_5
value: 81.671
- type: ndcg_at_1
value: 63.625
- type: ndcg_at_10
value: 49.576
- type: ndcg_at_100
value: 53.783
- type: ndcg_at_1000
value: 61.012
- type: ndcg_at_3
value: 53.822
- type: ndcg_at_5
value: 51.72
- type: precision_at_1
value: 75.75
- type: precision_at_10
value: 39.925
- type: precision_at_100
value: 12.525
- type: precision_at_1000
value: 2.399
- type: precision_at_3
value: 56.667
- type: precision_at_5
value: 50.5
- type: recall_at_1
value: 9.982000000000001
- type: recall_at_10
value: 29.325000000000003
- type: recall_at_100
value: 59.181
- type: recall_at_1000
value: 82.095
- type: recall_at_3
value: 17.338
- type: recall_at_5
value: 22.216
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 52.04500000000001
- type: f1
value: 47.32462453881906
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 78.68
- type: map_at_10
value: 86.207
- type: map_at_100
value: 86.375
- type: map_at_1000
value: 86.388
- type: map_at_3
value: 85.35199999999999
- type: map_at_5
value: 85.954
- type: mrr_at_1
value: 84.923
- type: mrr_at_10
value: 90.902
- type: mrr_at_100
value: 90.952
- type: mrr_at_1000
value: 90.952
- type: mrr_at_3
value: 90.489
- type: mrr_at_5
value: 90.822
- type: ndcg_at_1
value: 84.923
- type: ndcg_at_10
value: 89.403
- type: ndcg_at_100
value: 90.023
- type: ndcg_at_1000
value: 90.235
- type: ndcg_at_3
value: 88.24300000000001
- type: ndcg_at_5
value: 89.005
- type: precision_at_1
value: 84.923
- type: precision_at_10
value: 10.495000000000001
- type: precision_at_100
value: 1.103
- type: precision_at_1000
value: 0.11399999999999999
- type: precision_at_3
value: 33.358
- type: precision_at_5
value: 20.579
- type: recall_at_1
value: 78.68
- type: recall_at_10
value: 94.622
- type: recall_at_100
value: 97.083
- type: recall_at_1000
value: 98.348
- type: recall_at_3
value: 91.499
- type: recall_at_5
value: 93.486
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: fiqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 25.781
- type: map_at_10
value: 44.669
- type: map_at_100
value: 46.831
- type: map_at_1000
value: 46.96
- type: map_at_3
value: 38.714
- type: map_at_5
value: 42.186
- type: mrr_at_1
value: 51.235
- type: mrr_at_10
value: 60.083
- type: mrr_at_100
value: 60.675999999999995
- type: mrr_at_1000
value: 60.706
- type: mrr_at_3
value: 57.665
- type: mrr_at_5
value: 59.084
- type: ndcg_at_1
value: 51.235
- type: ndcg_at_10
value: 53.111
- type: ndcg_at_100
value: 59.57900000000001
- type: ndcg_at_1000
value: 61.57
- type: ndcg_at_3
value: 48.397
- type: ndcg_at_5
value: 50.169
- type: precision_at_1
value: 51.235
- type: precision_at_10
value: 14.877
- type: precision_at_100
value: 2.173
- type: precision_at_1000
value: 0.253
- type: precision_at_3
value: 32.87
- type: precision_at_5
value: 24.29
- type: recall_at_1
value: 25.781
- type: recall_at_10
value: 61.464
- type: recall_at_100
value: 84.244
- type: recall_at_1000
value: 96.039
- type: recall_at_3
value: 44.105
- type: recall_at_5
value: 52.205999999999996
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: hotpotqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 39.041
- type: map_at_10
value: 66.622
- type: map_at_100
value: 67.472
- type: map_at_1000
value: 67.52
- type: map_at_3
value: 62.81099999999999
- type: map_at_5
value: 65.23
- type: mrr_at_1
value: 78.082
- type: mrr_at_10
value: 83.827
- type: mrr_at_100
value: 84.03
- type: mrr_at_1000
value: 84.036
- type: mrr_at_3
value: 82.894
- type: mrr_at_5
value: 83.482
- type: ndcg_at_1
value: 78.082
- type: ndcg_at_10
value: 74.068
- type: ndcg_at_100
value: 76.981
- type: ndcg_at_1000
value: 77.887
- type: ndcg_at_3
value: 68.77600000000001
- type: ndcg_at_5
value: 71.763
- type: precision_at_1
value: 78.082
- type: precision_at_10
value: 15.822
- type: precision_at_100
value: 1.807
- type: precision_at_1000
value: 0.193
- type: precision_at_3
value: 44.956
- type: precision_at_5
value: 29.332
- type: recall_at_1
value: 39.041
- type: recall_at_10
value: 79.109
- type: recall_at_100
value: 90.371
- type: recall_at_1000
value: 96.313
- type: recall_at_3
value: 67.43400000000001
- type: recall_at_5
value: 73.329
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 87.422
- type: ap
value: 83.07360776629146
- type: f1
value: 87.38583428778229
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: msmarco
config: default
split: dev
revision: None
metrics:
- type: map_at_1
value: 21.715999999999998
- type: map_at_10
value: 34.821000000000005
- type: map_at_100
value: 36.022999999999996
- type: map_at_1000
value: 36.067
- type: map_at_3
value: 30.666
- type: map_at_5
value: 33.134
- type: mrr_at_1
value: 22.421
- type: mrr_at_10
value: 35.461
- type: mrr_at_100
value: 36.6
- type: mrr_at_1000
value: 36.638
- type: mrr_at_3
value: 31.413999999999998
- type: mrr_at_5
value: 33.823
- type: ndcg_at_1
value: 22.421
- type: ndcg_at_10
value: 42.169000000000004
- type: ndcg_at_100
value: 47.887
- type: ndcg_at_1000
value: 48.939
- type: ndcg_at_3
value: 33.786
- type: ndcg_at_5
value: 38.164
- type: precision_at_1
value: 22.421
- type: precision_at_10
value: 6.773999999999999
- type: precision_at_100
value: 0.962
- type: precision_at_1000
value: 0.105
- type: precision_at_3
value: 14.575
- type: precision_at_5
value: 10.963000000000001
- type: recall_at_1
value: 21.715999999999998
- type: recall_at_10
value: 64.75999999999999
- type: recall_at_100
value: 91.015
- type: recall_at_1000
value: 98.96000000000001
- type: recall_at_3
value: 42.089999999999996
- type: recall_at_5
value: 52.578
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 96.04195166438669
- type: f1
value: 95.76962987454031
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 84.76744186046513
- type: f1
value: 70.3328215706764
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 79.29051782111635
- type: f1
value: 77.0837414890434
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 81.64425016812373
- type: f1
value: 81.36288379329044
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 31.0673311773222
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 31.266850505047234
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 31.49575275757744
- type: mrr
value: 32.64979714009148
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: nfcorpus
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 6.151
- type: map_at_10
value: 14.879999999999999
- type: map_at_100
value: 19.445999999999998
- type: map_at_1000
value: 21.101
- type: map_at_3
value: 10.613999999999999
- type: map_at_5
value: 12.709000000000001
- type: mrr_at_1
value: 51.393
- type: mrr_at_10
value: 59.935
- type: mrr_at_100
value: 60.455000000000005
- type: mrr_at_1000
value: 60.485
- type: mrr_at_3
value: 57.894999999999996
- type: mrr_at_5
value: 59.303
- type: ndcg_at_1
value: 50.0
- type: ndcg_at_10
value: 39.324999999999996
- type: ndcg_at_100
value: 37.133
- type: ndcg_at_1000
value: 45.663
- type: ndcg_at_3
value: 45.294000000000004
- type: ndcg_at_5
value: 42.88
- type: precision_at_1
value: 51.393
- type: precision_at_10
value: 29.412
- type: precision_at_100
value: 9.666
- type: precision_at_1000
value: 2.263
- type: precision_at_3
value: 42.415000000000006
- type: precision_at_5
value: 37.399
- type: recall_at_1
value: 6.151
- type: recall_at_10
value: 19.121
- type: recall_at_100
value: 39.012
- type: recall_at_1000
value: 70.726
- type: recall_at_3
value: 11.855
- type: recall_at_5
value: 15.204
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: nq
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 36.382
- type: map_at_10
value: 53.657
- type: map_at_100
value: 54.547999999999995
- type: map_at_1000
value: 54.562999999999995
- type: map_at_3
value: 49.236999999999995
- type: map_at_5
value: 51.949
- type: mrr_at_1
value: 41.309000000000005
- type: mrr_at_10
value: 56.25599999999999
- type: mrr_at_100
value: 56.855999999999995
- type: mrr_at_1000
value: 56.867000000000004
- type: mrr_at_3
value: 52.891999999999996
- type: mrr_at_5
value: 54.99699999999999
- type: ndcg_at_1
value: 41.28
- type: ndcg_at_10
value: 61.702999999999996
- type: ndcg_at_100
value: 65.092
- type: ndcg_at_1000
value: 65.392
- type: ndcg_at_3
value: 53.722
- type: ndcg_at_5
value: 58.11300000000001
- type: precision_at_1
value: 41.28
- type: precision_at_10
value: 10.014000000000001
- type: precision_at_100
value: 1.187
- type: precision_at_1000
value: 0.121
- type: precision_at_3
value: 24.614
- type: precision_at_5
value: 17.317
- type: recall_at_1
value: 36.382
- type: recall_at_10
value: 83.38600000000001
- type: recall_at_100
value: 97.528
- type: recall_at_1000
value: 99.696
- type: recall_at_3
value: 63.053000000000004
- type: recall_at_5
value: 73.16
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: quora
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 69.577
- type: map_at_10
value: 83.944
- type: map_at_100
value: 84.604
- type: map_at_1000
value: 84.61800000000001
- type: map_at_3
value: 80.93599999999999
- type: map_at_5
value: 82.812
- type: mrr_at_1
value: 80.4
- type: mrr_at_10
value: 86.734
- type: mrr_at_100
value: 86.851
- type: mrr_at_1000
value: 86.85199999999999
- type: mrr_at_3
value: 85.75500000000001
- type: mrr_at_5
value: 86.396
- type: ndcg_at_1
value: 80.43
- type: ndcg_at_10
value: 87.75
- type: ndcg_at_100
value: 88.999
- type: ndcg_at_1000
value: 89.092
- type: ndcg_at_3
value: 84.88
- type: ndcg_at_5
value: 86.416
- type: precision_at_1
value: 80.43
- type: precision_at_10
value: 13.453000000000001
- type: precision_at_100
value: 1.539
- type: precision_at_1000
value: 0.157
- type: precision_at_3
value: 37.403
- type: precision_at_5
value: 24.648
- type: recall_at_1
value: 69.577
- type: recall_at_10
value: 95.233
- type: recall_at_100
value: 99.531
- type: recall_at_1000
value: 99.984
- type: recall_at_3
value: 86.867
- type: recall_at_5
value: 91.254
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 60.23690763558931
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 64.12391112159126
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: scidocs
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 5.288
- type: map_at_10
value: 13.611999999999998
- type: map_at_100
value: 15.909
- type: map_at_1000
value: 16.235
- type: map_at_3
value: 9.644
- type: map_at_5
value: 11.559
- type: mrr_at_1
value: 26.1
- type: mrr_at_10
value: 37.571
- type: mrr_at_100
value: 38.72
- type: mrr_at_1000
value: 38.76
- type: mrr_at_3
value: 34.383
- type: mrr_at_5
value: 36.187999999999995
- type: ndcg_at_1
value: 26.1
- type: ndcg_at_10
value: 22.497
- type: ndcg_at_100
value: 31.098
- type: ndcg_at_1000
value: 36.434
- type: ndcg_at_3
value: 21.401
- type: ndcg_at_5
value: 18.66
- type: precision_at_1
value: 26.1
- type: precision_at_10
value: 11.67
- type: precision_at_100
value: 2.405
- type: precision_at_1000
value: 0.368
- type: precision_at_3
value: 20.0
- type: precision_at_5
value: 16.34
- type: recall_at_1
value: 5.288
- type: recall_at_10
value: 23.652
- type: recall_at_100
value: 48.79
- type: recall_at_1000
value: 74.703
- type: recall_at_3
value: 12.158
- type: recall_at_5
value: 16.582
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_spearman
value: 83.6969699802343
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_spearman
value: 78.8031221769135
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_spearman
value: 86.37435789895171
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_spearman
value: 84.04036612478626
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_spearman
value: 88.99055778929946
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_spearman
value: 87.22140434759893
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_spearman
value: 90.1862731405498
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_spearman
value: 67.67995229420237
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_spearman
value: 88.65370934976113
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 83.79832393152147
- type: mrr
value: 95.78404438698557
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: scifact
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 64.883
- type: map_at_10
value: 74.48
- type: map_at_100
value: 74.85000000000001
- type: map_at_1000
value: 74.861
- type: map_at_3
value: 71.596
- type: map_at_5
value: 73.545
- type: mrr_at_1
value: 67.667
- type: mrr_at_10
value: 75.394
- type: mrr_at_100
value: 75.644
- type: mrr_at_1000
value: 75.655
- type: mrr_at_3
value: 73.5
- type: mrr_at_5
value: 74.63300000000001
- type: ndcg_at_1
value: 67.667
- type: ndcg_at_10
value: 78.855
- type: ndcg_at_100
value: 80.361
- type: ndcg_at_1000
value: 80.624
- type: ndcg_at_3
value: 74.37899999999999
- type: ndcg_at_5
value: 76.89200000000001
- type: precision_at_1
value: 67.667
- type: precision_at_10
value: 10.267
- type: precision_at_100
value: 1.11
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_3
value: 28.778
- type: precision_at_5
value: 19.133
- type: recall_at_1
value: 64.883
- type: recall_at_10
value: 91.2
- type: recall_at_100
value: 98.0
- type: recall_at_1000
value: 100.0
- type: recall_at_3
value: 79.406
- type: recall_at_5
value: 85.578
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.85445544554456
- type: cos_sim_ap
value: 96.81785428870712
- type: cos_sim_f1
value: 92.67563527653213
- type: cos_sim_precision
value: 92.35352532274081
- type: cos_sim_recall
value: 93.0
- type: dot_accuracy
value: 99.75643564356436
- type: dot_ap
value: 94.46746929160422
- type: dot_f1
value: 87.74900398406375
- type: dot_precision
value: 87.40079365079364
- type: dot_recall
value: 88.1
- type: euclidean_accuracy
value: 99.85445544554456
- type: euclidean_ap
value: 96.59180137299155
- type: euclidean_f1
value: 92.48850281042411
- type: euclidean_precision
value: 94.56635318704284
- type: euclidean_recall
value: 90.5
- type: manhattan_accuracy
value: 99.85643564356435
- type: manhattan_ap
value: 96.66599616275849
- type: manhattan_f1
value: 92.69746646795828
- type: manhattan_precision
value: 92.10266535044423
- type: manhattan_recall
value: 93.30000000000001
- type: max_accuracy
value: 99.85643564356435
- type: max_ap
value: 96.81785428870712
- type: max_f1
value: 92.69746646795828
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 70.72970157362414
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 34.49706344517027
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 54.41010678297881
- type: mrr
value: 55.15095811051693
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 30.5030094989814
- type: cos_sim_spearman
value: 29.959138274084797
- type: dot_pearson
value: 29.740134155639076
- type: dot_spearman
value: 29.18174652067779
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: trec-covid
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 0.22200000000000003
- type: map_at_10
value: 1.925
- type: map_at_100
value: 13.150999999999998
- type: map_at_1000
value: 33.410000000000004
- type: map_at_3
value: 0.631
- type: map_at_5
value: 0.9990000000000001
- type: mrr_at_1
value: 82.0
- type: mrr_at_10
value: 90.0
- type: mrr_at_100
value: 90.0
- type: mrr_at_1000
value: 90.0
- type: mrr_at_3
value: 89.0
- type: mrr_at_5
value: 90.0
- type: ndcg_at_1
value: 79.0
- type: ndcg_at_10
value: 77.69200000000001
- type: ndcg_at_100
value: 64.89
- type: ndcg_at_1000
value: 59.748999999999995
- type: ndcg_at_3
value: 79.296
- type: ndcg_at_5
value: 78.63
- type: precision_at_1
value: 82.0
- type: precision_at_10
value: 82.19999999999999
- type: precision_at_100
value: 67.52
- type: precision_at_1000
value: 26.512
- type: precision_at_3
value: 83.333
- type: precision_at_5
value: 83.2
- type: recall_at_1
value: 0.22200000000000003
- type: recall_at_10
value: 2.164
- type: recall_at_100
value: 16.608
- type: recall_at_1000
value: 56.89999999999999
- type: recall_at_3
value: 0.658
- type: recall_at_5
value: 1.084
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: webis-touche2020
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 1.8519999999999999
- type: map_at_10
value: 8.569
- type: map_at_100
value: 14.238999999999999
- type: map_at_1000
value: 15.876000000000001
- type: map_at_3
value: 3.9859999999999998
- type: map_at_5
value: 5.785
- type: mrr_at_1
value: 26.531
- type: mrr_at_10
value: 40.581
- type: mrr_at_100
value: 41.379
- type: mrr_at_1000
value: 41.388999999999996
- type: mrr_at_3
value: 35.034
- type: mrr_at_5
value: 38.299
- type: ndcg_at_1
value: 25.509999999999998
- type: ndcg_at_10
value: 22.18
- type: ndcg_at_100
value: 34.695
- type: ndcg_at_1000
value: 46.854
- type: ndcg_at_3
value: 23.112
- type: ndcg_at_5
value: 23.089000000000002
- type: precision_at_1
value: 26.531
- type: precision_at_10
value: 20.408
- type: precision_at_100
value: 7.428999999999999
- type: precision_at_1000
value: 1.559
- type: precision_at_3
value: 23.810000000000002
- type: precision_at_5
value: 23.265
- type: recall_at_1
value: 1.8519999999999999
- type: recall_at_10
value: 15.038000000000002
- type: recall_at_100
value: 46.499
- type: recall_at_1000
value: 84.11800000000001
- type: recall_at_3
value: 5.179
- type: recall_at_5
value: 8.758000000000001
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 69.26140000000001
- type: ap
value: 14.138284541193421
- type: f1
value: 53.715363590501916
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 62.136389360498015
- type: f1
value: 62.33290824449911
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 52.18306009684791
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 88.27561542588067
- type: cos_sim_ap
value: 80.59558041410928
- type: cos_sim_f1
value: 73.54724608388075
- type: cos_sim_precision
value: 70.55259331071255
- type: cos_sim_recall
value: 76.80738786279684
- type: dot_accuracy
value: 85.00923883888657
- type: dot_ap
value: 71.76942851966301
- type: dot_f1
value: 66.84518013631937
- type: dot_precision
value: 62.042476276547674
- type: dot_recall
value: 72.45382585751979
- type: euclidean_accuracy
value: 88.26965488466352
- type: euclidean_ap
value: 80.44398056118867
- type: euclidean_f1
value: 73.28244274809161
- type: euclidean_precision
value: 68.69806094182826
- type: euclidean_recall
value: 78.52242744063325
- type: manhattan_accuracy
value: 88.25773380222924
- type: manhattan_ap
value: 80.25000483445007
- type: manhattan_f1
value: 73.10447023956533
- type: manhattan_precision
value: 68.70937790157846
- type: manhattan_recall
value: 78.10026385224275
- type: max_accuracy
value: 88.27561542588067
- type: max_ap
value: 80.59558041410928
- type: max_f1
value: 73.54724608388075
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 89.52536189700004
- type: cos_sim_ap
value: 86.55972191277392
- type: cos_sim_f1
value: 79.31733569243245
- type: cos_sim_precision
value: 76.08372816632487
- type: cos_sim_recall
value: 82.83800431167231
- type: dot_accuracy
value: 87.77506112469437
- type: dot_ap
value: 82.92833178514168
- type: dot_f1
value: 76.12050479839702
- type: dot_precision
value: 70.03687172520861
- type: dot_recall
value: 83.3615645210964
- type: euclidean_accuracy
value: 89.3643031784841
- type: euclidean_ap
value: 86.45902920741383
- type: euclidean_f1
value: 79.4788514062154
- type: euclidean_precision
value: 76.32922160782645
- type: euclidean_recall
value: 82.89959963042809
- type: manhattan_accuracy
value: 89.38564830985369
- type: manhattan_ap
value: 86.47558438668958
- type: manhattan_f1
value: 79.46758328152997
- type: manhattan_precision
value: 75.67379343965457
- type: manhattan_recall
value: 83.66184170003079
- type: max_accuracy
value: 89.52536189700004
- type: max_ap
value: 86.55972191277392
- type: max_f1
value: 79.4788514062154
---
# LLM2Vec: Large Language Models Are Secretly Powerful Text Encoders
> LLM2Vec is a simple recipe to convert decoder-only LLMs into text encoders. It consists of 3 simple steps: 1) enabling bidirectional attention, 2) masked next token prediction, and 3) unsupervised contrastive learning. The model can be further fine-tuned to achieve state-of-the-art performance.
- **Repository:** https://github.com/McGill-NLP/llm2vec
- **Paper:** https://arxiv.org/abs/2404.05961
## Installation
```bash
pip install llm2vec
```
## Usage
```python
from llm2vec import LLM2Vec
import torch
from transformers import AutoTokenizer, AutoModel, AutoConfig
from peft import PeftModel
# Loading base Mistral model, along with custom code that enables bidirectional connections in decoder-only LLMs. MNTP LoRA weights are merged into the base model.
tokenizer = AutoTokenizer.from_pretrained(
"McGill-NLP/LLM2Vec-Mistral-7B-Instruct-v2-mntp"
)
config = AutoConfig.from_pretrained(
"McGill-NLP/LLM2Vec-Mistral-7B-Instruct-v2-mntp", trust_remote_code=True
)
model = AutoModel.from_pretrained(
"McGill-NLP/LLM2Vec-Mistral-7B-Instruct-v2-mntp",
trust_remote_code=True,
config=config,
torch_dtype=torch.bfloat16,
device_map="cuda" if torch.cuda.is_available() else "cpu",
)
model = PeftModel.from_pretrained(
model,
"McGill-NLP/LLM2Vec-Mistral-7B-Instruct-v2-mntp",
)
model = model.merge_and_unload() # This can take several minutes on cpu
# Loading supervised model. This loads the trained LoRA weights on top of MNTP model. Hence the final weights are -- Base model + MNTP (LoRA) + supervised (LoRA).
model = PeftModel.from_pretrained(
model, "McGill-NLP/LLM2Vec-Mistral-7B-Instruct-v2-mntp-supervised"
)
# Wrapper for encoding and pooling operations
l2v = LLM2Vec(model, tokenizer, pooling_mode="mean", max_length=512)
# Encoding queries using instructions
instruction = (
"Given a web search query, retrieve relevant passages that answer the query:"
)
queries = [
[instruction, "how much protein should a female eat"],
[instruction, "summit define"],
]
q_reps = l2v.encode(queries)
# Encoding documents. Instruction are not required for documents
documents = [
"As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.",
"Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments.",
]
d_reps = l2v.encode(documents)
# Compute cosine similarity
q_reps_norm = torch.nn.functional.normalize(q_reps, p=2, dim=1)
d_reps_norm = torch.nn.functional.normalize(d_reps, p=2, dim=1)
cos_sim = torch.mm(q_reps_norm, d_reps_norm.transpose(0, 1))
print(cos_sim)
"""
tensor([[0.5485, 0.0551],
[0.0565, 0.5425]])
"""
```
## Questions
If you have any question about the code, feel free to email Parishad (`[email protected]`) and Vaibhav (`[email protected]`). | [
"SUMMARIZATION"
] | [
"BIOSSES",
"SCIFACT"
] |
RichardErkhov/m42-health_-_Llama3-Med42-8B-gguf | RichardErkhov | null | [
"gguf",
"arxiv:2408.06142",
"endpoints_compatible",
"region:us",
"conversational"
] | 2024-08-22T12:53:05 | 2024-08-22T14:51:18 | 885 | 0 | ---
{}
---
Quantization made by Richard Erkhov.
[Github](https://github.com/RichardErkhov)
[Discord](https://discord.gg/pvy7H8DZMG)
[Request more models](https://github.com/RichardErkhov/quant_request)
Llama3-Med42-8B - GGUF
- Model creator: https://huggingface.co/m42-health/
- Original model: https://huggingface.co/m42-health/Llama3-Med42-8B/
| Name | Quant method | Size |
| ---- | ---- | ---- |
| [Llama3-Med42-8B.Q2_K.gguf](https://huggingface.co/RichardErkhov/m42-health_-_Llama3-Med42-8B-gguf/blob/main/Llama3-Med42-8B.Q2_K.gguf) | Q2_K | 2.96GB |
| [Llama3-Med42-8B.IQ3_XS.gguf](https://huggingface.co/RichardErkhov/m42-health_-_Llama3-Med42-8B-gguf/blob/main/Llama3-Med42-8B.IQ3_XS.gguf) | IQ3_XS | 3.28GB |
| [Llama3-Med42-8B.IQ3_S.gguf](https://huggingface.co/RichardErkhov/m42-health_-_Llama3-Med42-8B-gguf/blob/main/Llama3-Med42-8B.IQ3_S.gguf) | IQ3_S | 3.43GB |
| [Llama3-Med42-8B.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/m42-health_-_Llama3-Med42-8B-gguf/blob/main/Llama3-Med42-8B.Q3_K_S.gguf) | Q3_K_S | 3.41GB |
| [Llama3-Med42-8B.IQ3_M.gguf](https://huggingface.co/RichardErkhov/m42-health_-_Llama3-Med42-8B-gguf/blob/main/Llama3-Med42-8B.IQ3_M.gguf) | IQ3_M | 3.52GB |
| [Llama3-Med42-8B.Q3_K.gguf](https://huggingface.co/RichardErkhov/m42-health_-_Llama3-Med42-8B-gguf/blob/main/Llama3-Med42-8B.Q3_K.gguf) | Q3_K | 3.74GB |
| [Llama3-Med42-8B.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/m42-health_-_Llama3-Med42-8B-gguf/blob/main/Llama3-Med42-8B.Q3_K_M.gguf) | Q3_K_M | 3.74GB |
| [Llama3-Med42-8B.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/m42-health_-_Llama3-Med42-8B-gguf/blob/main/Llama3-Med42-8B.Q3_K_L.gguf) | Q3_K_L | 4.03GB |
| [Llama3-Med42-8B.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/m42-health_-_Llama3-Med42-8B-gguf/blob/main/Llama3-Med42-8B.IQ4_XS.gguf) | IQ4_XS | 4.18GB |
| [Llama3-Med42-8B.Q4_0.gguf](https://huggingface.co/RichardErkhov/m42-health_-_Llama3-Med42-8B-gguf/blob/main/Llama3-Med42-8B.Q4_0.gguf) | Q4_0 | 4.34GB |
| [Llama3-Med42-8B.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/m42-health_-_Llama3-Med42-8B-gguf/blob/main/Llama3-Med42-8B.IQ4_NL.gguf) | IQ4_NL | 4.38GB |
| [Llama3-Med42-8B.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/m42-health_-_Llama3-Med42-8B-gguf/blob/main/Llama3-Med42-8B.Q4_K_S.gguf) | Q4_K_S | 4.37GB |
| [Llama3-Med42-8B.Q4_K.gguf](https://huggingface.co/RichardErkhov/m42-health_-_Llama3-Med42-8B-gguf/blob/main/Llama3-Med42-8B.Q4_K.gguf) | Q4_K | 4.58GB |
| [Llama3-Med42-8B.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/m42-health_-_Llama3-Med42-8B-gguf/blob/main/Llama3-Med42-8B.Q4_K_M.gguf) | Q4_K_M | 4.58GB |
| [Llama3-Med42-8B.Q4_1.gguf](https://huggingface.co/RichardErkhov/m42-health_-_Llama3-Med42-8B-gguf/blob/main/Llama3-Med42-8B.Q4_1.gguf) | Q4_1 | 4.78GB |
| [Llama3-Med42-8B.Q5_0.gguf](https://huggingface.co/RichardErkhov/m42-health_-_Llama3-Med42-8B-gguf/blob/main/Llama3-Med42-8B.Q5_0.gguf) | Q5_0 | 5.21GB |
| [Llama3-Med42-8B.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/m42-health_-_Llama3-Med42-8B-gguf/blob/main/Llama3-Med42-8B.Q5_K_S.gguf) | Q5_K_S | 5.21GB |
| [Llama3-Med42-8B.Q5_K.gguf](https://huggingface.co/RichardErkhov/m42-health_-_Llama3-Med42-8B-gguf/blob/main/Llama3-Med42-8B.Q5_K.gguf) | Q5_K | 5.34GB |
| [Llama3-Med42-8B.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/m42-health_-_Llama3-Med42-8B-gguf/blob/main/Llama3-Med42-8B.Q5_K_M.gguf) | Q5_K_M | 5.34GB |
| [Llama3-Med42-8B.Q5_1.gguf](https://huggingface.co/RichardErkhov/m42-health_-_Llama3-Med42-8B-gguf/blob/main/Llama3-Med42-8B.Q5_1.gguf) | Q5_1 | 5.65GB |
| [Llama3-Med42-8B.Q6_K.gguf](https://huggingface.co/RichardErkhov/m42-health_-_Llama3-Med42-8B-gguf/blob/main/Llama3-Med42-8B.Q6_K.gguf) | Q6_K | 6.14GB |
| [Llama3-Med42-8B.Q8_0.gguf](https://huggingface.co/RichardErkhov/m42-health_-_Llama3-Med42-8B-gguf/blob/main/Llama3-Med42-8B.Q8_0.gguf) | Q8_0 | 7.95GB |
Original model description:
---
language:
- en
license: llama3
tags:
- m42
- health
- healthcare
- clinical-llm
pipeline_tag: text-generation
inference: false
license_name: llama3
---
# **Med42-v2 - A Suite of Clinically-aligned Large Language Models**
Med42-v2 is a suite of open-access clinical large language models (LLM) instruct and preference-tuned by M42 to expand access to medical knowledge. Built off LLaMA-3 and comprising either 8 or 70 billion parameters, these generative AI systems provide high-quality answers to medical questions.
## Key performance metrics:
- Med42-v2-70B outperforms GPT-4.0 in most of the MCQA tasks.
- Med42-v2-70B achieves a MedQA zero-shot performance of 79.10, surpassing the prior state-of-the-art among all openly available medical LLMs.
- Med42-v2-70B sits at the top of the Clinical Elo Rating Leaderboard.
|Models|Elo Score|
|:---:|:---:|
|**Med42-v2-70B**| 1764 |
|Llama3-70B-Instruct| 1643 |
|GPT4-o| 1426 |
|Llama3-8B-Instruct| 1352 |
|Mixtral-8x7b-Instruct| 970 |
|**Med42-v2-8B**| 924 |
|OpenBioLLM-70B| 657 |
|JSL-MedLlama-3-8B-v2.0| 447 |
## Limitations & Safe Use
- The Med42-v2 suite of models is not ready for real clinical use. Extensive human evaluation is undergoing as it is required to ensure safety.
- Potential for generating incorrect or harmful information.
- Risk of perpetuating biases in training data.
Use this suite of models responsibly! Do not rely on them for medical usage without rigorous safety testing.
## Model Details
*Disclaimer: This large language model is not yet ready for clinical use without further testing and validation. It should not be relied upon for making medical decisions or providing patient care.*
Beginning with Llama3 models, Med42-v2 were instruction-tuned using a dataset of ~1B tokens compiled from different open-access and high-quality sources, including medical flashcards, exam questions, and open-domain dialogues.
**Model Developers:** M42 Health AI Team
**Finetuned from model:** Llama3 - 8B & 70B Instruct
**Context length:** 8k tokens
**Input:** Text only data
**Output:** Model generates text only
**Status:** This is a static model trained on an offline dataset. Future versions of the tuned models will be released as we enhance the model's performance.
**License:** Llama 3 Community License Agreement
**Research Paper:** [Med42-v2: A Suite of Clinical LLMs](https://huggingface.co/papers/2408.06142)
## Intended Use
The Med42-v2 suite of models is being made available for further testing and assessment as AI assistants to enhance clinical decision-making and access to LLMs for healthcare use. Potential use cases include:
- Medical question answering
- Patient record summarization
- Aiding medical diagnosis
- General health Q&A
**Run the model**
You can use the 🤗 Transformers library `text-generation` pipeline to do inference.
```python
import transformers
import torch
model_name_or_path = "m42-health/Llama3-Med42-8B"
pipeline = transformers.pipeline(
"text-generation",
model=model_name_or_path,
torch_dtype=torch.bfloat16,
device_map="auto",
)
messages = [
{
"role": "system",
"content": (
"You are a helpful, respectful and honest medical assistant. You are a second version of Med42 developed by the AI team at M42, UAE. "
"Always answer as helpfully as possible, while being safe. "
"Your answers should not include any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content. "
"Please ensure that your responses are socially unbiased and positive in nature. If a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. "
"If you don’t know the answer to a question, please don’t share false information."
),
},
{"role": "user", "content": "What are the symptoms of diabetes?"},
]
prompt = pipeline.tokenizer.apply_chat_template(
messages, tokenize=False, add_generation_prompt=False
)
stop_tokens = [
pipeline.tokenizer.eos_token_id,
pipeline.tokenizer.convert_tokens_to_ids("<|eot_id|>"),
]
outputs = pipeline(
prompt,
max_new_tokens=512,
eos_token_id=stop_tokens,
do_sample=True,
temperature=0.4,
top_k=150,
top_p=0.75,
)
print(outputs[0]["generated_text"][len(prompt) :])
```
## Hardware and Software
The training was conducted on the NVIDIA DGX cluster with H100 GPUs, utilizing PyTorch's Fully Sharded Data Parallel (FSDP) framework.
## Evaluation Results
### Open-ended question generation
To ensure a robust evaluation of our model's output quality, we employ the LLM-as-a-Judge approach using Prometheus-8x7b-v2.0. Our assessment uses 4,000 carefully curated publicly accessible healthcare-related questions, generating responses from various models. We then use Prometheus to conduct pairwise comparisons of the answers. Drawing inspiration from the LMSYS Chatbot-Arena methodology, we present the results as Elo ratings for each model.
To maintain fairness and eliminate potential bias from prompt engineering, we used the same simple system prompt for every model throughout the evaluation process.
Below is the scoring rubric we used to prompt Prometheus to select the best answer:
```
### Score Rubric:
Which response is of higher overall quality in a medical context? Consider:
* Relevance: Does it directly address the question?
* Completeness: Does it cover all important aspects, details and subpoints?
* Safety: Does it avoid unsafe practices and address potential risks?
* Ethics: Does it maintain confidentiality and avoid biases?
* Clarity: Is it professional, clear and easy to understand?
```
#### Elo Ratings
|Models|Elo Score|
|:---:|:---:|
|**Med42-v2-70B**| 1764 |
|Llama3-70B-Instruct| 1643 |
|GPT4-o| 1426 |
|Llama3-8B-Instruct| 1352 |
|Mixtral-8x7b-Instruct| 970 |
|**Med42-v2-8B**| 924 |
|OpenBioLLM-70B| 657 |
|JSL-MedLlama-3-8B-v2.0| 447 |
#### Win-rate

### MCQA Evaluation
Med42-v2 improves performance on every clinical benchmark compared to our previous version, including MedQA, MedMCQA, USMLE, MMLU clinical topics and MMLU Pro clinical subset. For all evaluations reported so far, we use [EleutherAI's evaluation harness library](https://github.com/EleutherAI/lm-evaluation-harness) and report zero-shot accuracies (except otherwise stated). We integrated chat templates into harness and computed the likelihood for the full answer instead of only the tokens "a.", "b.", "c." or "d.".
|Model|MMLU Pro|MMLU|MedMCQA|MedQA|USMLE|
|---:|:---:|:---:|:---:|:---:|:---:|
|**Med42v2-70B**|64.36|87.12|73.20|79.10|83.80|
|**Med42v2-8B**|54.30|75.76|61.34|62.84|67.04|
|OpenBioLLM-70B|64.24|90.40|73.18|76.90|79.01|
|GPT-4.0<sup>†</sup>|-|87.00|69.50|78.90|84.05|
|MedGemini*|-|-|-|84.00|-|
|Med-PaLM-2 (5-shot)*|-|87.77|71.30|79.70|-|
|Med42|-|76.72|60.90|61.50|71.85|
|ClinicalCamel-70B|-|69.75|47.00|53.40|54.30|
|GPT-3.5<sup>†</sup>|-|66.63|50.10|50.80|53.00|
|Llama3-8B-Instruct|48.24|72.89|59.65|61.64|60.38|
|Llama3-70B-Instruct|64.24|85.99|72.03|78.88|83.57|
**For MedGemini, results are reported for MedQA without self-training and without search. We note that 0-shot performance is not reported for Med-PaLM 2. Further details can be found at [https://github.com/m42health/med42](https://github.com/m42health/med42)*.
<sup>†</sup> *Results as reported in the paper [Capabilities of GPT-4 on Medical Challenge Problems](https://www.microsoft.com/en-us/research/uploads/prod/2023/03/GPT-4_medical_benchmarks.pdf)*.
## Accessing Med42 and Reporting Issues
Please report any software "bug" or other problems through one of the following means:
- Reporting issues with the model: [https://github.com/m42health/med42](https://github.com/m42health/med42)
- Reporting risky content generated by the model, bugs and/or any security concerns: [https://forms.office.com/r/fPY4Ksecgf](https://forms.office.com/r/fPY4Ksecgf)
- M42’s privacy policy available at [https://m42.ae/privacy-policy/](https://m42.ae/privacy-policy/)
- Reporting violations of the Acceptable Use Policy or unlicensed uses of Med42: <[email protected]>
## Acknowledgements
We thank the Torch FSDP team for their robust distributed training framework, the EleutherAI harness team for their valuable evaluation tools, and the Hugging Face Alignment team for their contributions to responsible AI development.
## Citation
```
@misc{med42v2,
Author = {Cl{\'e}ment Christophe and Praveen K Kanithi and Tathagata Raha and Shadab Khan and Marco AF Pimentel},
Title = {Med42-v2: A Suite of Clinical LLMs},
Year = {2024},
Eprint = {arXiv:2408.06142},
url={https://arxiv.org/abs/2408.06142},
}
```
| [
"QUESTION_ANSWERING",
"SUMMARIZATION"
] | [
"MEDQA"
] |
joe32140/ModernBERT-large-msmarco | joe32140 | sentence-similarity | [
"sentence-transformers",
"onnx",
"safetensors",
"modernbert",
"sentence-similarity",
"feature-extraction",
"generated_from_trainer",
"dataset_size:11662655",
"loss:CachedMultipleNegativesRankingLoss",
"en",
"dataset:sentence-transformers/msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1",
"arxiv:1908.10084",
"arxiv:2101.06983",
"base_model:answerdotai/ModernBERT-large",
"base_model:finetune:answerdotai/ModernBERT-large",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-12-23T20:46:33 | 2025-01-26T00:03:26 | 874 | 2 | ---
base_model: answerdotai/ModernBERT-large
datasets:
- sentence-transformers/msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1
language:
- en
library_name: sentence-transformers
metrics:
- cosine_accuracy
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:11662655
- loss:CachedMultipleNegativesRankingLoss
base_model_relation: finetune
widget:
- source_sentence: what county is lyndhurst, ohio in
sentences:
- This article is about the song written by Kenneth Gamble, Leon Huff and Cary Gilbert.
For the Tina Turner song, see Don't Leave Me This Way (Tina Turner song). Don't
Leave Me This Way is a song written by Kenneth Gamble, Leon Huff and Cary Gilbert.
First charting as a hit for Harold Melvin & the Blue Notes featuring Teddy Pendergrass,
an act on Gamble & Huff's Philadelphia International label in 1975, Don't Leave
Me This Way was later a huge disco hit for Motown artist Thelma Houston in 1977.
- "Lyndhurst is a city in Cuyahoga County, Ohio, United States. The population was\
\ 14,001 at the 2010 census. Lyndhurst is located in northeastern Ohio, and is\
\ a suburb of Cleveland. A small part of Lyndhurst was originally part of Mayfield\
\ Township. It used to be called Euclidville before Lyndhurst was chosen. Lyndhurst\
\ is located at 41°31â\x80²17â\x80³N 81°29â\x80²25â\x80³W / 41.52139°N 81.49028°W\
\ / 41.52139; -81.49028 (41.521352, -81.490141)."
- Welcome to Trumbull County... Trumbull County, the county seat, located in Warren,
Ohio, consists of a combination of both urban and rural communities situated in
the northeast corner of Ohio. It is situated roughly between the Youngstown, Cleveland
and Akron corridors.
- source_sentence: who founded the american graphophone company
sentences:
- In 1886, Graham Bell and Charles Sumner Tainter founded the American Graphophone
Company to distribute and sell graphophones in the US and Canada under license
from the Volta Graphophone Company. In 1890, the American Graphophone Company
stopped production of new phonographs due to sagging orders.
- ShelfGenie How much does a ShelfGenie franchise cost? ShelfGenie has a franchise
fee of up to $45,000, with a total initial investment range of $70,100 to $107,750.
Local ShelfGenie franchise opportunities. ShelfGenie is looking to grow in a number
of cities around the country. To find out if there's a franchise opportunity in
your city, unlock more information.
- "A+E Networks. The technology that made the modern music business possible came\
\ into existence in the New Jersey laboratory where Thomas Alva Edison created\
\ the first device to both record sound and play it back. He was awarded U.S.\
\ Patent No. 200,521 for his inventionâ\x80\x93the phonographâ\x80\x93on this\
\ day in 1878."
- source_sentence: is housekeeping camp flooded?
sentences:
- 'What is the importance of housekeeping at work? A: Workplace housekeeping promotes
sanitation, safety, organization and productivity. It also boosts morale. Daily
housekeeping maintenance keeps the workplac... Full Answer >'
- The back patio area of a cabin is partially submerged in flood water at Housekeeping
Camp on Monday, Jan. 9, 2017, in Yosemite National Park. The Merced River, swollen
with storm runoff, crested at 12.7 feet at 4 a.m. SILVIA FLORES [email protected].
- "1 Bake for 8 minutes, then rotate the pan and check the underside of the bagels.\
\ 2 If theyâ\x80\x99re getting too dark, place another pan under the baking sheet.\
\ ( 3 Doubling the pan will insulate the first baking sheet.) Bake for another\
\ 8 to 12 minutes, until the bagels are a golden brown. 4 13."
- source_sentence: causes for infection in the nerve of tooth
sentences:
- If a cavity is causing the toothache, your dentist will fill the cavity or possibly
extract the tooth, if necessary. A root canal might be needed if the cause of
the toothache is determined to be an infection of the tooth's nerve. Bacteria
that have worked their way into the inner aspects of the tooth cause such an infection.
An antibiotic may be prescribed if there is fever or swelling of the jaw.
- "According to Article III, Section 1 of the Constitution, judges and justices\
\ of the Judicial Branch serve during good behavior.. This means they are appointed\
\ for life, unles â\x80¦ s they are impeached and removed from office. + 50 others\
\ found this useful.he term length for members of the House are two years and\
\ a staggering six years for members of the Senate."
- Inflamed or infected pulp (pulpitis) most often causes a toothache. To relieve
the pain and prevent further complications, the tooth may be extracted (surgically
removed) or saved by root canal treatment.
- source_sentence: what county is hayden in
sentences:
- Normally, the Lead Agency is the agency with general governmental powers such
as a city or a county. Agencies with limited powers or districts that provide
a public service/utility such as a recreation and park district will tend to be
a Responsible Agency.
- According to the United States Census Bureau, the city has a total area of 9.61
square miles (24.89 km2), of which 9.60 square miles (24.86 km2) is land and 0.01
square miles (0.03 km2) is water. It lies at the southwestern end of Hayden Lake,
and the elevation of the city is 2,287 feet (697 m) above sea level. Hayden is
located on U.S. Route 95 at the junction of Route 41. It is also four miles (6
km) north of Interstate 90 and Coeur d'Alene. The Coeur d'Alene airport is northwest
of Hayden.
- Hayden is a city in Kootenai County, Idaho, United States. Located in the northern
portion of the state, just north of Coeur d'Alene, its population was 13,294 at
the 2010 census.
model-index:
- name: SentenceTransformer based on answerdotai/ModernBERT-large
results:
- task:
type: triplet
name: Triplet
dataset:
name: msmarco co condenser dev
type: msmarco-co-condenser-dev
metrics:
- type: cosine_accuracy
value: 0.994
name: Cosine Accuracy
- task:
type: retrieval
dataset:
name: SCIDOCS
type: SCIDOCS
split: test
metrics:
- type: ndcg@10
value: 0.15789
- task:
type: retrieval
dataset:
name: FiQA2018
type: FiQA2018
split: test
metrics:
- type: ndcg@10
value: 0.33974
- task:
type: retrieval
dataset:
name: HotpotQA
type: HotpotQA
split: test
metrics:
- type: ndcg@10
value: 0.51818
- task:
type: retrieval
dataset:
name: ArguAna
type: ArguAna
split: test
metrics:
- type: ndcg@10
value: 0.47797
- task:
type: retrieval
dataset:
name: NFCorpus
type: NFCorpus
split: test
metrics:
- type: ndcg@10
value: 0.28443
- task:
type: retrieval
dataset:
name: SciFact
type: SciFact
split: test
metrics:
- type: ndcg@10
value: 0.60626
- task:
type: retrieval
dataset:
name: TRECCOVID
type: TRECCOVID
split: test
metrics:
- type: ndcg@10
value: 0.77495
---
# SentenceTransformer based on answerdotai/ModernBERT-large
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [answerdotai/ModernBERT-large](https://huggingface.co/answerdotai/ModernBERT-large) on the [msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1](https://huggingface.co/datasets/sentence-transformers/msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1) dataset. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
I finetune ModernBERT-base using script from offical repo [train_st.py](https://github.com/AnswerDotAI/ModernBERT/blob/main/examples/train_st.py) on a RTX 4090 GPU with the only change of setting mini-batch size of `CachedMultipleNegativesRankingLoss` to 64. Training for 1 epoch takes less than 2 hours.
The mini-batch size of GradCache should not change model performnace, but the finetuned model performs better than that recorded in the paper.
Training logs can be found here: https://api.wandb.ai/links/joe32140/ekuauaao.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [answerdotai/ModernBERT-large](https://huggingface.co/answerdotai/ModernBERT-large) <!-- at revision f87846cf8be76fceb18718f0245d18c8e6571215 -->
- **Maximum Sequence Length:** 8192 tokens
- **Output Dimensionality:** 1024 dimensions
- **Similarity Function:** Cosine Similarity
- **Training Dataset:**
- [msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1](https://huggingface.co/datasets/sentence-transformers/msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1)
- **Language:** en
<!-- - **License:** Unknown -->
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 8192, 'do_lower_case': False}) with Transformer model: ModernBertModel
(1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("joe32140/ModernBERT-large-msmarco")
# Run inference
sentences = [
'what county is hayden in',
"Hayden is a city in Kootenai County, Idaho, United States. Located in the northern portion of the state, just north of Coeur d'Alene, its population was 13,294 at the 2010 census.",
"According to the United States Census Bureau, the city has a total area of 9.61 square miles (24.89 km2), of which 9.60 square miles (24.86 km2) is land and 0.01 square miles (0.03 km2) is water. It lies at the southwestern end of Hayden Lake, and the elevation of the city is 2,287 feet (697 m) above sea level. Hayden is located on U.S. Route 95 at the junction of Route 41. It is also four miles (6 km) north of Interstate 90 and Coeur d'Alene. The Coeur d'Alene airport is northwest of Hayden.",
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
## Evaluation
### Metrics
#### Triplet
* Dataset: `msmarco-co-condenser-dev`
* Evaluated with [<code>TripletEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.TripletEvaluator)
| Metric | Value |
|:--------------------|:----------|
| **cosine_accuracy** | **0.994** |
#### Retrieval tasks compared to original numbers in the paper
| | ModernBERT-base | ModernBERT-base (ours) | ModernBERT-large | ModernBERT-large (ours) |
|:------------------|------------------|-------------------------|-------------------|--------------------------|
| NFCorpus | 23.7 | 26.66 | 26.2 | 28.44 |
| SciFact | 57.0 | 61.64 | 60.4 | 63.66 |
| TREC-Covid | 72.1 | 71.43 | 74.1 | 77.49 |
| FiQA | 28.8 | 30.73 | 33.1 | 34.35 |
| ArguAna | 35.7 | 46.38 | 38.2 | 47.79 |
| SciDocs | 12.5 | 13.67 | 13.8 | 15.78 |
| FEVER | 59.9 | 65.7 | 62.7 | 68.2 |
| Climate-FEVER | 23.6 | 22.6 | 20.5 | 22.9 |
| MLDR - OOD | 27.4 | 30.58 | 34.3 | 38.99 |
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1
* Dataset: [msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1](https://huggingface.co/datasets/sentence-transformers/msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1) at [84ed2d3](https://huggingface.co/datasets/sentence-transformers/msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1/tree/84ed2d35626f617d890bd493b4d6db69a741e0e2)
* Size: 11,662,655 training samples
* Columns: <code>query</code>, <code>positive</code>, and <code>negative</code>
* Approximate statistics based on the first 1000 samples:
| | query | positive | negative |
|:--------|:---------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
| type | string | string | string |
| details | <ul><li>min: 4 tokens</li><li>mean: 9.26 tokens</li><li>max: 34 tokens</li></ul> | <ul><li>min: 17 tokens</li><li>mean: 79.14 tokens</li><li>max: 222 tokens</li></ul> | <ul><li>min: 24 tokens</li><li>mean: 80.09 tokens</li><li>max: 436 tokens</li></ul> |
* Samples:
| query | positive | negative |
|:---------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>what is the meaning of menu planning</code> | <code>Menu planning is the selection of a menu for an event. Such as picking out the dinner for your wedding or even a meal at a Birthday Party. Menu planning is when you are preparing a calendar of meals and you have to sit down and decide what meat and veggies you want to serve on each certain day.</code> | <code>Menu Costs. In economics, a menu cost is the cost to a firm resulting from changing its prices. The name stems from the cost of restaurants literally printing new menus, but economists use it to refer to the costs of changing nominal prices in general.</code> |
| <code>how old is brett butler</code> | <code>Brett Butler is 59 years old. To be more precise (and nerdy), the current age as of right now is 21564 days or (even more geeky) 517536 hours. That's a lot of hours!</code> | <code>Passed in: St. John's, Newfoundland and Labrador, Canada. Passed on: 16/07/2016. Published in the St. John's Telegram. Passed away suddenly at the Health Sciences Centre surrounded by his loving family, on July 16, 2016 Robert (Bobby) Joseph Butler, age 52 years. Predeceased by his special aunt Geri Murrin and uncle Mike Mchugh; grandparents Joe and Margaret Murrin and Jack and Theresa Butler.</code> |
| <code>when was the last navajo treaty sign?</code> | <code>In Executive Session, Senate of the United States, July 25, 1868. Resolved, (two-thirds of the senators present concurring,) That the Senate advise and consent to the ratification of the treaty between the United States and the Navajo Indians, concluded at Fort Sumner, New Mexico, on the first day of June, 1868.</code> | <code>Share Treaty of Greenville. The Treaty of Greenville was signed August 3, 1795, between the United States, represented by Gen. Anthony Wayne, and chiefs of the Indian tribes located in the Northwest Territory, including the Wyandots, Delawares, Shawnees, Ottawas, Miamis, and others.</code> |
* Loss: [<code>CachedMultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedmultiplenegativesrankingloss) with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
```
### Evaluation Dataset
#### msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1
* Dataset: [msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1](https://huggingface.co/datasets/sentence-transformers/msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1) at [84ed2d3](https://huggingface.co/datasets/sentence-transformers/msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1/tree/84ed2d35626f617d890bd493b4d6db69a741e0e2)
* Size: 11,662,655 evaluation samples
* Columns: <code>query</code>, <code>positive</code>, and <code>negative</code>
* Approximate statistics based on the first 1000 samples:
| | query | positive | negative |
|:--------|:--------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
| type | string | string | string |
| details | <ul><li>min: 4 tokens</li><li>mean: 9.2 tokens</li><li>max: 27 tokens</li></ul> | <ul><li>min: 21 tokens</li><li>mean: 80.44 tokens</li><li>max: 241 tokens</li></ul> | <ul><li>min: 23 tokens</li><li>mean: 80.38 tokens</li><li>max: 239 tokens</li></ul> |
* Samples:
| query | positive | negative |
|:------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>what county is holly springs nc in</code> | <code>Holly Springs, North Carolina. Holly Springs is a town in Wake County, North Carolina, United States. As of the 2010 census, the town population was 24,661, over 2½ times its population in 2000. Contents.</code> | <code>The Mt. Holly Springs Park & Resort. One of the numerous trolley routes that carried people around the county at the turn of the century was the Carlisle & Mt. Holly Railway Company. The âHolly Trolleyâ as it came to be known was put into service by Patricio Russo and made its first run on May 14, 1901.</code> |
| <code>how long does nyquil stay in your system</code> | <code>In order to understand exactly how long Nyquil lasts, it is absolutely vital to learn about the various ingredients in the drug. One of the ingredients found in Nyquil is Doxylamine, which is an antihistamine. This specific medication has a biological half-life or 6 to 12 hours. With this in mind, it is possible for the drug to remain in the system for a period of 12 to 24 hours. It should be known that the specifics will depend on a wide variety of different factors, including your age and metabolism.</code> | <code>I confirmed that NyQuil is about 10% alcohol, a higher content than most domestic beers. When I asked about the relatively high proof, I was told that the alcohol dilutes the active ingredients. The alcohol free version is there for customers with addiction issues.. also found that in that version there is twice the amount of DXM. When I asked if I could speak to a chemist or scientist, I was told they didn't have anyone who fit that description there. Itâs been eight years since I kicked NyQuil. I've been sober from alcohol for four years.</code> |
| <code>what are mineral water</code> | <code>1 Mineral water â water from a mineral spring that contains various minerals, such as salts and sulfur compounds. 2 It comes from a source tapped at one or more bore holes or spring, and originates from a geologically and physically protected underground water source. Mineral water â water from a mineral spring that contains various minerals, such as salts and sulfur compounds. 2 It comes from a source tapped at one or more bore holes or spring, and originates from a geologically and physically protected underground water source.</code> | <code>Minerals for Your Body. Drinking mineral water is beneficial to health and well-being. But it is not only the amount of water you drink that is important-what the water contains is even more essential.inerals for Your Body. Drinking mineral water is beneficial to health and well-being. But it is not only the amount of water you drink that is important-what the water contains is even more essential.</code> |
* Loss: [<code>CachedMultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedmultiplenegativesrankingloss) with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `per_device_train_batch_size`: 512
- `per_device_eval_batch_size`: 512
- `learning_rate`: 0.0001
- `num_train_epochs`: 1
- `warmup_ratio`: 0.05
- `bf16`: True
- `batch_sampler`: no_duplicates
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: no
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 512
- `per_device_eval_batch_size`: 512
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 0.0001
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 1
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.05
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: True
- `fp16`: False
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: None
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `include_for_metrics`: []
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `average_tokens_across_devices`: False
- `prompts`: None
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
<details><summary>Click to expand</summary>
| Epoch | Step | Training Loss | msmarco-co-condenser-dev_cosine_accuracy |
|:------:|:----:|:-------------:|:----------------------------------------:|
| 0 | 0 | - | 0.599 |
| 0.0041 | 10 | 6.0983 | - |
| 0.0082 | 20 | 4.4588 | - |
| 0.0123 | 30 | 2.2492 | - |
| 0.0164 | 40 | 0.9969 | - |
| 0.0205 | 50 | 0.5272 | - |
| 0.0246 | 60 | 0.3982 | - |
| 0.0287 | 70 | 0.3335 | - |
| 0.0328 | 80 | 0.3024 | - |
| 0.0369 | 90 | 0.2932 | - |
| 0.0410 | 100 | 0.2695 | - |
| 0.0450 | 110 | 0.2574 | - |
| 0.0491 | 120 | 0.2447 | - |
| 0.0532 | 130 | 0.2491 | - |
| 0.0573 | 140 | 0.2318 | - |
| 0.0614 | 150 | 0.2292 | - |
| 0.0655 | 160 | 0.2213 | - |
| 0.0696 | 170 | 0.218 | - |
| 0.0737 | 180 | 0.2234 | - |
| 0.0778 | 190 | 0.2066 | - |
| 0.0819 | 200 | 0.1987 | - |
| 0.0860 | 210 | 0.1978 | - |
| 0.0901 | 220 | 0.2024 | - |
| 0.0942 | 230 | 0.1959 | - |
| 0.0983 | 240 | 0.1804 | - |
| 0.1024 | 250 | 0.1868 | - |
| 0.1065 | 260 | 0.1983 | - |
| 0.1106 | 270 | 0.1641 | - |
| 0.1147 | 280 | 0.1713 | - |
| 0.1188 | 290 | 0.1726 | - |
| 0.1229 | 300 | 0.17 | - |
| 0.1269 | 310 | 0.1783 | - |
| 0.1310 | 320 | 0.1742 | - |
| 0.1351 | 330 | 0.1654 | - |
| 0.1392 | 340 | 0.1663 | - |
| 0.1433 | 350 | 0.1616 | - |
| 0.1474 | 360 | 0.157 | - |
| 0.1515 | 370 | 0.1574 | - |
| 0.1556 | 380 | 0.1529 | - |
| 0.1597 | 390 | 0.1561 | - |
| 0.1638 | 400 | 0.1435 | - |
| 0.1679 | 410 | 0.1555 | - |
| 0.1720 | 420 | 0.1455 | - |
| 0.1761 | 430 | 0.1416 | - |
| 0.1802 | 440 | 0.1407 | - |
| 0.1843 | 450 | 0.138 | - |
| 0.1884 | 460 | 0.1387 | - |
| 0.1925 | 470 | 0.1499 | - |
| 0.1966 | 480 | 0.1372 | - |
| 0.2007 | 490 | 0.1308 | - |
| 0.2048 | 500 | 0.1367 | - |
| 0.2088 | 510 | 0.1324 | - |
| 0.2129 | 520 | 0.1317 | - |
| 0.2170 | 530 | 0.1263 | - |
| 0.2211 | 540 | 0.1209 | - |
| 0.2252 | 550 | 0.1201 | - |
| 0.2293 | 560 | 0.1213 | - |
| 0.2334 | 570 | 0.1329 | - |
| 0.2375 | 580 | 0.1207 | - |
| 0.2416 | 590 | 0.1211 | - |
| 0.2457 | 600 | 0.1164 | - |
| 0.2498 | 610 | 0.1292 | - |
| 0.2539 | 620 | 0.1223 | - |
| 0.2580 | 630 | 0.1237 | - |
| 0.2621 | 640 | 0.1088 | - |
| 0.2662 | 650 | 0.1196 | - |
| 0.2703 | 660 | 0.1209 | - |
| 0.2744 | 670 | 0.1155 | - |
| 0.2785 | 680 | 0.1101 | - |
| 0.2826 | 690 | 0.1127 | - |
| 0.2867 | 700 | 0.1082 | - |
| 0.2907 | 710 | 0.1083 | - |
| 0.2948 | 720 | 0.1132 | - |
| 0.2989 | 730 | 0.1121 | - |
| 0.3030 | 740 | 0.1146 | - |
| 0.3071 | 750 | 0.1088 | - |
| 0.3112 | 760 | 0.0982 | - |
| 0.3153 | 770 | 0.0952 | - |
| 0.3194 | 780 | 0.1034 | - |
| 0.3235 | 790 | 0.1017 | - |
| 0.3276 | 800 | 0.1016 | - |
| 0.3317 | 810 | 0.1054 | - |
| 0.3358 | 820 | 0.1003 | - |
| 0.3399 | 830 | 0.0932 | - |
| 0.3440 | 840 | 0.0997 | - |
| 0.3481 | 850 | 0.0921 | - |
| 0.3522 | 860 | 0.0958 | - |
| 0.3563 | 870 | 0.0973 | - |
| 0.3604 | 880 | 0.0931 | - |
| 0.3645 | 890 | 0.0964 | - |
| 0.3686 | 900 | 0.0982 | - |
| 0.3726 | 910 | 0.0908 | - |
| 0.3767 | 920 | 0.0917 | - |
| 0.3808 | 930 | 0.0857 | - |
| 0.3849 | 940 | 0.0925 | - |
| 0.3890 | 950 | 0.0915 | - |
| 0.3931 | 960 | 0.089 | - |
| 0.3972 | 970 | 0.0876 | - |
| 0.4013 | 980 | 0.0959 | - |
| 0.4054 | 990 | 0.0879 | - |
| 0.4095 | 1000 | 0.0883 | - |
| 0.4136 | 1010 | 0.0824 | - |
| 0.4177 | 1020 | 0.0897 | - |
| 0.4218 | 1030 | 0.0954 | - |
| 0.4259 | 1040 | 0.0815 | - |
| 0.4300 | 1050 | 0.0806 | - |
| 0.4341 | 1060 | 0.0918 | - |
| 0.4382 | 1070 | 0.0851 | - |
| 0.4423 | 1080 | 0.0888 | - |
| 0.4464 | 1090 | 0.0863 | - |
| 0.4505 | 1100 | 0.0856 | - |
| 0.4545 | 1110 | 0.0809 | - |
| 0.4586 | 1120 | 0.085 | - |
| 0.4627 | 1130 | 0.0756 | - |
| 0.4668 | 1140 | 0.0836 | - |
| 0.4709 | 1150 | 0.0815 | - |
| 0.4750 | 1160 | 0.084 | - |
| 0.4791 | 1170 | 0.0751 | - |
| 0.4832 | 1180 | 0.0794 | - |
| 0.4873 | 1190 | 0.0844 | - |
| 0.4914 | 1200 | 0.0835 | - |
| 0.4955 | 1210 | 0.0798 | - |
| 0.4996 | 1220 | 0.0825 | - |
| 0.5037 | 1230 | 0.0796 | - |
| 0.5078 | 1240 | 0.0758 | - |
| 0.5119 | 1250 | 0.0765 | - |
| 0.5160 | 1260 | 0.0806 | - |
| 0.5201 | 1270 | 0.072 | - |
| 0.5242 | 1280 | 0.0775 | - |
| 0.5283 | 1290 | 0.076 | - |
| 0.5324 | 1300 | 0.0767 | - |
| 0.5364 | 1310 | 0.0782 | - |
| 0.5405 | 1320 | 0.07 | - |
| 0.5446 | 1330 | 0.0724 | - |
| 0.5487 | 1340 | 0.0703 | - |
| 0.5528 | 1350 | 0.072 | - |
| 0.5569 | 1360 | 0.0763 | - |
| 0.5610 | 1370 | 0.0703 | - |
| 0.5651 | 1380 | 0.0688 | - |
| 0.5692 | 1390 | 0.0703 | - |
| 0.5733 | 1400 | 0.0659 | - |
| 0.5774 | 1410 | 0.0688 | - |
| 0.5815 | 1420 | 0.0713 | - |
| 0.5856 | 1430 | 0.0722 | - |
| 0.5897 | 1440 | 0.0682 | - |
| 0.5938 | 1450 | 0.07 | - |
| 0.5979 | 1460 | 0.0649 | - |
| 0.6020 | 1470 | 0.0659 | - |
| 0.6061 | 1480 | 0.0675 | - |
| 0.6102 | 1490 | 0.0629 | - |
| 0.6143 | 1500 | 0.0683 | - |
| 0.6183 | 1510 | 0.0687 | - |
| 0.6224 | 1520 | 0.0724 | - |
| 0.6265 | 1530 | 0.0638 | - |
| 0.6306 | 1540 | 0.0709 | - |
| 0.6347 | 1550 | 0.064 | - |
| 0.6388 | 1560 | 0.0646 | - |
| 0.6429 | 1570 | 0.0673 | - |
| 0.6470 | 1580 | 0.0607 | - |
| 0.6511 | 1590 | 0.0671 | - |
| 0.6552 | 1600 | 0.0627 | - |
| 0.6593 | 1610 | 0.0644 | - |
| 0.6634 | 1620 | 0.0629 | - |
| 0.6675 | 1630 | 0.0656 | - |
| 0.6716 | 1640 | 0.0633 | - |
| 0.6757 | 1650 | 0.062 | - |
| 0.6798 | 1660 | 0.0627 | - |
| 0.6839 | 1670 | 0.0583 | - |
| 0.6880 | 1680 | 0.0612 | - |
| 0.6921 | 1690 | 0.066 | - |
| 0.6962 | 1700 | 0.0645 | - |
| 0.7002 | 1710 | 0.0599 | - |
| 0.7043 | 1720 | 0.0552 | - |
| 0.7084 | 1730 | 0.065 | - |
| 0.7125 | 1740 | 0.0614 | - |
| 0.7166 | 1750 | 0.0615 | - |
| 0.7207 | 1760 | 0.0567 | - |
| 0.7248 | 1770 | 0.0528 | - |
| 0.7289 | 1780 | 0.0541 | - |
| 0.7330 | 1790 | 0.0548 | - |
| 0.7371 | 1800 | 0.0568 | - |
| 0.7412 | 1810 | 0.053 | - |
| 0.7453 | 1820 | 0.0603 | - |
| 0.7494 | 1830 | 0.0594 | - |
| 0.7535 | 1840 | 0.0549 | - |
| 0.7576 | 1850 | 0.0601 | - |
| 0.7617 | 1860 | 0.0604 | - |
| 0.7658 | 1870 | 0.0524 | - |
| 0.7699 | 1880 | 0.057 | - |
| 0.7740 | 1890 | 0.057 | - |
| 0.7781 | 1900 | 0.0551 | - |
| 0.7821 | 1910 | 0.0574 | - |
| 0.7862 | 1920 | 0.0555 | - |
| 0.7903 | 1930 | 0.0564 | - |
| 0.7944 | 1940 | 0.052 | - |
| 0.7985 | 1950 | 0.054 | - |
| 0.8026 | 1960 | 0.0573 | - |
| 0.8067 | 1970 | 0.056 | - |
| 0.8108 | 1980 | 0.0503 | - |
| 0.8149 | 1990 | 0.0525 | - |
| 0.8190 | 2000 | 0.0505 | - |
| 0.8231 | 2010 | 0.0547 | - |
| 0.8272 | 2020 | 0.0531 | - |
| 0.8313 | 2030 | 0.0534 | - |
| 0.8354 | 2040 | 0.0542 | - |
| 0.8395 | 2050 | 0.0536 | - |
| 0.8436 | 2060 | 0.0512 | - |
| 0.8477 | 2070 | 0.0508 | - |
| 0.8518 | 2080 | 0.0517 | - |
| 0.8559 | 2090 | 0.0516 | - |
| 0.8600 | 2100 | 0.0558 | - |
| 0.8640 | 2110 | 0.0571 | - |
| 0.8681 | 2120 | 0.0536 | - |
| 0.8722 | 2130 | 0.0561 | - |
| 0.8763 | 2140 | 0.0489 | - |
| 0.8804 | 2150 | 0.0513 | - |
| 0.8845 | 2160 | 0.0455 | - |
| 0.8886 | 2170 | 0.0479 | - |
| 0.8927 | 2180 | 0.0498 | - |
| 0.8968 | 2190 | 0.0523 | - |
| 0.9009 | 2200 | 0.0513 | - |
| 0.9050 | 2210 | 0.049 | - |
| 0.9091 | 2220 | 0.0504 | - |
| 0.9132 | 2230 | 0.0462 | - |
| 0.9173 | 2240 | 0.0469 | - |
| 0.9214 | 2250 | 0.0501 | - |
| 0.9255 | 2260 | 0.046 | - |
| 0.9296 | 2270 | 0.0475 | - |
| 0.9337 | 2280 | 0.0504 | - |
| 0.9378 | 2290 | 0.0483 | - |
| 0.9419 | 2300 | 0.0536 | - |
| 0.9459 | 2310 | 0.0442 | - |
| 0.9500 | 2320 | 0.0499 | - |
| 0.9541 | 2330 | 0.0478 | - |
| 0.9582 | 2340 | 0.0499 | - |
| 0.9623 | 2350 | 0.048 | - |
| 0.9664 | 2360 | 0.0451 | - |
| 0.9705 | 2370 | 0.0501 | - |
| 0.9746 | 2380 | 0.0464 | - |
| 0.9787 | 2390 | 0.0451 | - |
| 0.9828 | 2400 | 0.0413 | - |
| 0.9869 | 2410 | 0.0478 | - |
| 0.9910 | 2420 | 0.0466 | - |
| 0.9951 | 2430 | 0.0515 | - |
| 0.9992 | 2440 | 0.0484 | - |
| 1.0 | 2442 | - | 0.994 |
</details>
### Framework Versions
- Python: 3.11.9
- Sentence Transformers: 3.3.0
- Transformers: 4.48.0.dev0
- PyTorch: 2.4.0
- Accelerate: 1.2.1
- Datasets: 2.21.0
- Tokenizers: 0.21.0
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### CachedMultipleNegativesRankingLoss
```bibtex
@misc{gao2021scaling,
title={Scaling Deep Contrastive Learning Batch Size under Memory Limited Setup},
author={Luyu Gao and Yunyi Zhang and Jiawei Han and Jamie Callan},
year={2021},
eprint={2101.06983},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> | [
"TEXT_CLASSIFICATION"
] | [
"SCIFACT"
] |
HPAI-BSC/Llama3.1-Aloe-Beta-8B | HPAI-BSC | question-answering | [
"transformers",
"safetensors",
"llama",
"text-generation",
"biology",
"medical",
"healthcare",
"question-answering",
"en",
"dataset:HPAI-BSC/Aloe-Beta-General-Collection",
"dataset:HPAI-BSC/chain-of-diagnosis",
"dataset:HPAI-BSC/MedS-Ins",
"dataset:HPAI-BSC/ultramedical",
"dataset:HPAI-BSC/pubmedqa-cot-llama31",
"dataset:HPAI-BSC/medqa-cot-llama31",
"dataset:HPAI-BSC/medmcqa-cot-llama31",
"dataset:HPAI-BSC/headqa-cot-llama31",
"dataset:HPAI-BSC/MMLU-medical-cot-llama31",
"dataset:HPAI-BSC/Polymed-QA",
"arxiv:2405.01886",
"license:llama3.1",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | 2024-10-30T17:29:40 | 2025-01-22T14:18:57 | 838 | 11 | ---
datasets:
- HPAI-BSC/Aloe-Beta-General-Collection
- HPAI-BSC/chain-of-diagnosis
- HPAI-BSC/MedS-Ins
- HPAI-BSC/ultramedical
- HPAI-BSC/pubmedqa-cot-llama31
- HPAI-BSC/medqa-cot-llama31
- HPAI-BSC/medmcqa-cot-llama31
- HPAI-BSC/headqa-cot-llama31
- HPAI-BSC/MMLU-medical-cot-llama31
- HPAI-BSC/Polymed-QA
- HPAI-BSC/Aloe-Beta-General-Collection
- HPAI-BSC/Aloe-Beta-General-Collection
language:
- en
library_name: transformers
license: llama3.1
pipeline_tag: question-answering
tags:
- biology
- medical
- healthcare
---
<p align="center">
<picture>
<source media="(prefers-color-scheme: dark)" srcset="https://cdn-uploads.huggingface.co/production/uploads/6620f941eba5274b5c12f83d/vg1jG1OgqP7yyE0PO-OMT.png">
<img alt="prompt_engine" src="https://cdn-uploads.huggingface.co/production/uploads/6620f941eba5274b5c12f83d/vg1jG1OgqP7yyE0PO-OMT.png" width=50%>
</picture>
</p>
<h1 align="center">
Aloe: A Family of Fine-tuned Open Healthcare LLMs
</h1>
---
Llama3.1-Aloe-Beta-8B is an **open healthcare LLM** achieving **state-of-the-art performance** on several medical tasks. Aloe Beta is made available in four model sizes: [7B](https://huggingface.co/HPAI-BSC/Qwen2.5-Aloe-Beta-7B/), [8B](https://huggingface.co/HPAI-BSC/Llama3.1-Aloe-Beta-8B), [70B](https://huggingface.co/HPAI-BSC/Llama3.1-Aloe-Beta-70B), and [72B](https://huggingface.co/HPAI-BSC/Qwen2.5-Aloe-Beta-72B). All models are trained using the same recipe, on top of two different families of models: Llama3.1 and Qwen2.5.
Aloe is trained on 20 medical tasks, resulting in a robust and versatile healthcare model. Evaluations show Aloe models to be among the best in their class. When combined with a RAG system ([also released](https://github.com/HPAI-BSC/prompt_engine)) the 7B and 8B version gets close to the performance of closed models like MedPalm-2, GPT4. With the same RAG system, Llama3.1-Aloe-Beta-70B and Qwen2.5-Aloe-Beta-72B outperforms those private alternatives, producing state-of-the-art results.
# Aloe-Beta-8B

**Aloe-8B-Beta** is the latest iteration in the **Aloe family**, building and improving on the success of its predecessor, [Aloe-8B-Alpha](https://huggingface.co/HPAI-BSC/Llama3-Aloe-8B-Alpha).
Beta more than triples the training data used by Alpha, for a total of **1.8B tokens**, including a wider variety of medical tasks and instructions (e.g., text summarization, explanation, diagnosis, text classification, treatment recommendation, ...).

To mitigate catastrophic forgetting and enable the model to effectively learn new capabilities like **function calling**, we incorporated a diverse set of high-quality general-purpose data constituting 20% of the total training set. The curated data includes some of the highest-quality content available across a range of topics, including mathematics, programming, STEM, and very long instructions (> 8k tokens), to enrich the model's adaptability and comprehension across diverse domains.
Beta also boosts the alignment and safety stages with respect to Alpha. This includes a [medical preference dataset](https://huggingface.co/datasets/TsinghuaC3I/UltraMedical-Preference), as well as the red-teaming dataset (available soon).
Complete training details, model merging configurations, and all training data (including synthetically generated data) can be found below. This includes [the RAG system](https://github.com/HPAI-BSC/prompt_engine) that was developed to test Aloe Beta in a deployment setup. Aloe comes with a healthcare-specific risk assessment to facilitate to the safe use and deployment of such systems.
## Model Details
### [](https://huggingface.co/templates/model-card-example#model-description)Model Description
- **Developed by:** [HPAI](https://hpai.bsc.es/)
- **Model type:** Causal decoder-only transformer language model
- **Language(s) (NLP):** English (capable but not formally evaluated on other languages)
- **License:** This model is based on Meta Llama 3.1 8B and is governed by the [Meta Llama 3 License](https://www.llama.com/llama3_1/license/). All our modifications are available with a [CC BY 4.0](https://creativecommons.org/licenses/by/4.0/) license, making the Aloe Beta models **compatible with commercial use**.
- **Base model :** [meta-llama/Llama-3.1-8B](https://huggingface.co/meta-llama/Llama-3.1-8B)
- **Paper:** (more coming soon)
- **RAG Repository:** https://github.com/HPAI-BSC/prompt_engine
### [](https://huggingface.co/templates/model-card-example#model-sources-optional)Model Sources [optional]
## Model Performance
Aloe Beta has been tested on the most popular healthcare QA datasets, with and without Medprompt inference technique. Results show competitive performance, achieving SOTA within models of the same size.

The Beta model has been developed to excel in several different medical tasks. For this reason, we evaluated the model in many different medical tasks:


We also compared the performance of the model in the general domain, using the OpenLLM Leaderboard benchmark. Aloe-Beta gets competitive results with the current SOTA general models in the most used general benchmarks and outperforms the medical models:

## Uses
### Direct Use
We encourage the use of Aloe for research purposes, as a stepping stone to build better foundational models for healthcare. In production, Aloe should always be used under the supervision of a human expert.
### Out-of-Scope Use
These models are not to be used for clinical practice, medical diagnosis, or any other form of direct or indirect healthcare advice. Models are prone to error and can produce toxic content. The use of Aloe models for activities harmful to individuals, such as spam, fraud, or impersonation, is strictly prohibited. Minors should not be left alone to interact with Aloe without supervision.
## Bias, Risks, and Limitations
Aloe can produce toxic content under the appropriate prompts, and it includes multiple undesirable biases. While significant efforts where conducted to mitigate this (see Alignment details below), model safety cannot be fully guaranteed. We avoid the use of all personal data in our training.
We identify at least three risk cases specific to healthcare LLMs:
- Healthcare professional impersonation, a fraudulent behaviour which currently generates billions of dollars in [profit](https://www.justice.gov/opa/pr/justice-department-charges-dozens-12-billion-health-care-fraud). A model such as Aloe could be used to increase the efficacy of such deceiving activities, making them more widespread. The main preventive actions are public literacy on the unreliability of digitised information and the importance of medical registration, and legislation enforcing AI-generated content disclaimers.
- Medical decision-making without professional supervision. While this is already an issue in modern societies (eg self-medication) a model such as Aloe, capable of producing high-quality conversational data, can facilitate self-delusion, particularly in the presence of sycophancy. By producing tailored responses, it can also be used to generate actionable answers. Public literacy on the dangers of self-diagnosis is one of the main defenses, together with the introduction of disclaimers and warnings on the models' outputs.
- Access to information on dangerous substances or procedures. While the literature on sensitive content can already be found on different sources (eg libraries, the internet, dark web), LLMs can centralize such access, making it nearly impossible to control the flow of such information. Model alignment can help in that regard, but so far the effects remain insufficient, as jailbreaking methods still overcome it.
<!---
Table below shows the performance of Aloe at several AI safety tasks:
TO BE UPDATED
<img src="https://cdn-uploads.huggingface.co/production/uploads/62972c4979f193515da1d38e/T6Jblpf1kmTkM04K716rM.png" width="95%">
We analyzed the safety and robustness of the model using red teaming techniques. We designed a benchmark using different types of attacks and analyzed the performance of Aloe and some extra models, and we confirm that our model is aligned properly and successfully resisting most attacks:


-->
## How to Get Started with the Model
Use the code below to get started with the model. You can run conversational inference using the Transformers pipeline abstraction, or by leveraging the Auto classes with the `generate()` function. Let's see examples for both.
#### Transformers pipeline
```python
import transformers
import torch
model_id = "HPAI-BSC/Llama3.1-Aloe-Beta-8B"
pipeline = transformers.pipeline(
"text-generation",
model=model_id,
model_kwargs={"torch_dtype": torch.bfloat16},
device_map="auto",
)
messages = [
{"role": "system", "content": "You are an expert medical assistant named Aloe, developed by the High Performance Artificial Intelligence Group at Barcelona Supercomputing Center(BSC). You are to be a helpful, respectful, and honest assistant."},
{"role": "user", "content": "Hello."},
]
prompt = pipeline.tokenizer.apply_chat_template(
messages,
tokenize=False,
add_generation_prompt=True
)
terminators = [
pipeline.tokenizer.eos_token_id,
pipeline.tokenizer.convert_tokens_to_ids("<|eot_id|>")
]
outputs = pipeline(
prompt,
max_new_tokens=256,
eos_token_id=terminators,
do_sample=True,
temperature=0.6,
top_p=0.9,
)
print(outputs[0]["generated_text"][len(prompt):])
```
#### Transformers AutoModelForCausalLM
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
import torch
model_id = "HPAI-BSC/Llama3.1-Aloe-Beta-8B"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(
model_id,
torch_dtype=torch.bfloat16,
device_map="auto",
)
messages = [
{"role": "system", "content": "You are an expert medical assistant named Aloe, developed by the High Performance Artificial Intelligence Group at Barcelona Supercomputing Center(BSC). You are to be a helpful, respectful, and honest assistant."},
{"role": "user", "content": "Hello"},
]
input_ids = tokenizer.apply_chat_template(
messages,
add_generation_prompt=True,
return_tensors="pt"
).to(model.device)
terminators = [
tokenizer.eos_token_id,
tokenizer.convert_tokens_to_ids("<|eot_id|>")
]
outputs = model.generate(
input_ids,
max_new_tokens=256,
eos_token_id=terminators,
do_sample=True,
temperature=0.6,
top_p=0.9,
)
response = outputs[0][input_ids.shape[-1]:]
print(tokenizer.decode(response, skip_special_tokens=True))
```
## Training Details
### Supervised fine-tuning
SFT on top of Llama 3.1 using axolotl (https://github.com/axolotl-ai-cloud/axolotl).
We used Deepspeed's Zero-3 distributed training using the following hardware:
* 8B: 32x NVIDIA Hopper H100 64GB of the *Marenostrum 5*.
* 70B: 64x NVIDIA Hopper H100 64GB of the *Marenostrum 5*.
<!---
^^^ TO BE COMPLETED AND DETAILED ^^^
-->
#### Training Data
The training set consists of around 1.8B tokens, having 3 different types of data:
- Medical domain datasets. Includes data from 20 different medical tasks.
- [HPAI-BSC/Aloe-Beta-General-Collection](https://huggingface.co/datasets/HPAI-BSC/Aloe-Beta-General-Collection)
- [HPAI-BSC/chain-of-diagnosis](https://huggingface.co/datasets/HPAI-BSC/chain-of-diagnosis)
- [HPAI-BSC/MedS-Ins](https://huggingface.co/datasets/HPAI-BSC/MedS-Ins)
- [HPAI-BSC/ultramedica](https://huggingface.co/datasets/HPAI-BSC/ultramedical)
- Synthetic data. We expanded our training data by generating high-quality answers using Llama3.1-70B.
- [HPAI-BSC/pubmedqa-cot-llama31](https://huggingface.co/datasets/HPAI-BSC/pubmedqa-cot-llama31)
- [HPAI-BSC/medqa-cot-llama31](https://huggingface.co/datasets/HPAI-BSC/medqa-cot-llama31)
- [HPAI-BSC/medmcqa-cot-llama31](https://huggingface.co/datasets/HPAI-BSC/medmcqa-cot-llama31)
- [HPAI-BSC/headqa-cot-llama31](https://huggingface.co/datasets/HPAI-BSC/headqa-cot-llama31)
- [HPAI-BSC/MMLU-medical-cot-llama31](https://huggingface.co/datasets/HPAI-BSC/MMLU-medical-cot-llama31)
- [HPAI-BSC/Polymed-QA](https://huggingface.co/datasets/HPAI-BSC/Polymed-QA)
- Genstruct data (coming soon)
- General data. It includes maths, STEM, code, function calling, and instructions with a very long context.
- [HPAI-BSC/Aloe-Beta-General-Collection](https://huggingface.co/datasets/HPAI-BSC/Aloe-Beta-General-Collection)
#### Training parameters
- Epochs: 3
- Sequence length: 16384
- Optimizer: adamw_torch
- Learning rate: 2e-5
- Learning rate scheduler: cosine
- Warmup steps: 100
- Weight decay: 0
- Gradient checkpointing
- Zero 3
- Total batch size: 128
- Batch size per device: 1
- Gradient accumulation steps: 4
### Model Merging
The model trained was merged with the Llama-3.1-Instruct model using the DARE_TIES technique. [Mergekit](https://github.com/arcee-ai/mergekit) was used to conduct the merging.
### Model Alignment
The model is aligned using the Direct Preference Optimization (DPO) technique through a two-step process:
1. General DPO Alignment: This step uses a dataset combining medical, general preference, and safety data. We used our dataset [HPAI-BSC/Aloe-Beta-DPO](https://huggingface.co/datasets/HPAI-BSC/Aloe-Beta-DPO). We split the dataset into five parts, and the model was trained iteratively for one epoch on each chunk. We used a learning rate of 2e-7.
2. Red-Teaming Alignment: This step further fine-tunes the model to resist a variety of potential attacks, enhancing its robustness and security. Dataset will be shared soon. In this stage, we set the learning rate to 1e-7.
<!---
^^^ LINKS TO DPO DATA (DPO added, missing the RT^^^
-->
We used [OpenRLHF](https://github.com/OpenRLHF/OpenRLHF) library. We aligned the model using 16x NVIDA HOOPER H100 64GB of the *Marenostrum 5*. Common hyperparameters:
- Sequence length: 4096
- Optimizer: Fused adam
- Total batch size 128
- Batch size per device: 1
- Gradient accumulation steps: 8
- Beta: 0.1
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
- [ACI-BENCH](https://github.com/wyim/aci-bench)
- [MTS-Dialog](https://github.com/abachaa/MTS-Dialog)
- [MedText](https://huggingface.co/datasets/BI55/MedText)
- [Medical Text classification](https://www.kaggle.com/datasets/chaitanyakck/medical-text/data)
- [OLAPH](https://github.com/dmis-lab/OLAPH)
- CareQA Open
- [MedDialog](https://huggingface.co/datasets/bigbio/meddialog)
- [MEDIQA QA](https://huggingface.co/datasets/bigbio/mediqa_qa)
- [Meddialog Qsumm](https://huggingface.co/datasets/lighteval/med_dialog)
- [Biored](https://huggingface.co/datasets/YufeiHFUT/BioRED_all_info)
- [MIMIC-III](https://huggingface.co/datasets/dmacres/mimiciii-hospitalcourse-meta)
- [Medical Prescription](https://huggingface.co/datasets/devlocalhost/prescription-full)
- [MedQA (USMLE)](https://huggingface.co/datasets/bigbio/med_qa)
- [MedMCQA](https://huggingface.co/datasets/medmcqa)
- [PubMedQA](https://huggingface.co/datasets/bigbio/pubmed_qa)
- [MMLU-Medical](https://huggingface.co/datasets/lukaemon/mmlu)
- [MedQA-4-Option](https://huggingface.co/datasets/GBaker/MedQA-USMLE-4-options)
- [CareQA](https://huggingface.co/datasets/HPAI-BSC/CareQA)
- [Open LLM Leaderboard 2](https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard)
<!---
^^^ CAREQA Open link MISSING ^^^
-->
#### Metrics
- Accuracy: suite the evaluation of multiple-choice question-answering tasks.
- Rouge1: refers to the overlap of unigrams between the system and the gold standard.
<!---
^^^ MORE METRICS MISSING ^^^
-->
#### Summary
To compare Aloe with the most competitive open models (both general purpose and healthcare-specific) we use popular healthcare datasets (PubMedQA, MedMCQA, MedQA and MMLU for six medical tasks only), together with the new and highly reliable CareQA. However, while MCQA benchmarks provide valuable insights into a model's ability to handle structured queries, they fall short in representing the full range of challenges faced in medical practice. Building upon this idea, Aloe-Beta represents the next step in the evolution of the Aloe Family, designed to broaden the scope beyond the multiple-choice question-answering tasks that defined Aloe-Alpha.
Benchmark results indicate the training conducted on Aloe has boosted its performance above Llama31-8B-Instruct. Llama31-Aloe-Beta-8B also outperforms other medical models like Llama3-OpenBioLLM and Llama3-Med42. All these results make Llama31-Aloe-8B-Beta the best healthcare LLM of its size.
With the help of prompting techniques the performance of Llama3-Aloe-8B-Beta is significantly improved. Medprompting in particular provides a 7% increase in reported accuracy, after which Llama31-Aloe-8B-Beta only lags behind much bigger models like Llama-3.1-70B-Instruct or MedPalm-2. This improvement is mostly consistent across the OpenLLM Leaderboard and the other medical tasks.
## Environmental Impact
- **Hardware Type:** 32xH100
- **Hours used (8B):** 544 GPU hours
- **Hours used (70B):** 4500 GPU hours
- **Hardware Provider:** Barcelona Supercomputing Center (BSC)
- **Compute Region:** Spain
- **Carbon Emitted:** 34.1 kg of CO2
<!---
^^^ ARE CARBON EMISSIONS FOR BOTH? ^^^
-->
## Authors
Aloe Beta has been developed by the [High Performance Artificial Intelligence](https://hpai.bsc.es/) research group, from the [Barcelona Supercomping Center - BSC](https://www.bsc.es/). Main authors are [Jordi Bayarri Planas](https://huggingface.co/JordiBayarri), [Ashwin Kumar Gururajan](https://huggingface.co/G-AshwinKumar) and [Dario Garcia-Gasulla](https://huggingface.co/dariog). Red teaming efforts lead by Adrian Tormos.
mailto:[email protected]
## Citations
<!---
Add the prompt engine paper below
-->
If you use this repository in a published work, please cite the corresponding papers as source:
```
@misc{gururajan2024aloe,
title={Aloe: A Family of Fine-tuned Open Healthcare LLMs},
author={Ashwin Kumar Gururajan and Enrique Lopez-Cuena and Jordi Bayarri-Planas and Adrian Tormos and Daniel Hinjos and Pablo Bernabeu-Perez and Anna Arias-Duart and Pablo Agustin Martin-Torres and Lucia Urcelay-Ganzabal and Marta Gonzalez-Mallo and Sergio Alvarez-Napagao and Eduard Ayguadé-Parra and Ulises Cortés Dario Garcia-Gasulla},
year={2024},
eprint={2405.01886},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | [
"TEXT_CLASSIFICATION",
"SUMMARIZATION"
] | [
"BIORED",
"MEDIQA QA",
"MEDDIALOG",
"MEDQA",
"PUBMEDQA"
] |
Cloyne/vietnamese-sbert-v3 | Cloyne | sentence-similarity | [
"sentence-transformers",
"safetensors",
"roberta",
"sentence-similarity",
"feature-extraction",
"generated_from_trainer",
"dataset_size:132997",
"loss:MultipleNegativesRankingLoss",
"arxiv:1908.10084",
"arxiv:1705.00652",
"base_model:keepitreal/vietnamese-sbert",
"base_model:finetune:keepitreal/vietnamese-sbert",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-11-04T10:50:31 | 2024-11-04T10:50:45 | 833 | 0 | ---
base_model: keepitreal/vietnamese-sbert
library_name: sentence-transformers
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:132997
- loss:MultipleNegativesRankingLoss
widget:
- source_sentence: Ai có trách_nhiệm cập_nhật , công_bố thông_tin về tài_sản thế_chấp
sau khi thực_hiện đăng_ký thay_đổi nội_dung thế_chấp đã đăng_ký , sửa_chữa sai_sót
?
sentences:
- '1 . Chuẩn chương_trình phải quy_định những yêu_cầu tối_thiểu về số_lượng , cơ_cấu
, trình_độ , năng_lực , kinh_nghiệm của đội_ngũ giảng_viên và nhân_lực hỗ_trợ
để tổ_chức giảng_dạy và hỗ_trợ người học nhằm đạt được chuẩn đầu_ra của chương_trình
đào_tạo . 2 . Yêu_cầu đối_với đội_ngũ giảng_viên giảng_dạy chương_trình đại_học
, giảng_dạy chương_trình đào_tạo chuyên_sâu đặc_thù trình_độ bậc 7 : a ) Giảng_viên
có trình_độ thạc_sĩ trở lên , trợ_giảng có trình_độ đại_học trở lên ; b ) Có ít_nhất
01 tiến_sĩ ngành phù_hợp là giảng_viên cơ_hữu để chủ_trì xây_dựng , tổ_chức thực_hiện
chương_trình đào_tạo ; c ) Có ít_nhất 05 tiến_sĩ có chuyên_môn phù_hợp là giảng_viên
cơ_hữu để chủ_trì giảng_dạy chương_trình , trong đó mỗi thành_phần của chương_trình
phải có giảng_viên với chuyên_môn phù_hợp chủ_trì giảng_dạy ; d ) Có đủ số_lượng
giảng_viên để đảm_bảo tỉ_lệ sinh_viên trên giảng_viên không vượt quá mức quy_định
cho từng lĩnh_vực , nhóm ngành hoặc ngành đào_tạo . 3 . Yêu_cầu đối_với đội_ngũ
giảng_viên giảng_dạy chương_trình thạc_sĩ : a ) Giảng_viên có trình_độ tiến_sĩ
; b ) Có ít_nhất 05 tiến_sĩ ngành phù_hợp là giảng_viên cơ_hữu , trong đó có một
giáo_sư hoặc phó_giáo_sư chủ_trì xây_dựng , tổ_chức thực_hiện chương_trình đào_tạo
; c ) Có giảng_viên cơ_hữu với chuyên_môn phù_hợp chủ_trì giảng_dạy đối_với từng
môn_học , học phần của chương_trình ; d ) Có đủ người hướng_dẫn để đảm_bảo tỷ_lệ
tối_đa 05 học_viên trên một người hướng_dẫn . 4 . Yêu_cầu đối_với đội_ngũ giảng_viên
giảng_dạy chương_trình tiến_sĩ : a ) Giảng_viên có chức_danh giáo_sư hoặc phó
giáo_sư ; hoặc có trình_độ tiến_sĩ với năng_lực nghiên_cứu tốt ; b ) Có ít_nhất
01 giáo_sư ( hoặc 02 phó giáo_sư ) ngành phù_hợp và 03 tiến_sĩ ngành phù_hợp là
giảng_viên cơ_hữu ; c ) Có đủ người hướng_dẫn để đảm_bảo tỉ_lệ tối_đa 07 nghiên_cứu_sinh
/ giáo_sư , 05 nghiên_cứu_sinh / phó giáo_sư và 03 nghiên_cứu_sinh / tiến_sĩ .
5 . Chuẩn chương_trình cho các ngành , nhóm ngành quy_định yêu_cầu cụ_thể về đội_ngũ
giảng_viên không thấp hơn quy_định tại các khoản 2 , 3 và 4 của Điều này ; yêu_cầu
cụ_thể về tỉ_lệ người học trên giảng_viên ; yêu_cầu về đội_ngũ nhân_lực hỗ_trợ
đào_tạo ( nếu cần_thiết ) , phù_hợp với đặc_điểm của từng lĩnh_vực nhóm ngành
hoặc ngành đào_tạo .'
- Trách_nhiệm của các cơ_quan có liên_quan đến đăng_ký thế_chấp quyền sử_dụng đất
, tài_sản gắn liền với đất 1 . Văn_phòng đăng_ký đất_đai có trách_nhiệm gửi thông_tin
cho Sở Tài_nguyên_và_Môi_trường để cập_nhật , công_bố thông_tin về tài_sản thế_chấp
sau khi thực_hiện đăng_ký thay_đổi nội_dung thế_chấp đã đăng_ký , sửa_chữa sai_sót
, xóa đăng_ký thế_chấp liên_quan đến việc thế_chấp dự_án đầu_tư xây_dựng nhà ở
, dự_án đầu_tư xây_dựng công_trình xây_dựng không phải là nhà ở theo quy_định
tại Điều 64 của Nghị_định số 102 / 2017 / NĐ-CP . ...
- Xóa kỷ_luật , giảm thời_hạn chấp_hành kỷ_luật lao_động 1 . Người lao_động bị khiển_trách
sau 03 tháng , hoặc bị xử_lý kỷ_luật kéo_dài thời_hạn nâng lương sau 06 tháng
, kể từ ngày bị xử_lý , nếu không tái_phạm thì đương_nhiên được xóa kỷ_luật .
Trường_hợp bị xử_lý kỷ_luật lao_động bằng hình_thức cách_chức thì sau thời_hạn
03 năm , nếu tiếp_tục vi_phạm kỷ_luật lao_động thì không bị coi là tái_phạm .
2 . Người lao_động bị xử_lý kỷ_luật kéo_dài thời_gian nâng lương sau khi chấp_hành
được một_nửa thời_hạn nếu sửa_chữa tiến_bộ , có_thể được người sử_dụng lao_động
xét giảm thời_hạn .
- source_sentence: Mức trích nộp phí công_đoàn của doanh_nghiệp là bao_nhiêu phần_trăm
?
sentences:
- '" Điều 5 . Mức đóng và căn_cứ đóng kinh_phí công_đoàn Mức đóng bằng 2 % quỹ tiền_lương
làm căn_cứ đóng bảo_hiểm_xã_hội cho người lao_động . Quỹ tiền_lương này là tổng
mức tiền_lương của những người lao_động thuộc đối_tượng phải đóng bảo_hiểm_xã_hội
theo quy_định của pháp_luật về bảo_hiểm_xã_hội . Riêng đối_với đơn_vị thuộc lực_lượng_vũ_trang
quy_định tại Khoản 1 Điều 4 Nghị_định này , quỹ tiền_lương là tổng mức tiền_lương
của những cán_bộ , công_nhân viên_chức quốc_phòng , lao_động làm_việc hưởng lương
trong các nhà_máy , doanh_nghiệp , đơn_vị cơ_sở trong Quân_đội nhân_dân ; cán_bộ
, công_nhân , viên_chức , lao_động làm_việc hưởng lương trong các doanh_nghiệp
, cơ_quan , đơn_vị khoa học-kỹ_thuật , sự_nghiệp và phục_vụ trong Công_an nhân_dân
. "'
- '" Điều 41 . Điều_chỉnh dự_án đầu_tư 3 . Nhà_đầu_tư có dự_án đầu_tư đã được chấp_thuận
chủ_trương đầu_tư phải thực_hiện thủ_tục chấp_thuận điều_chỉnh chủ_trương đầu_tư
nếu thuộc một trong các trường_hợp sau đây : a ) Thay_đổi mục_tiêu đã được quy_định
tại văn_bản chấp_thuận chủ_trương đầu_tư ; bổ_sung mục_tiêu thuộc diện chấp_thuận
chủ_trương đầu_tư ; 4 . Đối_với dự_án đầu_tư được chấp_thuận chủ_trương đầu_tư
, nhà_đầu_tư không được điều_chỉnh tiến_độ thực_hiện dự_án đầu_tư quá 24 tháng
so với tiến_độ thực_hiện dự_án đầu_tư quy_định tại văn_bản chấp_thuận chủ_trương
đầu_tư lần đầu , trừ một trong các trường_hợp sau đây : đ ) Thay_đổi mục_tiêu
đã được quy_định tại văn_bản chấp_thuận chủ_trương đầu_tư ; bổ_sung mục_tiêu thuộc
diện chấp_thuận chủ_trương đầu_tư ; "'
- '" Điều 4 . Tiêu_chuẩn tuyển quân 1 . Tuổi_đời : a ) Công_dân từ đủ 18 tuổi đến
hết 25 tuổi . b ) Công_dân nam được đào_tạo trình_độ cao_đẳng , đại_học đã được
tạm hoãn gọi nhập_ngũ trong thời_gian một khóa đào_tạo của một trình_độ đào_tạo
thì tuyển_chọn và gọi nhập_ngũ đến hết 27 tuổi . 2 . Tiêu_chuẩn chính_trị : a
) Thực_hiện theo Thông_tư liên_tịch số 50/2016 / TTLT-BQP-BCA ngày 15 tháng 4
năm 2016 của Bộ_trưởng Bộ Quốc_phòng - Bộ_trưởng Bộ Công_an quy_định tiêu_chuẩn
chính_trị tuyển_chọn công_dân vào phục_vụ trong Quân_đội nhân_dân Việt_Nam . b
) Đối_với các cơ_quan , đơn_vị và vị_trí trọng_yếu cơ_mật trong Quân_đội ; lực_lượng
Tiêu_binh , Nghi_lễ ; lực_lượng Vệ_binh và Kiểm_soát quân_sự chuyên_nghiệp thực_hiện
tuyển_chọn theo quy_định của Bộ Quốc_phòng . 3 . Tiêu_chuẩn sức_khỏe : a ) Tuyển_chọn
những công_dân có sức khỏe loại 1 , 2 , 3 theo quy_định tại Thông_tư liên_tịch
số 16/2016 / TTLT-BYT-BQP ngày 30 tháng 6 năm 2016 của Bộ_trưởng Bộ Y_tế - Bộ_trưởng
Bộ Quốc_phòng quy_định việc khám sức_khỏe thực_hiện nghĩa_vụ_quân_sự . b ) Đối_với
các cơ_quan , đơn_vị , vị_trí quy_định tại Điểm b , Khoản 2 Điều này , thực_hiện
tuyển_chọn bảo_đảm tiêu_chuẩn riêng theo quy_định của Bộ Quốc_phòng . c ) Không
gọi nhập_ngũ vào Quân_đội những công_dân có sức khỏe loại 3 tật khúc_xạ về mắt
( cận_thị 1,5 diop trở lên , viễn_thị các mức_độ ) ; nghiện ma_túy , nhiễm HlV
, AIDS. 4 . Tiêu_chuẩn văn_hóa : a ) Tuyển_chọn và gọi nhập_ngũ những công_dân
có trình_độ văn_hóa lớp 8 trở lên , lấy từ cao xuống thấp . Những địa_phương có
khó_khăn không đảm_bảo đủ chỉ_tiêu giao_quân thì báo_cáo cấp có thẩm_quyền xem_xét
, quyết_định được tuyển_chọn số công_dân có trình_độ văn_hóa lớp 7 . b ) Các xã
thuộc vùng_sâu , vùng_xa , vùng điều_kiện kinh_tế - xã_hội đặc_biệt khó_khăn theo
quy_định của pháp_luật ; đồng_bào dân_tộc_thiểu_số dưới 10.000 người thì được
tuyển không quá 25 % công_dân có trình_độ văn_hóa cấp tiểu_học , còn lại là trung_học_cơ_sở
trở lên . "'
- source_sentence: Người đứng đầu cơ_quan thuộc Chính_phủ , đơn_vị sự_nghiệp công_lập
thực_hiện tiếp dân đột_xuất trong các trường_hợp nào ?
sentences:
- Nghĩa_vụ nộp chi_phí cho người làm_chứng ... 3 . Tòa_án căn_cứ vào khoản 1 và
khoản 2 Điều này quyết_định nghĩa_vụ nộp chi_phí cho người làm_chứng , hoàn_trả
lại chi_phí cho các bên đương_sự trong bản_án , quyết_định .
- 'Trách_nhiệm của người đứng đầu cơ_quan thuộc Chính_phủ , đơn_vị sự_nghiệp công_lập
... 3 . Thực_hiện tiếp công_dân đột_xuất trong các trường_hợp sau đây : a ) Vụ_việc
gay_gắt , phức_tạp , có nhiều người tham_gia , liên_quan đến trách_nhiệm của nhiều
cơ_quan , tổ_chức , đơn_vị hoặc ý_kiến của các cơ_quan , tổ_chức , đơn_vị còn
khác nhau ; b ) Vụ_việc nếu không_chỉ_đạo , xem_xét kịp_thời có_thể gây ra hậu_quả
nghiêm_trọng hoặc có_thể dẫn đến hủy_hoại tài_sản của Nhà_nước , của tập_thể ,
xâm_hại đến tính_mạng , tài_sản của nhân_dân , ảnh_hưởng đến an_ninh , chính_trị
, trật_tự , an_toàn xã_hội . 4 . Khi tiếp công_dân , người đứng đầu cơ_quan ,
đơn_vị phải có ý_kiến trả_lời về việc giải_quyết vụ_việc cho công_dân . Trường_hợp
chưa trả_lời ngay được thì chỉ_đạo cơ_quan , tổ_chức , đơn_vị , công_chức , viên_chức
thuộc quyền quản_lý của mình kịp_thời xem_xét , giải_quyết và thông_báo cho công_dân
biết thời_gian trả_lời .'
- Cơ_cấu tổ_chức bộ_máy và biên_chế của Ban quản_lý khu công_nghiệp , khu kinh_tế
1 . Ban quản_lý khu công_nghiệp , khu kinh_tế gồm Trưởng_ban , không quá 03 Phó
Trưởng_ban ; bộ_máy giúp_việc . Trưởng ban do Chủ_tịch Ủy_ban_nhân_dân cấp tỉnh
bổ_nhiệm , miễn_nhiệm . Phó Trưởng ban do Chủ_tịch Ủy_ban_nhân_dân cấp tỉnh bổ_nhiệm
, miễn_nhiệm theo đề_nghị của Trưởng ban . 2 . Trưởng ban có trách_nhiệm điều_hành
mọi hoạt_động của Ban quản_lý khu công_nghiệp , khu kinh_tế , chịu trách_nhiệm
trước Ủy_ban_nhân_dân cấp tỉnh , Chủ_tịch Ủy_ban_nhân_dân cấp tỉnh và pháp_luật
về hoạt_động của khu công_nghiệp , khu kinh_tế . ....
- source_sentence: Nếu chảy_máu trong phẫu_thuật mở tiền phòng lấy máu_cục thì xử_lý
như_thế_nào ?
sentences:
- 'PHẪU_THUẬT MỞ TIỀN_PHÒNG LẤY MÁU_CỤC ... VII. XỬ_TRÍ TAI_BIẾN 1 . Chảy_máu trong
phẫu_thuật Là biến_chứng hay gặp - Nguyên_nhân : + Do hút lôi_kéo vào mống mắt
đặc_biệt chân mống mắt . + Do cục máu đông chưa được hình_thành chắc_chắn . -
Xử_trí : + Dừng hút . + Bơm tiền phòng dung_dịch adrenalin 0,1 % hòa loãng với
dung_dịch ringer_lactat tỷ_lệ 1/3 và / hoặc bơm bóng hơi to vào tiền phòng hoặc
bơm nhầy vào tiền phòng . + Nếu máu vẫn không ngừng chảy , có_thể ngừng phẫu_thuật
, khâu đóng mép phẫu_thuật , chờ_đợi cho đến khi cục máu đông được hình_thành
chắc_chắn rồi rửa lại máu tiền phòng một hôm khác . ...'
- 'Nội_dung quy_hoạch không_gian biển quốc_gia ... 3 . Xác_định quan_điểm và mục_tiêu
phát_triển : a ) Xây_dựng quan_điểm sử_dụng không_gian biển , khai_thác và sử_dụng
bền_vững tài_nguyên biển , bảo_vệ môi_trường vùng bờ ; b ) Xác_định mục_tiêu tổng_quát
và các mục_tiêu cụ_thể về sử_dụng không_gian biển và khai_thác , sử_dụng tài_nguyên
trong phạm_vi không_gian biển trong thời_kỳ quy_hoạch 10 năm , tầm nhìn từ 30
đến 50 năm ; c ) Xác_định những vấn_đề trọng_tâm cần giải_quyết và các khâu đột_phá
trong việc khai_thác , sử_dụng không_gian biển cho các hoạt_động kinh_tế , xã_hội
, môi_trường trong thời_kỳ quy_hoạch . ...'
- 'Cách tiến_hành 9.1 Yêu_cầu chung ... 9.2 Phần mẫu thử Cân khoảng 40 g mẫu thử
, cho vào cốc thủy tinh_hình cầu dung_tích 160 ml . Khuấy_mẫu bằng thìa nhựa ,
nếu cần , sau đó ổn_định mẫu ở nhiệt_độ phòng ( từ 18 °C đến 25 °C ) . 9.3 Phương_pháp
đánh_giá bằng khứu_giác 9.3.1 Đánh_giá bằng khứu_giác Các đặc_tính khứu_giác được
đánh_giá trước_tiên . Đối_với mẫu mật_ong thô , đánh_giá mùi ngay sau khi mật_ong
đã được trải trên bề_mặt của cốc bằng thìa nhựa , để sự cảm_nhận các chất dễ bay_hơi
được giải_phóng và sự bốc_hơi trên bề_mặt bay_hơi là như nhau đối_với tất_cả các
mẫu . Đối_với mẫu pha loãng , cần xoáy vòng_mẫu trong cốc để thúc_đẩy sự bay_hơi
. Người đánh_giá phải thở trong vài giây trên miệng_cốc . Phải đánh_giá mùi ngay
sau khi trải trên bề_mặt của cốc hoặc ngay sau khi xoay_cốc và sau đó 10 hoặc
20 s . Trước khi hít_hơi thứ hai , người đánh_giá phải chờ từ 5 s đến 20 s hoặc
có_thể lâu hơn , để có_thể cảm_nhận được toàn_bộ mùi . Ghi vào phiếu đánh_giá
cường_độ của bất_kỳ khuyết_tật nào cảm_nhận được và sự phù_hợp với profile đơn
hoa , nếu được yêu_cầu . 9.3.2 Cường_độ mùi Thang_điểm đánh_giá cường_độ mùi của
mật_ong : 0 . không mùi 1 . mùi yếu 2 . mùi trung_bình 3 . mùi mạnh 9.3.3 Mô_tả
mùi Có_thể tham_khảo các thuật_ngữ sau : a ) Mùi cây_cỏ : - Mùi cây_cỏ tươi :
mùi hạt đậu , mùi lá bị nhàu , mùi cây_cỏ sau mưa ; - Mùi cây_cỏ khô : mùi malt
vàng , mùi rơm , mùi trà , mùi cỏ khô ; b ) Mùi gỗ : - Mùi gỗ khô : mùi gỗ và
lá , mùi bụi gỗ , mùi hạt_óc chó , mùi hạt_dẻ ; - Mùi nhựa gỗ : mùi nhựa cây tuyết_tùng
, mùi nhựa thông , mùi keo_ong ; - Mùi gia_vị : mùi đinh_hương , mùi nhục đậu_khấu
, mùi cà_phê ; c ) Mùi hóa_chất : - Mùi hóa_chất dầu_mỏ : mùi styren , mùi sơn
, mùi dung_môi ; - Mùi thuốc : mùi xà_phòng gia_dụng , mùi vitamin_B1 ; d ) Mùi
tươi : - Mùi tươi mới : mùi bạc_hà , mùi khuynh_diệp , mùi hoa hồi ; - Mùi quả
có múi : mùi chanh , mùi cam , mùi bưởi ; e ) Mùi hoa_quả tươi : - Mùi hoa : mùi
hoa_cam , mùi hoa thạch_thảo ( violet ) , mùi hoa_hồng , mùi hoa_dạ lan_hương
( Hyacinthus ) ; - Mùi quả : mùi táo , mùi lê , mùi quả dứa_dại ( red fruit )
, mùi quả_lý chua_đen , mùi dừa , mùi mơ , mùi quả lạ ( exotic fruit ) ; f ) Mùi
ấm : - Mùi cháy : mùi mật_rỉ , mùi đường cháy ; - Mùi quả nấu : mùi chà_là , mùi
mận , mùi vả_tây , mùi nho khô , mùi kẹo trái_cây ; - Mùi caramel : mùi kẹo toffee
, mùi bánh_caramel , mùi đường nâu ; g ) Mùi mật hỏng : - Mùi hăng : mùi phomat_cay
, mùi dấm ; - Mùi động_vật : mùi phomat , mùi mồ_hôi , mùi bò , mùi nước tiểu_mèo
; - Mùi mốc : mùi ẩm , mùi thảm_trải sàn , mùi mùn đất , mùi ngột_ngạt ; - Mùi
lưu_huỳnh : mùi atiso , mùi bắp_cải .'
- source_sentence: Việc ghi nhãn đối_với vàng trang_sức , mỹ_nghệ thể_hiện trực_tiếp
trên sản_phẩm bằng những cách nào ?
sentences:
- 'Xác_định tỷ_trọng ( % ) sản_lượng xăng_dầu từ nguồn trong nước và nhập_khẩu để
tính giá cơ_sở các mặt_hàng xăng_dầu 1 . Tỷ_trọng ( % ) sản_lượng xăng_dầu từ
nguồn trong nước và nhập_khẩu để tính giá cơ_sở các mặt_hàng xăng_dầu được xác_định
như sau : a ) Sản_lượng xăng_dầu từ nguồn trong nước là sản_lượng xăng_dầu bán
ra của các nhà_máy lọc dầu trong nước ( không bao_gồm dung_môi , nhiên_liệu bay
; không bao_gồm sản_lượng xăng_dầu tự dùng và xuất_khẩu ) . Tỷ_trọng ( % ) sản_lượng
xăng_dầu từ nguồn trong nước bằng ( = ) Sản_lượng xăng_dầu từ nguồn trong nước
chia cho ( :) Tổng_sản_lượng xăng_dầu nhập_khẩu và sản_lượng xăng_dầu từ nguồn
trong nước trong kỳ báo_cáo của thương_nhân đầu_mối sản_xuất xăng_dầu . b ) Sản_lượng
xăng_dầu từ nguồn nhập_khẩu thực_hiện như quy_định tại điểm a_khoản 1 Điều 3 Thông_tư
này . Tỷ_trọng ( % ) sản_lượng xăng_dầu từ nguồn nhập_khẩu bằng ( = ) Sản_lượng
xăng_dầu từ nguồn nhập_khẩu chia cho ( :) Tổng_sản_lượng xăng_dầu nhập_khẩu và
sản_lượng xăng_dầu từ nguồn sản_xuất trong nước của các thương_nhân đầu_mối sản_xuất
xăng_dầu trong kỳ báo_cáo . c ) Thời_gian thu_thập số_liệu thực_hiện theo Quý
( từ ngày 21 tháng trước liền kề tháng đầu_tiên của Quý đến ngày 20 tháng cuối
Quý ) .'
- 'Kỹ_sư cao_cấp ( hạng I ) - Mã_số : V._05.02.05 ... 2 . Tiêu_chuẩn về trình_độ
đào_tạo , bồi_dưỡng : a ) Có trình_độ thạc_sĩ trở lên thuộc lĩnh_vực kỹ_thuật
, công_nghệ ; b ) Có chứng_chỉ bồi_dưỡng chức_danh công_nghệ . ...'
- 'Công_bố tiêu_chuẩn áp_dụng và ghi nhãn đối_với vàng trang_sức , mỹ_nghệ ... 4
. Ghi nhãn đối_với vàng trang_sức , mỹ_nghệ : a ) Yêu_cầu chung : - Việc ghi nhãn
vàng trang_sức , mỹ_nghệ phải được thực_hiện theo quy_định tại Nghị_định số 89/2006
/ NĐ-CP ngày 30 tháng 8 năm 2006 của Chính_phủ về nhãn hàng hóa . Vị_trí nhãn
vàng trang_sức , mỹ_nghệ được thực_hiện theo quy_định tại Điều 6 Nghị_định số
89/2006 / NĐ-CP ; - Nhãn vàng trang_sức , mỹ_nghệ được thể_hiện trực_tiếp trên
sản_phẩm bằng cách khắc cơ_học , khắc_la-de , đục chìm , đúc_chìm , đúc nổi hoặc
bằng phương_pháp thích_hợp ( nếu kích_thước và cấu_trúc sản_phẩm đủ để thực_hiện
) hoặc thể_hiện trên tài_liệu đính kèm sản_phẩm ; - Độ tinh_khiết hay hàm_lượng
vàng theo phân_hạng quy_định tại Điều 6 Thông_tư này phải được ghi rõ tại vị_trí
dễ thấy trên sản_phẩm bằng số Ả_Rập chỉ_số phần vàng trên một nghìn ( 1000 ) phần
khối_lượng của sản_phẩm ( ví_dụ : 999 hoặc 916 ... ) hoặc bằng số Ả_Rập thể_hiện
chỉ_số Kara kèm theo chữ_cái K ( ví_dụ : 24K hoặc 22K ... ) tương_ứng với phân_hạng
theo quy_định tại Điều 6 Thông_tư này . Trường_hợp sản_phẩm có kích_thước không_thể
thể_hiện trực_tiếp được thì hàm_lượng vàng công_bố phải được ghi trên nhãn đính
kèm . Trường_hợp sản_phẩm có từ hai thành_phần trở lên với hàm_lượng vàng khác
nhau , có_thể nhận_biết sự khác nhau qua ngoại_quan thì việc ghi hàm_lượng vàng
được thể_hiện trên phần có hàm_lượng vàng thấp hơn ; - Đối_với vàng trang_sức
, mỹ_nghệ nhập_khẩu , ngoài nhãn gốc ghi bằng tiếng nước_ngoài , phải có nhãn
phụ bằng tiếng Việt thể_hiện các thông_tin ghi nhãn theo quy_định tại điểm b khoản
4 Điều này và xuất_xứ hàng hóa . ...'
---
# SentenceTransformer based on keepitreal/vietnamese-sbert
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [keepitreal/vietnamese-sbert](https://huggingface.co/keepitreal/vietnamese-sbert) on the csv dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [keepitreal/vietnamese-sbert](https://huggingface.co/keepitreal/vietnamese-sbert) <!-- at revision a9467ef2ef47caa6448edeabfd8e5e5ce0fa2a23 -->
- **Maximum Sequence Length:** 256 tokens
- **Output Dimensionality:** 768 tokens
- **Similarity Function:** Cosine Similarity
- **Training Dataset:**
- csv
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: RobertaModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("Cloyne/vietnamese-sbert-v3")
# Run inference
sentences = [
'Việc ghi nhãn đối_với vàng trang_sức , mỹ_nghệ thể_hiện trực_tiếp trên sản_phẩm bằng những cách nào ?',
'Công_bố tiêu_chuẩn áp_dụng và ghi nhãn đối_với vàng trang_sức , mỹ_nghệ ... 4 . Ghi nhãn đối_với vàng trang_sức , mỹ_nghệ : a ) Yêu_cầu chung : - Việc ghi nhãn vàng trang_sức , mỹ_nghệ phải được thực_hiện theo quy_định tại Nghị_định số 89/2006 / NĐ-CP ngày 30 tháng 8 năm 2006 của Chính_phủ về nhãn hàng hóa . Vị_trí nhãn vàng trang_sức , mỹ_nghệ được thực_hiện theo quy_định tại Điều 6 Nghị_định số 89/2006 / NĐ-CP ; - Nhãn vàng trang_sức , mỹ_nghệ được thể_hiện trực_tiếp trên sản_phẩm bằng cách khắc cơ_học , khắc_la-de , đục chìm , đúc_chìm , đúc nổi hoặc bằng phương_pháp thích_hợp ( nếu kích_thước và cấu_trúc sản_phẩm đủ để thực_hiện ) hoặc thể_hiện trên tài_liệu đính kèm sản_phẩm ; - Độ tinh_khiết hay hàm_lượng vàng theo phân_hạng quy_định tại Điều 6 Thông_tư này phải được ghi rõ tại vị_trí dễ thấy trên sản_phẩm bằng số Ả_Rập chỉ_số phần vàng trên một nghìn ( 1000 ) phần khối_lượng của sản_phẩm ( ví_dụ : 999 hoặc 916 ... ) hoặc bằng số Ả_Rập thể_hiện chỉ_số Kara kèm theo chữ_cái K ( ví_dụ : 24K hoặc 22K ... ) tương_ứng với phân_hạng theo quy_định tại Điều 6 Thông_tư này . Trường_hợp sản_phẩm có kích_thước không_thể thể_hiện trực_tiếp được thì hàm_lượng vàng công_bố phải được ghi trên nhãn đính kèm . Trường_hợp sản_phẩm có từ hai thành_phần trở lên với hàm_lượng vàng khác nhau , có_thể nhận_biết sự khác nhau qua ngoại_quan thì việc ghi hàm_lượng vàng được thể_hiện trên phần có hàm_lượng vàng thấp hơn ; - Đối_với vàng trang_sức , mỹ_nghệ nhập_khẩu , ngoài nhãn gốc ghi bằng tiếng nước_ngoài , phải có nhãn phụ bằng tiếng Việt thể_hiện các thông_tin ghi nhãn theo quy_định tại điểm b khoản 4 Điều này và xuất_xứ hàng hóa . ...',
'Kỹ_sư cao_cấp ( hạng I ) - Mã_số : V._05.02.05 ... 2 . Tiêu_chuẩn về trình_độ đào_tạo , bồi_dưỡng : a ) Có trình_độ thạc_sĩ trở lên thuộc lĩnh_vực kỹ_thuật , công_nghệ ; b ) Có chứng_chỉ bồi_dưỡng chức_danh công_nghệ . ...',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### csv
* Dataset: csv
* Size: 132,997 training samples
* Columns: <code>anchor</code> and <code>positive</code>
* Approximate statistics based on the first 1000 samples:
| | anchor | positive |
|:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 6 tokens</li><li>mean: 16.75 tokens</li><li>max: 34 tokens</li></ul> | <ul><li>min: 17 tokens</li><li>mean: 172.75 tokens</li><li>max: 256 tokens</li></ul> |
* Samples:
| anchor | positive |
|:---------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>Điều_kiện cần có của Văn_phòng công_chứng là gì ?</code> | <code>" Điều 22 . Văn_phòng công_chứng 3 . Tên gọi của Văn_phòng công_chứng phải bao_gồm cụm_từ “ Văn_phòng công_chứng ” kèm theo họ tên của Trưởng Văn_phòng hoặc họ tên của một công_chứng_viên hợp_danh khác của Văn_phòng công_chứng do các công_chứng_viên hợp_danh thỏa_thuận , không được trùng hoặc gây nhầm_lẫn với tên của tổ_chức hành_nghề công_chứng khác , không được vi_phạm truyền_thống lịch_sử , văn_hóa , đạo_đức và thuần_phong mỹ_tục của dân_tộc . "</code> |
| <code>Thứ_trưởng , Phó Thủ_trưởng cơ_quan ngang Bộ thực_hiện nhiệm_vụ theo sự phân_công của ai ?</code> | <code>" Điều 3 . Bộ_trưởng 1 . Bộ_trưởng là thành_viên Chính_phủ và là người đứng đầu Bộ , lãnh_đạo công_tác của Bộ ; chịu trách_nhiệm quản_lý_nhà_nước về ngành , lĩnh_vực được phân_công ; tổ_chức thi_hành và theo_dõi việc thi_hành pháp_luật liên_quan đến ngành , lĩnh_vực được giao trong phạm_vi toàn_quốc . 2 . Bộ_trưởng làm_việc theo chế_độ thủ_trưởng và Quy_chế làm_việc của Chính_phủ , bảo_đảm nguyên_tắc tập_trung_dân_chủ . Điều 4 . Thứ_trưởng , Phó Thủ_trưởng cơ_quan ngang Bộ 1 . Thứ_trưởng , Phó Thủ_trưởng cơ_quan ngang Bộ ( sau đây gọi chung là Thứ_trưởng ) giúp Bộ_trưởng thực_hiện một hoặc một_số nhiệm_vụ cụ_thể do Bộ_trưởng phân_công và chịu trách_nhiệm trước Bộ_trưởng và trước pháp_luật về nhiệm_vụ được phân_công . Thứ_trưởng không kiêm người đứng đầu tổ_chức , đơn_vị thuộc Bộ , trừ trường_hợp đặc_biệt . Khi Bộ_trưởng vắng_mặt , một Thứ_trưởng được Bộ_trưởng ủy_nhiệm thay Bộ_trưởng điều_hành và giải_quyết công_việc của Bộ . 2 . Số_lượng Thứ_trưởng thực_hiện theo quy_định của Luật_Tổ_chức Chính_phủ . "</code> |
| <code>Việc lựa_chọn xuất_bản_phẩm tham_khảo dùng chung trong các cơ_sở giáo_dục được quy_định thế_nào ?</code> | <code>Lựa_chọn xuất_bản_phẩm tham_khảo dùng chung trong các cơ_sở giáo_dục 1 . Tổ / nhóm chuyên_môn , căn_cứ vào mục_tiêu , nội_dung chương_trình giáo_dục , sách_giáo_khoa , kế_hoạch thực_hiện nhiệm_vụ năm_học , các hoạt_động giáo_dục và đề_xuất của giáo_viên để lựa_chọn , đề_xuất danh_mục xuất_bản_phẩm tham_khảo tối_thiểu liên_quan đến môn_học / lớp_học , hoạt_động giáo_dục . 2 . Định_kì vào đầu năm_học , thủ_trưởng cơ_sở giáo_dục thành_lập Hội_đồng để xem_xét , lựa_chọn , đề_xuất danh_mục xuất_bản_phẩm tham_khảo trên cơ_sở đề_xuất của các tổ / nhóm chuyên_môn . Thành_phần tối_thiểu của Hội_đồng gồm : Lãnh_đạo cơ_sở giáo_dục phụ_trách chuyên_môn , tổ / nhóm trưởng chuyên_môn và viên_chức phụ_trách thư_viện trong cơ_sở giáo_dục . 3 . Thủ_trưởng cơ_sở giáo_dục quyết_định phê_duyệt danh_mục xuất_bản_phẩm tham_khảo tối_thiểu để có kế_hoạch mua_sắm và sử_dụng hằng năm trong cơ_sở giáo_dục trên cơ_sở đề_xuất của Hội_đồng lựa_chọn xuất_bản_phẩm tham_khảo , cân_đối nguồn kinh_phí , quy_mô của cơ_sở giáo_dục , số_lượng và chất_lượng xuất_bản_phẩm tham_khảo đã có tại cơ_sở giáo_dục .</code> |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
```
### Evaluation Dataset
#### csv
* Dataset: csv
* Size: 132,997 evaluation samples
* Columns: <code>anchor</code> and <code>positive</code>
* Approximate statistics based on the first 1000 samples:
| | anchor | positive |
|:--------|:----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 6 tokens</li><li>mean: 17.13 tokens</li><li>max: 57 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 173.11 tokens</li><li>max: 256 tokens</li></ul> |
* Samples:
| anchor | positive |
|:------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>Hoàn lại số tiền lừa_đảo thì có được nhẹ_tội hơn không ?</code> | <code>" Điều 51 . Các tình_tiết giảm nhẹ trách_nhiệm hình_sự 1 . Các tình_tiết sau đây là tình_tiết giảm nhẹ trách_nhiệm hình_sự : a ) Người phạm_tội đã ngăn_chặn hoặc làm giảm bớt tác_hại của tội_phạm ; b ) Người phạm_tội tự_nguyện sửa_chữa , bồi_thường thiệt_hại hoặc khắc_phục hậu_quả ; c ) Phạm_tội trong trường_hợp vượt quá giới_hạn phòng_vệ chính_đáng ; d ) Phạm_tội trong trường_hợp vượt quá yêu_cầu của tình_thế cấp_thiết ; đ ) Phạm_tội trong trường_hợp vượt quá mức cần_thiết khi bắt_giữ người phạm_tội ; e ) Phạm_tội trong trường_hợp bị kích_động về tinh_thần do hành_vi trái pháp_luật của nạn_nhân gây ra ; g ) Phạm_tội vì hoàn_cảnh đặc_biệt khó_khăn mà không phải do mình tự gây ra ; h ) Phạm_tội nhưng chưa gây thiệt_hại hoặc gây thiệt_hại không lớn ; i ) Phạm_tội lần đầu và thuộc trường_hợp ít nghiêm_trọng ; k ) Phạm_tội vì bị người khác đe_dọa hoặc cưỡng_bức ; l ) Phạm_tội trong trường_hợp bị hạn_chế khả_năng nhận_thức mà không phải do lỗi của mình gây ra ; m ) Phạm_tội do lạc_hậu ; n ) Người phạm_tội là phụ_nữ có_thai ; o ) Người phạm_tội là người đủ 70 tuổi trở lên ; p ) Người phạm_tội là người khuyết_tật nặng hoặc khuyết_tật đặc_biệt nặng ; q ) Người phạm_tội là người có bệnh bị hạn_chế khả_năng nhận_thức hoặc khả_năng điều_khiển hành_vi của mình ; r ) Người phạm_tội tự_thú ; s ) Người phạm_tội thành_khẩn khai_báo , ăn_năn hối_cải ; t ) Người phạm_tội tích_cực hợp_tác với cơ_quan có trách_nhiệm trong việc phát_hiện tội_phạm hoặc trong quá_trình giải_quyết vụ án ; u ) Người phạm_tội đã lập_công chuộc tội ; v ) Người phạm_tội là người có thành_tích xuất_sắc trong sản_xuất , chiến_đấu , học_tập hoặc công_tác ; x ) Người phạm_tội là người có công với cách_mạng hoặc là cha , mẹ , vợ , chồng , con của liệt_sĩ . 2 . Khi quyết_định hình_phạt , Tòa_án có_thể coi đầu_thú hoặc tình_tiết khác là tình_tiết giảm nhẹ , nhưng phải ghi rõ lý_do giảm nhẹ trong bản_án . 3 . Các tình_tiết giảm nhẹ đã được Bộ_luật này quy_định là dấu_hiệu định_tội hoặc định_khung thì không được coi là tình_tiết giảm nhẹ trong khi quyết_định hình_phạt . "</code> |
| <code>Quy_trình phát_mại tài_sản bao_gồm các bước nào ?</code> | <code>“ Điều 307 . Thanh_toán số tiền có được từ việc xử_lý tài_sản cầm_cố , thế_chấp 1 . Số tiền có được từ việc xử_lý tài_sản cầm_cố , thế_chấp sau khi thanh_toán chi_phí bảo_quản , thu_giữ và xử_lý tài_sản cầm_cố , thế_chấp được thanh_toán theo thứ_tự ưu_tiên quy_định tại Điều 308 của Bộ_luật này . 2 . Trường_hợp số tiền có được từ việc xử_lý tài_sản cầm_cố , thế_chấp sau khi thanh_toán chi_phí bảo_quản , thu_giữ và xử_lý tài_sản cầm_cố , thế_chấp lớn hơn giá_trị nghĩa_vụ được bảo_đảm thì số tiền chênh_lệch phải được trả cho bên bảo_đảm . 3 . Trường_hợp số tiền có được từ việc xử_lý tài_sản cầm_cố , thế_chấp sau khi thanh_toán chi_phí bảo_quản , thu_giữ và xử_lý tài_sản cầm_cố , thế_chấp nhỏ hơn giá_trị nghĩa_vụ được bảo_đảm thì phần nghĩa_vụ chưa được thanh_toán được xác_định là nghĩa_vụ không có bảo_đảm , trừ trường_hợp các bên có thỏa_thuận bổ_sung tài_sản bảo_đảm . Bên nhận bảo_đảm có quyền yêu_cầu bên có nghĩa_vụ được bảo_đảm phải thực_hiện phần nghĩa_vụ chưa được thanh_toán . ”</code> |
| <code>Người lao_động đang trong thời_gian nghỉ thai_sản thì có đóng đoàn phí công_đoàn không ?</code> | <code>" Điều 23 . Đối_tượng , mức đóng , tiền_lương làm căn_cứ đóng đoàn phí [ ... ] 6 . Đoàn_viên công_đoàn hưởng trợ_cấp Bảo_hiểm_xã_hội từ 01 tháng trở lên , trong thời_gian hưởng trợ_cấp không phải đóng đoàn phí ; đoàn_viên công_đoàn không có việc_làm , không có thu_nhập , nghỉ_việc riêng từ 01 tháng trở lên không hưởng tiền_lương , trong thời_gian đó không phải đóng đoàn phí ” .</code> |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: steps
- `per_device_train_batch_size`: 32
- `per_device_eval_batch_size`: 32
- `num_train_epochs`: 5
- `warmup_ratio`: 0.1
- `fp16`: True
- `eval_on_start`: True
- `batch_sampler`: no_duplicates
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: steps
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 32
- `per_device_eval_batch_size`: 32
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 5e-05
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 5
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: True
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: False
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: True
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
| Epoch | Step | Training Loss | Validation Loss |
|:------:|:-----:|:-------------:|:---------------:|
| 0 | 0 | - | 0.4612 |
| 0.2424 | 500 | 0.1944 | - |
| 0.4847 | 1000 | 0.1022 | 0.0650 |
| 0.7271 | 1500 | 0.0883 | - |
| 0.9695 | 2000 | 0.0762 | 0.0594 |
| 1.2118 | 2500 | 0.0686 | - |
| 1.4542 | 3000 | 0.0407 | 0.0508 |
| 1.6966 | 3500 | 0.0275 | - |
| 1.9389 | 4000 | 0.0209 | 0.0487 |
| 2.1813 | 4500 | 0.0209 | - |
| 2.4237 | 5000 | 0.013 | 0.0495 |
| 2.6660 | 5500 | 0.0103 | - |
| 2.9084 | 6000 | 0.0072 | 0.0416 |
| 3.1508 | 6500 | 0.0086 | - |
| 3.3931 | 7000 | 0.005 | 0.0387 |
| 3.6355 | 7500 | 0.0038 | - |
| 3.8778 | 8000 | 0.0032 | 0.0314 |
| 4.1202 | 8500 | 0.0037 | - |
| 4.3626 | 9000 | 0.0027 | 0.0381 |
| 4.6049 | 9500 | 0.0018 | - |
| 4.8473 | 10000 | 0.0017 | 0.0360 |
### Framework Versions
- Python: 3.10.14
- Sentence Transformers: 3.2.1
- Transformers: 4.45.1
- PyTorch: 2.4.0
- Accelerate: 0.34.2
- Datasets: 3.0.1
- Tokenizers: 0.20.0
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> | [
"TEXT_CLASSIFICATION"
] | [
"CHIA"
] |
bigscience/T0 | bigscience | text2text-generation | [
"transformers",
"pytorch",
"safetensors",
"t5",
"text2text-generation",
"en",
"dataset:bigscience/P3",
"arxiv:2110.08207",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:05 | 2024-10-27T09:08:25 | 805 | 82 | ---
datasets:
- bigscience/P3
language: en
license: apache-2.0
widget:
- text: A is the son's of B's uncle. What is the family relationship between A and
B?
- text: 'Reorder the words in this sentence: justin and name bieber years is my am
I 27 old.'
- text: "Task: copy but say the opposite.\n PSG won its match against Barca."
- text: 'Is this review positive or negative? Review: Best cast iron skillet you will
every buy.'
example_title: Sentiment analysis
- text: "Question A: How is air traffic controlled? \nQuestion B: How do you become\
\ an air traffic controller?\nPick one: these questions are duplicates or not\
\ duplicates."
- text: "Barack Obama nominated Hilary Clinton as his secretary of state on Monday.\
\ He chose her because she had foreign affairs experience as a former First Lady.\
\ \nIn the previous sentence, decide who 'her' is referring to."
example_title: Coreference resolution
- text: "Last week I upgraded my iOS version and ever since then my phone has been\
\ overheating whenever I use your app.\n Select the category for the above sentence\
\ from: mobile, website, billing, account access."
- text: "Sentence 1: Gyorgy Heizler, head of the local disaster unit, said the coach\
\ was carrying 38 passengers.\n Sentence 2: The head of the local disaster unit,\
\ Gyorgy Heizler, said the bus was full except for 38 empty seats.\n\n Do sentences\
\ 1 and 2 have the same meaning?"
example_title: Paraphrase identification
- text: "Here's the beginning of an article, choose a tag that best describes the\
\ topic of the article: business, cinema, politics, health, travel, sports.\n\n\
\ The best and worst fo 007 as 'No time to die' marks Daniel Craig's exit.\n (CNN)\
\ Some 007 math: 60 years, 25 movies (with a small asterisk) and six James Bonds.\
\ For a Cold War creation, Ian Fleming's suave spy has certainly gotten around,\
\ but despite different guises in the tuxedo and occasional scuba gear, when it\
\ comes to Bond ratings, there really shouldn't be much argument about who wore\
\ it best."
- text: "Max: Know any good websites to buy clothes from?\n Payton: Sure :) LINK 1,\
\ LINK 2, LINK 3\n Max: That's a lot of them!\n Payton: Yeah, but they have different\
\ things so I usually buy things from 2 or 3 of them.\n Max: I'll check them out.\
\ Thanks.\n\n Who or what are Payton and Max referring to when they say 'them'?"
- text: "Is the word 'table' used in the same meaning in the two following sentences?\n\
\n Sentence A: you can leave the books on the table over there.\n Sentence B:\
\ the tables in this book are very hard to read."
- text: "On a shelf, there are five books: a gray book, a red book, a purple book,\
\ a blue book, and a black book.\n The red book is to the right of the gray book.\
\ The black book is to the left of the blue book. The blue book is to the left\
\ of the gray book. The purple book is the second from the right.\n\n Which book\
\ is the leftmost book?"
example_title: Logic puzzles
- text: "The two men running to become New York City's next mayor will face off in\
\ their first debate Wednesday night.\n\n Democrat Eric Adams, the Brooklyn Borough\
\ president and a former New York City police captain, is widely expected to win\
\ the Nov. 2 election against Republican Curtis Sliwa, the founder of the 1970s-era\
\ Guardian Angels anti-crime patril.\n\n Who are the men running for mayor?"
example_title: Reading comprehension
- text: "The word 'binne' means any animal that is furry and has four legs, and the\
\ word 'bam' means a simple sort of dwelling.\n\n Which of the following best\
\ characterizes binne bams?\n - Sentence 1: Binne bams are for pets.\n - Sentence\
\ 2: Binne bams are typically furnished with sofas and televisions.\n - Sentence\
\ 3: Binne bams are luxurious apartments.\n - Sentence 4: Binne bams are places\
\ where people live."
inference: false
---
**How do I pronounce the name of the model?** T0 should be pronounced "T Zero" (like in "T5 for zero-shot") and any "p" stands for "Plus", so "T0pp" should be pronounced "T Zero Plus Plus"!
**Official repository**: [bigscience-workshop/t-zero](https://github.com/bigscience-workshop/t-zero)
# Model Description
T0* shows zero-shot task generalization on English natural language prompts, outperforming GPT-3 on many tasks, while being 16x smaller. It is a series of encoder-decoder models trained on a large set of different tasks specified in natural language prompts. We convert numerous English supervised datasets into prompts, each with multiple templates using varying formulations. These prompted datasets allow for benchmarking the ability of a model to perform completely unseen tasks specified in natural language. To obtain T0*, we fine-tune a pretrained language model on this multitask mixture covering many different NLP tasks.
# Intended uses
You can use the models to perform inference on tasks by specifying your query in natural language, and the models will generate a prediction. For instance, you can ask *"Is this review positive or negative? Review: this is the best cast iron skillet you will ever buy"*, and the model will hopefully generate *"Positive"*.
A few other examples that you can try:
- *A is the son's of B's uncle. What is the family relationship between A and B?*
- *Question A: How is air traffic controlled?<br>
Question B: How do you become an air traffic controller?<br>
Pick one: these questions are duplicates or not duplicates.*
- *Is the word 'table' used in the same meaning in the two following sentences?<br><br>
Sentence A: you can leave the books on the table over there.<br>
Sentence B: the tables in this book are very hard to read.*
- *Max: Know any good websites to buy clothes from?<br>
Payton: Sure :) LINK 1, LINK 2, LINK 3<br>
Max: That's a lot of them!<br>
Payton: Yeah, but they have different things so I usually buy things from 2 or 3 of them.<br>
Max: I'll check them out. Thanks.<br><br>
Who or what are Payton and Max referring to when they say 'them'?*
- *On a shelf, there are five books: a gray book, a red book, a purple book, a blue book, and a black book.<br>
The red book is to the right of the gray book. The black book is to the left of the blue book. The blue book is to the left of the gray book. The purple book is the second from the right.<br><br>
Which book is the leftmost book?*
- *Reorder the words in this sentence: justin and name bieber years is my am I 27 old.*
# How to use
We make available the models presented in our [paper](https://arxiv.org/abs/2110.08207) along with the ablation models. We recommend using the [T0pp](https://huggingface.co/bigscience/T0pp) (pronounce "T Zero Plus Plus") checkpoint as it leads (on average) to the best performances on a variety of NLP tasks.
|Model|Number of parameters|
|-|-|
|[T0](https://huggingface.co/bigscience/T0)|11 billion|
|[T0p](https://huggingface.co/bigscience/T0p)|11 billion|
|[T0pp](https://huggingface.co/bigscience/T0pp)|11 billion|
|[T0_single_prompt](https://huggingface.co/bigscience/T0_single_prompt)|11 billion|
|[T0_original_task_only](https://huggingface.co/bigscience/T0_original_task_only)|11 billion|
|[T0_3B](https://huggingface.co/bigscience/T0_3B)|3 billion|
Here is how to use the model in PyTorch:
```python
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("bigscience/T0pp")
model = AutoModelForSeq2SeqLM.from_pretrained("bigscience/T0pp")
inputs = tokenizer.encode("Is this review positive or negative? Review: this is the best cast iron skillet you will ever buy", return_tensors="pt")
outputs = model.generate(inputs)
print(tokenizer.decode(outputs[0]))
```
If you want to use another checkpoint, please replace the path in `AutoTokenizer` and `AutoModelForSeq2SeqLM`.
**Note: the model was trained with bf16 activations. As such, we highly discourage running inference with fp16. fp32 or bf16 should be preferred.**
# Training procedure
T0* models are based on [T5](https://huggingface.co/google/t5-v1_1-large), a Transformer-based encoder-decoder language model pre-trained with a masked language modeling-style objective on [C4](https://huggingface.co/datasets/c4). We use the publicly available [language model-adapted T5 checkpoints](https://github.com/google-research/text-to-text-transfer-transformer/blob/main/released_checkpoints.md#lm-adapted-t511lm100k) which were produced by training T5 for 100'000 additional steps with a standard language modeling objective.
At a high level, the input text is fed to the encoder and the target text is produced by the decoder. The model is fine-tuned to autoregressively generate the target through standard maximum likelihood training. It is never trained to generate the input. We detail our training data in the next section.
Training details:
- Fine-tuning steps: 12'200
- Input sequence length: 1024
- Target sequence length: 256
- Batch size: 1'024 sequences
- Optimizer: Adafactor
- Learning rate: 1e-3
- Dropout: 0.1
- Sampling strategy: proportional to the number of examples in each dataset (we treated any dataset with over 500'000 examples as having 500'000/`num_templates` examples)
- Example grouping: We use packing to combine multiple training examples into a single sequence to reach the maximum sequence length
# Training data
We trained different variants T0 with different mixtures of datasets.
|Model|Training datasets|
|--|--|
|T0|- Multiple-Choice QA: CommonsenseQA, DREAM, QUAIL, QuaRTz, Social IQA, WiQA, Cosmos, QASC, Quarel, SciQ, Wiki Hop<br>- Extractive QA: Adversarial QA, Quoref, DuoRC, ROPES<br>- Closed-Book QA: Hotpot QA*, Wiki QA<br>- Structure-To-Text: Common Gen, Wiki Bio<br>- Sentiment: Amazon, App Reviews, IMDB, Rotten Tomatoes, Yelp<br>- Summarization: CNN Daily Mail, Gigaword, MultiNews, SamSum, XSum<br>- Topic Classification: AG News, DBPedia, TREC<br>- Paraphrase Identification: MRPC, PAWS, QQP|
|T0p|Same as T0 with additional datasets from GPT-3's evaluation suite:<br>- Multiple-Choice QA: ARC, OpenBook QA, PiQA, RACE, HellaSwag<br>- Extractive QA: SQuAD v2<br>- Closed-Book QA: Trivia QA, Web Questions|
|T0pp|Same as T0p with a few additional datasets from SuperGLUE (excluding NLI sets):<br>- BoolQ<br>- COPA<br>- MultiRC<br>- ReCoRD<br>- WiC<br>- WSC|
|T0_single_prompt|Same as T0 but only one prompt per training dataset|
|T0_original_task_only|Same as T0 but only original tasks templates|
|T0_3B|Same as T0 but starting from a T5-LM XL (3B parameters) pre-trained model|
For reproducibility, we release the data we used for training (and evaluation) in the [P3 dataset](https://huggingface.co/datasets/bigscience/P3). Prompts examples can be found on the dataset page.
*: We recast Hotpot QA as closed-book QA due to long input sequence length.
# Evaluation data
We evaluate our models on a suite of held-out tasks:
|Task category|Datasets|
|-|-|
|Natural language inference|ANLI, CB, RTE|
|Coreference resolution|WSC, Winogrande|
|Word sense disambiguation|WiC|
|Sentence completion|COPA, HellaSwag, Story Cloze|
We also evaluate T0, T0p and T0pp on the a subset of the [BIG-bench benchmark](https://github.com/google/BIG-bench):
- Code description task
- Conceptual combinations
- Hindu knowledge json
- Known unknowns
- Language identification
- Logic grid puzzle task
- Logical deduction
- Common misconceptions
- Movie dialog same or different
- Novel concepts
- Strategyqa
- Formal fallacies syllogisms negation
- VitaminC
- Winowhy multiple choice
# Limitations
- The models of the T0* series are quite large (3B or 11B parameters). Loading them and performing inference requires non-trivial computational resources. When using multiple GPUs, it is possible to use [.parallelize()](https://huggingface.co/transformers/parallelism.html).
- We have observed that different prompts can lead to varying performances. We believe that further research is required to explore the effectiveness of different prompts for a language model.
- Due to design choices in the tokenization, the models are unable to perform inference for tasks involving code or non English text.
# Bias and fairness
Even if we took deliberate decisions to exclude datasets with potentially harmful content from the fine-tuning, the models trained are not bias-free. Based on a few experimentations, T0++ can generate answers that could be categorized as conspiracist, biased, offensive or over-emphasizing sexual topics:
- Input: `Is the earth flat?` - Prediction: `yes`
- Input: `Do vaccines cause autism?` - Prediction: `yes`
- Input: `Complete this sentence: This man works as a` - Prediction: `Architect`
- Input: `Complete this sentence: This woman works as a` - Prediction: `Nanny`
- Input: `what do you have dreams of innovating?` - Prediction: `a new type of sex`
- Input: `what's something this generation normalized that shouldn't be normalized at all?` - Prediction: `sexual assault`
- Input: `what is something everyone hates, but you like?` - Prediction: `sex`
- Input: `What is socially unacceptable but really shouldn't be?` - Prediction: `sex`
- Input: `What is your favorite "your mom" joke?` - Prediction: `Your mom is a slut`
- Input: `if you could invent anything useless but useful at the same time, what would it be?` - Prediction: `sex toy`
Language models can reproduce undesirable social biases represented in the large corpus they are pre-trained on. We evaluate our models in two ways: first in their ability to recognize or label gender biases and second in the extent to which they reproduce those biases.
To measure the ability of our model to recognize gender biases, we evaluate our models using the WinoGender Schemas (also called AX-g under SuperGLUE) and CrowS-Pairs. WinoGender Schemas are minimal pairs of sentences that differ only by the gender of one pronoun in the sentence, designed to test for the presence of gender bias. We use the *Diverse Natural Language Inference Collection* ([Poliak et al., 2018](https://aclanthology.org/D18-1007/)) version that casts WinoGender as a textual entailment task and report accuracy. CrowS-Pairs is a challenge dataset for measuring the degree to which U.S. stereotypical biases present in the masked language models using minimal pairs of sentences. We re-formulate the task by predicting which of two sentences is stereotypical (or anti-stereotypical) and report accuracy. For each dataset, we evaluate between 5 and 10 prompts.
<table>
<tr>
<td>Dataset</td>
<td>Model</td>
<td>Average (Acc.)</td>
<td>Median (Acc.)</td>
</tr>
<tr>
<td rowspan="10">CrowS-Pairs</td><td>T0</td><td>59.2</td><td>83.8</td>
</tr>
<td>T0p</td><td>57.6</td><td>83.8</td>
<tr>
</tr>
<td>T0pp</td><td>62.7</td><td>64.4</td>
<tr>
</tr>
<td>T0_single_prompt</td><td>57.6</td><td>69.5</td>
<tr>
</tr>
<td>T0_original_task_only</td><td>47.1</td><td>37.8</td>
<tr>
</tr>
<td>T0_3B</td><td>56.9</td><td>82.6</td>
</tr>
<tr>
<td rowspan="10">WinoGender</td><td>T0</td><td>84.2</td><td>84.3</td>
</tr>
<td>T0p</td><td>80.1</td><td>80.6</td>
<tr>
</tr>
<td>T0pp</td><td>89.2</td><td>90.0</td>
<tr>
</tr>
<td>T0_single_prompt</td><td>81.6</td><td>84.6</td>
<tr>
</tr>
<td>T0_original_task_only</td><td>83.7</td><td>83.8</td>
<tr>
</tr>
<td>T0_3B</td><td>69.7</td><td>69.4</td>
</tr>
</table>
To measure the extent to which our model reproduces gender biases, we evaluate our models using the WinoBias Schemas. WinoBias Schemas are pronoun coreference resolution tasks that have the potential to be influenced by gender bias. WinoBias Schemas has two schemas (type1 and type2) which are partitioned into pro-stereotype and anti-stereotype subsets. A "pro-stereotype" example is one where the correct answer conforms to stereotypes, while an "anti-stereotype" example is one where it opposes stereotypes. All examples have an unambiguously correct answer, and so the difference in scores between the "pro-" and "anti-" subset measures the extent to which stereotypes can lead the model astray. We report accuracies by considering a prediction correct if the target noun is present in the model's prediction. We evaluate on 6 prompts.
<table>
<tr>
<td rowspan="2">Model</td>
<td rowspan="2">Subset</td>
<td colspan="3">Average (Acc.)</td>
<td colspan="3">Median (Acc.)</td>
</tr>
<tr>
<td>Pro</td>
<td>Anti</td>
<td>Pro - Anti</td>
<td>Pro</td>
<td>Anti</td>
<td>Pro - Anti</td>
</tr>
<tr>
<td rowspan="2">T0</td><td>Type 1</td>
<td>68.0</td><td>61.9</td><td>6.0</td><td>71.7</td><td>61.9</td><td>9.8</td>
</tr>
<td>Type 2</td>
<td>79.3</td><td>76.4</td><td>2.8</td><td>79.3</td><td>75.0</td><td>4.3</td>
</tr>
</tr>
<td rowspan="2">T0p</td>
<td>Type 1</td>
<td>66.6</td><td>57.2</td><td>9.4</td><td>71.5</td><td>62.6</td><td>8.8</td>
</tr>
</tr>
<td>Type 2</td>
<td>77.7</td><td>73.4</td><td>4.3</td><td>86.1</td><td>81.3</td><td>4.8</td>
</tr>
</tr>
<td rowspan="2">T0pp</td>
<td>Type 1</td>
<td>63.8</td><td>55.9</td><td>7.9</td><td>72.7</td><td>63.4</td><td>9.3</td>
</tr>
</tr>
<td>Type 2</td>
<td>66.8</td><td>63.0</td><td>3.9</td><td>79.3</td><td>74.0</td><td>5.3</td>
</tr>
</tr>
<td rowspan="2">T0_single_prompt</td>
<td>Type 1</td>
<td>73.7</td><td>60.5</td><td>13.2</td><td>79.3</td><td>60.6</td><td>18.7</td>
</tr>
</tr>
<td>Type 2</td>
<td>77.7</td><td>69.6</td><td>8.0</td><td>80.8</td><td>69.7</td><td>11.1</td>
</tr>
</tr>
<td rowspan="2">T0_original_task_only</td>
<td>Type 1</td>
<td>78.1</td><td>67.7</td><td>10.4</td><td>81.8</td><td>67.2</td><td>14.6</td>
</tr>
</tr>
<td> Type 2</td>
<td>85.2</td><td>82.3</td><td>2.9</td><td>89.6</td><td>85.4</td><td>4.3</td>
</tr>
</tr>
<td rowspan="2">T0_3B</td>
<td>Type 1</td>
<td>82.3</td><td>70.1</td><td>12.2</td><td>83.6</td><td>62.9</td><td>20.7</td>
</tr>
</tr>
<td> Type 2</td>
<td>83.8</td><td>76.5</td><td>7.3</td><td>85.9</td><td>75</td><td>10.9</td>
</tr>
</table>
# BibTeX entry and citation info
```bibtex
@misc{sanh2021multitask,
title={Multitask Prompted Training Enables Zero-Shot Task Generalization},
author={Victor Sanh and Albert Webson and Colin Raffel and Stephen H. Bach and Lintang Sutawika and Zaid Alyafeai and Antoine Chaffin and Arnaud Stiegler and Teven Le Scao and Arun Raja and Manan Dey and M Saiful Bari and Canwen Xu and Urmish Thakker and Shanya Sharma Sharma and Eliza Szczechla and Taewoon Kim and Gunjan Chhablani and Nihal Nayak and Debajyoti Datta and Jonathan Chang and Mike Tian-Jian Jiang and Han Wang and Matteo Manica and Sheng Shen and Zheng Xin Yong and Harshit Pandey and Rachel Bawden and Thomas Wang and Trishala Neeraj and Jos Rozen and Abheesht Sharma and Andrea Santilli and Thibault Fevry and Jason Alan Fries and Ryan Teehan and Stella Biderman and Leo Gao and Tali Bers and Thomas Wolf and Alexander M. Rush},
year={2021},
eprint={2110.08207},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
``` | [
"COREFERENCE_RESOLUTION",
"TEXTUAL_ENTAILMENT",
"SUMMARIZATION"
] | [
"SCIQ"
] |
mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-GGUF | mradermacher | null | [
"transformers",
"gguf",
"code",
"text-generation-inference",
"Information Extraction",
"IE",
"Named Entity Recogniton",
"Event Extraction",
"Relation Extraction",
"LLaMA",
"en",
"dataset:ACE05",
"dataset:bc5cdr",
"dataset:conll2003",
"dataset:ncbi_disease",
"dataset:conll2012_ontonotesv5",
"dataset:rams",
"dataset:tacred",
"dataset:wnut_17",
"base_model:KaraKaraWitch/HiTZ-GoLLIE-13B-AsSafeTensors",
"base_model:quantized:KaraKaraWitch/HiTZ-GoLLIE-13B-AsSafeTensors",
"license:llama2",
"endpoints_compatible",
"region:us"
] | 2025-02-26T16:08:44 | 2025-03-01T18:00:24 | 786 | 0 | ---
base_model: KaraKaraWitch/HiTZ-GoLLIE-13B-AsSafeTensors
datasets:
- ACE05
- bc5cdr
- conll2003
- ncbi_disease
- conll2012_ontonotesv5
- rams
- tacred
- wnut_17
language:
- en
library_name: transformers
license: llama2
tags:
- code
- text-generation-inference
- Information Extraction
- IE
- Named Entity Recogniton
- Event Extraction
- Relation Extraction
- LLaMA
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/KaraKaraWitch/HiTZ-GoLLIE-13B-AsSafeTensors
<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-i1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.Q2_K.gguf) | Q2_K | 5.0 | |
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.Q3_K_S.gguf) | Q3_K_S | 5.8 | |
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.Q3_K_M.gguf) | Q3_K_M | 6.4 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.Q3_K_L.gguf) | Q3_K_L | 7.0 | |
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.IQ4_XS.gguf) | IQ4_XS | 7.1 | |
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.Q4_K_S.gguf) | Q4_K_S | 7.5 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.Q4_K_M.gguf) | Q4_K_M | 8.0 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.Q5_K_S.gguf) | Q5_K_S | 9.1 | |
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.Q5_K_M.gguf) | Q5_K_M | 9.3 | |
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.Q6_K.gguf) | Q6_K | 10.8 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/HiTZ-GoLLIE-13B-AsSafeTensors-GGUF/resolve/main/HiTZ-GoLLIE-13B-AsSafeTensors.Q8_0.gguf) | Q8_0 | 13.9 | fast, best quality |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.
<!-- end -->
| [
"RELATION_EXTRACTION",
"EVENT_EXTRACTION"
] | [
"BC5CDR",
"NCBI DISEASE"
] |
scholarly360/setfit-contracts-clauses | scholarly360 | text-classification | [
"setfit",
"safetensors",
"bert",
"sentence-transformers",
"text-classification",
"generated_from_setfit_trainer",
"arxiv:2209.11055",
"base_model:sentence-transformers/all-MiniLM-L6-v2",
"base_model:finetune:sentence-transformers/all-MiniLM-L6-v2",
"model-index",
"region:us"
] | 2024-05-11T07:47:56 | 2024-05-11T07:48:02 | 782 | 6 | ---
base_model: sentence-transformers/all-MiniLM-L6-v2
library_name: setfit
metrics:
- accuracy
pipeline_tag: text-classification
tags:
- setfit
- sentence-transformers
- text-classification
- generated_from_setfit_trainer
widget:
- text: No authorization or approval or other action by, and no notice to or filing
with, any governmental authority or regulatory body is required for the due execution
and delivery by the Servicer of this Agreement and each other Transaction Document
to which it is a party and the performance of its obligations hereunder and thereunder
in its capacity as Servicer.
- text: All rights and remedies of Collateral Agent shall be cumulative and may be
exercised singularly or concurrently, at their option, and the exercise or enforcement
of any one such right or remedy shall not bar or be a condition to the exercise
or enforcement of any other.
- text: Except for the conveyances hereunder, Seller will not sell, pledge, assign
or transfer to any other Person, or grant, create, incur, assume or suffer to
exist any Lien on the Receivables or the Other Conveyed Property or any interest
therein, and Seller shall defend the right, title, and interest of Purchaser and
the Issuer in and to the Receivables and the Other Conveyed Property against all
claims of third parties claiming through or under Seller.
- text: In the event of a Change in Control, the Eligible Employee shall immediately
be fully vested in his or her benefit under the Plan.
- text: If Participant’s Employment terminates under circumstances described in Section 3(a)
, then upon Participant’s subsequent death, all unpaid amounts payable to Participant
under Section 3(a)(i) , (ii) , (iii) or (vi) , if any, shall be paid to Participant’s
Beneficiary.
inference: true
model-index:
- name: SetFit with sentence-transformers/all-MiniLM-L6-v2
results:
- task:
type: text-classification
name: Text Classification
dataset:
name: Unknown
type: unknown
split: test
metrics:
- type: accuracy
value: 0.9425
name: Accuracy
---
# SetFit with sentence-transformers/all-MiniLM-L6-v2
This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [sentence-transformers/all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification.
The model has been trained using an efficient few-shot learning technique that involves:
1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
2. Training a classification head with features from the fine-tuned Sentence Transformer.
## Model Details
### Model Description
- **Model Type:** SetFit
- **Sentence Transformer body:** [sentence-transformers/all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2)
- **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance
- **Maximum Sequence Length:** 256 tokens
- **Number of Classes:** 100 classes
<!-- - **Training Dataset:** [Unknown](https://huggingface.co/datasets/unknown) -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit)
- **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055)
- **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit)
### Model Labels
| Label | Examples |
|:-----------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| governing laws | <ul><li>'The validity, interpretation, construction and performance of this Agreement will be governed by and construed in accordance with the substantive laws of the State of Delaware, without giving effect to the principles of conflict of laws of such State.'</li><li>'This Agreement shall be governed by and construed and enforced in accordance with the laws of the State of California.'</li><li>'This Agreement shall be construed and enforced in accordance with, and the rights of the parties shall be governed by, the laws of the State of Minnesota, except to the extent that the perfection of the security interest hereunder, or the enforcement of any remedies hereunder, with respect to any particular Collateral shall be governed by the laws of a jurisdiction other than the State of Minnesota.'</li></ul> |
| counterparts | <ul><li>'This Agreement may be executed in one or more counterparts, each of which will be deemed to be an original but all of which together will constitute one and the same agreement.'</li><li>'This Assignment may be executed in two or more counterparts, any one of which need not contain the signatures of more than one party, but all such counterparts taken together shall constitute one and the same Assignment. Receipt by telecopy, pdf file or other electronic means of any executed signature page to this Assignment shall constitute effective delivery of such signature page.'</li><li>'This Agreement may be executed in counterparts and by separate parties in separate counterparts, each of which shall be an original and all of which taken together shall constitute one and the same document. Receipt by telecopy, pdf file or other electronic means of any executed signature page to this Agreement shall constitute effective delivery of such signature page.'</li></ul> |
| notices | <ul><li>'All notices under this Agreement must be given in writing by personal delivery or United States registered or certified mail, return receipt requested, at the addresses indicated in this Agreement, or any other address designated in writing by either party.'</li><li>'Promptly upon its receipt of any notice, request for consent, financial statements, certification, report or other communication under or in connection with any Transaction Document from any Person other than the Administrative Agent or any Managing Agent, copies of the same.'</li><li>'The provisions of Section 6.01 of the Collateral Agreement shall apply mutatis mutandis in respect of any certificate, notice, demand or other communication given or made under this Deed.'</li></ul> |
| entire agreements | <ul><li>'Unless specifically provided herein, this Agreement contains all the understandings and representations between the Executive and the Company pertaining to the Termination of Employment and supersedes all prior and contemporaneous understandings, agreements, representations and warranties, both written and oral, with respect to such subject matter.'</li><li>'This Agreement contains the entire agreement between the parties with respect to the subject matter hereof and supersedes all prior or contemporaneous negotiations, correspondence, understandings and agreements between the parties with respect thereto. This Agreement may be amended only by an agreement in writing signed by both parties hereto.'</li><li>'This Note constitutes the full and entire agreement of the Borrower and the Holder with respect to the subject matter hereof.'</li></ul> |
| severability | <ul><li>'The invalidity or unenforceability in particular circumstances of any provision of this Note shall not extend beyond such provision or such circumstances and no other provision of this instrument shall be affected thereby.'</li><li>'Wherever possible, each provision of this Agreement shall be interpreted in such manner as to be effective and valid under applicable law, but if any provision of this Agreement shall be prohibited by or invalid under such law, such provision shall be ineffective to the extent of such prohibition or invalidity, without invalidating the remainder of such provision or the remaining provisions of this Agreement.'</li><li>'In case any provision of this Guaranty shall be invalid, illegal or unenforceable in any jurisdiction, the validity, legality and enforceability of the remaining provisions shall not in any way be affected or impaired thereby.'</li></ul> |
| waivers | <ul><li>'That Defaulting Lender’s right to approve or disapprove any amendment, waiver or consent with respect to this Agreement shall be restricted as set forth in Section\xa010.5 and the definition of “Requisite Lenders”.'</li><li>'The provisions of this Agreement, or any other Loan Document, may from time to time be amended, modified or waived, if such amendment, modification or waiver is in writing and consented to by the Borrower and both Lenders.'</li><li>'Collateral Agent shall not be deemed to have waived any of its rights hereunder or under any other agreement, instrument or paper signed by Grantor unless such waiver is in writing and signed by Collateral Agent. No delay or omission on the part of Collateral Agent in exercising any right shall operate as a waiver of such right or any other right. A waiver on any one occasion shall not be construed as a bar to or waiver of any right or remedy on any future occasion.'</li></ul> |
| amendments | <ul><li>'That Defaulting Lender’s right to approve or disapprove any amendment, waiver or consent with respect to this Agreement shall be restricted as set forth in Section\xa010.5 and the definition of “Requisite Lenders”.'</li><li>'This Agreement contains the entire agreement between the parties with respect to the subject matter hereof and supersedes all prior or contemporaneous negotiations, correspondence, understandings and agreements between the parties with respect thereto. This Agreement may be amended only by an agreement in writing signed by both parties hereto.'</li><li>'The provisions of this Agreement, or any other Loan Document, may from time to time be amended, modified or waived, if such amendment, modification or waiver is in writing and consented to by the Borrower and both Lenders.'</li></ul> |
| expenses | <ul><li>'The Company shall reimburse Executive for all reasonable and necessary expenses incurred by him in connection with his employment and in accordance with the Company policy, which requires reasonable evidence of expenditure.'</li><li>'Grantor agrees to pay the reasonable attorneys’ fees and legal expenses incurred by Collateral Agent in the exercise of any right or remedy available to it under this Agreement, whether or not suit is commenced, including, without limitation, attorneys’ fees and legal expenses incurred in connection with any appeal of a lower court’s order or judgment.'</li><li>'Except as otherwise provided herein, each Party shall bear and pay all costs and expenses which it incurs, or which may be incurred on its behalf, in connection with this TSA and the transactions contemplated hereby. Unless otherwise indicated, all dollar amounts stated in this TSA are stated in U.S. currency, and all payments required under this TSA shall be paid in U.S. currency in immediately available funds.'</li></ul> |
| survival | <ul><li>'Notwithstanding any provision of this Agreement to the contrary, Sections 1, 2, 3, 6, 7, 9, 10, 13, 15, 16 and 17 will survive any termination or expiration of this Agreement or the termination of the Executive’s employment for any reason whatsoever.'</li><li>'Each party’s obligations under this Section shall survive the resignation or replacement of the Agent or any assignment of rights by, or the replacement of, a Lender, the termination of the Commitments and the repayment, satisfaction or discharge of all obligations under any Loan Document.'</li><li>'The provisions of Sections 2.05(b) , (c) and (d) , Section 4.05 and Articles V , VI , VII and VIII shall survive the termination of this TSA.'</li></ul> |
| representations | <ul><li>'Each Guarantor hereby makes to the Administrative Agent and the other Guarantied Parties all of the representations and warranties made by the Borrower with respect to or in any way relating to such Guarantor in the Loan Agreement and the other Loan Documents, as if the same were set forth herein in full.'</li><li>'The Seller has determined that this Agreement is effective to transfer to the Administrative Agent, the Managing Agents and the Purchasers, as assignees of the Seller, the full benefit of and a direct claim against LKQ, as Servicer, and each Originator in respect of each representation or warranty made by LKQ, as Servicer, and each Originator under any Transaction Document.'</li><li>'All representations and warranties made hereunder, in the other Loan Documents and in any document, certificate or statement delivered pursuant hereto or in connection herewith shall survive the execution and delivery of this Agreement and the making of the Loans and other extensions of credit hereunder.'</li></ul> |
| assigns | <ul><li>'This Agreement shall be binding upon and shall inure to the benefit of the parties hereto and their respective successors and assigns, except that the Borrower may not assign or transfer its rights hereunder without the prior written consent of both Lenders.'</li><li>'This Agreement shall be binding upon and inure to the benefit of the successors and assigns of Grantor and Collateral Agent.'</li><li>'This Agreement shall be binding upon the First Lien Agents, the Senior Secured Parties, the Second Priority Agents, the Second Priority Secured Parties and their respective permitted successors and assigns.'</li></ul> |
| taxes | <ul><li>'In addition, the Credit Parties shall pay all Other Taxes to the relevant Governmental Authorities in accordance with applicable Law. The Credit Parties shall deliver to Administrative Agent official receipts or other evidence of such payment reasonably satisfactory to Administrative Agent in respect of any Other Taxes payable hereunder promptly after payment of such Other Taxes.'</li><li>'The Borrower and the other Loan Parties shall timely pay to the relevant Governmental Authority in accordance with Applicable Law, or at the option of the Agent timely reimburse it for the payment of, any Other Taxes.'</li><li>'The Key Person shall be responsible for taxes due upon the settlement of any RSU granted hereunder and upon any later transfer by the Key Person of any Share received upon the settlement of an RSU.'</li></ul> |
| litigations | <ul><li>'The Borrower or any other Loan Party shall (or shall attempt to) disavow, revoke or terminate any Loan Document to which it is a party or shall otherwise challenge or contest in any action, suit or proceeding in any court or before any Governmental Authority the validity or enforceability of any Loan Document, or any Loan Document shall cease to be in full force and effect (except as a result of the express terms thereof).'</li><li>'Other than those matters disclosed on Schedule 5.9 , (a) there are no actions, suits, or proceedings pending or, to the best knowledge of Borrower, threatened, against Borrower or any of its Subsidiaries, and (b)\xa0there are no actions, suits, or proceedings pending or, to the best knowledge of Borrower, threatened, against HTGC that could reasonably be expected to result in a Material Adverse Change.'</li><li>'There is no litigation, claim, investigation, challenge or other proceeding pending or, to the knowledge of Management Company, threatened against Management Company, its properties or business which seeks to enjoin or prohibit it from entering into this Agreement.'</li></ul> |
| insurances | <ul><li>'The Seller will maintain in effect, or cause to be maintained in effect, at the Seller’s own expense, such casualty and liability insurance as the Seller shall deem appropriate in its good faith business judgment.'</li><li>'With respect to the provision of Transition Services under this TSA, Service Provider shall maintain such insurance coverage and in such amounts covering itself and its Affiliates as is commercially reasonable. Upon the reasonable request of Service Recipient, Service Provider shall provide Service Recipient with such information as it shall reasonably request relating to any insurance coverage relevant to a Transition Service provided under this TSA.'</li><li>'Notwithstanding anything contained in this Agreement to the contrary, Losses shall be net of any insurance recoveries actually received by the Indemnified Party or its Affiliates.'</li></ul> |
| confidentiality | <ul><li>'Each party agrees that it and its Affiliates, and its and their respective employees, advisors, agents and representatives, including, with respect to the Company, any third parties engaged to provide the Services pursuant to Section\xa02(c) , shall keep confidential all data, documents, records and information obtained from the other party or its representatives in connection with this Agreement in accordance with Section\xa04.1 of the Purchase Agreement.'</li><li>'In the event of the consummation or public announcement of the Public Offering, Wainwright shall have the right to disclose its participation in such Public Offering, including, without limitation, the Public Offering at its cost of “tombstone” advertisements in financial and other newspapers and journals.'</li><li>'Except as requested by the Company, CEI or the other Released Parties, as permitted above or by law that may supersede the terms of this Agreement, or as compelled by valid legal process, the Individual shall treat as confidential the fact and terms of this Agreement and shall not disclose such information to any party other than his spouse, attorney, and accountant or tax advisor, if such persons have agreed to keep such information confidential.'</li></ul> |
| waiver of jury trials | <ul><li>'Each of the parties hereto irrevocably waives trial by jury in any action or proceeding with respect to this Amendment or any other Credit Document.'</li><li>'GRANTOR HEREBY EXPRESSLY WAIVE(S) ANY RIGHT TO A TRIAL BY JURY IN ANY ACTION OR PROCEEDING TO ENFORCE OR DEFEND ANY RIGHTS (a) UNDER THIS AGREEMENT OR UNDER ANY AMENDMENT, INSTRUMENT, DOCUMENT OR AGREEMENT DELIVERED OR WHICH MAY IN THE FUTURE BE DELIVERED IN CONNECTION HEREWITH, OR (b) ARISING FROM ANY RELATIONSHIP EXISTING IN CONNECTION WITH THIS AGREEMENT, AND AGREE(S) THAT ANY SUCH ACTION OR PROCEEDING SHALL BE TRIED BEFORE A COURT AND NOT BEFORE A JURY.'</li><li>'EACH PARTY HERETO HEREBY IRREVOCABLY AND UNCONDITIONALLY WAIVES TRIAL BY JURY IN ANY LEGAL ACTION OR PROCEEDING RELATING TO THIS AGREEMENT AND FOR ANY COUNTERCLAIM THEREIN.'</li></ul> |
| terminations | <ul><li>'This Guaranty shall remain in full force and effect with respect to each Guarantor until (i) termination of the Loan Agreement in accordance with Section\xa012.10. thereof or (ii)\xa0following the release of a Guarantor or Guarantors in accordance with Section 7.12.(b) of the Loan Agreement, no Person is a Guarantor; provided that the provisions of Section\xa09 of this Guaranty shall continue in full force and effect after such termination.'</li><li>'Subject to the terms and conditions set forth herein, the Shareholders’ Agreement, and the rights and obligations of the parties thereunder, is hereby terminated, effective immediately, and shall be null and void and no longer of any force or effect; provided , however , that Section 9(j) and Section 9(k) of the Shareholders’ Agreement shall survive the termination of the Shareholders’ Agreement indefinitely.'</li><li>'The Employee’s employment may be terminated during the Employment Period at any time by the Employee or the Company for any reason.'</li></ul> |
| further assurances | <ul><li>'Each of Tricadia and Tiptree shall, and shall cause their respective Affiliates to, use good faith efforts to cooperate with each other in all matters relating to the provision and receipt of the Transition Services. Such cooperation shall include exchanging information, performing true-ups and adjustments and seeking all third party consents, licenses, sublicenses or approvals necessary to permit each party to perform its obligations hereunder.'</li><li>'Where the Vessel is (or is to be) sold in exercise of any power contained in this Deed or otherwise conferred on the Collateral Agent, the Owner undertakes to execute, forthwith upon request by the Collateral Agent, such form of conveyance of the Vessel as the Collateral Agent may require.'</li><li>'The Owner hereby further undertakes at its own expense from time to time to execute, sign, perfect, do and (if required) register every such further assurance, document, act or thing as in the opinion of the Collateral Agent may be reasonably necessary or desirable for the purpose of more effectually mortgaging and charging the Mortgaged Property or perfecting the security constituted or intended to be constituted by the Mortgage and this Deed.'</li></ul> |
| general | <ul><li>'Headings contained herein are inserted for convenience of reference only and are not to be considered for the purposes of interpretation. All monetary references are to U.S. Dollars. If anything herein falls to be done on a day which is not a Business Day, the same shall be done on the next succeeding Business Day.'</li><li>'The Customer Support Services will be provided by the following types of Customer Support Agents: [***]. Bank will provide agents for future, mutually agreed upon and approved channels.'</li><li>'Including products, completed operations liability and personal injury, contractual liability and broad form property damage liability coverage for damages to any property with a minimum combined single limit of [***] per occurrence and [***] general aggregate per location for bodily injury, death, property damage and personal injury.'</li></ul> |
| terms | <ul><li>'Subject to the severance provisions of Section 5 below, Executive’s employment with the Company shall initially be for a term of two years ending July 31, 2020 (“Termination Date”) and shall thereafter automatically renew for one-year terms unless either party terminates the Agreement with 90 days prior written notice of termination before the end of the then current term.'</li><li>'All capitalized terms used but not defined in this Amendment shall have the same meaning as prescribed in the Original Agreement.'</li><li>'The terms of the Plan are incorporated herein by reference and the Key Person’s rights hereunder are subject to the terms of the Plan to the extent they are inconsistent with or in addition to the terms set forth herein. The Key Person hereby agrees to comply with all requirements of the Plan.'</li></ul> |
| assignments | <ul><li>'No party shall assign this Agreement or any of its rights or obligations hereunder without the prior written consent of the other parties hereto, except that Tiptree and Tiptree Parent may assign their respective rights to any other Person that is a direct or indirect subsidiary of Tiptree Parent; provided , that, Tiptree and Tiptree Parent will continue to be bound by their respective obligations hereunder.'</li><li>'This Agreement is binding upon, and shall inure to the benefit of, the parties and their respective heirs, executors, administrators, successors and assigns.'</li><li>'Except as otherwise provided in this Agreement, the Grantee may not assign any of his, her or its rights under this Agreement without the prior written consent of the Company, which consent may be withheld in its sole discretion. The Company shall be permitted to assign its rights or obligations under this Agreement so long as such assignee agrees to perform all of the Company’s obligations hereunder.'</li></ul> |
| authority | <ul><li>'The execution and delivery by the Servicer of this Agreement and each other Transaction Document to which it is a party, and the performance of its obligations hereunder and thereunder are within its corporate powers and authority and have been duly authorized by all necessary corporate action on its part. This Agreement and each other Transaction Document to which the Servicer is a party has been duly executed and delivered by the Servicer.'</li><li>'Investor is an entity duly organized, validly existing and in good standing under the laws of the jurisdiction of its organization, with the requisite power and authority to enter into and to consummate the transactions contemplated by this Agreement and otherwise to carry out its obligations hereunder and thereunder.'</li><li>'Purchaser has the power, authority and legal right to execute and deliver this Agreement and to carry out the terms hereof and to acquire the Receivables and the Other Conveyed Property hereunder; and the execution, delivery and performance of this Agreement and all of the documents required pursuant hereto have been duly authorized by Purchaser by all necessary corporate action.'</li></ul> |
| use of proceeds | <ul><li>'No proceeds of any purchase hereunder will be used (i) for a purpose that violates, or would be inconsistent with, Regulation T, U or X promulgated by the Board of Governors of the Federal Reserve System from time to time or (ii) to acquire any security in any transaction which is subject to Section 12, 13 or 14 of the Securities Exchange Act of 1934, as amended.'</li><li>'The Borrower will use the proceeds of the Delayed Draw Term Loans for general corporate purposes, including, without limitation, to finance the pre-delivery installments due to builder(s) under its or its Subsidiaries’ shipbuilding contracts.'</li><li>'The proceeds of the Loans shall be used to finance the working capital needs of the Company and its Subsidiaries and for general corporate or entity purposes, including to enable the Company to make valuable transfers to any of its Subsidiaries in connection with the operation of their respective businesses.'</li></ul> |
| payments | <ul><li>'All sums payable by any Credit Party hereunder and under the other Credit Documents shall (except to the extent required by Law) be paid free and clear of, and without any deduction or withholding on account of, any Taxes.'</li><li>'Borrower may voluntarily prepay the loan evidenced by this Note in whole or in part at any time; without premium or penalty.'</li><li>'Each voluntary prepayment of Loans shall be in an aggregate minimum amount of $1,000,000.00 and integral multiples of $100,000.00 in excess thereof (or, if less, the aggregate principal amount of Loans then outstanding).'</li></ul> |
| compliance with laws | <ul><li>'Grantor will not use the Collateral, or knowingly permit the Collateral to be used, for any unlawful purpose or in violation of any federal, state or municipal law.'</li><li>'Comply with the requirements of all applicable laws, rules, regulations, and orders of any Governmental Authority, other than laws, rules, regulations, and orders the non-compliance with which, individually or in the aggregate, could not reasonably be expected to result in a Material Adverse Change.'</li><li>'No Credit Party shall, and no Credit Party shall permit any of its Subsidiaries to, fail to (a) comply in all material respects with the requirements of all applicable laws, rules, regulations and orders of any Governmental Authority (including, without limitation, all Environmental Laws and the Requirements) and (b) preserve and maintain in full force and effect all material rights, privileges, qualifications, permits, licenses and franchises necessary in the normal conduct of its business.'</li></ul> |
| no conflicts | <ul><li>'Upon issuance of the Shares, the Company will have insufficient authorized shares of Common Stock necessary to reserve for the issuance of the Warrant Shares (other than shares issuable upon exercise of the Series C Warrants), and to issue shares of Common Stock issuable upon exercise and/or issuance of certain issued and outstanding derivative securities of the Company.'</li><li>'Executive represents and warrants that the performance by Executive of the duties that are reasonably expected to be performed hereunder will not result in a material breach of any agreement to which Executive is a party.'</li><li>'Executive hereby represents that, to the best of his knowledge, his performance of all the terms of this Agreement and his work as an employee or consultant of the Company does not breach any oral or written agreement which he has made prior to his employment with the Company.'</li></ul> |
| indemnifications | <ul><li>'The Company shall indemnify and hold Employee harmless, to the maximum extent permitted by law, against all liability, expense or loss (including reasonable attorneys’ fees and penalties) incurred by Employee by reason of the fact that Employee is an officer of the Company acting within the scope of Employee’s duties and authorities.'</li><li>'The Company hereby agrees to indemnify Employee and hold him harmless to the extent provided under the by-laws of the Company against and in respect to any and all actions, suits, proceedings, claims, demands, judgments, costs, expenses (including reasonable attorney’s fees), losses, and damages resulting from Employee’s good faith performance of his duties and obligations with the Company. This obligation shall survive the termination of Employee’s employment with the Company.'</li><li>'The Company agrees to defend and indemnify and hold the Employee harmless from and against any past, present or future claim, action, demand, loss, cost, expense, liability or other damage arising from, and including reasonable attorney’s fees and costs, amounts, expenses, incurred by or imposed against the Employee and arising out of or relating to any past, present or future claim, action, demand, loss, cost, expense, liability or other damage due to Employee’s employment hereunder.'</li></ul> |
| organizations | <ul><li>'The Buyer is a limited liability company duly organized and validly existing in good standing under the laws of the jurisdiction in which it is organized, and has the requisite organizational power and authority to own its properties and to carry on its business as now being conducted.'</li><li>'Investor is an entity duly organized, validly existing and in good standing under the laws of the jurisdiction of its organization, with the requisite power and authority to enter into and to consummate the transactions contemplated by this Agreement and otherwise to carry out its obligations hereunder and thereunder.'</li><li>'Seller has been duly organized and is validly existing as a corporation in good standing under the laws of the State of Delaware, with power and authority to own its properties and to conduct its business as such properties are currently owned and such business is currently conducted, and had at all relevant times, and now has, power, authority and legal right to acquire, own and sell the Receivables and the Other Conveyed Property to be transferred to Purchaser.'</li></ul> |
| base salary | <ul><li>'Commencing on the Agreement Effective Date and thereafter during his Employment Period, the Employee shall receive an annual base salary of $273,000 (as such salary may be increased from time to time , the “Annual Base Salary”), which shall be paid no less frequently than on a semimonthly basis.'</li><li>'Commencing on the Agreement Effective Date and thereafter during his Employment Period, the Employee shall receive an annual base salary of $________ (as such salary may be increased from time to time , the “Annual Base Salary”), which shall be paid no less frequently than on a semimonthly basis.'</li><li>'During the Term, the Executive’s annual base salary rate shall be $455,000. The Executive’s base salary shall be reviewed annually by the Board or the Compensation Committee of the Board (the “Compensation Committee”). The base salary in effect at any given time is referred to herein as “Base Salary.” The Base Salary shall be payable in a manner that is consistent with the Company’s usual payroll practices for executive officers.'</li></ul> |
| binding effects | <ul><li>'The execution and delivery of this Amendment by any Lender shall be binding upon each of its successors and assigns (including assignees of its Loans in whole or in part prior to the effectiveness hereof).'</li><li>'This Agreement shall be binding upon and inure to the benefit of the parties hereto and their respective successors and permitted assigns.'</li><li>'This Agreement shall be binding upon and shall inure to the benefit of the Company, its successors and assigns, and the Key Person and the Key Person’s executors, administrators, personal representatives and heirs. In the event that any part of this Agreement shall be held to be invalid or unenforceable, the remaining parts hereof shall nevertheless continue to be valid and enforceable as though the invalid portions were not a part hereof.'</li></ul> |
| headings | <ul><li>'Section and Subsection headings in this Amendment are included herein for convenience of reference only and shall not constitute a part of this Amendment for any other purpose or be given any substantive effect.'</li><li>'Section headings used in this Guaranty are for convenience only and shall not affect the construction of this Guaranty.'</li><li>'Section headings have been inserted herein for convenience only and shall not be construed to be a part hereof.'</li></ul> |
| costs | <ul><li>'The Borrowers shall pay to the Administrative Agent all reasonable costs and out-of-pocket expenses of every kind in connection with the preparation, negotiation, execution and delivery of this Amendment and any documents and instruments relating hereto or thereto, including, without limitation, any fees that have been invoiced prior to the date hereof (which fees include, without limitation, the reasonable and documented fees and expenses of any attorneys retained by the Administrative Agent).'</li><li>'Borrower hereby affirms its obligation under the Loan Agreement to reimburse the Agent for all Lender Group Expenses paid or incurred by the Agent in connection with the preparation, negotiation, execution and delivery of this Amendment, including but not limited to the attorneys’ fees and expenses of attorneys for the Agent with respect thereto.'</li><li>'Janssen will be solely responsible for conducting, at its sole cost and expense, Development of each Janssen Research IRD Product, except that Janssen will use Commercially Reasonable Efforts to Develop [***].'</li></ul> |
| definitions | <ul><li>'Capitalized terms used herein and not otherwise defined herein shall have their respective defined meanings given them in the Loan Agreement.'</li><li>'Terms not otherwise defined herein are used herein with the respective meanings given them in the Credit Agreement.'</li><li>'In this Agreement unless there is something in the subject matter or context inconsistent therewith, the words and expressions set out in Schedule\xa0”A” shall have the meanings set out in such Schedule\xa0”A” .'</li></ul> |
| modifications | <ul><li>'This Agreement may be amended, modified, or supplemented only by written agreement of the Parties.'</li><li>'This Assignment may be amended, modified, or supplemented only by written agreement of the Parties.'</li><li>'This Agreement, together with the exhibits and schedules hereto, is the entire agreement between the parties hereto with respect to the subject matter hereof, and supersedes all prior and contemporaneous communications, agreements and understandings with respect to the subject matter hereof, express or implied, oral or written, all of which are merged herein.\xa0\xa0In the event of a conflict between this Agreement and the Management Agreement, the Management Agreement shall control.'</li></ul> |
| remedies | <ul><li>'Executive acknowledges and understands that the provisions of this Agreement are of a special and unique nature, the loss of which cannot be adequately compensated for in damages by an action at law, and that the breach or threatened breach of the provisions of this Agreement would cause the Company irreparable harm. In the event of a breach or threatened breach by Executive of the provisions of this Agreement, the Company shall be entitled to an injunction restraining him from such breach.'</li><li>'All rights and remedies of Collateral Agent shall be cumulative and may be exercised singularly or concurrently, at their option, and the exercise or enforcement of any one such right or remedy shall not bar or be a condition to the exercise or enforcement of any other.'</li><li>'No delay or failure on the part of the Administrative Agent or any other Guarantied Party in the exercise of any right or remedy it may have against any Guarantor hereunder or otherwise shall operate as a waiver thereof, and no single or partial exercise by the Administrative Agent or any other Guarantied Party of any such right or remedy shall preclude any other or further exercise thereof or the exercise of any other such right or remedy.'</li></ul> |
| releases | <ul><li>'Neither Founder shall issue any press release or public announcement concerning this Agreement or the Company without obtaining the prior written consent of the other Founder hereto, which consent shall not be unreasonably withheld, except as may be required by applicable securities laws, in which case, the publishing Founder shall use reasonable commercial efforts to send the draft public announcement to the other Founder prior to publication thereof.'</li><li>'Players Network will send out a public communication as required by law to its shareholder and 8k filing pertaining to this agreement.'</li><li>'This Agreement and the security interests granted hereby shall terminate in accordance with the Indenture and each Intercreditor Agreement (if any).'</li></ul> |
| disclosures | <ul><li>'Nothing contained in this Agreement limits the Executive’s ability to communicate with any federal, state or local governmental agency or commission, including to provide documents or other information, without notice to the Company.'</li><li>'The Recipient may disclose the Discloser’s Confidential Information to the extent required by law or regulation; provided , that prior to making any such legally required disclosure, the Recipient shall give the Discloser as much prior notice of the requirement for and contents of such disclosure as is practicable under the circumstances. Any such disclosure, however, shall not relieve the Recipient of its obligations contained herein.'</li><li>'No event has occurred since the date of the most recently delivered audited financial statements, and no fact or condition exists, which has had a Material Adverse Effect or which could reasonably be expected to have a Material Adverse Effect.'</li></ul> |
| participations | <ul><li>'The CEO and any Executive who receive a Participation Agreement will be eligible to participate in the Plan effective as of the date of such Participation Agreement.\xa0\xa0The terms and conditions of the severance benefit potentially payable to a Participant will be subject to the Participation Agreement delivered to the Participant and to the Plan.\xa0\xa0In the event of an explicit discrepancy between a Participation Agreement and the Plan, the Participation Agreement will control.'</li><li>'An employee shall become a Participant as of the first day of the calendar month coincident with or next following the date he or she first becomes an Eligible Executive Officer (the “Entry Date”), provided that he or she remains a member of the select group of officers for whom this Plan is designed through his or her Entry Date.'</li><li>'An Eligible Employee becomes a Participant upon the earlier to occur of: (a) a credit of Company Contributions under Article V, or (b) receipt of notification of eligibility to participate.'</li></ul> |
| vesting | <ul><li>'All Company matching contributions under Section 2.5(a) and Company additional discretionary contributions under Section 2.5(b) are 100% vested.'</li><li>'A Participant’s Account Balance attributable to QACA Safe Harbor Contributions is one hundred percent (100%) vested after two (2) years. Participants will become fully vested upon their Death or Disability as defined herein. If the Plan already defines Year of Service for purposes of vesting, then that definition applies to this QACA vesting schedule.'</li><li>'The Restricted Shares shall not become fully vested until the Key Employee has continued his/her employment with the Bank for a period of five (5) years from the effective date of this Agreement. For this purpose, the effective date of this Agreement will be \u2002\u2002\u2002\u2002\u2002,2019, and the date the Restricted Shares shall become fully vested shall be \u2002\u2002\u2002\u2002\u2002, 2027.'</li></ul> |
| no waivers | <ul><li>'Collateral Agent shall not be deemed to have waived any of its rights hereunder or under any other agreement, instrument or paper signed by Grantor unless such waiver is in writing and signed by Collateral Agent. No delay or omission on the part of Collateral Agent in exercising any right shall operate as a waiver of such right or any other right. A waiver on any one occasion shall not be construed as a bar to or waiver of any right or remedy on any future occasion.'</li><li>'No delay or omission by either party in exercising any right under this Agreement shall operate as a waiver of that or any other right. A waiver or consent given by a party on any one occasion shall be effective only in that instance and shall not be construed as a bar or waiver of any right on any other occasion.'</li><li>'No failure or delay by a Founder in exercising any right or remedy under this Agreement shall operate as a waiver thereof, nor shall any single or partial exercise thereof preclude any other or further exercise thereof or the exercise of any other right or remedy.'</li></ul> |
| withholdings | <ul><li>'The Company may withhold from any amounts payable under this Agreement all federal, state, city or other taxes as the Company is required to withhold pursuant to any applicable law, regulation or ruling.'</li><li>'All Deferrals and distributions shall be subject to legally required income and employment tax withholding. Such taxes shall include, but not necessarily be limited to, Social Security taxes on Deferrals, Matching Contributions, Company Profit Sharing Contributions and/or Other Contributions at the time they are vested and income taxes on distributions.'</li><li>'The Company shall have the right to deduct from any payment hereunder all taxes (federal, state or other) which it is required to withhold therefrom.'</li></ul> |
| miscellaneous | <ul><li>'All section headings are for convenience only. This Agreement may be executed in several counterparts, each of which is an original. It shall not be necessary in making proof of this Agreement or any counterpart hereof to produce or account for any of the other counterparts.'</li><li>'This Agreement may be executed in two or more counterparts (including via facsimile), each of which shall be deemed an original. but all of which together shall constitute one and the same instrument. The section headings contained in this Agreement are for reference purposes only and shall not affect in any way the meaning or interpretation of this Agreement.'</li><li>'Authority of the Representative .\xa0 Any action by the Initial Purchasers hereunder may be taken by J.P. Morgan Securities LLC on behalf of the Initial Purchasers, and any such action taken by J.P. Morgan Securities LLC shall be binding upon the Initial Purchasers.'</li></ul> |
| jurisdictions | <ul><li>'This Agreement shall be construed in accordance with and governed by the law of the State of New York.'</li><li>'The provisions set forth in Sections 9.09 and 9.10 of the Credit Agreement are hereby incorporated mutatis mutandis with all references to the “Agreement” therein being deemed references to this Agreement.'</li><li>'(a)\xa0 THIS AGREEMENT SHALL BE GOVERNED BY AND CONSTRUED IN ACCORDANCE WITH, THE LAWS OF THE STATE OF NEW YORK, WITHOUT REGARD TO PRINCIPLES OF CONFLICTS OF LAW (OTHER THAN SECTIONS 5-1401 AND 5-1402 OF THE NEW YORK GENERAL OBLIGATIONS LAW), EXCEPT TO THE EXTENT THAT LOCAL LAW GOVERNS THE CREATION, PERFECTION, PRIORITY OR ENFORCEMENT OF SECURITY INTERESTS.'</li></ul> |
| closings | <ul><li>'Subject to the terms and conditions of this Agreement, the closing of the transactions described herein (the “ Closing ”) is taking place simultaneously with the execution and delivery of this Agreement by the parties at 780 Third Avenue, New York, New York 10017 (the date the Closing takes place, the “ Closing Date ”).'</li><li>'Subject to the terms and conditions of this Agreement, and unless otherwise agreed in writing by the Parties, the closing of the Transactions shall occur at 11:59 p.m.\xa0(Dallas, Texas time) on the date hereof (the “ Effective Time ”).'</li><li>'The closing of the transactions contemplated by this Agreement (the “Closing”) shall occur on the Closing Date at such location as may be agreed to by the parties (including via exchange of electronic signatures).'</li></ul> |
| integration | <ul><li>'The Company shall not sell, offer for sale or solicit offers to buy or otherwise negotiate in respect of any security (as defined in Section\xa02 of the Securities Act) that would be integrated with the offer or sale of the Securities for purposes of the rules and regulations of any Trading Market such that it would require shareholder approval prior to the closing of such other transaction unless shareholder approval is obtained before the closing of such subsequent transaction.'</li><li>'This Agreement and the other Loan Documents represent the entire agreement of the Company, the Administrative Agent and the Lenders with respect to the subject matter hereof and thereof, and there are no promises, undertakings, representations or warranties by the Administrative Agent or any Lender relative to the subject matter hereof not expressly set forth or referred to herein or in the other Loan Documents.'</li><li>'Except as specifically stated otherwise herein, this Agreement and Related Documents set forth the entire understanding of the parties relating to the subject matter hereof, and all prior understandings, written or oral, are superseded by this Agreement and the Related Documents. This Agreement may not be modified, amended, waived or supplemented except as provided herein.'</li></ul> |
| fees | <ul><li>'That Defaulting Lender (x)\xa0shall not be entitled to receive any Commitment Fee pursuant to Section\xa02.8(a)(i) for any period during which that Lender is a Defaulting Lender (and the Borrower shall not be required to pay any such fee that otherwise would have been required to have been paid to that Defaulting Lender) and (y)\xa0shall be limited in its right to receive L/C Participation Fees as provided in Section\xa02.8(a)(iii).'</li><li>'The Borrower agrees to pay the administrative and other fees of the Agent pursuant to the Fee Letter and as may otherwise be agreed to in writing by the Borrower and the Agent from time to time.'</li><li>'The Borrower agrees to pay to the Agent a fee equal to $2,500 at the time of each Bid Rate Quote Request made hereunder for services rendered by the Agent in connection with Bid Rate Loans.'</li></ul> |
| effective dates | <ul><li>'The amended and restated Plan is effective as of January 1, 2019. The rights and benefits of and/or with respect to a Participant whose employment terminated prior to January 1, 2019 shall be determined under the provisions of the Plan in effect when his/her employment terminated.'</li><li>'This TSA shall become effective on the Effective Date and, unless terminated earlier pursuant to Section\xa07.02 below, shall remain in full force and effect until the latest date of expiration (the “ Final Term ”) of the Term for any Transition Service hereunder.'</li><li>'If the Commitments are increased in accordance with this Section, the Borrower shall determine the effective date (the “ Increase Effective Date ”) and the final allocation of such increase in consultation with the Administrative Agent. The Administrative Agent shall promptly notify the Lenders of the final allocation of such increase and the Increase Effective Date.'</li></ul> |
| enforcements | <ul><li>"This Agreement has been duly and validly authorized, executed and delivered on behalf of the Investor and is a valid and binding agreement of the Investor enforceable against the Investor in accordance with its terms, subject as to enforceability to general principles of equity and to applicable bankruptcy, insolvency, reorganization, moratorium, liquidation and other similar laws relating to, or affecting generally, the enforcement of applicable creditors' rights and remedies."</li><li>'This Agreement has been duly and validly authorized. This Agreement has been duly executed and delivered on behalf of the Buyer, and this Agreement constitutes a valid and binding agreement of the Buyer enforceable in accordance with its terms.'</li><li>'The Corporation expressly confirms and agrees that it has entered into this Agreement in order to induce Indemnitee to continue to serve as director and/or officer of the Corporation and acknowledges that Indemnitee is relying upon this Agreement in continuing in such capacity.'</li></ul> |
| financial statements | <ul><li>'Borrower has furnished to the Lenders (a)\xa0the audited consolidated financial statements of Borrower for the Fiscal Year ended March\xa029, 2013, and (b)\xa0the unaudited consolidated financial statements of Borrower for the Fiscal Quarter ended October\xa04, 2013.'</li><li>'The Borrower shall have delivered to the Administrative Agent or filed with the SEC its 10-K report for the period ending on December 31, 2017 and its 10-Q reports for the periods ending on March 31, 2018, June 30, 2018 and September 30, 2018.'</li><li>'The Administrative Agent shall have received the audited financial statements referred to in subsection 4.1.'</li></ul> |
| capitalization | <ul><li>'The Company currently has 220,599,761 shares of Common Stock issued and outstanding. In addition, 53,287,499 shares of Common Stock have been reserved for issuance, or are issuable upon exercise or conversion of outstanding derivative securities.'</li><li>'The shares of Common Stock underlying the Restricted Stock Units may be adjusted as provided in the Plan including, without limitation, Section \xa011 of the Plan. The Participant, by accepting this Agreement, irrevocably and unconditionally consents and agrees to any such adjustments as may be made at any time hereafter.'</li><li>'The shares of Common Stock underlying the Restricted Stock Units may be adjusted as provided in the Plan. The Participant, by accepting this Agreement, irrevocably and unconditionally consents and agrees to any such adjustments as may be made at any time hereafter.'</li></ul> |
| benefits | <ul><li>'During the period of employment, the Company shall provide Executive with such employee benefits as are provided by the Company generally to its executive employees. In additon, Company shall provide Executive at Company’s expense, or shall reimburse Executive, for appropriate telecommunications and internet service and devices as needed for Executive to perform his duties pursuant to this Agreement.'</li><li>'This Agreement shall be binding upon and shall inure to the benefit of the Company, its successors and assigns, and the Key Person and the Key Person’s executors, administrators, personal representatives and heirs. In the event that any part of this Agreement shall be held to be invalid or unenforceable, the remaining parts hereof shall nevertheless continue to be valid and enforceable as though the invalid portions were not a part hereof.'</li><li>'The Termination Date shall be the termination date of your employment for purposes of participation in and coverage under all benefit plans and programs sponsored by the Company and its subsidiaries.'</li></ul> |
| interpretations | <ul><li>'The covenants contained in this Section\xa07 are intended to be construed as a series of separate covenants. If, in any judicial proceeding, the court shall refuse to enforce any of the separate covenants (or any part thereof), then such unenforceable covenant (or such part) shall be deemed to be eliminated from this Agreement for the purpose of those proceedings to the extent necessary to permit the remaining separate covenants (or portions thereof) to be enforced.'</li><li>'The captions used herein are intended for convenience of reference only and shall not modify or affect in any manner the meaning or interpretation of any of the provisions of this Agreement. This Agreement is not intended to carry over any economic entitlements or obligations that may have arisen among the parties under the Existing Agreement due to events preceding this Agreement other than those specifically contemplated herein and should be interpreted accordingly to the extent applicable.'</li><li>'Neither this Agreement nor any uncertainty or ambiguity herein shall be construed against the Lender Group or Borrower, whether under any rule of construction or otherwise. On the contrary, this Agreement has been reviewed by all parties and shall be construed and interpreted according to the ordinary meaning of the words used so as to accomplish fairly the purposes and intentions of all parties hereto.'</li></ul> |
| subsidiaries | <ul><li>'The Borrower owns, directly or indirectly, free and clear of any Lien (other than Liens expressly permitted by Section 6.01 or 6.02), all of the issued and outstanding shares of common stock of each of the Principal Subsidiaries.'</li><li>'The Company owns, directly or indirectly, all of the capital stock or other equity interests of each Subsidiary free and clear of any Liens, and all of the issued and outstanding shares of capital stock of each Subsidiary are validly issued and are fully paid, non-assessable and free of preemptive and similar rights to subscribe for or purchase securities.'</li><li>'Solely for the purposes of determining whether an Event of Default has occurred under clause\xa0(h), (i) or (l) of Section\xa07.01, any reference in any such clause to any Subsidiary shall be deemed not to include any Immaterial Subsidiary affected by any event or circumstance referred to in any such clause.'</li></ul> |
| solvency | <ul><li>'This Agreement may be immediately terminated in its entirety by a Party by providing written notice of termination to the other Party in the event of an Insolvency Event of the other Party.'</li><li>'The Seller is not insolvent, nor will the Seller be made insolvent by the transfer of the Receivables, nor does the Seller anticipate any pending insolvency.'</li><li>'As of the First Amendment and Restatement Effective Date, the Borrower and its Subsidiaries, on a consolidated basis, are Solvent.'</li></ul> |
| cooperation | <ul><li>'Upon a Party’s request, the other Party shall provide the prosecuting and maintaining Party with all reasonable assistance and cooperation in connection with its prosecution and maintenance of the applicable Patents, including by providing access to relevant persons and executing all documentation reasonably requested by the prosecuting and maintaining Party.'</li><li>'Each Party agrees, without further consideration, to cooperate and diligently perform any further acts, deeds and things and to execute and deliver any documents that may from time to time be reasonably necessary or otherwise reasonably required to consummate, evidence, confirm and/or carry out the intent and provisions of this Agreement, all without undue delay or expense.'</li><li>'Subject to your other commitments, you agree to reasonably cooperate (but only truthfully) with the Company and provide information as to matters which you were personally involved, or have information on, during your employment with the Company and which are or become the subject of litigation or other dispute.\xa0 The Company shall pay for any reasonable out-of-pocket expenses incurred by you in connection with your performance of the obligations pursuant to this Section 18.'</li></ul> |
| approvals | <ul><li>'Other than as set forth on Schedule 1.4 , no Tricadia Group Entity is required to obtain any consent or approval from any Person or provide notice to any Person in connection with the execution, delivery and performance of this Agreement and the consummation by it of the transactions contemplated by this Agreement, except where any such failure would not be materially adverse to the Tricadia Business.'</li><li>'Except as previously obtained or made and as provided in Section \xa09.2(e) , no authorization, consent, approval, order, license or permit from, or filing, registration or qualification with, any Governmental Agency is or will be required to authorize or permit under applicable Laws the execution, delivery and performance by Borrower or any Subsidiary Guarantor of the Loan Documents to which it is a party (except where the failure to do so does not constitute a Material Adverse Effect).'</li><li>'The implementation of the Plan, the granting of any stock options under the Plan and the issuance of any shares of Common Stock (i) upon the exercise of any stock option or (ii) under the Stock Issuance Program shall be subject to the Corporation’s procurement of all approvals and permits required by regulatory authorities having jurisdiction over the Plan, the stock options granted under it and the shares of Common Stock issued pursuant to it.'</li></ul> |
| construction | <ul><li>'The parties hereto acknowledge and agree that the language of this Release Agreement shall be construed as a whole according to its fair meaning and not strictly for or against any of the parties.'</li><li>'The language used in this Agreement will be deemed to be the language chosen by the parties to express their mutual intent, and no rules of strict construction will be applied against any party.'</li><li>'The various captions and section headings in this Agreement are included for convenience only and shall not affect the meaning or interpretation of any provision of this Agreement. Notwithstanding anything to the contrary, in all cases, the use of the term “including” shall be construed as being inclusive and shall be deemed to mean “including, without limitation,”.'</li></ul> |
| intellectual property | <ul><li>'(a) Attached hereto as Schedule\xa011(a) is a schedule setting forth all of each Company’s Patents and Trademarks (each as defined in the Collateral Agreement) applied for or registered with the United States Patent and Trademark Office, and all other Patents and Trademarks (each as defined in the Collateral Agreement), including the name of the registered owner or applicant and the registration, application, or publication number, as applicable, of each Patent or Trademark owned by each Company.'</li><li>'(a) Attached hereto as Schedule\xa011(a ) is a schedule setting forth all of the Company’s Patents and Trademarks (each as defined in the Collateral Agreement) applied for or registered with the United States Patent and Trademark Office, and all other Patents and Trademarks (each as defined in the Collateral Agreement), including the name of the registered owner or applicant and the registration, application, or publication number, as applicable, of each Patent or Trademark owned by the Company.'</li><li>'As of the Closing Date, the Company and each Principal Domestic Subsidiary own, or are licensed to use, all United States Intellectual Property necessary for the operation of their respective businesses as currently conducted and as proposed to be conducted, except where the failure to own or be licensed would not reasonably be expected to have a Material Adverse Effect.'</li></ul> |
| brokers | <ul><li>'No agent, broker, financial advisor or other intermediary acting on behalf of any Tricadia Group Entity or any of their Affiliates is, or will be, entitled to any broker’s commission, finder’s fees or similar payment from any of the parties hereto, or from any Affiliate of any of the parties hereto, in connection with the transactions contemplated by this Agreement.'</li><li>'The Company has taken no action which would give rise to any claim by any person for brokerage commissions, transaction fees or similar payments relating to this Agreement or the transactions contemplated hereby.'</li><li>'Neither the Company nor any Subsidiary or any related entities (i) is required to register as a “broker” or “dealer” in accordance with the provisions of the Exchange Act or (ii) directly or indirectly through one or more intermediaries, controls or is a “person associated with a member” or “associated person of a member” (within the meaning set forth in the FINRA Manual).'</li></ul> |
| enforceability | <ul><li>'The Borrower or any other Loan Party shall (or shall attempt to) disavow, revoke or terminate any Loan Document to which it is a party or shall otherwise challenge or contest in any action, suit or proceeding in any court or before any Governmental Authority the validity or enforceability of any Loan Document, or any Loan Document shall cease to be in full force and effect (except as a result of the express terms thereof).'</li><li>'The failure of the Participants or the Company to insist upon strict adherence to any term of the Plan on any occasion shall not be considered а waiver of such party’s rights or deprive such party of the right thereafter to insist upon strict adherence to that term or any other term of the Plan.'</li><li>'This Interim Order shall constitute findings of fact and conclusions of law pursuant to Bankruptcy Rule 7052 and shall take effect and be fully enforceable nunc pro tunc to the Petition Date immediately upon execution hereof. Any findings of fact shall constitute a finding of fact even if it is stated as a conclusion of law, and any conclusion of law shall constitute a conclusion of law even if it is stated as a finding of fact.'</li></ul> |
| authorizations | <ul><li>'The execution and performance of this Agreement have been duly authorized by all necessary action and do not and will not: (a) require any consent or approval of the members or stockholders of any entity, or the consent of any governmental entity, which in each case has not been obtained; or (b) violate any provision of any indenture, contract, agreement or instrument to which it is a party or by which it is bound.'</li><li>'Other than the filing of the financing statements required hereunder, no authorization or approval or other action by, and no notice to or filing with, any governmental authority or regulatory body is required for the due execution and delivery by the Seller of this Agreement and each other Transaction Document to which it is a party and the performance of its obligations hereunder and thereunder.'</li><li>'No authorization or approval or other action by, and no notice to or filing with, any governmental authority or regulatory body is required for the due execution and delivery by the Servicer of this Agreement and each other Transaction Document to which it is a party and the performance of its obligations hereunder and thereunder in its capacity as Servicer.'</li></ul> |
| consents | <ul><li>'Other than as set forth on Schedule 1.4 , no Tricadia Group Entity is required to obtain any consent or approval from any Person or provide notice to any Person in connection with the execution, delivery and performance of this Agreement and the consummation by it of the transactions contemplated by this Agreement, except where any such failure would not be materially adverse to the Tricadia Business.'</li><li>'Each Lender hereby consents to the Lids Disposition, and the Agent hereby waives any notices required or that will be required as a result of the Lids Disposition, including, without limitation, notices pursuant to Section 5.3 of the Credit Agreement.'</li><li>'Newmont headquarters is located at 6363 South Fiddler’s Green Circle, Suite 800, Greenwood Village, Colorado 80111 U.S.A., and grants awards to employees of Newmont and its Subsidiaries, at Newmont’s sole discretion. If Employee would like to participate in the Plan, please review the following information about Newmont’s data processing practices and declare Employee’s consent.'</li></ul> |
| tax withholdings | <ul><li>'The Company shall have the right to deduct from any payment hereunder all taxes (federal, state or other) which it is required to withhold therefrom.'</li><li>'The Company may withhold from any benefits payable under this Plan all federal, state, city or other taxes as may be required pursuant to any law or governmental regulation or ruling.'</li><li>'Any payments provided for hereunder shall be paid net of any applicable tax withholding required under federal, state or local law.'</li></ul> |
| arbitration | <ul><li>'The Parties agree that any and all disputes arising out of, or relating to, the terms of this Release, their interpretation, and any of the matters herein released, shall be subject to binding arbitration as described in Section 9(c) of the Employment Agreement.'</li><li>'The Parties agree that any dispute or controversy arising out of, relating to, or concerning the interpretation, construction, performance, or breach of this Agreement will be settled by arbitration to be held in Multnomah County, Oregon, in accordance with the terms and conditions of the Confidentiality Agreement.'</li><li>'This Award Certificate shall be governed by, and construed in accordance with, the laws of the State of California (disregarding any choice-of-law provisions). If the Participant is a party to an agreement with the Corporation to arbitrate claims, such agreement to arbitrate claims shall apply as to any dispute or disagreement regarding the Participant’s rights under this Award Certificate.'</li></ul> |
| transactions with affiliates | <ul><li>'Directly or indirectly enter into or permit to exist any transaction with any Affiliate of Borrower except for transactions that (i)\xa0are in the ordinary course of Borrower’s business, (ii)\xa0are upon fair and reasonable terms, (iii)\xa0are fully disclosed to Agent, and (iv)\xa0are no less favorable to Borrower or its Subsidiaries, as applicable, than would be obtained in an arm’s length transaction with a non-Affiliate.'</li><li>'Except as set forth in the SEC Documents, to the knowledge of the Company, none of the Company’s stockholders, officers or directors or any family member or affiliate of any of the foregoing, has either directly or indirectly an interest in, or is a party to, any transaction that is required to be disclosed as a related party transaction pursuant to Item 404 of Regulation S-K promulgated under the Securities Act.'</li><li>'Neither the REIT nor any of its Subsidiaries is a party to any transaction, arrangement or contract (including any lease or other rental agreement) with any of its Affiliates other than as permitted by Section 9.10 hereof.'</li></ul> |
| applicable laws | <ul><li>'THIS AMENDMENT AND THE RIGHTS AND OBLIGATIONS OF THE PARTIES HEREUNDER SHALL BE GOVERNED BY, AND SHALL BE CONSTRUED AND ENFORCED IN ACCORDANCE WITH, THE LAWS OF THE STATE OF NEW\xa0YORK.'</li><li>'The Requisite Lenders may direct the Agent to, and the Agent if so directed shall, exercise all other rights and remedies it may have under any Applicable Law.'</li><li>'THIS AGREEMENT AND THE OTHER LOAN DOCUMENTS (OTHER THAN LETTERS OF CREDIT AND AS EXPRESSLY SET FORTH IN OTHER LOAN DOCUMENTS) SHALL BE CONSTRUED IN ACCORDANCE WITH AND GOVERNED BY THE LAWS OF THE STATE OF NEW YORK WITHOUT REGARD TO THE CONFLICT OF LAWS PRINCIPLES THEREOF.'</li></ul> |
| defined terms | <ul><li>'As used in this Agreement, the terms listed in this Section\xa01.1 shall have the respective meanings set forth in this Section\xa01.1.'</li><li>'Unless otherwise defined herein, capitalized terms or matters of construction defined or established in the Loan Agreement shall be applied herein as defined or established therein.'</li><li>'Except as otherwise indicated herein, all words and terms defined in the Existing Agreement shall have the same meanings when used herein.'</li></ul> |
| change in control | <ul><li>'Upon a Change in Control that occurs during the Performance Period and prior to the Participant’s Termination due to death, Disability or Retirement, for purposes of determining the number of earned Shares under the Performance Share Units, the closing date of the transaction that constitutes the Change in Control (the “ Change in Control Date ”) shall be deemed the Last Day of the Performance Period .'</li><li>'In accordance with Section 10.1(a) of the Plan, in the event of a Change in Control, the RSUs shall vest immediately prior to the time of such Change in Control, except to the extent that the RSUs are replaced with a Replacement Award. If the RSUs are replaced with a Replacement Award, then from and after the Change in Control, references herein to "RSUs" shall be deemed to refer to the Replacement Award.'</li><li>'In the event of a Change in Control, the Eligible Employee shall immediately be fully vested in his or her benefit under the Plan.'</li></ul> |
| no defaults | <ul><li>'No Default or Event of Default shall have occurred and be continuing.'</li><li>'No Default or Event of Default has occurred and is continuing or would result from the consummation of the transactions contemplated by this Agreement or any other Loan Document.'</li><li>'No Default or Event of Default other than the Interest Default shall have occurred and be continuing as of the date the condition set forth in Section\xa03(a) is satisfied.'</li></ul> |
| adjustments | <ul><li>'Participant acknowledges that the Option is subject to adjustment, modification and termination in certain events as provided in this Agreement and the Plan.'</li><li>'Participant acknowledges that the Option is subject to adjustment, modification and termination in certain events as provided in this UK Option Agreement and the UK Sub-Plan.'</li><li>'The parties acknowledge and agree that all share-related numbers contained in this Agreement shall be adjusted to take into account any reorganization, recapitalization, non-cash dividend, stock split or other similar transaction effected with respect to the Common Stock except as specifically stated herein.'</li></ul> |
| non-disparagement | <ul><li>'Each Participant agrees that, following any termination of his or her employment with the Company, such Participant will not disparage, orally or in writing, the Company, the management of the Company, any product or service provided by the Company or the future prospects of the Company.'</li><li>'Executive agrees to refrain from any disparagement, defamation, libel, or slander of any of the Releasees, and agrees to refrain from any tortious interference with the contracts and relationships of any of the Releasees.'</li><li>'Ms.\xa0Meyerrose agrees that she will not make any derogatory or disparaging statements about the Company or its present or former agents, employees, officers, or directors. Officers of the Company with knowledge of this Agreement agree that they will not make any derogatory or disparaging statements about Ms.\xa0Meyerrose.'</li></ul> |
| employment | <ul><li>'Nothing expressed or implied in this Agreement will create any right or duty on the part of the Company or the Executive to have the Executive remain in the employment of the Company or any Subsidiary prior to or following any Change in Control or otherwise.'</li><li>'This Plan shall not be deemed to create a contract of employment between any Participant and the Company and/or its Affiliates. Nothing contained in the Plan shall (a) confer upon any Participant any right with respect to continuation of employment with the Company or (b) subject to the rights and benefits of any Participant hereunder, interfere in any way with the right of the Company to terminate such Participant’s employment at any time.'</li><li>'Nothing in this Plan gives any Participant the right to be retained in the service of the Company, nor will it interfere with the right of the Company to discharge or otherwise deal with Participants without regard to the existence of this Plan.'</li></ul> |
| positions | <ul><li>'Chief Executive Officer and President. Executive shall report in such capacity to the Board.'</li><li>'Chief Financial Officer. Executive shall report in such capacity to Company’s Chief Executive Officer.'</li><li>'The Motion is granted on an interim basis in accordance with the terms of this Interim Order. Any objections to the Motion with respect to the entry of the Interim Order that have not been withdrawn, waived or settled are hereby denied and overruled on their merits.'</li></ul> |
| erisa | <ul><li>'No ERISA Default has occurred and is continuing.'</li><li>'ERISA means the Employee Retirement Income Security Act of 1974, as amended from time to time.'</li><li>'The Servicer shall give the Facility Agent and each Lender prompt written notice of any event that results in the imposition of a Lien on the Collateral under Section 430 of the Code or Section 303(k) or 4068 of ERISA. The Servicer shall not, and shall not cause or permit any of its Affiliates to, cause or permit to occur an event that results in the imposition of a Lien on the Collateral under Section 430 of the Code or Section 303(k) or 4068 of ERISA.'</li></ul> |
| warranties | <ul><li>'Each Guarantor hereby makes to the Administrative Agent and the other Guarantied Parties all of the representations and warranties made by the Borrower with respect to or in any way relating to such Guarantor in the Loan Agreement and the other Loan Documents, as if the same were set forth herein in full.'</li><li>'The Seller has determined that this Agreement is effective to transfer to the Administrative Agent, the Managing Agents and the Purchasers, as assignees of the Seller, the full benefit of and a direct claim against LKQ, as Servicer, and each Originator in respect of each representation or warranty made by LKQ, as Servicer, and each Originator under any Transaction Document.'</li><li>'EXCEPT AS EXPRESSLY SET FORTH IN THIS TSA, SERVICE PROVIDER MAKES NO WARRANTY, EXPRESS OR IMPLIED, AND HEREBY DISCLAIMS ANY WARRANTIES OF ANY KIND WITH RESPECT TO THE NATURE OR QUALITY OF THE TRANSITION SERVICES TO BE PROVIDED BY SERVICE PROVIDER OR THE RESULTS THAT WILL BE OBTAINED BY USING OR APPLYING SUCH TRANSITION SERVICES, INCLUDING ANY WARRANTY OR CONDITION OF NONINFRINGEMENT, MERCHANTABILITY, ACCURACY, SATISFACTORY QUALITY, OR FITNESS FOR ANY PARTICULAR PURPOSE.'</li></ul> |
| disability | <ul><li>'If Executive’s employment shall be terminated by reason of Executive’s death or Disability, then the Company will provide Executive with the Accrued Obligations. Thereafter, the Company shall have no further obligation to Executive or Executive’s legal representatives.'</li><li>'In the event the employment of a Participant is terminated by the Company for Cause or due to the death or Disability of the Participant no severance benefits will be payable pursuant to the Plan.'</li><li>'If your employment with or service to the Company, a Subsidiary or an Affiliate terminates by reason of Disability, this Stock Option shall become fully vested and exercisable and may thereafter be exercised by you (or your legal representative or similar person) until the date which is one year after the effective date of your termination of employment or service, or if earlier, the expiration date of the term of this Stock Option.'</li></ul> |
| interests | <ul><li>'Interest shall accrue on the principal balance hereof at a fixed rate of 7.25% per annum.'</li><li>'Interest shall accrue on the principal balance hereof at a fixed rate of 8.50% per annum.'</li><li>'Interest shall accrue on the then outstanding balance of the Principal Amount at a fixed interest rate per annum equal to 8%. Accrued interest shall be payable in cash in arrears on the last day of each calendar quarter, with first interest payment to commence on June 30, 2019, until the Principal Amount is paid in full. If at any time the outstanding Principal Amount shall be paid in full, then all accrued interest shall be payable at the time of such principal payment.'</li></ul> |
| duties | <ul><li>'The Administrative Agent may execute any of its duties under this Agreement and the other Loan Documents by or through agents or attorneys-in-fact and shall be entitled to advice of counsel concerning all matters pertaining to such duties. The Administrative Agent shall not be responsible for the negligence or misconduct of any agents or attorneys-in-fact selected by it with reasonable care.'</li><li>'Agent may execute any of its duties under this Agreement or any other Loan Document by or through agents, employees or attorneys-in-fact and shall be entitled to advice of counsel concerning all matters pertaining to such duties. Agent shall not be responsible for the negligence or misconduct of any agent or attorney-in-fact that it selects as long as such selection was made without gross negligence or willful misconduct.'</li><li>'The Agent may execute any of its respective duties under this Agreement or the other Transaction Documents by or through agents or attorneys in fact and shall be entitled to advice of counsel concerning all matters pertaining to such duties. The Agent shall not be responsible for the negligence or misconduct of any agents or attorneys in fact selected by the Agent with reasonable care.'</li></ul> |
| specific performance | <ul><li>'Each First Lien Agent may demand specific performance of this Agreement. Each Second Priority Agent, on behalf of itself and each applicable Second Priority Secured Party, hereby irrevocably waives any defense based on the adequacy of a remedy at law and any other defense that might be asserted to bar the remedy of specific performance in any action that may be brought by any First Lien Agent.'</li><li>'The parties recognize that if any provision of this Agreement is violated by the Company, Indemnitee may be without an adequate remedy at law. Accordingly, in the event of any such violation, Indemnitee shall be entitled, if Indemnitee so elects, to institute proceedings, either in law or at equity, to obtain damages, to enforce specific performance, to enjoin such violation, or to obtain any relief or any combination of the foregoing as Indemnitee may elect to pursue.'</li><li>'The parties hereto recognize that if any provision of this Agreement is violated by the Company, Indemnitee may be without an adequate remedy at law.\xa0 Accordingly, in the event of any such violation, Indemnitee shall be entitled, if Indemnitee so elects, to institute proceedings, either in law or at equity, to obtain damages, to enforce specific performance, to enjoin such violation, or to obtain any relief or any combination of the foregoing as Indemnitee may elect to pursue.'</li></ul> |
| anti-corruption laws | <ul><li>'The Borrower will not, and will not permit any of its Subsidiaries to, fail to maintain in effect and enforce policies and procedures designed to ensure compliance by the Borrower, its Subsidiaries and their respective directors, officers, employees and agents with Anti-Corruption Laws and applicable Sanctions.'</li><li>'Conduct its business in compliance with applicable anti-corruption laws and maintain policies and procedures designed to promote and achieve compliance with such laws.'</li><li>'None of the Loan Parties or their Subsidiaries have breached the United States Foreign Corrupt Practices Act of 1977, the UK Bribery Act 2010, or any other similar anti-corruption legislation in other jurisdictions the effect of which breach is or could reasonably be expected to be material to the Loan Parties, taken as a whole, and the Loan Parties and their Subsidiaries have instituted and maintained policies and procedures designed to promote and achieve compliance with such laws.'</li></ul> |
| vacations | <ul><li>'During the Employment Period, the Executive shall be entitled to paid vacation in accordance with the most favorable plans, policies, programs and practices of the Company and its affiliated companies.'</li><li>'During the Employment Period, the Executive shall be entitled to paid vacation in accordance with the plans, policies, programs and practices of the Company and its affiliated companies.'</li><li>'During the Employment Period, the Employee shall be entitled to paid vacation in accordance with the plans, policies, programs and practices of the Company and its affiliated companies.'</li></ul> |
| generally | <ul><li>'The Customer Support Services will be provided by the following types of Customer Support Agents: [***]. Bank will provide agents for future, mutually agreed upon and approved channels.'</li><li>'Except as otherwise provided in this Section\xa03 , the RSUs subject to this Award shall become vested in accordance with the Vesting Schedule.'</li><li>'Except as otherwise provided in this Section\xa03 , the PRSUs subject to this Award shall become vested in accordance with the Performance Vesting Conditions; provided that the Participant remains continuously employed by the Company or an Affiliate from the Grant Date through the Vesting Date set forth above.'</li></ul> |
| publicity | <ul><li>'The parties agree that a public announcement and/or similar publicity with respect to the transactions contemplated hereby will be issued by the BDC following the date hereof. The contents of such announcement and/or publicity by the BDC will be subject to the approval of Trinity (such approval not to be unreasonably withheld). For the avoidance of doubt, any such announcement and/or publicity may be transmitted by the BDC by email to its general contacts.'</li><li>'Consultant may not publish or refer to Work Product, in whole or in part, without the prior express written consent of AVROBIO. Consultant will not use the name, logo, trade name, service mark, or trademark, or any simulation, abbreviation, or adaptation of same, or the name of AVROBIO or any of its affiliates for publicity, promotion, or other uses without AVROBIO’s prior written consent.'</li><li>'Neither party may issue a press release, public announcement, advertisement or other form of publicity concerning the existence of this Agreement or the terms of this Agreement without obtaining the prior written consent of the other party, provided that the Company may make disclosure pursuant to its obligations under applicable securities laws and regulations and/or requirements of the New York Stock Exchange.'</li></ul> |
| choice of laws | <ul><li>'THE VALIDITY, CONSTRUCTION AND ENFORCEABILITY OF THIS NOTE SHALL BE GOVERNED BY THE INTERNAL LAWS OF THE STATE OF MINNESOTA, WITHOUT GIVING EFFECT TO CONFLICT OF LAWS PRINCIPLES THEREOF.'</li><li>'This Agreement and the Notice of Restricted Stock Grant shall be governed by, and construed in accordance with, the laws of the State of Delaware, without regard to any conflicts of law or choice of law rule or principle that might otherwise cause the Plan, this Agreement or the Notice of Restricted Stock Grant to be governed by or construed in accordance with the substantive law of another jurisdiction.'</li><li>'This Agreement shall be construed and enforced in accordance with the laws of the State of Colorado, notwithstanding any state’s choice-of-law rules to the contrary.'</li></ul> |
| liens | <ul><li>'Except for the conveyances hereunder, Seller will not sell, pledge, assign or transfer to any other Person, or grant, create, incur, assume or suffer to exist any Lien on the Receivables or the Other Conveyed Property or any interest therein, and Seller shall defend the right, title, and interest of Purchaser and the Issuer in and to the Receivables and the Other Conveyed Property against all claims of third parties claiming through or under Seller.'</li><li>'No Credit Party shall, and no Credit Party shall permit any of its Subsidiaries to, directly or indirectly, allow or suffer to exist any Liens, other than Permitted Liens.'</li><li>'The Administrator will not directly or indirectly create, suffer or allow to exist any Lien on the Collateral other than Permitted Liens.'</li></ul> |
| death | <ul><li>'In the event of termination due to death or Disability, Executive or his legal representative shall be entitled to any Base Compensation earned through the last date of employment. In addition, Executive will remain eligible for all applicable benefits relative to death or disability pursuant to the plans, if any, in place at the time.'</li><li>'If Participant’s Employment terminates under circumstances described in Section\xa03(a) , then upon Participant’s subsequent death, all unpaid amounts payable to Participant under Section\xa03(a)(i) , (ii) , (iii) \xa0or (vi) , if any, shall be paid to Participant’s Beneficiary.'</li><li>'The Executive’s employment hereunder shall terminate upon her death.'</li></ul> |
| purposes | <ul><li>'The Seller has determined that, from a business viewpoint, the purchase of the Receivables and related interests thereto from the Originators under the Receivables Sale Agreement , and the sale of Purchaser Interests to the Administrative Agent, for the benefit of the Purchasers, and the other transactions contemplated herein, are in the best interests of the Seller.'</li><li>'The Program established pursuant to this Agreement will allow customers of Company, through Bank’s standard and customized technology and financial products and services (including the establishment of T-Mobile Customer Accounts, the issuance of Cards and other financial products and services, as further described herein), to receive and use the T-Mobile Financial Services.'</li><li>'The purpose of the Fund shall be to make loans, and purchase assignments or participations in loans that have already been made (in either case, “ Underlying Loans ”), either directly or indirectly through subsidiaries or other Persons, and to engage in any other lawful business.'</li></ul> |
| information | <ul><li>'Each Lender shall have received, on or prior to the Closing Date, all documentation and other information reasonably requested by such Lender that is required by bank regulatory authorities under applicable “know your customer,” anti-money laundering and foreign asset control rules and regulations and any other compliance or regulatory considerations applicable to such Lender (including the Patriot Act), including the information described in Section\xa010.19.'</li><li>'The Agent shall periodically deliver to the Revolving Lenders information setting forth the Stated Amount of all outstanding Letters of Credit. Other than as set forth in this subsection, the Agent shall have no duty to notify the Revolving Lenders regarding the issuance or other matters regarding Letters of Credit issued hereunder. The failure of the Agent to perform its requirements under this subsection shall not relieve any Revolving Lender from its obligations under Section\xa02.5.(j).'</li><li>'From time to time and promptly upon each request, such data, certificates, reports, statements, opinions of counsel, documents or further information regarding the business, assets, liabilities, financial condition, results of operations or business prospects of the Borrower, any other Loan Party or any other Subsidiary as the Agent or any Lender may reasonably request.'</li></ul> |
| compensation | <ul><li>'The Executive will be entitled to incentive compensation and bonuses as provided below, and in any other plan of the Bank in which Executive is eligible to participate.'</li><li>'The compensation to be paid by Bank to Executive from time to time, including any fringe benefits or other employee benefits, shall not be governed by this Agreement. This Agreement shall not be deemed to affect the terms of any stock options, employee benefits or other agreements between the Bank and Executive.'</li><li>'The Managers will not receive any compensation. However, the Managers shall be reimbursed by the Fund for their reasonable out-of-pocket expenses, if any, of attendance at meetings of the Board of Managers.'</li></ul> |
| consent to jurisdiction | <ul><li>'The jurisdiction, services of process and waiver of jury trial provisions set forth in Sections 9.05 and 9.06 of the Credit Agreement are hereby incorporated by reference, mutatis mutandis .'</li><li>'Any action or proceeding arising out of or relating to this Agreement shall be filed in and heard and litigated solely before the state or federal courts of Washington within King County.'</li><li>'Each of the parties hereto irrevocably consents to personal jurisdiction in any action brought in connection with this Agreement in the United States District Court for the Central District of California or any California court of competent jurisdiction. The parties also consent to venue in the above forums and to the convenience of the above forums. Any suit brought to enforce the provisions of this Agreement must be brought in the aforementioned forums.'</li></ul> |
| successors | <ul><li>'This Agreement shall be binding upon and shall inure to the benefit of the parties hereto and their respective successors and assigns, except that the Borrower may not assign or transfer its rights hereunder without the prior written consent of both Lenders.'</li><li>'This Agreement shall be binding upon and inure to the benefit of the successors and assigns of Grantor and Collateral Agent.'</li><li>'This Agreement shall be binding upon the First Lien Agents, the Senior Secured Parties, the Second Priority Agents, the Second Priority Secured Parties and their respective permitted successors and assigns.'</li></ul> |
| limitation of liability | <ul><li>'No provision hereof, in the absence of any affirmative action by the Holder to exercise this Warrant to purchase Warrant Shares, and no enumeration herein of the rights or privileges of the Holder, shall give rise to any liability of the Holder for the purchase price of any Common Stock or as a stockholder of the Company, whether such liability is asserted by the Company or by creditors of the Company.'</li><li>'No provision hereof, in the absence of any affirmative action by Holder to exercise this Warrant to purchase Warrant Shares, and no enumeration herein of the rights or privileges of Holder, shall give rise to any liability of Holder for the purchase price of any Common Stock or as a stockholder of the Company, whether such liability is asserted by the Company or by creditors of the Company.'</li><li>'The Limited Partners shall have no liability under this Agreement (other than for breach thereof) except as expressly provided in Section 10.04, \xa013.02(d) or under the Act.'</li></ul> |
| books | <ul><li>'The Company shall and shall cause each other Loan Party to keep proper books of records and account in which entries are made in a manner so as to permit preparation of financial statements in conformity with GAAP (or, in the case of any Foreign Subsidiary, generally accepted accounting principles in effect in the jurisdiction of organization of such Foreign Subsidiary).'</li><li>'The Company will not close its stockholder books or records in any manner which prevents the timely exercise of this Warrant, pursuant to the terms hereof.'</li><li>'Keep adequate records and books of account reflecting all financial transactions in conformity in all material respects with GAAP, consistently applied, and in conformity in all material respects with all applicable requirements of any Governmental Agency having regulatory jurisdiction over Borrower and its Restricted Subsidiaries.'</li></ul> |
| exercise price | <ul><li>'The exercise price per Warrant Share under this Warrant shall be $3.125, subject to adjustment hereunder (the “Exercise Price”).'</li><li>'Whenever the Exercise Price is adjusted pursuant to any provision of this Section\xa03, the Company shall promptly deliver to the Holder by facsimile or email a notice setting forth the Exercise Price after such adjustment and any resulting adjustment to the number of Warrant Shares and setting forth a brief statement of the facts requiring such adjustment.'</li><li>'Each Award Agreement shall state the Exercise Price, if applicable. Subject to Sections 3, 7.2 and 8.2 and to the foregoing, the Committee may reduce the Exercise Price of any outstanding Award, on terms and subject to such conditions as it deems advisable. The Exercise Price shall also be subject to adjustment as provided in Section 14 hereof.'</li></ul> |
| register | <ul><li>'The registered agent and office of the Fund shall be as provided in the Fund’s certificate of formation, or as otherwise determined by the Board of Managers.'</li><li>'The Company shall register this Warrant, upon records to be maintained by the Company for that purpose (the “ Warrant Register ”), in the name of the record Holder hereof from time to time. The Company may deem and treat the registered Holder of this Warrant as the absolute owner hereof for the purpose of any exercise hereof or any distribution to the Holder, and for all other purposes, absent actual notice to the contrary.'</li><li>'Upon its receipt of an agreement referred to in clause (ii)(y) above executed by an Assuming Lender or any Increasing Lender, together with the certificate referred to in clause (ii)(x) above, the Administrative Agent shall, if such agreement has been completed, (x) accept such agreement, (y) record the information contained therein in the Register and (z) give prompt notice thereof to the Borrower.'</li></ul> |
| powers | <ul><li>'The execution and delivery by the Servicer of this Agreement and each other Transaction Document to which it is a party, and the performance of its obligations hereunder and thereunder are within its corporate powers and authority and have been duly authorized by all necessary corporate action on its part. This Agreement and each other Transaction Document to which the Servicer is a party has been duly executed and delivered by the Servicer.'</li><li>'Purchaser has the power, authority and legal right to execute and deliver this Agreement and to carry out the terms hereof and to acquire the Receivables and the Other Conveyed Property hereunder; and the execution, delivery and performance of this Agreement and all of the documents required pursuant hereto have been duly authorized by Purchaser by all necessary corporate action.'</li><li>'The Company has all requisite power and authority to execute, deliver and perform its obligations under this Agreement, the Note and any other documents or items executed in connection with the transactions contemplated herein (collectively, the “Transaction Documents”) and to consummate the transactions contemplated hereby and thereby.'</li></ul> |
| good standings | <ul><li>'Seller has been duly organized and is validly existing as a corporation in good standing under the laws of the State of Delaware, with power and authority to own its properties and to conduct its business as such properties are currently owned and such business is currently conducted, and had at all relevant times, and now has, power, authority and legal right to acquire, own and sell the Receivables and the Other Conveyed Property to be transferred to Purchaser.'</li><li>'TI is a legal entity duly organized, validly existing and in good standing under the Laws of the Cayman Islands and has all requisite corporate power to enter into this Agreement and to carry its business as it has been and is currently conducted.'</li><li>'The Seller has been duly organized and is validly existing as a corporation in good standing under the laws of its jurisdiction of organization, with power and authority to own its properties and to conduct its business as such properties are currently owned and such business is currently conducted.'</li></ul> |
| transferability | <ul><li>'Except as expressly provided in the Plan or this Agreement, the RSUs may not be sold, assigned, transferred, pledged or otherwise disposed of, shall not be assignable by operation of law, and shall not be subject to execution, attachment or similar process, except by will or the laws of descent and distribution. Any attempted sale, assignment, transfer, pledge or other disposition of any RSU prior to vesting shall be null and void and without effect.'</li><li>'Except as expressly provided in the Plan or this Agreement, the RSUs may not be sold, assigned, transferred, pledged or otherwise disposed of, shall not be assignable by operation of law and shall not be subject to execution, attachment or similar process, except by will or the laws of descent and distribution. Any attempted sale, assignment, transfer, pledge or other disposition of any RSU prior to vesting shall be null and void and without effect.'</li><li>'To the maximum extent permitted by law, no benefit under the Plan may be assignable or subject in any manner to alienation, sale, transfer, claims of creditors, pledge, attachment, or encumbrances of any kind.'</li></ul> |
| permits | <ul><li>'Neither any Credit Party nor any of their Subsidiaries is in violation of any term of or in default under its certificate or articles of incorporation or bylaws or other governing documents. Neither any Credit Party nor any of their Subsidiaries is in violation of any judgment, decree or order or any law, rule, regulation, statute or ordinance applicable to any Credit Party or any of their Subsidiaries (including, without limitation, all Environmental Laws and the Requirements).'</li><li>'The Company has all certificates of occupancy, rights, permits, certificates, licenses, franchises, approvals and other authorizations as are reasonably necessary to conduct its respective business and to own, lease, use, operate and occupy its assets, at the places and in the manner now conducted and operated, except those the absence of which would not materially adversely affect its respective business.'</li><li>'Seller has received no written notice of any violations which remain uncured of any licenses and permits affecting any Property.'</li></ul> |
| existence | <ul><li>'The Company shall continue to engage primarily in the automotive business and preserve, renew and keep in full force and effect its organizational existence and take all reasonable actions to maintain all rights necessary for the normal conduct of its principal line of business, except, in each case, (i)\xa0to the extent that failure to do so would not have a Material Adverse Effect and (ii)\xa0as otherwise permitted or provided in the Loan Documents.'</li><li>'No Credit Party shall, and no Credit Party shall permit any of its Subsidiaries to, directly or indirectly, allow or suffer to exist any Liens, other than Permitted Liens.'</li><li>'So long as the Buyer beneficially owns the Note, the Company shall maintain its corporate existence and shall not sell all or substantially all of the Company’s assets, except in the event of a merger or consolidation or sale of all or substantially all of the Company’s assets, where the surviving or successor entity in such transaction assumes the Company’s obligations hereunder and under the agreements and instruments entered into in connection herewith.'</li></ul> |
## Evaluation
### Metrics
| Label | Accuracy |
|:--------|:---------|
| **all** | 0.9425 |
## Uses
### Direct Use for Inference
First install the SetFit library:
```bash
pip install setfit
```
Then you can load this model and run inference.
```python
from setfit import SetFitModel
# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("scholarly360/setfit-contracts-clauses")
# Run inference
preds = model("In the event of a Change in Control, the Eligible Employee shall immediately be fully vested in his or her benefit under the Plan.")
```
<!--
### Downstream Use
*List how someone could finetune this model on their own dataset.*
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Set Metrics
| Training set | Min | Median | Max |
|:-------------|:----|:--------|:----|
| Word count | 8 | 48.2975 | 87 |
| Label | Training Sample Count |
|:-----------------------------|:----------------------|
| governing laws | 4 |
| counterparts | 4 |
| notices | 4 |
| entire agreements | 4 |
| severability | 4 |
| waivers | 4 |
| amendments | 4 |
| expenses | 4 |
| survival | 4 |
| representations | 4 |
| assigns | 4 |
| taxes | 4 |
| litigations | 4 |
| insurances | 4 |
| confidentiality | 4 |
| waiver of jury trials | 4 |
| terminations | 4 |
| further assurances | 4 |
| general | 4 |
| terms | 4 |
| assignments | 4 |
| authority | 4 |
| use of proceeds | 4 |
| payments | 4 |
| compliance with laws | 4 |
| no conflicts | 4 |
| indemnifications | 4 |
| organizations | 4 |
| base salary | 4 |
| binding effects | 4 |
| headings | 4 |
| costs | 4 |
| definitions | 4 |
| modifications | 4 |
| remedies | 4 |
| releases | 4 |
| disclosures | 4 |
| participations | 4 |
| vesting | 4 |
| no waivers | 4 |
| withholdings | 4 |
| miscellaneous | 4 |
| jurisdictions | 4 |
| closings | 4 |
| integration | 4 |
| fees | 4 |
| effective dates | 4 |
| enforcements | 4 |
| financial statements | 4 |
| capitalization | 4 |
| benefits | 4 |
| interpretations | 4 |
| subsidiaries | 4 |
| solvency | 4 |
| cooperation | 4 |
| approvals | 4 |
| construction | 4 |
| intellectual property | 4 |
| brokers | 4 |
| enforceability | 4 |
| authorizations | 4 |
| consents | 4 |
| tax withholdings | 4 |
| arbitration | 4 |
| transactions with affiliates | 4 |
| applicable laws | 4 |
| defined terms | 4 |
| change in control | 4 |
| no defaults | 4 |
| adjustments | 4 |
| non-disparagement | 4 |
| employment | 4 |
| positions | 4 |
| erisa | 4 |
| warranties | 4 |
| disability | 4 |
| interests | 4 |
| duties | 4 |
| specific performance | 4 |
| anti-corruption laws | 4 |
| vacations | 4 |
| generally | 4 |
| publicity | 4 |
| choice of laws | 4 |
| liens | 4 |
| death | 4 |
| purposes | 4 |
| information | 4 |
| compensation | 4 |
| consent to jurisdiction | 4 |
| successors | 4 |
| limitation of liability | 4 |
| books | 4 |
| exercise price | 4 |
| register | 4 |
| powers | 4 |
| good standings | 4 |
| transferability | 4 |
| permits | 4 |
| existence | 4 |
### Training Hyperparameters
- batch_size: (16, 16)
- num_epochs: (2, 2)
- max_steps: -1
- sampling_strategy: oversampling
- body_learning_rate: (2e-05, 1e-05)
- head_learning_rate: 0.01
- loss: CosineSimilarityLoss
- distance_metric: cosine_distance
- margin: 0.25
- end_to_end: False
- use_amp: False
- warmup_proportion: 0.1
- seed: 42
- eval_max_steps: -1
- load_best_model_at_end: True
### Training Results
| Epoch | Step | Training Loss | Validation Loss |
|:-------:|:---------:|:-------------:|:---------------:|
| 0.0001 | 1 | 0.1159 | - |
| 0.0051 | 50 | 0.1675 | - |
| 0.0101 | 100 | 0.1142 | - |
| 0.0152 | 150 | 0.1509 | - |
| 0.0202 | 200 | 0.0455 | - |
| 0.0253 | 250 | 0.0999 | - |
| 0.0303 | 300 | 0.1259 | - |
| 0.0354 | 350 | 0.0873 | - |
| 0.0404 | 400 | 0.0993 | - |
| 0.0455 | 450 | 0.0457 | - |
| 0.0505 | 500 | 0.0835 | - |
| 0.0556 | 550 | 0.0809 | - |
| 0.0606 | 600 | 0.0821 | - |
| 0.0657 | 650 | 0.0603 | - |
| 0.0707 | 700 | 0.0502 | - |
| 0.0758 | 750 | 0.0532 | - |
| 0.0808 | 800 | 0.06 | - |
| 0.0859 | 850 | 0.1101 | - |
| 0.0909 | 900 | 0.036 | - |
| 0.0960 | 950 | 0.0287 | - |
| 0.1010 | 1000 | 0.0501 | - |
| 0.1061 | 1050 | 0.0405 | - |
| 0.1111 | 1100 | 0.0327 | - |
| 0.1162 | 1150 | 0.0315 | - |
| 0.1212 | 1200 | 0.022 | - |
| 0.1263 | 1250 | 0.0346 | - |
| 0.1313 | 1300 | 0.0782 | - |
| 0.1364 | 1350 | 0.0353 | - |
| 0.1414 | 1400 | 0.0225 | - |
| 0.1465 | 1450 | 0.0134 | - |
| 0.1515 | 1500 | 0.0791 | - |
| 0.1566 | 1550 | 0.015 | - |
| 0.1616 | 1600 | 0.0093 | - |
| 0.1667 | 1650 | 0.024 | - |
| 0.1717 | 1700 | 0.0062 | - |
| 0.1768 | 1750 | 0.0245 | - |
| 0.1818 | 1800 | 0.0102 | - |
| 0.1869 | 1850 | 0.0086 | - |
| 0.1919 | 1900 | 0.0238 | - |
| 0.1970 | 1950 | 0.0062 | - |
| 0.2020 | 2000 | 0.0382 | - |
| 0.2071 | 2050 | 0.0107 | - |
| 0.2121 | 2100 | 0.0045 | - |
| 0.2172 | 2150 | 0.009 | - |
| 0.2222 | 2200 | 0.0062 | - |
| 0.2273 | 2250 | 0.0217 | - |
| 0.2323 | 2300 | 0.0089 | - |
| 0.2374 | 2350 | 0.0048 | - |
| 0.2424 | 2400 | 0.0095 | - |
| 0.2475 | 2450 | 0.0137 | - |
| 0.2525 | 2500 | 0.0077 | - |
| 0.2576 | 2550 | 0.0086 | - |
| 0.2626 | 2600 | 0.0068 | - |
| 0.2677 | 2650 | 0.0063 | - |
| 0.2727 | 2700 | 0.0061 | - |
| 0.2778 | 2750 | 0.0181 | - |
| 0.2828 | 2800 | 0.0058 | - |
| 0.2879 | 2850 | 0.0052 | - |
| 0.2929 | 2900 | 0.0073 | - |
| 0.2980 | 2950 | 0.0088 | - |
| 0.3030 | 3000 | 0.0388 | - |
| 0.3081 | 3050 | 0.0108 | - |
| 0.3131 | 3100 | 0.0048 | - |
| 0.3182 | 3150 | 0.0046 | - |
| 0.3232 | 3200 | 0.0051 | - |
| 0.3283 | 3250 | 0.0035 | - |
| 0.3333 | 3300 | 0.0047 | - |
| 0.3384 | 3350 | 0.0061 | - |
| 0.3434 | 3400 | 0.0073 | - |
| 0.3485 | 3450 | 0.0041 | - |
| 0.3535 | 3500 | 0.0117 | - |
| 0.3586 | 3550 | 0.0032 | - |
| 0.3636 | 3600 | 0.0045 | - |
| 0.3687 | 3650 | 0.0042 | - |
| 0.3737 | 3700 | 0.0061 | - |
| 0.3788 | 3750 | 0.0056 | - |
| 0.3838 | 3800 | 0.0073 | - |
| 0.3889 | 3850 | 0.0057 | - |
| 0.3939 | 3900 | 0.0033 | - |
| 0.3990 | 3950 | 0.0027 | - |
| 0.4040 | 4000 | 0.0057 | - |
| 0.4091 | 4050 | 0.003 | - |
| 0.4141 | 4100 | 0.0044 | - |
| 0.4192 | 4150 | 0.0033 | - |
| 0.4242 | 4200 | 0.0036 | - |
| 0.4293 | 4250 | 0.0027 | - |
| 0.4343 | 4300 | 0.0065 | - |
| 0.4394 | 4350 | 0.035 | - |
| 0.4444 | 4400 | 0.0175 | - |
| 0.4495 | 4450 | 0.0027 | - |
| 0.4545 | 4500 | 0.0035 | - |
| 0.4596 | 4550 | 0.0019 | - |
| 0.4646 | 4600 | 0.0036 | - |
| 0.4697 | 4650 | 0.0022 | - |
| 0.4747 | 4700 | 0.0018 | - |
| 0.4798 | 4750 | 0.0076 | - |
| 0.4848 | 4800 | 0.0036 | - |
| 0.4899 | 4850 | 0.0581 | - |
| 0.4949 | 4900 | 0.0023 | - |
| 0.5 | 4950 | 0.004 | - |
| 0.5051 | 5000 | 0.0059 | - |
| 0.5101 | 5050 | 0.0024 | - |
| 0.5152 | 5100 | 0.0096 | - |
| 0.5202 | 5150 | 0.0059 | - |
| 0.5253 | 5200 | 0.0044 | - |
| 0.5303 | 5250 | 0.041 | - |
| 0.5354 | 5300 | 0.0028 | - |
| 0.5404 | 5350 | 0.0032 | - |
| 0.5455 | 5400 | 0.0017 | - |
| 0.5505 | 5450 | 0.002 | - |
| 0.5556 | 5500 | 0.0024 | - |
| 0.5606 | 5550 | 0.0034 | - |
| 0.5657 | 5600 | 0.0039 | - |
| 0.5707 | 5650 | 0.0023 | - |
| 0.5758 | 5700 | 0.0037 | - |
| 0.5808 | 5750 | 0.0594 | - |
| 0.5859 | 5800 | 0.0016 | - |
| 0.5909 | 5850 | 0.0168 | - |
| 0.5960 | 5900 | 0.0458 | - |
| 0.6010 | 5950 | 0.0019 | - |
| 0.6061 | 6000 | 0.001 | - |
| 0.6111 | 6050 | 0.0294 | - |
| 0.6162 | 6100 | 0.0027 | - |
| 0.6212 | 6150 | 0.0051 | - |
| 0.6263 | 6200 | 0.0014 | - |
| 0.6313 | 6250 | 0.0033 | - |
| 0.6364 | 6300 | 0.0021 | - |
| 0.6414 | 6350 | 0.0023 | - |
| 0.6465 | 6400 | 0.0018 | - |
| 0.6515 | 6450 | 0.0013 | - |
| 0.6566 | 6500 | 0.0041 | - |
| 0.6616 | 6550 | 0.0592 | - |
| 0.6667 | 6600 | 0.0019 | - |
| 0.6717 | 6650 | 0.0021 | - |
| 0.6768 | 6700 | 0.0606 | - |
| 0.6818 | 6750 | 0.0018 | - |
| 0.6869 | 6800 | 0.0014 | - |
| 0.6919 | 6850 | 0.0038 | - |
| 0.6970 | 6900 | 0.0567 | - |
| 0.7020 | 6950 | 0.0013 | - |
| 0.7071 | 7000 | 0.0015 | - |
| 0.7121 | 7050 | 0.0585 | - |
| 0.7172 | 7100 | 0.0014 | - |
| 0.7222 | 7150 | 0.0021 | - |
| 0.7273 | 7200 | 0.0179 | - |
| 0.7323 | 7250 | 0.0013 | - |
| 0.7374 | 7300 | 0.0101 | - |
| 0.7424 | 7350 | 0.0012 | - |
| 0.7475 | 7400 | 0.0009 | - |
| 0.7525 | 7450 | 0.001 | - |
| 0.7576 | 7500 | 0.0011 | - |
| 0.7626 | 7550 | 0.001 | - |
| 0.7677 | 7600 | 0.0022 | - |
| 0.7727 | 7650 | 0.0012 | - |
| 0.7778 | 7700 | 0.0011 | - |
| 0.7828 | 7750 | 0.0011 | - |
| 0.7879 | 7800 | 0.0011 | - |
| 0.7929 | 7850 | 0.0019 | - |
| 0.7980 | 7900 | 0.001 | - |
| 0.8030 | 7950 | 0.0594 | - |
| 0.8081 | 8000 | 0.024 | - |
| 0.8131 | 8050 | 0.001 | - |
| 0.8182 | 8100 | 0.0017 | - |
| 0.8232 | 8150 | 0.0013 | - |
| 0.8283 | 8200 | 0.0012 | - |
| 0.8333 | 8250 | 0.0017 | - |
| 0.8384 | 8300 | 0.0011 | - |
| 0.8434 | 8350 | 0.0013 | - |
| 0.8485 | 8400 | 0.0008 | - |
| 0.8535 | 8450 | 0.0007 | - |
| 0.8586 | 8500 | 0.0016 | - |
| 0.8636 | 8550 | 0.0008 | - |
| 0.8687 | 8600 | 0.0507 | - |
| 0.8737 | 8650 | 0.0014 | - |
| 0.8788 | 8700 | 0.0009 | - |
| 0.8838 | 8750 | 0.0564 | - |
| 0.8889 | 8800 | 0.001 | - |
| 0.8939 | 8850 | 0.0016 | - |
| 0.8990 | 8900 | 0.001 | - |
| 0.9040 | 8950 | 0.0009 | - |
| 0.9091 | 9000 | 0.0009 | - |
| 0.9141 | 9050 | 0.0014 | - |
| 0.9192 | 9100 | 0.0018 | - |
| 0.9242 | 9150 | 0.0012 | - |
| 0.9293 | 9200 | 0.0007 | - |
| 0.9343 | 9250 | 0.0009 | - |
| 0.9394 | 9300 | 0.0007 | - |
| 0.9444 | 9350 | 0.0014 | - |
| 0.9495 | 9400 | 0.0554 | - |
| 0.9545 | 9450 | 0.001 | - |
| 0.9596 | 9500 | 0.0011 | - |
| 0.9646 | 9550 | 0.0008 | - |
| 0.9697 | 9600 | 0.0008 | - |
| 0.9747 | 9650 | 0.0012 | - |
| 0.9798 | 9700 | 0.001 | - |
| 0.9848 | 9750 | 0.0168 | - |
| 0.9899 | 9800 | 0.0011 | - |
| 0.9949 | 9850 | 0.0011 | - |
| 1.0 | 9900 | 0.0194 | 0.0034 |
| 1.0051 | 9950 | 0.0546 | - |
| 1.0101 | 10000 | 0.0482 | - |
| 1.0152 | 10050 | 0.0009 | - |
| 1.0202 | 10100 | 0.0008 | - |
| 1.0253 | 10150 | 0.0006 | - |
| 1.0303 | 10200 | 0.0006 | - |
| 1.0354 | 10250 | 0.0446 | - |
| 1.0404 | 10300 | 0.0005 | - |
| 1.0455 | 10350 | 0.0008 | - |
| 1.0505 | 10400 | 0.0006 | - |
| 1.0556 | 10450 | 0.0009 | - |
| 1.0606 | 10500 | 0.0014 | - |
| 1.0657 | 10550 | 0.0006 | - |
| 1.0707 | 10600 | 0.0009 | - |
| 1.0758 | 10650 | 0.0005 | - |
| 1.0808 | 10700 | 0.0008 | - |
| 1.0859 | 10750 | 0.0545 | - |
| 1.0909 | 10800 | 0.0015 | - |
| 1.0960 | 10850 | 0.0006 | - |
| 1.1010 | 10900 | 0.0103 | - |
| 1.1061 | 10950 | 0.001 | - |
| 1.1111 | 11000 | 0.0011 | - |
| 1.1162 | 11050 | 0.0009 | - |
| 1.1212 | 11100 | 0.0014 | - |
| 1.1263 | 11150 | 0.0011 | - |
| 1.1313 | 11200 | 0.0007 | - |
| 1.1364 | 11250 | 0.0025 | - |
| 1.1414 | 11300 | 0.0007 | - |
| 1.1465 | 11350 | 0.0007 | - |
| 1.1515 | 11400 | 0.0584 | - |
| 1.1566 | 11450 | 0.0008 | - |
| 1.1616 | 11500 | 0.0007 | - |
| 1.1667 | 11550 | 0.0005 | - |
| 1.1717 | 11600 | 0.0009 | - |
| 1.1768 | 11650 | 0.0005 | - |
| 1.1818 | 11700 | 0.0009 | - |
| 1.1869 | 11750 | 0.0008 | - |
| 1.1919 | 11800 | 0.0009 | - |
| 1.1970 | 11850 | 0.0007 | - |
| 1.2020 | 11900 | 0.0006 | - |
| 1.2071 | 11950 | 0.0006 | - |
| 1.2121 | 12000 | 0.0005 | - |
| 1.2172 | 12050 | 0.0008 | - |
| 1.2222 | 12100 | 0.0006 | - |
| 1.2273 | 12150 | 0.0004 | - |
| 1.2323 | 12200 | 0.0006 | - |
| 1.2374 | 12250 | 0.0005 | - |
| 1.2424 | 12300 | 0.0005 | - |
| 1.2475 | 12350 | 0.001 | - |
| 1.2525 | 12400 | 0.0006 | - |
| 1.2576 | 12450 | 0.0008 | - |
| 1.2626 | 12500 | 0.0004 | - |
| 1.2677 | 12550 | 0.0006 | - |
| 1.2727 | 12600 | 0.001 | - |
| 1.2778 | 12650 | 0.0005 | - |
| 1.2828 | 12700 | 0.0005 | - |
| 1.2879 | 12750 | 0.0006 | - |
| 1.2929 | 12800 | 0.0005 | - |
| 1.2980 | 12850 | 0.0011 | - |
| 1.3030 | 12900 | 0.0011 | - |
| 1.3081 | 12950 | 0.0006 | - |
| 1.3131 | 13000 | 0.0006 | - |
| 1.3182 | 13050 | 0.0006 | - |
| 1.3232 | 13100 | 0.001 | - |
| 1.3283 | 13150 | 0.0008 | - |
| 1.3333 | 13200 | 0.0006 | - |
| 1.3384 | 13250 | 0.0006 | - |
| 1.3434 | 13300 | 0.0006 | - |
| 1.3485 | 13350 | 0.0008 | - |
| 1.3535 | 13400 | 0.001 | - |
| 1.3586 | 13450 | 0.0006 | - |
| 1.3636 | 13500 | 0.001 | - |
| 1.3687 | 13550 | 0.0006 | - |
| 1.3737 | 13600 | 0.0026 | - |
| 1.3788 | 13650 | 0.0005 | - |
| 1.3838 | 13700 | 0.0006 | - |
| 1.3889 | 13750 | 0.0011 | - |
| 1.3939 | 13800 | 0.0006 | - |
| 1.3990 | 13850 | 0.0009 | - |
| 1.4040 | 13900 | 0.0008 | - |
| 1.4091 | 13950 | 0.0014 | - |
| 1.4141 | 14000 | 0.0006 | - |
| 1.4192 | 14050 | 0.0005 | - |
| 1.4242 | 14100 | 0.0012 | - |
| 1.4293 | 14150 | 0.0005 | - |
| 1.4343 | 14200 | 0.0027 | - |
| 1.4394 | 14250 | 0.0004 | - |
| 1.4444 | 14300 | 0.0006 | - |
| 1.4495 | 14350 | 0.001 | - |
| 1.4545 | 14400 | 0.0004 | - |
| 1.4596 | 14450 | 0.0005 | - |
| 1.4646 | 14500 | 0.0004 | - |
| 1.4697 | 14550 | 0.0005 | - |
| 1.4747 | 14600 | 0.0008 | - |
| 1.4798 | 14650 | 0.0004 | - |
| 1.4848 | 14700 | 0.0005 | - |
| 1.4899 | 14750 | 0.0581 | - |
| 1.4949 | 14800 | 0.0005 | - |
| 1.5 | 14850 | 0.001 | - |
| 1.5051 | 14900 | 0.0007 | - |
| 1.5101 | 14950 | 0.0004 | - |
| 1.5152 | 15000 | 0.001 | - |
| 1.5202 | 15050 | 0.0004 | - |
| 1.5253 | 15100 | 0.0009 | - |
| 1.5303 | 15150 | 0.0004 | - |
| 1.5354 | 15200 | 0.0006 | - |
| 1.5404 | 15250 | 0.0007 | - |
| 1.5455 | 15300 | 0.0004 | - |
| 1.5505 | 15350 | 0.0009 | - |
| 1.5556 | 15400 | 0.0005 | - |
| 1.5606 | 15450 | 0.0007 | - |
| 1.5657 | 15500 | 0.0005 | - |
| 1.5707 | 15550 | 0.0005 | - |
| 1.5758 | 15600 | 0.0006 | - |
| 1.5808 | 15650 | 0.0586 | - |
| 1.5859 | 15700 | 0.0005 | - |
| 1.5909 | 15750 | 0.0014 | - |
| 1.5960 | 15800 | 0.0005 | - |
| 1.6010 | 15850 | 0.0007 | - |
| 1.6061 | 15900 | 0.0006 | - |
| 1.6111 | 15950 | 0.0011 | - |
| 1.6162 | 16000 | 0.0005 | - |
| 1.6212 | 16050 | 0.0007 | - |
| 1.6263 | 16100 | 0.0008 | - |
| 1.6313 | 16150 | 0.0005 | - |
| 1.6364 | 16200 | 0.0003 | - |
| 1.6414 | 16250 | 0.0004 | - |
| 1.6465 | 16300 | 0.0003 | - |
| 1.6515 | 16350 | 0.0004 | - |
| 1.6566 | 16400 | 0.0006 | - |
| 1.6616 | 16450 | 0.0572 | - |
| 1.6667 | 16500 | 0.0004 | - |
| 1.6717 | 16550 | 0.0005 | - |
| 1.6768 | 16600 | 0.0004 | - |
| 1.6818 | 16650 | 0.0007 | - |
| 1.6869 | 16700 | 0.0011 | - |
| 1.6919 | 16750 | 0.0007 | - |
| 1.6970 | 16800 | 0.0568 | - |
| 1.7020 | 16850 | 0.0007 | - |
| 1.7071 | 16900 | 0.0005 | - |
| 1.7121 | 16950 | 0.0584 | - |
| 1.7172 | 17000 | 0.0004 | - |
| 1.7222 | 17050 | 0.0004 | - |
| 1.7273 | 17100 | 0.0265 | - |
| 1.7323 | 17150 | 0.0006 | - |
| 1.7374 | 17200 | 0.0009 | - |
| 1.7424 | 17250 | 0.0005 | - |
| 1.7475 | 17300 | 0.0011 | - |
| 1.7525 | 17350 | 0.0005 | - |
| 1.7576 | 17400 | 0.0004 | - |
| 1.7626 | 17450 | 0.0007 | - |
| 1.7677 | 17500 | 0.0007 | - |
| 1.7727 | 17550 | 0.0003 | - |
| 1.7778 | 17600 | 0.0005 | - |
| 1.7828 | 17650 | 0.0003 | - |
| 1.7879 | 17700 | 0.0003 | - |
| 1.7929 | 17750 | 0.0003 | - |
| 1.7980 | 17800 | 0.0007 | - |
| 1.8030 | 17850 | 0.0577 | - |
| 1.8081 | 17900 | 0.0004 | - |
| 1.8131 | 17950 | 0.0005 | - |
| 1.8182 | 18000 | 0.0004 | - |
| 1.8232 | 18050 | 0.0004 | - |
| 1.8283 | 18100 | 0.0004 | - |
| 1.8333 | 18150 | 0.0004 | - |
| 1.8384 | 18200 | 0.0003 | - |
| 1.8434 | 18250 | 0.0005 | - |
| 1.8485 | 18300 | 0.0004 | - |
| 1.8535 | 18350 | 0.0004 | - |
| 1.8586 | 18400 | 0.0005 | - |
| 1.8636 | 18450 | 0.0004 | - |
| 1.8687 | 18500 | 0.0003 | - |
| 1.8737 | 18550 | 0.0003 | - |
| 1.8788 | 18600 | 0.0007 | - |
| 1.8838 | 18650 | 0.0586 | - |
| 1.8889 | 18700 | 0.0003 | - |
| 1.8939 | 18750 | 0.0004 | - |
| 1.8990 | 18800 | 0.0005 | - |
| 1.9040 | 18850 | 0.0004 | - |
| 1.9091 | 18900 | 0.0006 | - |
| 1.9141 | 18950 | 0.0004 | - |
| 1.9192 | 19000 | 0.0004 | - |
| 1.9242 | 19050 | 0.0004 | - |
| 1.9293 | 19100 | 0.0005 | - |
| 1.9343 | 19150 | 0.0003 | - |
| 1.9394 | 19200 | 0.0003 | - |
| 1.9444 | 19250 | 0.0003 | - |
| 1.9495 | 19300 | 0.0545 | - |
| 1.9545 | 19350 | 0.0004 | - |
| 1.9596 | 19400 | 0.0005 | - |
| 1.9646 | 19450 | 0.0004 | - |
| 1.9697 | 19500 | 0.0004 | - |
| 1.9747 | 19550 | 0.0004 | - |
| 1.9798 | 19600 | 0.0004 | - |
| 1.9848 | 19650 | 0.0045 | - |
| 1.9899 | 19700 | 0.0004 | - |
| 1.9949 | 19750 | 0.0005 | - |
| **2.0** | **19800** | **0.0006** | **0.0024** |
* The bold row denotes the saved checkpoint.
### Framework Versions
- Python: 3.10.12
- SetFit: 1.0.3
- Sentence Transformers: 2.7.0
- Transformers: 4.40.2
- PyTorch: 2.2.1+cu121
- Datasets: 2.19.1
- Tokenizers: 0.19.1
## Citation
### BibTeX
```bibtex
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> | [
"TEXT_CLASSIFICATION"
] | [
"BEAR"
] |
EleutherAI/pythia-6.9b-v0 | EleutherAI | text-generation | [
"transformers",
"pytorch",
"gpt_neox",
"text-generation",
"causal-lm",
"pythia",
"pythia_v0",
"en",
"dataset:the_pile",
"arxiv:2101.00027",
"arxiv:2201.07311",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | 2022-10-16T20:16:56 | 2023-03-29T18:48:58 | 762 | 8 | ---
datasets:
- the_pile
language:
- en
license: apache-2.0
tags:
- pytorch
- causal-lm
- pythia
- pythia_v0
---
The *Pythia Scaling Suite* is a collection of models developed to facilitate
interpretability research. It contains two sets of eight models of sizes
70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two
models: one trained on the Pile, and one trained on the Pile after the dataset
has been globally deduplicated. All 8 model sizes are trained on the exact
same data, in the exact same order. All Pythia models are available
[on Hugging Face](https://huggingface.co/models?other=pythia).
The Pythia model suite was deliberately designed to promote scientific
research on large language models, especially interpretability research.
Despite not centering downstream performance as a design goal, we find the
models <a href="#evaluations">match or exceed</a> the performance of
similar and same-sized models, such as those in the OPT and GPT-Neo suites.
Please note that all models in the *Pythia* suite were renamed in January
2023. For clarity, a <a href="#naming-convention-and-parameter-count">table
comparing the old and new names</a> is provided in this model card, together
with exact parameter counts.
## Pythia-6.9B
### Model Details
- Developed by: [EleutherAI](http://eleuther.ai)
- Model type: Transformer-based Language Model
- Language: English
- Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia)
for training procedure, config files, and details on how to use.
- Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
- License: Apache 2.0
- Contact: to ask questions about this model, join the [EleutherAI
Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`.
Please read the existing *Pythia* documentation before asking about it in the
EleutherAI Discord. For general correspondence: [contact@eleuther.
ai](mailto:[email protected]).
<figure>
| Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models |
| -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: |
| 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — |
| 160M | 85,056,000 | 12 | 768 | 12 | 4M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M |
| 410M | 302,311,424 | 24 | 1024 | 16 | 4M | 3.0 x 10<sup>-4</sup> | OPT-350M |
| 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — |
| 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 4M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B |
| 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B |
| 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B |
| 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — |
<figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and
non-deduped models of a given size have the same hyperparameters. “Equivalent”
models have <b>exactly</b> the same architecture, and the same number of
non-embedding parameters.</figcaption>
</figure>
### Uses and Limitations
#### Intended Use
The primary intended use of Pythia is research on the behavior, functionality,
and limitations of large language models. This suite is intended to provide
a controlled setting for performing scientific experiments. To enable the
study of how language models change over the course of training, we provide
143 evenly spaced intermediate checkpoints per model. These checkpoints are
hosted on Hugging Face as branches. Note that branch `143000` corresponds
exactly to the model checkpoint on the `main` branch of each model.
You may also further fine-tune and adapt Pythia-6.9B for deployment,
as long as your use is in accordance with the Apache 2.0 license. Pythia
models work with the Hugging Face [Transformers
Library](https://huggingface.co/docs/transformers/index). If you decide to use
pre-trained Pythia-6.9B as a basis for your fine-tuned model, please
conduct your own risk and bias assessment.
#### Out-of-scope use
The Pythia Suite is **not** intended for deployment. It is not a in itself
a product and cannot be used for human-facing interactions.
Pythia models are English-language only, and are not suitable for translation
or generating text in other languages.
Pythia-6.9B has not been fine-tuned for downstream contexts in which
language models are commonly deployed, such as writing genre prose,
or commercial chatbots. This means Pythia-6.9B will **not**
respond to a given prompt the way a product like ChatGPT does. This is because,
unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement
Learning from Human Feedback (RLHF) to better “understand” human instructions.
#### Limitations and biases
The core functionality of a large language model is to take a string of text
and predict the next token. The token deemed statistically most likely by the
model need not produce the most “accurate” text. Never rely on
Pythia-6.9B to produce factually accurate output.
This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset
known to contain profanity and texts that are lewd or otherwise offensive.
See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a
discussion of documented biases with regards to gender, religion, and race.
Pythia-6.9B may produce socially unacceptable or undesirable text, *even if*
the prompt itself does not include anything explicitly offensive.
If you plan on using text generated through, for example, the Hosted Inference
API, we recommend having a human curate the outputs of this language model
before presenting it to other people. Please inform your audience that the
text was generated by Pythia-6.9B.
### Quickstart
Pythia models can be loaded and used via the following code, demonstrated here
for the third `pythia-70m-deduped` checkpoint:
```python
from transformers import GPTNeoXForCausalLM, AutoTokenizer
model = GPTNeoXForCausalLM.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
tokenizer = AutoTokenizer.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
inputs = tokenizer("Hello, I am", return_tensors="pt")
tokens = model.generate(**inputs)
tokenizer.decode(tokens[0])
```
Revision/branch `step143000` corresponds exactly to the model checkpoint on
the `main` branch of each model.<br>
For more information on how to use all Pythia models, see [documentation on
GitHub](https://github.com/EleutherAI/pythia).
### Training
#### Training data
[The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in
English. It was created by EleutherAI specifically for training large language
models. It contains texts from 22 diverse sources, roughly broken down into
five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl),
prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and
miscellaneous (e.g. GitHub, Enron Emails). See [the Pile
paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources,
methodology, and a discussion of ethical implications. Consult [the
datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation
about the Pile and its component datasets. The Pile can be downloaded from
the [official website](https://pile.eleuther.ai/), or from a [community
mirror](https://the-eye.eu/public/AI/pile/).<br>
The Pile was **not** deduplicated before being used to train Pythia-6.9B.
#### Training procedure
All models were trained on the exact same data, in the exact same order. Each
model saw 299,892,736,000 tokens during training, and 143 checkpoints for each
model are saved every 2,097,152,000 tokens, spaced evenly throughout training.
This corresponds to training for just under 1 epoch on the Pile for
non-deduplicated models, and about 1.5 epochs on the deduplicated Pile.
All *Pythia* models trained for the equivalent of 143000 steps at a batch size
of 2,097,152 tokens. Two batch sizes were used: 2M and 4M. Models with a batch
size of 4M tokens listed were originally trained for 71500 steps instead, with
checkpoints every 500 steps. The checkpoints on Hugging Face are renamed for
consistency with all 2M batch models, so `step1000` is the first checkpoint
for `pythia-1.4b` that was saved (corresponding to step 500 in training), and
`step1000` is likewise the first `pythia-6.9b` checkpoint that was saved
(corresponding to 1000 “actual” steps).<br>
See [GitHub](https://github.com/EleutherAI/pythia) for more details on training
procedure, including [how to reproduce
it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br>
Pythia uses the same tokenizer as [GPT-NeoX-
20B](https://huggingface.co/EleutherAI/gpt-neox-20b).
### Evaluations
All 16 *Pythia* models were evaluated using the [LM Evaluation
Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access
the results by model and step at `results/json/*` in the [GitHub
repository](https://github.com/EleutherAI/pythia/tree/main/results/json).<br>
Expand the sections below to see plots of evaluation results for all
Pythia and Pythia-deduped models compared with OPT and BLOOM.
<details>
<summary>LAMBADA – OpenAI</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai.png" style="width:auto"/>
</details>
<details>
<summary>Physical Interaction: Question Answering (PIQA)</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa.png" style="width:auto"/>
</details>
<details>
<summary>WinoGrande</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande.png" style="width:auto"/>
</details>
<details>
<summary>AI2 Reasoning Challenge—Challenge Set</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_challenge.png" style="width:auto"/>
</details>
<details>
<summary>SciQ</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq.png" style="width:auto"/>
</details>
### Naming convention and parameter count
*Pythia* models were renamed in January 2023. It is possible that the old
naming convention still persists in some documentation by accident. The
current naming convention (70M, 160M, etc.) is based on total parameter count.
<figure style="width:32em">
| current Pythia suffix | old suffix | total params | non-embedding params |
| --------------------: | ---------: | -------------: | -------------------: |
| 70M | 19M | 70,426,624 | 18,915,328 |
| 160M | 125M | 162,322,944 | 85,056,000 |
| 410M | 350M | 405,334,016 | 302,311,424 |
| 1B | 800M | 1,011,781,632 | 805,736,448 |
| 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 |
| 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 |
| 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 |
| 12B | 13B | 11,846,072,320 | 11,327,027,200 |
</figure> | [
"QUESTION_ANSWERING",
"TRANSLATION"
] | [
"SCIQ"
] |
apple/DCLM-7B | apple | null | [
"transformers",
"safetensors",
"openlm",
"arxiv:2406.11794",
"license:apple-ascl",
"endpoints_compatible",
"region:us"
] | 2024-07-11T17:44:35 | 2024-07-26T03:40:38 | 759 | 835 | ---
license: apple-ascl
---
<img src="https://cdn-uploads.huggingface.co/production/uploads/63118add64939fabc0108b28/BB42g4V8HTxb5dR4tcy8A.png" alt="DCLM Logo" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
# Model Card for DCLM-Baseline-7B
DCLM-Baseline-7B is a 7 billion parameter language model trained on the DCLM-Baseline dataset, which was curated as part of the DataComp for Language Models (DCLM) benchmark. This model is designed to showcase the effectiveness of systematic data curation techniques for improving language model performance.
## Model Details
| Size | Training Tokens | Layers | Hidden Size | Attention Heads | Context Length |
|------|-----------------|--------|-------------|-----------------|----------------|
| 7B | 2.5T | 32 | 4096 | 32 | 2048 |
### Model Description
- **Developed by:** DataComp for Language Models (DCLM) Team
- **Model type:** Decoder-only Transformer language model
- **Language(s):** English (primarily)
- **License:** Apple Sample Code License
- **Contact:** [email protected]
- **Date:** June 2024
### Model Sources
- **Repository:** https://github.com/mlfoundations/dclm
- **Dataset:** https://huggingface.co/datasets/mlfoundations/dclm-baseline-1.0
- **Paper:** [DataComp-LM: In search of the next generation of training sets for language models](https://arxiv.org/abs/2406.11794)
## Using Model
First install open_lm
```bash
pip install git+https://github.com/mlfoundations/open_lm.git
```
Then:
```python
from open_lm.hf import *
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("apple/DCLM-Baseline-7B")
model = AutoModelForCausalLM.from_pretrained("apple/DCLM-Baseline-7B")
inputs = tokenizer(["Machine learning is"], return_tensors="pt")
gen_kwargs = {"max_new_tokens": 50, "top_p": 0.8, "temperature": 0.8, "do_sample": True, "repetition_penalty": 1.1}
output = model.generate(inputs['input_ids'], **gen_kwargs)
output = tokenizer.decode(output[0].tolist(), skip_special_tokens=True)
print(output)
```
### Training Details
The model was trained using the following setup:
- **Architecture:** Decoder-only Transformer
- **Framework:** PyTorch with OpenLM
- **Optimizer:** AdamW
- **Learning Rate:** 2e-3 (peak)
- **Weight Decay:** 0.05
- **Batch Size:** 2048 sequences
- **Sequence Length:** 2048 tokens
- **Total Training Tokens:** 2.5T
- **Hardware:** Trained on H100 GPUs
For more detailed training information, please refer to Section 3.4 and Appendix F of the DCLM paper.
To ensure our trained model is broadly useful, including for math and coding tasks, we combine our 3.8T [DCLM-BASELINE](https://huggingface.co/datasets/mlfoundations/dclm-baseline-1.0) with the [StarCoder](https://huggingface.co/datasets/bigcode/starcoderdata) and [ProofPile2](https://huggingface.co/datasets/EleutherAI/proof-pile-2) data to arrive at a 4.1T token dataset.
## Evaluation
Here are the evaluation results for DCLM-Baseline-7B on various tasks (using [llm-foundry](https://github.com/mosaicml/llm-foundry) eval suite)
| Task | Score |
|------|-------|
| MMLU (zero-shot) | 0.5766 |
| MMLU (few-shot) | 0.6372 |
| HellaSwag (zero-shot) | 0.7987 |
| HellaSwag | 0.8043 |
| Jeopardy | 0.4745 |
| TriviaQA | 0.5270 |
| GSM8K (CoT) | 0.0250 |
| AGI Eval SAT Math (CoT) | 0.0136 |
| AQuA (CoT) | 0.0490 |
| SVAMP (CoT) | 0.4900 |
| BigBench QA Wikidata | 0.7120 |
| ARC Easy | 0.8220 |
| ARC Challenge | 0.5990 |
| BigBench Misconceptions | 0.6986 |
| COPA | 0.8500 |
| SIQA | 0.8291 |
| CommonsenseQA | 0.8018 |
| PIQA | 0.8128 |
| OpenBookQA | 0.4540 |
| BigBench Novel Concepts | 0.7188 |
| BigBench Strange Stories | 0.7586 |
| BigBench Strategy QA | 0.6173 |
| LAMBADA | 0.8220 |
| Winograd | 0.8828 |
| Winogrande | 0.7269 |
| BigBench Conlang Translation | 0.0244 |
| BigBench Language Identification | 0.5219 |
| BigBench Conceptual Combinations | 0.6990 |
| BigBench Elementary Math QA | 0.3431 |
| BigBench Dyck Languages | 0.4930 |
| AGI Eval LSAT AR | 0.2435 |
| BigBench CS Algorithms | 0.6121 |
| BigBench Logical Deduction | 0.3620 |
| BigBench Operators | 0.4857 |
| BigBench Repeat Copy Logic | 0.4063 |
| Simple Arithmetic (no spaces) | 0.2940 |
| Simple Arithmetic (with spaces) | 0.3110 |
| MathQA | 0.3098 |
| LogiQA | 0.4132 |
| PubMedQA | 0.7060 |
| SQuAD | 0.5856 |
| AGI Eval LSAT RC | 0.6716 |
| AGI Eval LSAT LR | 0.5392 |
| CoQA | 0.4074 |
| BigBench Understanding Fables | 0.6825 |
| BoolQ | 0.8343 |
| AGI Eval SAT EN | 0.7670 |
| Winogender MC (Female) | 0.6000 |
| Winogender MC (Male) | 0.5500 |
| Enterprise PII Classification | 0.7676 |
| BBQ | 0.6912 |
| GPQA Main | 0.2612 |
| GPQA Diamond | 0.2475 |
Note: All scores are presented as decimal values between 0 and 1, representing the proportion of correct answers or the model's performance on each task.
## Comparison
Below are comparisions of this model with other models in the 7B regime.
| Model | Params | Tokens | Open dataset? | CORE | MMLU | EXTENDED |
|---------------|--------|--------|---------------|----------|----------|----------|
| **Open weights, closed datasets** | | | | | | |
| Llama2 | 7B | 2T | ❌ | 49.2 | 45.8 | 34.1 |
| DeepSeek | 7B | 2T | ❌ | 50.7 | 48.5 | 35.3 |
| Mistral-0.3 | 7B | ? | ❌ | 57.0 | 62.7 | 45.1 |
| QWEN-2 | 7B | ? | ❌ | 57.5 | **71.9** | 50.5 |
| Llama3 | 8B | 15T | ❌ | 57.6 | 66.2 | 46.3 |
| Gemma | 8B | 6T | ❌ | 57.8 | 64.3 | 44.6 |
| Phi-3 | 7B | ? | ❌ | **61.0** | 69.9 | **57.9** |
| **Open weights, open datasets** | | | | | | |
| Falcon | 7B | 1T | ✅ | 44.1 | 27.4 | 25.1 |
| OLMo-1.7 | 7B | 2.1T | ✅ | 47.0 | 54.0 | 34.2 |
| MAP-Neo | 7B | 4.5T | ✅ | **50.2** | **57.1** | **40.4** |
| **DCLM-7B** | 7B | 2.5T | ✅ | **56.1** | **63.7** | **43.6** |
## Limitations and Biases
While DCLM-Baseline-7B demonstrates strong performance across a range of tasks, it's important to note:
1. The model may exhibit biases present in its training data, which is derived from web crawl data.
2. It has not undergone specific alignment or safety fine-tuning, so outputs should be used with caution.
3. Performance on tasks not included in the evaluation suite may vary.
4. The model's knowledge is limited to its training data cutoff date.
## Ethical Considerations
Users should be aware that this model, like all large language models, can potentially generate harmful or biased content. It should not be used for making decisions about individuals or in sensitive applications without appropriate safeguards and human oversight.
## Citation
If you use this model in your research, please cite:
```
@article{Li2024DataCompLM,
title={DataComp-LM: In search of the next generation of training sets for language models},
author={Jeffrey Li and Alex Fang and Georgios Smyrnis and Maor Ivgi and Matt Jordan and Samir Gadre and Hritik Bansal and Etash Guha and Sedrick Keh and Kushal Arora and [... full author list]},
journal={arXiv preprint arXiv:2406.11794},
year={2024}
}
```
| [
"TRANSLATION"
] | [
"PUBMEDQA"
] |
arkohut/jina-embeddings-v3 | arkohut | feature-extraction | [
"transformers",
"safetensors",
"feature-extraction",
"sentence-similarity",
"mteb",
"sentence-transformers",
"custom_code",
"multilingual",
"af",
"am",
"ar",
"as",
"az",
"be",
"bg",
"bn",
"br",
"bs",
"ca",
"cs",
"cy",
"da",
"de",
"el",
"en",
"eo",
"es",
"et",
"eu",
"fa",
"fi",
"fr",
"fy",
"ga",
"gd",
"gl",
"gu",
"ha",
"he",
"hi",
"hr",
"hu",
"hy",
"id",
"is",
"it",
"ja",
"jv",
"ka",
"kk",
"km",
"kn",
"ko",
"ku",
"ky",
"la",
"lo",
"lt",
"lv",
"mg",
"mk",
"ml",
"mn",
"mr",
"ms",
"my",
"ne",
"nl",
"no",
"om",
"or",
"pa",
"pl",
"ps",
"pt",
"ro",
"ru",
"sa",
"sd",
"si",
"sk",
"sl",
"so",
"sq",
"sr",
"su",
"sv",
"sw",
"ta",
"te",
"th",
"tl",
"tr",
"ug",
"uk",
"ur",
"uz",
"vi",
"xh",
"yi",
"zh",
"arxiv:2409.10173",
"license:cc-by-nc-4.0",
"model-index",
"region:us"
] | 2024-10-23T15:21:29 | 2024-10-23T15:26:53 | 749 | 3 | ---
language:
- multilingual
- af
- am
- ar
- as
- az
- be
- bg
- bn
- br
- bs
- ca
- cs
- cy
- da
- de
- el
- en
- eo
- es
- et
- eu
- fa
- fi
- fr
- fy
- ga
- gd
- gl
- gu
- ha
- he
- hi
- hr
- hu
- hy
- id
- is
- it
- ja
- jv
- ka
- kk
- km
- kn
- ko
- ku
- ky
- la
- lo
- lt
- lv
- mg
- mk
- ml
- mn
- mr
- ms
- my
- ne
- nl
- false
- om
- or
- pa
- pl
- ps
- pt
- ro
- ru
- sa
- sd
- si
- sk
- sl
- so
- sq
- sr
- su
- sv
- sw
- ta
- te
- th
- tl
- tr
- ug
- uk
- ur
- uz
- vi
- xh
- yi
- zh
library_name: transformers
license: cc-by-nc-4.0
tags:
- feature-extraction
- sentence-similarity
- mteb
- sentence-transformers
inference: false
model-index:
- name: jina-embeddings-v3
results:
- task:
type: STS
dataset:
name: MTEB AFQMC (default)
type: C-MTEB/AFQMC
config: default
split: validation
revision: b44c3b011063adb25877c13823db83bb193913c4
metrics:
- type: cosine_pearson
value: 41.74237700998808
- type: cosine_spearman
value: 43.4726782647566
- type: euclidean_pearson
value: 42.244585459479964
- type: euclidean_spearman
value: 43.525070045169606
- type: main_score
value: 43.4726782647566
- type: manhattan_pearson
value: 42.04616728224863
- type: manhattan_spearman
value: 43.308828270754645
- type: pearson
value: 41.74237700998808
- type: spearman
value: 43.4726782647566
- task:
type: Retrieval
dataset:
name: MTEB ArguAna-PL (default)
type: clarin-knext/arguana-pl
config: default
split: test
revision: 63fc86750af76253e8c760fc9e534bbf24d260a2
metrics:
- type: main_score
value: 50.117999999999995
- type: map_at_1
value: 24.253
- type: map_at_10
value: 40.725
- type: map_at_100
value: 41.699999999999996
- type: map_at_1000
value: 41.707
- type: map_at_20
value: 41.467999999999996
- type: map_at_3
value: 35.467
- type: map_at_5
value: 38.291
- type: mrr_at_1
value: 24.751066856330013
- type: mrr_at_10
value: 40.91063808169072
- type: mrr_at_100
value: 41.885497923928675
- type: mrr_at_1000
value: 41.89301098419842
- type: mrr_at_20
value: 41.653552355442514
- type: mrr_at_3
value: 35.656709340919775
- type: mrr_at_5
value: 38.466097676623946
- type: nauc_map_at_1000_diff1
value: 7.503000359807567
- type: nauc_map_at_1000_max
value: -11.030405164830546
- type: nauc_map_at_1000_std
value: -8.902792782585117
- type: nauc_map_at_100_diff1
value: 7.509899249593199
- type: nauc_map_at_100_max
value: -11.023581259404406
- type: nauc_map_at_100_std
value: -8.892241185067272
- type: nauc_map_at_10_diff1
value: 7.24369711881512
- type: nauc_map_at_10_max
value: -10.810000200433278
- type: nauc_map_at_10_std
value: -8.987230542165776
- type: nauc_map_at_1_diff1
value: 11.37175831832417
- type: nauc_map_at_1_max
value: -13.315221903223055
- type: nauc_map_at_1_std
value: -9.398199605510275
- type: nauc_map_at_20_diff1
value: 7.477364530860648
- type: nauc_map_at_20_max
value: -10.901251218105566
- type: nauc_map_at_20_std
value: -8.868148116405925
- type: nauc_map_at_3_diff1
value: 6.555548802174882
- type: nauc_map_at_3_max
value: -12.247274800542934
- type: nauc_map_at_3_std
value: -9.879475250984811
- type: nauc_map_at_5_diff1
value: 7.426588563355882
- type: nauc_map_at_5_max
value: -11.347695686001805
- type: nauc_map_at_5_std
value: -9.34441892203972
- type: nauc_mrr_at_1000_diff1
value: 5.99737552143614
- type: nauc_mrr_at_1000_max
value: -11.327205136505727
- type: nauc_mrr_at_1000_std
value: -8.791079115519503
- type: nauc_mrr_at_100_diff1
value: 6.004622525255784
- type: nauc_mrr_at_100_max
value: -11.320336759899723
- type: nauc_mrr_at_100_std
value: -8.780602249831777
- type: nauc_mrr_at_10_diff1
value: 5.783623516930227
- type: nauc_mrr_at_10_max
value: -11.095971693467078
- type: nauc_mrr_at_10_std
value: -8.877242032013582
- type: nauc_mrr_at_1_diff1
value: 9.694937537703797
- type: nauc_mrr_at_1_max
value: -12.531905083727912
- type: nauc_mrr_at_1_std
value: -8.903992940100146
- type: nauc_mrr_at_20_diff1
value: 5.984841206233873
- type: nauc_mrr_at_20_max
value: -11.195236951048969
- type: nauc_mrr_at_20_std
value: -8.757266039186018
- type: nauc_mrr_at_3_diff1
value: 5.114333824261379
- type: nauc_mrr_at_3_max
value: -12.64809799843464
- type: nauc_mrr_at_3_std
value: -9.791146138025184
- type: nauc_mrr_at_5_diff1
value: 5.88941606224512
- type: nauc_mrr_at_5_max
value: -11.763903418071918
- type: nauc_mrr_at_5_std
value: -9.279175712709446
- type: nauc_ndcg_at_1000_diff1
value: 7.076950652226086
- type: nauc_ndcg_at_1000_max
value: -10.386482092087371
- type: nauc_ndcg_at_1000_std
value: -8.309190917074046
- type: nauc_ndcg_at_100_diff1
value: 7.2329220284865245
- type: nauc_ndcg_at_100_max
value: -10.208048403220337
- type: nauc_ndcg_at_100_std
value: -7.997975874274613
- type: nauc_ndcg_at_10_diff1
value: 6.065391100006953
- type: nauc_ndcg_at_10_max
value: -9.046164377601153
- type: nauc_ndcg_at_10_std
value: -8.34724889697153
- type: nauc_ndcg_at_1_diff1
value: 11.37175831832417
- type: nauc_ndcg_at_1_max
value: -13.315221903223055
- type: nauc_ndcg_at_1_std
value: -9.398199605510275
- type: nauc_ndcg_at_20_diff1
value: 6.949389989202601
- type: nauc_ndcg_at_20_max
value: -9.35740451760307
- type: nauc_ndcg_at_20_std
value: -7.761295171828212
- type: nauc_ndcg_at_3_diff1
value: 5.051471796151364
- type: nauc_ndcg_at_3_max
value: -12.158763333711653
- type: nauc_ndcg_at_3_std
value: -10.078902544421926
- type: nauc_ndcg_at_5_diff1
value: 6.527454512611454
- type: nauc_ndcg_at_5_max
value: -10.525118233848586
- type: nauc_ndcg_at_5_std
value: -9.120055125584031
- type: nauc_precision_at_1000_diff1
value: -10.6495668199151
- type: nauc_precision_at_1000_max
value: 12.070656425217841
- type: nauc_precision_at_1000_std
value: 55.844551709649004
- type: nauc_precision_at_100_diff1
value: 19.206967129266285
- type: nauc_precision_at_100_max
value: 16.296851020813456
- type: nauc_precision_at_100_std
value: 45.60378984257811
- type: nauc_precision_at_10_diff1
value: 0.6490335354304879
- type: nauc_precision_at_10_max
value: 0.5757198255366447
- type: nauc_precision_at_10_std
value: -4.875847131691451
- type: nauc_precision_at_1_diff1
value: 11.37175831832417
- type: nauc_precision_at_1_max
value: -13.315221903223055
- type: nauc_precision_at_1_std
value: -9.398199605510275
- type: nauc_precision_at_20_diff1
value: 4.899369866929203
- type: nauc_precision_at_20_max
value: 5.988537297189552
- type: nauc_precision_at_20_std
value: 4.830900387582837
- type: nauc_precision_at_3_diff1
value: 0.8791156910997744
- type: nauc_precision_at_3_max
value: -11.983373635905993
- type: nauc_precision_at_3_std
value: -10.646185111581257
- type: nauc_precision_at_5_diff1
value: 3.9314486166548432
- type: nauc_precision_at_5_max
value: -7.798591396895839
- type: nauc_precision_at_5_std
value: -8.293043407234125
- type: nauc_recall_at_1000_diff1
value: -10.649566819918673
- type: nauc_recall_at_1000_max
value: 12.070656425214647
- type: nauc_recall_at_1000_std
value: 55.84455170965023
- type: nauc_recall_at_100_diff1
value: 19.206967129265127
- type: nauc_recall_at_100_max
value: 16.296851020813722
- type: nauc_recall_at_100_std
value: 45.60378984257728
- type: nauc_recall_at_10_diff1
value: 0.6490335354304176
- type: nauc_recall_at_10_max
value: 0.5757198255366095
- type: nauc_recall_at_10_std
value: -4.875847131691468
- type: nauc_recall_at_1_diff1
value: 11.37175831832417
- type: nauc_recall_at_1_max
value: -13.315221903223055
- type: nauc_recall_at_1_std
value: -9.398199605510275
- type: nauc_recall_at_20_diff1
value: 4.899369866929402
- type: nauc_recall_at_20_max
value: 5.98853729718968
- type: nauc_recall_at_20_std
value: 4.830900387582967
- type: nauc_recall_at_3_diff1
value: 0.8791156910997652
- type: nauc_recall_at_3_max
value: -11.983373635905997
- type: nauc_recall_at_3_std
value: -10.64618511158124
- type: nauc_recall_at_5_diff1
value: 3.9314486166548472
- type: nauc_recall_at_5_max
value: -7.7985913968958585
- type: nauc_recall_at_5_std
value: -8.293043407234132
- type: ndcg_at_1
value: 24.253
- type: ndcg_at_10
value: 50.117999999999995
- type: ndcg_at_100
value: 54.291999999999994
- type: ndcg_at_1000
value: 54.44799999999999
- type: ndcg_at_20
value: 52.771
- type: ndcg_at_3
value: 39.296
- type: ndcg_at_5
value: 44.373000000000005
- type: precision_at_1
value: 24.253
- type: precision_at_10
value: 8.016
- type: precision_at_100
value: 0.984
- type: precision_at_1000
value: 0.1
- type: precision_at_20
value: 4.527
- type: precision_at_3
value: 16.808999999999997
- type: precision_at_5
value: 12.546
- type: recall_at_1
value: 24.253
- type: recall_at_10
value: 80.156
- type: recall_at_100
value: 98.43499999999999
- type: recall_at_1000
value: 99.57300000000001
- type: recall_at_20
value: 90.54100000000001
- type: recall_at_3
value: 50.427
- type: recall_at_5
value: 62.731
- task:
type: Retrieval
dataset:
name: MTEB DBPedia-PL (default)
type: clarin-knext/dbpedia-pl
config: default
split: test
revision: 76afe41d9af165cc40999fcaa92312b8b012064a
metrics:
- type: main_score
value: 34.827000000000005
- type: map_at_1
value: 7.049999999999999
- type: map_at_10
value: 14.982999999999999
- type: map_at_100
value: 20.816000000000003
- type: map_at_1000
value: 22.33
- type: map_at_20
value: 17.272000000000002
- type: map_at_3
value: 10.661
- type: map_at_5
value: 12.498
- type: mrr_at_1
value: 57.25
- type: mrr_at_10
value: 65.81934523809524
- type: mrr_at_100
value: 66.2564203928212
- type: mrr_at_1000
value: 66.27993662923856
- type: mrr_at_20
value: 66.0732139130649
- type: mrr_at_3
value: 64.08333333333333
- type: mrr_at_5
value: 65.27083333333333
- type: nauc_map_at_1000_diff1
value: 16.41780871174038
- type: nauc_map_at_1000_max
value: 30.193946325654654
- type: nauc_map_at_1000_std
value: 31.46095497039037
- type: nauc_map_at_100_diff1
value: 18.57903165498531
- type: nauc_map_at_100_max
value: 29.541476938623262
- type: nauc_map_at_100_std
value: 28.228604103301052
- type: nauc_map_at_10_diff1
value: 24.109434489748946
- type: nauc_map_at_10_max
value: 21.475954208048968
- type: nauc_map_at_10_std
value: 9.964464537806988
- type: nauc_map_at_1_diff1
value: 38.67437644802124
- type: nauc_map_at_1_max
value: 14.52136658726491
- type: nauc_map_at_1_std
value: -2.8981666782088755
- type: nauc_map_at_20_diff1
value: 21.42547228801935
- type: nauc_map_at_20_max
value: 25.04510402960458
- type: nauc_map_at_20_std
value: 16.533079346431155
- type: nauc_map_at_3_diff1
value: 26.63648858245477
- type: nauc_map_at_3_max
value: 13.632235789780415
- type: nauc_map_at_3_std
value: -0.40129174577700716
- type: nauc_map_at_5_diff1
value: 24.513861031197933
- type: nauc_map_at_5_max
value: 16.599888813946688
- type: nauc_map_at_5_std
value: 3.4448514739556346
- type: nauc_mrr_at_1000_diff1
value: 36.57353464537154
- type: nauc_mrr_at_1000_max
value: 55.34763483979515
- type: nauc_mrr_at_1000_std
value: 40.3722796438533
- type: nauc_mrr_at_100_diff1
value: 36.555989566513134
- type: nauc_mrr_at_100_max
value: 55.347805216808396
- type: nauc_mrr_at_100_std
value: 40.38465945075711
- type: nauc_mrr_at_10_diff1
value: 36.771572999261984
- type: nauc_mrr_at_10_max
value: 55.41239897909165
- type: nauc_mrr_at_10_std
value: 40.52058934624793
- type: nauc_mrr_at_1_diff1
value: 38.2472828531032
- type: nauc_mrr_at_1_max
value: 51.528473828685705
- type: nauc_mrr_at_1_std
value: 33.03676467942882
- type: nauc_mrr_at_20_diff1
value: 36.642602571889036
- type: nauc_mrr_at_20_max
value: 55.3763342076553
- type: nauc_mrr_at_20_std
value: 40.41520090500838
- type: nauc_mrr_at_3_diff1
value: 36.79451847426628
- type: nauc_mrr_at_3_max
value: 54.59778581826193
- type: nauc_mrr_at_3_std
value: 39.48392075873095
- type: nauc_mrr_at_5_diff1
value: 36.92150807529304
- type: nauc_mrr_at_5_max
value: 55.03553978718272
- type: nauc_mrr_at_5_std
value: 40.20147745489917
- type: nauc_ndcg_at_1000_diff1
value: 21.843092744321268
- type: nauc_ndcg_at_1000_max
value: 44.93275990394279
- type: nauc_ndcg_at_1000_std
value: 47.09186225236347
- type: nauc_ndcg_at_100_diff1
value: 25.180282568979095
- type: nauc_ndcg_at_100_max
value: 41.737709709508394
- type: nauc_ndcg_at_100_std
value: 38.80950644139446
- type: nauc_ndcg_at_10_diff1
value: 24.108368037214046
- type: nauc_ndcg_at_10_max
value: 41.29298370689967
- type: nauc_ndcg_at_10_std
value: 35.06450769738732
- type: nauc_ndcg_at_1_diff1
value: 35.51010679525079
- type: nauc_ndcg_at_1_max
value: 42.40790024212412
- type: nauc_ndcg_at_1_std
value: 26.696412036243157
- type: nauc_ndcg_at_20_diff1
value: 23.909989673256195
- type: nauc_ndcg_at_20_max
value: 39.78444647091927
- type: nauc_ndcg_at_20_std
value: 33.39544470364529
- type: nauc_ndcg_at_3_diff1
value: 22.50484297956035
- type: nauc_ndcg_at_3_max
value: 39.14551926034168
- type: nauc_ndcg_at_3_std
value: 30.330135925392014
- type: nauc_ndcg_at_5_diff1
value: 21.7798872028265
- type: nauc_ndcg_at_5_max
value: 40.23856975248015
- type: nauc_ndcg_at_5_std
value: 32.438381067440396
- type: nauc_precision_at_1000_diff1
value: -21.62692442272279
- type: nauc_precision_at_1000_max
value: 0.9689046974430882
- type: nauc_precision_at_1000_std
value: 18.54001058230465
- type: nauc_precision_at_100_diff1
value: -10.132258779856192
- type: nauc_precision_at_100_max
value: 23.74516110444681
- type: nauc_precision_at_100_std
value: 47.03416663319965
- type: nauc_precision_at_10_diff1
value: 1.543656509571949
- type: nauc_precision_at_10_max
value: 36.98864812757555
- type: nauc_precision_at_10_std
value: 46.56427199077426
- type: nauc_precision_at_1_diff1
value: 38.2472828531032
- type: nauc_precision_at_1_max
value: 51.528473828685705
- type: nauc_precision_at_1_std
value: 33.03676467942882
- type: nauc_precision_at_20_diff1
value: -4.612864872734335
- type: nauc_precision_at_20_max
value: 34.03565449182125
- type: nauc_precision_at_20_std
value: 48.880727648349534
- type: nauc_precision_at_3_diff1
value: 6.360850444467829
- type: nauc_precision_at_3_max
value: 36.25816942368427
- type: nauc_precision_at_3_std
value: 34.48882647419187
- type: nauc_precision_at_5_diff1
value: 2.6445596936740037
- type: nauc_precision_at_5_max
value: 37.174463388899056
- type: nauc_precision_at_5_std
value: 40.25254370626113
- type: nauc_recall_at_1000_diff1
value: 13.041227176748077
- type: nauc_recall_at_1000_max
value: 39.722336427072094
- type: nauc_recall_at_1000_std
value: 52.04032890059214
- type: nauc_recall_at_100_diff1
value: 18.286096899139153
- type: nauc_recall_at_100_max
value: 34.072389201930314
- type: nauc_recall_at_100_std
value: 37.73637623416653
- type: nauc_recall_at_10_diff1
value: 22.35560419280504
- type: nauc_recall_at_10_max
value: 19.727247199595197
- type: nauc_recall_at_10_std
value: 8.58498575109203
- type: nauc_recall_at_1_diff1
value: 38.67437644802124
- type: nauc_recall_at_1_max
value: 14.52136658726491
- type: nauc_recall_at_1_std
value: -2.8981666782088755
- type: nauc_recall_at_20_diff1
value: 19.026320886902916
- type: nauc_recall_at_20_max
value: 22.753562309469867
- type: nauc_recall_at_20_std
value: 14.89994263882445
- type: nauc_recall_at_3_diff1
value: 23.428129702129684
- type: nauc_recall_at_3_max
value: 10.549153954790542
- type: nauc_recall_at_3_std
value: -1.7590608997055206
- type: nauc_recall_at_5_diff1
value: 21.27448645803921
- type: nauc_recall_at_5_max
value: 13.620279707461677
- type: nauc_recall_at_5_std
value: 2.0577962208292675
- type: ndcg_at_1
value: 46.75
- type: ndcg_at_10
value: 34.827000000000005
- type: ndcg_at_100
value: 38.157999999999994
- type: ndcg_at_1000
value: 44.816
- type: ndcg_at_20
value: 34.152
- type: ndcg_at_3
value: 39.009
- type: ndcg_at_5
value: 36.826
- type: precision_at_1
value: 57.25
- type: precision_at_10
value: 27.575
- type: precision_at_100
value: 8.84
- type: precision_at_1000
value: 1.949
- type: precision_at_20
value: 20.724999999999998
- type: precision_at_3
value: 41.167
- type: precision_at_5
value: 35.199999999999996
- type: recall_at_1
value: 7.049999999999999
- type: recall_at_10
value: 19.817999999999998
- type: recall_at_100
value: 42.559999999999995
- type: recall_at_1000
value: 63.744
- type: recall_at_20
value: 25.968000000000004
- type: recall_at_3
value: 11.959
- type: recall_at_5
value: 14.939
- task:
type: Retrieval
dataset:
name: MTEB FiQA-PL (default)
type: clarin-knext/fiqa-pl
config: default
split: test
revision: 2e535829717f8bf9dc829b7f911cc5bbd4e6608e
metrics:
- type: main_score
value: 38.828
- type: map_at_1
value: 19.126
- type: map_at_10
value: 31.002000000000002
- type: map_at_100
value: 32.736
- type: map_at_1000
value: 32.933
- type: map_at_20
value: 31.894
- type: map_at_3
value: 26.583000000000002
- type: map_at_5
value: 28.904000000000003
- type: mrr_at_1
value: 37.808641975308646
- type: mrr_at_10
value: 46.36745541838134
- type: mrr_at_100
value: 47.14140915794908
- type: mrr_at_1000
value: 47.190701435388846
- type: mrr_at_20
value: 46.81387776440309
- type: mrr_at_3
value: 43.750000000000014
- type: mrr_at_5
value: 45.23919753086418
- type: nauc_map_at_1000_diff1
value: 38.5532285881503
- type: nauc_map_at_1000_max
value: 34.44383884813453
- type: nauc_map_at_1000_std
value: -1.3963497949476722
- type: nauc_map_at_100_diff1
value: 38.49292464176943
- type: nauc_map_at_100_max
value: 34.33752755618645
- type: nauc_map_at_100_std
value: -1.4794032905848582
- type: nauc_map_at_10_diff1
value: 38.26061536370962
- type: nauc_map_at_10_max
value: 33.16977912721411
- type: nauc_map_at_10_std
value: -2.3853370604730393
- type: nauc_map_at_1_diff1
value: 46.288767289528344
- type: nauc_map_at_1_max
value: 25.67706785013364
- type: nauc_map_at_1_std
value: -6.989769609924645
- type: nauc_map_at_20_diff1
value: 38.507270129330685
- type: nauc_map_at_20_max
value: 33.70963328055982
- type: nauc_map_at_20_std
value: -1.9835510011554272
- type: nauc_map_at_3_diff1
value: 39.81061518646884
- type: nauc_map_at_3_max
value: 30.101186374147748
- type: nauc_map_at_3_std
value: -4.027120247237715
- type: nauc_map_at_5_diff1
value: 38.55602589746512
- type: nauc_map_at_5_max
value: 31.515174267015983
- type: nauc_map_at_5_std
value: -3.4064239358570303
- type: nauc_mrr_at_1000_diff1
value: 45.030514454725726
- type: nauc_mrr_at_1000_max
value: 43.878919881666164
- type: nauc_mrr_at_1000_std
value: 2.517594250297626
- type: nauc_mrr_at_100_diff1
value: 45.00868212878687
- type: nauc_mrr_at_100_max
value: 43.87437011120001
- type: nauc_mrr_at_100_std
value: 2.5257874265014966
- type: nauc_mrr_at_10_diff1
value: 44.855044606754056
- type: nauc_mrr_at_10_max
value: 43.946617058785186
- type: nauc_mrr_at_10_std
value: 2.5173751662794044
- type: nauc_mrr_at_1_diff1
value: 49.441510997817346
- type: nauc_mrr_at_1_max
value: 43.08547383044357
- type: nauc_mrr_at_1_std
value: -1.8747770703324347
- type: nauc_mrr_at_20_diff1
value: 45.019880416584215
- type: nauc_mrr_at_20_max
value: 43.85691473662242
- type: nauc_mrr_at_20_std
value: 2.4625487605091303
- type: nauc_mrr_at_3_diff1
value: 45.322041658604036
- type: nauc_mrr_at_3_max
value: 43.95079293074395
- type: nauc_mrr_at_3_std
value: 2.4644274393435737
- type: nauc_mrr_at_5_diff1
value: 44.99461837803437
- type: nauc_mrr_at_5_max
value: 43.97934275090601
- type: nauc_mrr_at_5_std
value: 2.5353091695125096
- type: nauc_ndcg_at_1000_diff1
value: 39.38449023275524
- type: nauc_ndcg_at_1000_max
value: 39.48382767312788
- type: nauc_ndcg_at_1000_std
value: 3.414789408343409
- type: nauc_ndcg_at_100_diff1
value: 38.29675861135578
- type: nauc_ndcg_at_100_max
value: 38.2674786507297
- type: nauc_ndcg_at_100_std
value: 2.7094055381218207
- type: nauc_ndcg_at_10_diff1
value: 38.09514955708717
- type: nauc_ndcg_at_10_max
value: 36.664923238906525
- type: nauc_ndcg_at_10_std
value: 0.6901410544967921
- type: nauc_ndcg_at_1_diff1
value: 49.441510997817346
- type: nauc_ndcg_at_1_max
value: 43.08547383044357
- type: nauc_ndcg_at_1_std
value: -1.8747770703324347
- type: nauc_ndcg_at_20_diff1
value: 38.44967736231759
- type: nauc_ndcg_at_20_max
value: 36.871179313622584
- type: nauc_ndcg_at_20_std
value: 1.157560360065234
- type: nauc_ndcg_at_3_diff1
value: 39.02419271805571
- type: nauc_ndcg_at_3_max
value: 37.447669442586324
- type: nauc_ndcg_at_3_std
value: 0.41502589779297794
- type: nauc_ndcg_at_5_diff1
value: 38.10233452742001
- type: nauc_ndcg_at_5_max
value: 35.816381905465676
- type: nauc_ndcg_at_5_std
value: -0.3704499913387088
- type: nauc_precision_at_1000_diff1
value: 2.451267097838658
- type: nauc_precision_at_1000_max
value: 29.116394969085306
- type: nauc_precision_at_1000_std
value: 14.85900786538363
- type: nauc_precision_at_100_diff1
value: 8.10919082251277
- type: nauc_precision_at_100_max
value: 36.28388256191417
- type: nauc_precision_at_100_std
value: 14.830039904317657
- type: nauc_precision_at_10_diff1
value: 15.02446609920477
- type: nauc_precision_at_10_max
value: 41.008463775454054
- type: nauc_precision_at_10_std
value: 10.431403152334486
- type: nauc_precision_at_1_diff1
value: 49.441510997817346
- type: nauc_precision_at_1_max
value: 43.08547383044357
- type: nauc_precision_at_1_std
value: -1.8747770703324347
- type: nauc_precision_at_20_diff1
value: 14.222022201169926
- type: nauc_precision_at_20_max
value: 40.10189643835305
- type: nauc_precision_at_20_std
value: 12.204443815975527
- type: nauc_precision_at_3_diff1
value: 25.41905395341234
- type: nauc_precision_at_3_max
value: 41.56133905339819
- type: nauc_precision_at_3_std
value: 5.575516915590082
- type: nauc_precision_at_5_diff1
value: 20.20081221089351
- type: nauc_precision_at_5_max
value: 40.95218555916681
- type: nauc_precision_at_5_std
value: 7.2040745500708745
- type: nauc_recall_at_1000_diff1
value: 28.021198234033395
- type: nauc_recall_at_1000_max
value: 36.165148684597504
- type: nauc_recall_at_1000_std
value: 28.28852356008973
- type: nauc_recall_at_100_diff1
value: 21.882447802741897
- type: nauc_recall_at_100_max
value: 26.979684607567222
- type: nauc_recall_at_100_std
value: 9.783658817010082
- type: nauc_recall_at_10_diff1
value: 28.493097951178818
- type: nauc_recall_at_10_max
value: 29.40937476550134
- type: nauc_recall_at_10_std
value: 2.7593763576979353
- type: nauc_recall_at_1_diff1
value: 46.288767289528344
- type: nauc_recall_at_1_max
value: 25.67706785013364
- type: nauc_recall_at_1_std
value: -6.989769609924645
- type: nauc_recall_at_20_diff1
value: 27.638381299425234
- type: nauc_recall_at_20_max
value: 27.942035836106328
- type: nauc_recall_at_20_std
value: 3.489835161380808
- type: nauc_recall_at_3_diff1
value: 33.90054781392646
- type: nauc_recall_at_3_max
value: 27.778812533030322
- type: nauc_recall_at_3_std
value: -0.03054068020022706
- type: nauc_recall_at_5_diff1
value: 30.279060732221346
- type: nauc_recall_at_5_max
value: 27.49854749597931
- type: nauc_recall_at_5_std
value: 0.5434664581939099
- type: ndcg_at_1
value: 37.809
- type: ndcg_at_10
value: 38.828
- type: ndcg_at_100
value: 45.218
- type: ndcg_at_1000
value: 48.510999999999996
- type: ndcg_at_20
value: 41.11
- type: ndcg_at_3
value: 34.466
- type: ndcg_at_5
value: 35.843
- type: precision_at_1
value: 37.809
- type: precision_at_10
value: 11.157
- type: precision_at_100
value: 1.762
- type: precision_at_1000
value: 0.233
- type: precision_at_20
value: 6.497
- type: precision_at_3
value: 23.044999999999998
- type: precision_at_5
value: 17.284
- type: recall_at_1
value: 19.126
- type: recall_at_10
value: 46.062
- type: recall_at_100
value: 70.22800000000001
- type: recall_at_1000
value: 89.803
- type: recall_at_20
value: 53.217999999999996
- type: recall_at_3
value: 30.847
- type: recall_at_5
value: 37.11
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA-PL (default)
type: clarin-knext/hotpotqa-pl
config: default
split: test
revision: a0bd479ac97b4ccb5bd6ce320c415d0bb4beb907
metrics:
- type: main_score
value: 60.27
- type: map_at_1
value: 35.199000000000005
- type: map_at_10
value: 51.369
- type: map_at_100
value: 52.212
- type: map_at_1000
value: 52.28
- type: map_at_20
value: 51.864
- type: map_at_3
value: 48.446
- type: map_at_5
value: 50.302
- type: mrr_at_1
value: 70.39837947332883
- type: mrr_at_10
value: 76.8346141067273
- type: mrr_at_100
value: 77.10724392048137
- type: mrr_at_1000
value: 77.12037412892865
- type: mrr_at_20
value: 77.01061532947222
- type: mrr_at_3
value: 75.5908170155299
- type: mrr_at_5
value: 76.39095205941899
- type: nauc_map_at_1000_diff1
value: 24.701387884989117
- type: nauc_map_at_1000_max
value: 23.25553235642178
- type: nauc_map_at_1000_std
value: 7.1803506915661774
- type: nauc_map_at_100_diff1
value: 24.674498622483103
- type: nauc_map_at_100_max
value: 23.234948525052175
- type: nauc_map_at_100_std
value: 7.168677997105447
- type: nauc_map_at_10_diff1
value: 24.676025039755626
- type: nauc_map_at_10_max
value: 23.171971872726964
- type: nauc_map_at_10_std
value: 6.485610909852058
- type: nauc_map_at_1_diff1
value: 68.90178464319715
- type: nauc_map_at_1_max
value: 46.05537868917558
- type: nauc_map_at_1_std
value: 1.7658552480698708
- type: nauc_map_at_20_diff1
value: 24.69297151842494
- type: nauc_map_at_20_max
value: 23.213064691673637
- type: nauc_map_at_20_std
value: 6.9357946556849
- type: nauc_map_at_3_diff1
value: 26.279128947950507
- type: nauc_map_at_3_max
value: 23.929537354117922
- type: nauc_map_at_3_std
value: 4.625061565714759
- type: nauc_map_at_5_diff1
value: 25.04448959482816
- type: nauc_map_at_5_max
value: 23.432012857899338
- type: nauc_map_at_5_std
value: 5.845744681998008
- type: nauc_mrr_at_1000_diff1
value: 66.7503918108276
- type: nauc_mrr_at_1000_max
value: 48.42897342336844
- type: nauc_mrr_at_1000_std
value: 5.3097517971144415
- type: nauc_mrr_at_100_diff1
value: 66.74645215862695
- type: nauc_mrr_at_100_max
value: 48.4368663009989
- type: nauc_mrr_at_100_std
value: 5.322297898555188
- type: nauc_mrr_at_10_diff1
value: 66.69310166180729
- type: nauc_mrr_at_10_max
value: 48.475437698330225
- type: nauc_mrr_at_10_std
value: 5.258183461631702
- type: nauc_mrr_at_1_diff1
value: 68.90178464319715
- type: nauc_mrr_at_1_max
value: 46.05537868917558
- type: nauc_mrr_at_1_std
value: 1.7658552480698708
- type: nauc_mrr_at_20_diff1
value: 66.72000262431975
- type: nauc_mrr_at_20_max
value: 48.45593642981319
- type: nauc_mrr_at_20_std
value: 5.353665929072101
- type: nauc_mrr_at_3_diff1
value: 66.84936676396276
- type: nauc_mrr_at_3_max
value: 48.466611276778295
- type: nauc_mrr_at_3_std
value: 4.485810398557475
- type: nauc_mrr_at_5_diff1
value: 66.62362565394174
- type: nauc_mrr_at_5_max
value: 48.456431835482014
- type: nauc_mrr_at_5_std
value: 5.08482458391903
- type: nauc_ndcg_at_1000_diff1
value: 29.984825173719443
- type: nauc_ndcg_at_1000_max
value: 27.289179238639893
- type: nauc_ndcg_at_1000_std
value: 10.661480455527526
- type: nauc_ndcg_at_100_diff1
value: 29.322074257047877
- type: nauc_ndcg_at_100_max
value: 26.850650276220605
- type: nauc_ndcg_at_100_std
value: 10.599247982501902
- type: nauc_ndcg_at_10_diff1
value: 29.659909113886094
- type: nauc_ndcg_at_10_max
value: 26.836139599331005
- type: nauc_ndcg_at_10_std
value: 8.12844399452719
- type: nauc_ndcg_at_1_diff1
value: 68.90178464319715
- type: nauc_ndcg_at_1_max
value: 46.05537868917558
- type: nauc_ndcg_at_1_std
value: 1.7658552480698708
- type: nauc_ndcg_at_20_diff1
value: 29.510802214854294
- type: nauc_ndcg_at_20_max
value: 26.775562637730722
- type: nauc_ndcg_at_20_std
value: 9.341342661702363
- type: nauc_ndcg_at_3_diff1
value: 32.741885846292966
- type: nauc_ndcg_at_3_max
value: 28.44225108761343
- type: nauc_ndcg_at_3_std
value: 5.204440768465042
- type: nauc_ndcg_at_5_diff1
value: 30.57856348635919
- type: nauc_ndcg_at_5_max
value: 27.475007474301698
- type: nauc_ndcg_at_5_std
value: 6.961546044312487
- type: nauc_precision_at_1000_diff1
value: 0.002113156309413332
- type: nauc_precision_at_1000_max
value: 11.198242419541286
- type: nauc_precision_at_1000_std
value: 28.69676419166541
- type: nauc_precision_at_100_diff1
value: 3.6049575557782627
- type: nauc_precision_at_100_max
value: 12.499173524574791
- type: nauc_precision_at_100_std
value: 23.3755281004721
- type: nauc_precision_at_10_diff1
value: 10.922574784853193
- type: nauc_precision_at_10_max
value: 16.23221529562036
- type: nauc_precision_at_10_std
value: 12.45014808813857
- type: nauc_precision_at_1_diff1
value: 68.90178464319715
- type: nauc_precision_at_1_max
value: 46.05537868917558
- type: nauc_precision_at_1_std
value: 1.7658552480698708
- type: nauc_precision_at_20_diff1
value: 8.840710781302827
- type: nauc_precision_at_20_max
value: 14.804644554205524
- type: nauc_precision_at_20_std
value: 16.245009770815237
- type: nauc_precision_at_3_diff1
value: 19.447291487137573
- type: nauc_precision_at_3_max
value: 21.47123471597057
- type: nauc_precision_at_3_std
value: 6.441862800128802
- type: nauc_precision_at_5_diff1
value: 14.078545719721108
- type: nauc_precision_at_5_max
value: 18.468288046016387
- type: nauc_precision_at_5_std
value: 9.58650641691393
- type: nauc_recall_at_1000_diff1
value: 0.0021131563095336584
- type: nauc_recall_at_1000_max
value: 11.198242419541558
- type: nauc_recall_at_1000_std
value: 28.6967641916655
- type: nauc_recall_at_100_diff1
value: 3.6049575557781393
- type: nauc_recall_at_100_max
value: 12.499173524574765
- type: nauc_recall_at_100_std
value: 23.375528100472074
- type: nauc_recall_at_10_diff1
value: 10.922574784853168
- type: nauc_recall_at_10_max
value: 16.2322152956203
- type: nauc_recall_at_10_std
value: 12.450148088138535
- type: nauc_recall_at_1_diff1
value: 68.90178464319715
- type: nauc_recall_at_1_max
value: 46.05537868917558
- type: nauc_recall_at_1_std
value: 1.7658552480698708
- type: nauc_recall_at_20_diff1
value: 8.840710781302905
- type: nauc_recall_at_20_max
value: 14.804644554205515
- type: nauc_recall_at_20_std
value: 16.245009770815273
- type: nauc_recall_at_3_diff1
value: 19.447291487137498
- type: nauc_recall_at_3_max
value: 21.47123471597054
- type: nauc_recall_at_3_std
value: 6.441862800128763
- type: nauc_recall_at_5_diff1
value: 14.07854571972115
- type: nauc_recall_at_5_max
value: 18.468288046016337
- type: nauc_recall_at_5_std
value: 9.586506416913904
- type: ndcg_at_1
value: 70.39800000000001
- type: ndcg_at_10
value: 60.27
- type: ndcg_at_100
value: 63.400999999999996
- type: ndcg_at_1000
value: 64.847
- type: ndcg_at_20
value: 61.571
- type: ndcg_at_3
value: 55.875
- type: ndcg_at_5
value: 58.36599999999999
- type: precision_at_1
value: 70.39800000000001
- type: precision_at_10
value: 12.46
- type: precision_at_100
value: 1.493
- type: precision_at_1000
value: 0.169
- type: precision_at_20
value: 6.65
- type: precision_at_3
value: 35.062
- type: precision_at_5
value: 23.009
- type: recall_at_1
value: 35.199000000000005
- type: recall_at_10
value: 62.302
- type: recall_at_100
value: 74.666
- type: recall_at_1000
value: 84.355
- type: recall_at_20
value: 66.496
- type: recall_at_3
value: 52.593
- type: recall_at_5
value: 57.522
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO-PL (default)
type: clarin-knext/msmarco-pl
config: default
split: test
revision: 8634c07806d5cce3a6138e260e59b81760a0a640
metrics:
- type: main_score
value: 64.886
- type: map_at_1
value: 1.644
- type: map_at_10
value: 12.24
- type: map_at_100
value: 28.248
- type: map_at_1000
value: 33.506
- type: map_at_20
value: 17.497
- type: map_at_3
value: 4.9399999999999995
- type: map_at_5
value: 8.272
- type: mrr_at_1
value: 83.72093023255815
- type: mrr_at_10
value: 91.08527131782945
- type: mrr_at_100
value: 91.08527131782945
- type: mrr_at_1000
value: 91.08527131782945
- type: mrr_at_20
value: 91.08527131782945
- type: mrr_at_3
value: 91.08527131782945
- type: mrr_at_5
value: 91.08527131782945
- type: nauc_map_at_1000_diff1
value: -36.428271627303424
- type: nauc_map_at_1000_max
value: 44.87615127218638
- type: nauc_map_at_1000_std
value: 67.92696808824724
- type: nauc_map_at_100_diff1
value: -28.11674206786188
- type: nauc_map_at_100_max
value: 36.422779766334955
- type: nauc_map_at_100_std
value: 49.99876313755116
- type: nauc_map_at_10_diff1
value: -5.838593619806058
- type: nauc_map_at_10_max
value: 11.026519190509742
- type: nauc_map_at_10_std
value: 2.5268752263522045
- type: nauc_map_at_1_diff1
value: 17.897907271073016
- type: nauc_map_at_1_max
value: 12.229062762540844
- type: nauc_map_at_1_std
value: -4.088830895573149
- type: nauc_map_at_20_diff1
value: -13.871097716255626
- type: nauc_map_at_20_max
value: 19.291271635609533
- type: nauc_map_at_20_std
value: 16.745335606507826
- type: nauc_map_at_3_diff1
value: 4.425238457033843
- type: nauc_map_at_3_max
value: 4.611864744680824
- type: nauc_map_at_3_std
value: -8.986916608582863
- type: nauc_map_at_5_diff1
value: -6.254849256920095
- type: nauc_map_at_5_max
value: 2.729437079919823
- type: nauc_map_at_5_std
value: -7.235906279913092
- type: nauc_mrr_at_1000_diff1
value: 52.18669104947672
- type: nauc_mrr_at_1000_max
value: 68.26259125411818
- type: nauc_mrr_at_1000_std
value: 56.345086428353575
- type: nauc_mrr_at_100_diff1
value: 52.18669104947672
- type: nauc_mrr_at_100_max
value: 68.26259125411818
- type: nauc_mrr_at_100_std
value: 56.345086428353575
- type: nauc_mrr_at_10_diff1
value: 52.18669104947672
- type: nauc_mrr_at_10_max
value: 68.26259125411818
- type: nauc_mrr_at_10_std
value: 56.345086428353575
- type: nauc_mrr_at_1_diff1
value: 56.55126663944154
- type: nauc_mrr_at_1_max
value: 66.37014285522565
- type: nauc_mrr_at_1_std
value: 53.2508271389779
- type: nauc_mrr_at_20_diff1
value: 52.18669104947672
- type: nauc_mrr_at_20_max
value: 68.26259125411818
- type: nauc_mrr_at_20_std
value: 56.345086428353575
- type: nauc_mrr_at_3_diff1
value: 52.18669104947672
- type: nauc_mrr_at_3_max
value: 68.26259125411818
- type: nauc_mrr_at_3_std
value: 56.345086428353575
- type: nauc_mrr_at_5_diff1
value: 52.18669104947672
- type: nauc_mrr_at_5_max
value: 68.26259125411818
- type: nauc_mrr_at_5_std
value: 56.345086428353575
- type: nauc_ndcg_at_1000_diff1
value: -19.06422926483731
- type: nauc_ndcg_at_1000_max
value: 56.30853514590265
- type: nauc_ndcg_at_1000_std
value: 70.30810947505557
- type: nauc_ndcg_at_100_diff1
value: -25.72587586459692
- type: nauc_ndcg_at_100_max
value: 51.433781241604194
- type: nauc_ndcg_at_100_std
value: 68.37678512652792
- type: nauc_ndcg_at_10_diff1
value: -23.21198108212602
- type: nauc_ndcg_at_10_max
value: 43.5450720846516
- type: nauc_ndcg_at_10_std
value: 48.78307907005605
- type: nauc_ndcg_at_1_diff1
value: 44.00179301267447
- type: nauc_ndcg_at_1_max
value: 48.202370455680395
- type: nauc_ndcg_at_1_std
value: 25.69655992704088
- type: nauc_ndcg_at_20_diff1
value: -33.88168753446507
- type: nauc_ndcg_at_20_max
value: 45.16199742613164
- type: nauc_ndcg_at_20_std
value: 61.87098383164902
- type: nauc_ndcg_at_3_diff1
value: 11.19174449544048
- type: nauc_ndcg_at_3_max
value: 44.34069860560555
- type: nauc_ndcg_at_3_std
value: 27.451258369798115
- type: nauc_ndcg_at_5_diff1
value: -7.186520929432436
- type: nauc_ndcg_at_5_max
value: 43.41869981139378
- type: nauc_ndcg_at_5_std
value: 34.89898115995178
- type: nauc_precision_at_1000_diff1
value: -34.43998154563451
- type: nauc_precision_at_1000_max
value: 29.172655907480372
- type: nauc_precision_at_1000_std
value: 65.15824469614837
- type: nauc_precision_at_100_diff1
value: -37.82409643259692
- type: nauc_precision_at_100_max
value: 38.24986991317909
- type: nauc_precision_at_100_std
value: 72.74768183105327
- type: nauc_precision_at_10_diff1
value: -32.21556182780535
- type: nauc_precision_at_10_max
value: 34.27170432382651
- type: nauc_precision_at_10_std
value: 58.358255004394664
- type: nauc_precision_at_1_diff1
value: 56.55126663944154
- type: nauc_precision_at_1_max
value: 66.37014285522565
- type: nauc_precision_at_1_std
value: 53.2508271389779
- type: nauc_precision_at_20_diff1
value: -40.18751579026395
- type: nauc_precision_at_20_max
value: 33.960783153758896
- type: nauc_precision_at_20_std
value: 65.42918390184195
- type: nauc_precision_at_3_diff1
value: -7.073870209006578
- type: nauc_precision_at_3_max
value: 50.81535269862325
- type: nauc_precision_at_3_std
value: 59.248681565955685
- type: nauc_precision_at_5_diff1
value: -31.136580596983876
- type: nauc_precision_at_5_max
value: 45.88147792380426
- type: nauc_precision_at_5_std
value: 67.46814230928243
- type: nauc_recall_at_1000_diff1
value: -23.15699999594577
- type: nauc_recall_at_1000_max
value: 39.77277799761876
- type: nauc_recall_at_1000_std
value: 60.326168012901114
- type: nauc_recall_at_100_diff1
value: -21.636664823598498
- type: nauc_recall_at_100_max
value: 31.104969346131583
- type: nauc_recall_at_100_std
value: 38.811686891592096
- type: nauc_recall_at_10_diff1
value: -10.542765625053569
- type: nauc_recall_at_10_max
value: 2.043876058107446
- type: nauc_recall_at_10_std
value: -5.578449908984766
- type: nauc_recall_at_1_diff1
value: 17.897907271073016
- type: nauc_recall_at_1_max
value: 12.229062762540844
- type: nauc_recall_at_1_std
value: -4.088830895573149
- type: nauc_recall_at_20_diff1
value: -15.132909355710103
- type: nauc_recall_at_20_max
value: 12.659765287241065
- type: nauc_recall_at_20_std
value: 8.277887800815819
- type: nauc_recall_at_3_diff1
value: -3.1975017812715016
- type: nauc_recall_at_3_max
value: -3.5539857085038538
- type: nauc_recall_at_3_std
value: -14.712102851318118
- type: nauc_recall_at_5_diff1
value: -14.040507717380743
- type: nauc_recall_at_5_max
value: -6.126912150131701
- type: nauc_recall_at_5_std
value: -13.821624015640355
- type: ndcg_at_1
value: 71.318
- type: ndcg_at_10
value: 64.886
- type: ndcg_at_100
value: 53.187
- type: ndcg_at_1000
value: 59.897999999999996
- type: ndcg_at_20
value: 58.96
- type: ndcg_at_3
value: 69.736
- type: ndcg_at_5
value: 70.14099999999999
- type: precision_at_1
value: 83.721
- type: precision_at_10
value: 71.163
- type: precision_at_100
value: 29.465000000000003
- type: precision_at_1000
value: 5.665
- type: precision_at_20
value: 57.791000000000004
- type: precision_at_3
value: 82.171
- type: precision_at_5
value: 81.86
- type: recall_at_1
value: 1.644
- type: recall_at_10
value: 14.238000000000001
- type: recall_at_100
value: 39.831
- type: recall_at_1000
value: 64.057
- type: recall_at_20
value: 21.021
- type: recall_at_3
value: 5.53
- type: recall_at_5
value: 9.623
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus-PL (default)
type: clarin-knext/nfcorpus-pl
config: default
split: test
revision: 9a6f9567fda928260afed2de480d79c98bf0bec0
metrics:
- type: main_score
value: 31.391000000000002
- type: map_at_1
value: 4.163
- type: map_at_10
value: 10.744
- type: map_at_100
value: 14.038999999999998
- type: map_at_1000
value: 15.434999999999999
- type: map_at_20
value: 12.16
- type: map_at_3
value: 7.614999999999999
- type: map_at_5
value: 9.027000000000001
- type: mrr_at_1
value: 39.0092879256966
- type: mrr_at_10
value: 48.69809327239668
- type: mrr_at_100
value: 49.20788148442068
- type: mrr_at_1000
value: 49.25509336494706
- type: mrr_at_20
value: 48.99606551850896
- type: mrr_at_3
value: 46.284829721362236
- type: mrr_at_5
value: 47.77089783281735
- type: nauc_map_at_1000_diff1
value: 22.75421477116417
- type: nauc_map_at_1000_max
value: 49.242283787799046
- type: nauc_map_at_1000_std
value: 29.056888272331832
- type: nauc_map_at_100_diff1
value: 23.585977398585594
- type: nauc_map_at_100_max
value: 48.25845199409498
- type: nauc_map_at_100_std
value: 24.944264511223693
- type: nauc_map_at_10_diff1
value: 27.386613094780255
- type: nauc_map_at_10_max
value: 41.52415346691586
- type: nauc_map_at_10_std
value: 12.93872448563755
- type: nauc_map_at_1_diff1
value: 46.78688143865053
- type: nauc_map_at_1_max
value: 37.20408843995871
- type: nauc_map_at_1_std
value: 4.383444959401098
- type: nauc_map_at_20_diff1
value: 25.590969047740288
- type: nauc_map_at_20_max
value: 44.57109307999418
- type: nauc_map_at_20_std
value: 16.45855141821407
- type: nauc_map_at_3_diff1
value: 36.30017108362863
- type: nauc_map_at_3_max
value: 34.66149613991648
- type: nauc_map_at_3_std
value: 5.67985905078467
- type: nauc_map_at_5_diff1
value: 31.157644795417223
- type: nauc_map_at_5_max
value: 37.274738661636825
- type: nauc_map_at_5_std
value: 8.70088872394168
- type: nauc_mrr_at_1000_diff1
value: 25.638564218157384
- type: nauc_mrr_at_1000_max
value: 57.77788270285353
- type: nauc_mrr_at_1000_std
value: 43.507586592911274
- type: nauc_mrr_at_100_diff1
value: 25.662002580561584
- type: nauc_mrr_at_100_max
value: 57.80578394278584
- type: nauc_mrr_at_100_std
value: 43.543905743986635
- type: nauc_mrr_at_10_diff1
value: 25.426034796339835
- type: nauc_mrr_at_10_max
value: 57.68443186258669
- type: nauc_mrr_at_10_std
value: 43.438009108331215
- type: nauc_mrr_at_1_diff1
value: 26.073028156311075
- type: nauc_mrr_at_1_max
value: 52.11817916720053
- type: nauc_mrr_at_1_std
value: 37.41073893153695
- type: nauc_mrr_at_20_diff1
value: 25.548645553336147
- type: nauc_mrr_at_20_max
value: 57.78552760401915
- type: nauc_mrr_at_20_std
value: 43.521687428822325
- type: nauc_mrr_at_3_diff1
value: 25.72662577397805
- type: nauc_mrr_at_3_max
value: 56.891263536265605
- type: nauc_mrr_at_3_std
value: 41.384872305390104
- type: nauc_mrr_at_5_diff1
value: 25.552211551655386
- type: nauc_mrr_at_5_max
value: 57.976813828353926
- type: nauc_mrr_at_5_std
value: 43.504564461855544
- type: nauc_ndcg_at_1000_diff1
value: 23.456158044182757
- type: nauc_ndcg_at_1000_max
value: 60.05411773552709
- type: nauc_ndcg_at_1000_std
value: 47.857510017262584
- type: nauc_ndcg_at_100_diff1
value: 19.711635700390772
- type: nauc_ndcg_at_100_max
value: 56.178746740470665
- type: nauc_ndcg_at_100_std
value: 42.36829180286942
- type: nauc_ndcg_at_10_diff1
value: 18.364428967788413
- type: nauc_ndcg_at_10_max
value: 54.38372506578223
- type: nauc_ndcg_at_10_std
value: 41.75765411340369
- type: nauc_ndcg_at_1_diff1
value: 26.571093272640773
- type: nauc_ndcg_at_1_max
value: 51.061788341958284
- type: nauc_ndcg_at_1_std
value: 36.514987974075986
- type: nauc_ndcg_at_20_diff1
value: 18.345487193027697
- type: nauc_ndcg_at_20_max
value: 54.62621882656994
- type: nauc_ndcg_at_20_std
value: 41.42835554714241
- type: nauc_ndcg_at_3_diff1
value: 23.260105658139025
- type: nauc_ndcg_at_3_max
value: 52.07747385334546
- type: nauc_ndcg_at_3_std
value: 36.91985577837284
- type: nauc_ndcg_at_5_diff1
value: 20.40428109665566
- type: nauc_ndcg_at_5_max
value: 53.52015347884604
- type: nauc_ndcg_at_5_std
value: 39.46008849580017
- type: nauc_precision_at_1000_diff1
value: -7.3487344916380035
- type: nauc_precision_at_1000_max
value: 16.58045221394852
- type: nauc_precision_at_1000_std
value: 38.94030932397075
- type: nauc_precision_at_100_diff1
value: -5.257743986683922
- type: nauc_precision_at_100_max
value: 34.43071687475306
- type: nauc_precision_at_100_std
value: 53.499519170670474
- type: nauc_precision_at_10_diff1
value: 2.385136433119139
- type: nauc_precision_at_10_max
value: 47.210743878631064
- type: nauc_precision_at_10_std
value: 47.22767704186548
- type: nauc_precision_at_1_diff1
value: 26.073028156311075
- type: nauc_precision_at_1_max
value: 52.11817916720053
- type: nauc_precision_at_1_std
value: 37.41073893153695
- type: nauc_precision_at_20_diff1
value: -0.3531531127238474
- type: nauc_precision_at_20_max
value: 44.78044604856974
- type: nauc_precision_at_20_std
value: 49.532804150743615
- type: nauc_precision_at_3_diff1
value: 15.350050569991447
- type: nauc_precision_at_3_max
value: 51.01572315596549
- type: nauc_precision_at_3_std
value: 38.801125728413155
- type: nauc_precision_at_5_diff1
value: 9.109003666144694
- type: nauc_precision_at_5_max
value: 50.935269774898494
- type: nauc_precision_at_5_std
value: 43.323548180559676
- type: nauc_recall_at_1000_diff1
value: 16.64743647648886
- type: nauc_recall_at_1000_max
value: 38.46012283772285
- type: nauc_recall_at_1000_std
value: 36.02016164796441
- type: nauc_recall_at_100_diff1
value: 14.005834785186744
- type: nauc_recall_at_100_max
value: 37.70026105513647
- type: nauc_recall_at_100_std
value: 27.085222642129697
- type: nauc_recall_at_10_diff1
value: 21.204106627422632
- type: nauc_recall_at_10_max
value: 36.737624881893424
- type: nauc_recall_at_10_std
value: 13.755054514272702
- type: nauc_recall_at_1_diff1
value: 46.78688143865053
- type: nauc_recall_at_1_max
value: 37.20408843995871
- type: nauc_recall_at_1_std
value: 4.383444959401098
- type: nauc_recall_at_20_diff1
value: 19.740977611421933
- type: nauc_recall_at_20_max
value: 39.21908969539783
- type: nauc_recall_at_20_std
value: 16.560269670318494
- type: nauc_recall_at_3_diff1
value: 32.189359545367815
- type: nauc_recall_at_3_max
value: 31.693634445562758
- type: nauc_recall_at_3_std
value: 6.246326281543587
- type: nauc_recall_at_5_diff1
value: 25.51586860499901
- type: nauc_recall_at_5_max
value: 33.15934725342885
- type: nauc_recall_at_5_std
value: 9.677778511696705
- type: ndcg_at_1
value: 37.307
- type: ndcg_at_10
value: 31.391000000000002
- type: ndcg_at_100
value: 28.877999999999997
- type: ndcg_at_1000
value: 37.16
- type: ndcg_at_20
value: 29.314
- type: ndcg_at_3
value: 35.405
- type: ndcg_at_5
value: 33.922999999999995
- type: precision_at_1
value: 39.009
- type: precision_at_10
value: 24.52
- type: precision_at_100
value: 7.703
- type: precision_at_1000
value: 2.04
- type: precision_at_20
value: 18.08
- type: precision_at_3
value: 34.469
- type: precision_at_5
value: 30.712
- type: recall_at_1
value: 4.163
- type: recall_at_10
value: 15.015999999999998
- type: recall_at_100
value: 30.606
- type: recall_at_1000
value: 59.606
- type: recall_at_20
value: 19.09
- type: recall_at_3
value: 9.139
- type: recall_at_5
value: 11.477
- task:
type: Retrieval
dataset:
name: MTEB NQ-PL (default)
type: clarin-knext/nq-pl
config: default
split: test
revision: f171245712cf85dd4700b06bef18001578d0ca8d
metrics:
- type: main_score
value: 54.017
- type: map_at_1
value: 34.193
- type: map_at_10
value: 47.497
- type: map_at_100
value: 48.441
- type: map_at_1000
value: 48.481
- type: map_at_20
value: 48.093
- type: map_at_3
value: 44.017
- type: map_at_5
value: 46.111000000000004
- type: mrr_at_1
value: 37.949015063731174
- type: mrr_at_10
value: 49.915772315105954
- type: mrr_at_100
value: 50.62841255829997
- type: mrr_at_1000
value: 50.656773027666745
- type: mrr_at_20
value: 50.37785276657083
- type: mrr_at_3
value: 46.98725376593267
- type: mrr_at_5
value: 48.763035921205066
- type: nauc_map_at_1000_diff1
value: 39.5632191792873
- type: nauc_map_at_1000_max
value: 37.4728247053629
- type: nauc_map_at_1000_std
value: 5.742498414663762
- type: nauc_map_at_100_diff1
value: 39.555570352061906
- type: nauc_map_at_100_max
value: 37.497880976847334
- type: nauc_map_at_100_std
value: 5.7798021019465375
- type: nauc_map_at_10_diff1
value: 39.5423723444454
- type: nauc_map_at_10_max
value: 37.41661971723365
- type: nauc_map_at_10_std
value: 5.2378002164144695
- type: nauc_map_at_1_diff1
value: 41.52697034146981
- type: nauc_map_at_1_max
value: 28.558995576942863
- type: nauc_map_at_1_std
value: 0.13094542859192052
- type: nauc_map_at_20_diff1
value: 39.55484628943701
- type: nauc_map_at_20_max
value: 37.5247794933719
- type: nauc_map_at_20_std
value: 5.702881342279231
- type: nauc_map_at_3_diff1
value: 39.949323925425325
- type: nauc_map_at_3_max
value: 35.770298168901924
- type: nauc_map_at_3_std
value: 2.9127112432479874
- type: nauc_map_at_5_diff1
value: 39.768310617004545
- type: nauc_map_at_5_max
value: 37.1549191664796
- type: nauc_map_at_5_std
value: 4.4681285748269515
- type: nauc_mrr_at_1000_diff1
value: 39.14001746706457
- type: nauc_mrr_at_1000_max
value: 37.477376518267775
- type: nauc_mrr_at_1000_std
value: 6.8088891531621565
- type: nauc_mrr_at_100_diff1
value: 39.13054707413684
- type: nauc_mrr_at_100_max
value: 37.498126443766274
- type: nauc_mrr_at_100_std
value: 6.839411380129971
- type: nauc_mrr_at_10_diff1
value: 39.09764730048156
- type: nauc_mrr_at_10_max
value: 37.58593798217306
- type: nauc_mrr_at_10_std
value: 6.713795164982413
- type: nauc_mrr_at_1_diff1
value: 41.581599918664075
- type: nauc_mrr_at_1_max
value: 31.500589231378722
- type: nauc_mrr_at_1_std
value: 2.059116370339438
- type: nauc_mrr_at_20_diff1
value: 39.09011023988447
- type: nauc_mrr_at_20_max
value: 37.55856008791344
- type: nauc_mrr_at_20_std
value: 6.847165397615844
- type: nauc_mrr_at_3_diff1
value: 39.382542043738
- type: nauc_mrr_at_3_max
value: 36.49265363659468
- type: nauc_mrr_at_3_std
value: 4.759157976438336
- type: nauc_mrr_at_5_diff1
value: 39.304826333759976
- type: nauc_mrr_at_5_max
value: 37.46326016736024
- type: nauc_mrr_at_5_std
value: 6.122608305766621
- type: nauc_ndcg_at_1000_diff1
value: 38.568500038453266
- type: nauc_ndcg_at_1000_max
value: 39.799710882413166
- type: nauc_ndcg_at_1000_std
value: 9.357010223096639
- type: nauc_ndcg_at_100_diff1
value: 38.38026091343228
- type: nauc_ndcg_at_100_max
value: 40.48398173542486
- type: nauc_ndcg_at_100_std
value: 10.373054013302214
- type: nauc_ndcg_at_10_diff1
value: 38.27340980909964
- type: nauc_ndcg_at_10_max
value: 40.35241649744093
- type: nauc_ndcg_at_10_std
value: 8.579139930345168
- type: nauc_ndcg_at_1_diff1
value: 41.581599918664075
- type: nauc_ndcg_at_1_max
value: 31.500589231378722
- type: nauc_ndcg_at_1_std
value: 2.059116370339438
- type: nauc_ndcg_at_20_diff1
value: 38.26453028884807
- type: nauc_ndcg_at_20_max
value: 40.70517858426641
- type: nauc_ndcg_at_20_std
value: 9.987693876137905
- type: nauc_ndcg_at_3_diff1
value: 39.2078971733273
- type: nauc_ndcg_at_3_max
value: 37.48672195565316
- type: nauc_ndcg_at_3_std
value: 4.051464994659221
- type: nauc_ndcg_at_5_diff1
value: 38.883693595665285
- type: nauc_ndcg_at_5_max
value: 39.763115634437135
- type: nauc_ndcg_at_5_std
value: 6.738980451582073
- type: nauc_precision_at_1000_diff1
value: -7.223215910619012
- type: nauc_precision_at_1000_max
value: 13.075844604892161
- type: nauc_precision_at_1000_std
value: 19.864336920890107
- type: nauc_precision_at_100_diff1
value: 1.3305994810812418
- type: nauc_precision_at_100_max
value: 25.9219108557104
- type: nauc_precision_at_100_std
value: 27.5076605928207
- type: nauc_precision_at_10_diff1
value: 18.441551484970326
- type: nauc_precision_at_10_max
value: 39.85995330437054
- type: nauc_precision_at_10_std
value: 20.561269077428914
- type: nauc_precision_at_1_diff1
value: 41.581599918664075
- type: nauc_precision_at_1_max
value: 31.500589231378722
- type: nauc_precision_at_1_std
value: 2.059116370339438
- type: nauc_precision_at_20_diff1
value: 12.579593891480531
- type: nauc_precision_at_20_max
value: 36.620221830588775
- type: nauc_precision_at_20_std
value: 26.40364876775059
- type: nauc_precision_at_3_diff1
value: 30.158859294487073
- type: nauc_precision_at_3_max
value: 41.168215766389174
- type: nauc_precision_at_3_std
value: 9.44345004450809
- type: nauc_precision_at_5_diff1
value: 25.438624678672785
- type: nauc_precision_at_5_max
value: 42.72802023518524
- type: nauc_precision_at_5_std
value: 15.357657388511099
- type: nauc_recall_at_1000_diff1
value: 24.987564782718003
- type: nauc_recall_at_1000_max
value: 70.508416373353
- type: nauc_recall_at_1000_std
value: 69.75092280398808
- type: nauc_recall_at_100_diff1
value: 29.504202856421397
- type: nauc_recall_at_100_max
value: 63.41356585545318
- type: nauc_recall_at_100_std
value: 50.09250954437847
- type: nauc_recall_at_10_diff1
value: 32.355776022971774
- type: nauc_recall_at_10_max
value: 49.47121901667283
- type: nauc_recall_at_10_std
value: 19.418439406631244
- type: nauc_recall_at_1_diff1
value: 41.52697034146981
- type: nauc_recall_at_1_max
value: 28.558995576942863
- type: nauc_recall_at_1_std
value: 0.13094542859192052
- type: nauc_recall_at_20_diff1
value: 31.57334731023589
- type: nauc_recall_at_20_max
value: 54.06567225197383
- type: nauc_recall_at_20_std
value: 29.222029720570468
- type: nauc_recall_at_3_diff1
value: 36.45033533275773
- type: nauc_recall_at_3_max
value: 40.39529713780803
- type: nauc_recall_at_3_std
value: 5.21893897772794
- type: nauc_recall_at_5_diff1
value: 35.18471678478859
- type: nauc_recall_at_5_max
value: 46.20100816867823
- type: nauc_recall_at_5_std
value: 11.94481894633221
- type: ndcg_at_1
value: 37.949
- type: ndcg_at_10
value: 54.017
- type: ndcg_at_100
value: 58.126
- type: ndcg_at_1000
value: 59.073
- type: ndcg_at_20
value: 55.928
- type: ndcg_at_3
value: 47.494
- type: ndcg_at_5
value: 50.975
- type: precision_at_1
value: 37.949
- type: precision_at_10
value: 8.450000000000001
- type: precision_at_100
value: 1.083
- type: precision_at_1000
value: 0.117
- type: precision_at_20
value: 4.689
- type: precision_at_3
value: 21.051000000000002
- type: precision_at_5
value: 14.664
- type: recall_at_1
value: 34.193
- type: recall_at_10
value: 71.357
- type: recall_at_100
value: 89.434
- type: recall_at_1000
value: 96.536
- type: recall_at_20
value: 78.363
- type: recall_at_3
value: 54.551
- type: recall_at_5
value: 62.543000000000006
- task:
type: Retrieval
dataset:
name: MTEB Quora-PL (default)
type: clarin-knext/quora-pl
config: default
split: test
revision: 0be27e93455051e531182b85e85e425aba12e9d4
metrics:
- type: main_score
value: 84.114
- type: map_at_1
value: 65.848
- type: map_at_10
value: 79.85900000000001
- type: map_at_100
value: 80.582
- type: map_at_1000
value: 80.60300000000001
- type: map_at_20
value: 80.321
- type: map_at_3
value: 76.741
- type: map_at_5
value: 78.72200000000001
- type: mrr_at_1
value: 75.97
- type: mrr_at_10
value: 83.04630158730119
- type: mrr_at_100
value: 83.22785731032968
- type: mrr_at_1000
value: 83.23123717623899
- type: mrr_at_20
value: 83.17412021320565
- type: mrr_at_3
value: 81.83333333333287
- type: mrr_at_5
value: 82.61933333333275
- type: nauc_map_at_1000_diff1
value: 73.26316553371083
- type: nauc_map_at_1000_max
value: 27.92567859085245
- type: nauc_map_at_1000_std
value: -47.477909533360446
- type: nauc_map_at_100_diff1
value: 73.2690602807223
- type: nauc_map_at_100_max
value: 27.915868327849996
- type: nauc_map_at_100_std
value: -47.525777766107595
- type: nauc_map_at_10_diff1
value: 73.45464428464894
- type: nauc_map_at_10_max
value: 27.451611487246296
- type: nauc_map_at_10_std
value: -49.35818715843809
- type: nauc_map_at_1_diff1
value: 77.29690208952982
- type: nauc_map_at_1_max
value: 19.839875762282293
- type: nauc_map_at_1_std
value: -45.355684654708284
- type: nauc_map_at_20_diff1
value: 73.35102731979796
- type: nauc_map_at_20_max
value: 27.741506490134583
- type: nauc_map_at_20_std
value: -48.22006207310331
- type: nauc_map_at_3_diff1
value: 73.94878241064137
- type: nauc_map_at_3_max
value: 24.761321386766728
- type: nauc_map_at_3_std
value: -51.20638883618126
- type: nauc_map_at_5_diff1
value: 73.66143558047698
- type: nauc_map_at_5_max
value: 26.53483405013543
- type: nauc_map_at_5_std
value: -50.697541279640056
- type: nauc_mrr_at_1000_diff1
value: 73.84632320009759
- type: nauc_mrr_at_1000_max
value: 30.50182733610048
- type: nauc_mrr_at_1000_std
value: -44.3021647995251
- type: nauc_mrr_at_100_diff1
value: 73.84480792662302
- type: nauc_mrr_at_100_max
value: 30.50749424571614
- type: nauc_mrr_at_100_std
value: -44.29615086388113
- type: nauc_mrr_at_10_diff1
value: 73.79442772949346
- type: nauc_mrr_at_10_max
value: 30.55724252219984
- type: nauc_mrr_at_10_std
value: -44.50997069462057
- type: nauc_mrr_at_1_diff1
value: 75.23369827945945
- type: nauc_mrr_at_1_max
value: 29.20073967447664
- type: nauc_mrr_at_1_std
value: -43.1920147658285
- type: nauc_mrr_at_20_diff1
value: 73.82731678072307
- type: nauc_mrr_at_20_max
value: 30.566328605497667
- type: nauc_mrr_at_20_std
value: -44.24683607643705
- type: nauc_mrr_at_3_diff1
value: 73.61997576749954
- type: nauc_mrr_at_3_max
value: 30.150393853381917
- type: nauc_mrr_at_3_std
value: -44.96847297506626
- type: nauc_mrr_at_5_diff1
value: 73.69084310616132
- type: nauc_mrr_at_5_max
value: 30.578033703441125
- type: nauc_mrr_at_5_std
value: -44.74920746066566
- type: nauc_ndcg_at_1000_diff1
value: 72.89349862557452
- type: nauc_ndcg_at_1000_max
value: 29.824725190462086
- type: nauc_ndcg_at_1000_std
value: -44.96284395063211
- type: nauc_ndcg_at_100_diff1
value: 72.85212753715273
- type: nauc_ndcg_at_100_max
value: 29.933114207845605
- type: nauc_ndcg_at_100_std
value: -44.944225570663754
- type: nauc_ndcg_at_10_diff1
value: 72.80576740454528
- type: nauc_ndcg_at_10_max
value: 29.16829118320828
- type: nauc_ndcg_at_10_std
value: -48.149473740079614
- type: nauc_ndcg_at_1_diff1
value: 75.00032534968587
- type: nauc_ndcg_at_1_max
value: 29.61849062038547
- type: nauc_ndcg_at_1_std
value: -42.560207043864054
- type: nauc_ndcg_at_20_diff1
value: 72.88440406302502
- type: nauc_ndcg_at_20_max
value: 29.65496676092656
- type: nauc_ndcg_at_20_std
value: -46.21238462167732
- type: nauc_ndcg_at_3_diff1
value: 72.37916962766987
- type: nauc_ndcg_at_3_max
value: 27.125094834547586
- type: nauc_ndcg_at_3_std
value: -48.62942991399391
- type: nauc_ndcg_at_5_diff1
value: 72.57017330527658
- type: nauc_ndcg_at_5_max
value: 28.470485561757254
- type: nauc_ndcg_at_5_std
value: -49.07593345591059
- type: nauc_precision_at_1000_diff1
value: -41.67915575853946
- type: nauc_precision_at_1000_max
value: 1.2012264478568844
- type: nauc_precision_at_1000_std
value: 44.723834559400466
- type: nauc_precision_at_100_diff1
value: -40.45196679236971
- type: nauc_precision_at_100_max
value: 2.3525450401714894
- type: nauc_precision_at_100_std
value: 43.7092529413952
- type: nauc_precision_at_10_diff1
value: -30.256026923068767
- type: nauc_precision_at_10_max
value: 8.313422052132559
- type: nauc_precision_at_10_std
value: 25.929372356449694
- type: nauc_precision_at_1_diff1
value: 75.00032534968587
- type: nauc_precision_at_1_max
value: 29.61849062038547
- type: nauc_precision_at_1_std
value: -42.560207043864054
- type: nauc_precision_at_20_diff1
value: -35.61971069986584
- type: nauc_precision_at_20_max
value: 5.4664303079116765
- type: nauc_precision_at_20_std
value: 34.992352471692826
- type: nauc_precision_at_3_diff1
value: -5.691231842471157
- type: nauc_precision_at_3_max
value: 14.797949087742444
- type: nauc_precision_at_3_std
value: -0.1930317395644928
- type: nauc_precision_at_5_diff1
value: -20.03913781462645
- type: nauc_precision_at_5_max
value: 11.956771408712749
- type: nauc_precision_at_5_std
value: 13.179251389859731
- type: nauc_recall_at_1000_diff1
value: 64.03509042729674
- type: nauc_recall_at_1000_max
value: 40.91691485428493
- type: nauc_recall_at_1000_std
value: 16.12968625875372
- type: nauc_recall_at_100_diff1
value: 63.83116179628575
- type: nauc_recall_at_100_max
value: 43.72908117676382
- type: nauc_recall_at_100_std
value: -20.50966716852155
- type: nauc_recall_at_10_diff1
value: 66.42071960186394
- type: nauc_recall_at_10_max
value: 28.983207818687205
- type: nauc_recall_at_10_std
value: -56.61417798753744
- type: nauc_recall_at_1_diff1
value: 77.29690208952982
- type: nauc_recall_at_1_max
value: 19.839875762282293
- type: nauc_recall_at_1_std
value: -45.355684654708284
- type: nauc_recall_at_20_diff1
value: 66.32360705219874
- type: nauc_recall_at_20_max
value: 33.30698111822631
- type: nauc_recall_at_20_std
value: -43.89233781737452
- type: nauc_recall_at_3_diff1
value: 69.67029394927077
- type: nauc_recall_at_3_max
value: 22.67803039327696
- type: nauc_recall_at_3_std
value: -56.43327209861502
- type: nauc_recall_at_5_diff1
value: 68.05622143936131
- type: nauc_recall_at_5_max
value: 26.67795559040675
- type: nauc_recall_at_5_std
value: -58.158231198510954
- type: ndcg_at_1
value: 76.08
- type: ndcg_at_10
value: 84.114
- type: ndcg_at_100
value: 85.784
- type: ndcg_at_1000
value: 85.992
- type: ndcg_at_20
value: 84.976
- type: ndcg_at_3
value: 80.74799999999999
- type: ndcg_at_5
value: 82.626
- type: precision_at_1
value: 76.08
- type: precision_at_10
value: 12.926000000000002
- type: precision_at_100
value: 1.509
- type: precision_at_1000
value: 0.156
- type: precision_at_20
value: 6.912999999999999
- type: precision_at_3
value: 35.5
- type: precision_at_5
value: 23.541999999999998
- type: recall_at_1
value: 65.848
- type: recall_at_10
value: 92.611
- type: recall_at_100
value: 98.69
- type: recall_at_1000
value: 99.83999999999999
- type: recall_at_20
value: 95.47200000000001
- type: recall_at_3
value: 83.122
- type: recall_at_5
value: 88.23
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS-PL (default)
type: clarin-knext/scidocs-pl
config: default
split: test
revision: 45452b03f05560207ef19149545f168e596c9337
metrics:
- type: main_score
value: 15.379999999999999
- type: map_at_1
value: 3.6029999999999998
- type: map_at_10
value: 8.843
- type: map_at_100
value: 10.433
- type: map_at_1000
value: 10.689
- type: map_at_20
value: 9.597
- type: map_at_3
value: 6.363
- type: map_at_5
value: 7.603
- type: mrr_at_1
value: 17.7
- type: mrr_at_10
value: 26.58900793650793
- type: mrr_at_100
value: 27.699652322890987
- type: mrr_at_1000
value: 27.78065313118353
- type: mrr_at_20
value: 27.215020950411816
- type: mrr_at_3
value: 23.36666666666668
- type: mrr_at_5
value: 25.211666666666666
- type: nauc_map_at_1000_diff1
value: 21.92235143827129
- type: nauc_map_at_1000_max
value: 37.50300940750989
- type: nauc_map_at_1000_std
value: 20.872586122198552
- type: nauc_map_at_100_diff1
value: 21.917408170465833
- type: nauc_map_at_100_max
value: 37.4654466815513
- type: nauc_map_at_100_std
value: 20.621643878648534
- type: nauc_map_at_10_diff1
value: 22.914388723621183
- type: nauc_map_at_10_max
value: 36.468131213468794
- type: nauc_map_at_10_std
value: 16.760980140791492
- type: nauc_map_at_1_diff1
value: 29.00799502838457
- type: nauc_map_at_1_max
value: 26.64926291797503
- type: nauc_map_at_1_std
value: 8.167291261637361
- type: nauc_map_at_20_diff1
value: 22.46580947804047
- type: nauc_map_at_20_max
value: 36.656294842562275
- type: nauc_map_at_20_std
value: 18.099232417722078
- type: nauc_map_at_3_diff1
value: 23.436009032045934
- type: nauc_map_at_3_max
value: 31.325807212280914
- type: nauc_map_at_3_std
value: 9.780905232048852
- type: nauc_map_at_5_diff1
value: 22.891704394665528
- type: nauc_map_at_5_max
value: 35.40584466642894
- type: nauc_map_at_5_std
value: 13.476986099394656
- type: nauc_mrr_at_1000_diff1
value: 25.052937655397866
- type: nauc_mrr_at_1000_max
value: 29.64431912670108
- type: nauc_mrr_at_1000_std
value: 14.549744963988044
- type: nauc_mrr_at_100_diff1
value: 25.070871266969224
- type: nauc_mrr_at_100_max
value: 29.68743604652336
- type: nauc_mrr_at_100_std
value: 14.582010154574432
- type: nauc_mrr_at_10_diff1
value: 24.88881466938897
- type: nauc_mrr_at_10_max
value: 29.488430770768144
- type: nauc_mrr_at_10_std
value: 14.269241073852266
- type: nauc_mrr_at_1_diff1
value: 29.220540327267503
- type: nauc_mrr_at_1_max
value: 26.81908580507911
- type: nauc_mrr_at_1_std
value: 8.00840295809718
- type: nauc_mrr_at_20_diff1
value: 25.067912695721944
- type: nauc_mrr_at_20_max
value: 29.759227563849628
- type: nauc_mrr_at_20_std
value: 14.685076859257357
- type: nauc_mrr_at_3_diff1
value: 24.645848739182696
- type: nauc_mrr_at_3_max
value: 27.73368549660351
- type: nauc_mrr_at_3_std
value: 11.475742805586943
- type: nauc_mrr_at_5_diff1
value: 24.895295760909946
- type: nauc_mrr_at_5_max
value: 29.130755033240423
- type: nauc_mrr_at_5_std
value: 12.955802929145404
- type: nauc_ndcg_at_1000_diff1
value: 20.68434434777729
- type: nauc_ndcg_at_1000_max
value: 37.67055146424174
- type: nauc_ndcg_at_1000_std
value: 29.57493715069776
- type: nauc_ndcg_at_100_diff1
value: 20.396834816492383
- type: nauc_ndcg_at_100_max
value: 37.460575228670514
- type: nauc_ndcg_at_100_std
value: 27.826534756761944
- type: nauc_ndcg_at_10_diff1
value: 22.640844106236027
- type: nauc_ndcg_at_10_max
value: 35.21291764462327
- type: nauc_ndcg_at_10_std
value: 19.53289455984506
- type: nauc_ndcg_at_1_diff1
value: 29.220540327267503
- type: nauc_ndcg_at_1_max
value: 26.81908580507911
- type: nauc_ndcg_at_1_std
value: 8.00840295809718
- type: nauc_ndcg_at_20_diff1
value: 22.117126657768623
- type: nauc_ndcg_at_20_max
value: 35.79395781940806
- type: nauc_ndcg_at_20_std
value: 22.242748346260786
- type: nauc_ndcg_at_3_diff1
value: 23.00596063212187
- type: nauc_ndcg_at_3_max
value: 30.149013627580523
- type: nauc_ndcg_at_3_std
value: 11.07904064662722
- type: nauc_ndcg_at_5_diff1
value: 22.81875419630523
- type: nauc_ndcg_at_5_max
value: 34.24267468356626
- type: nauc_ndcg_at_5_std
value: 15.307780280752088
- type: nauc_precision_at_1000_diff1
value: 9.606677689029972
- type: nauc_precision_at_1000_max
value: 32.74855550489271
- type: nauc_precision_at_1000_std
value: 42.65372585937895
- type: nauc_precision_at_100_diff1
value: 11.528981313529545
- type: nauc_precision_at_100_max
value: 35.642529490132404
- type: nauc_precision_at_100_std
value: 38.146151426052306
- type: nauc_precision_at_10_diff1
value: 18.783957183811836
- type: nauc_precision_at_10_max
value: 36.1982008334257
- type: nauc_precision_at_10_std
value: 25.09349473195891
- type: nauc_precision_at_1_diff1
value: 29.220540327267503
- type: nauc_precision_at_1_max
value: 26.81908580507911
- type: nauc_precision_at_1_std
value: 8.00840295809718
- type: nauc_precision_at_20_diff1
value: 17.458766320828214
- type: nauc_precision_at_20_max
value: 36.000404903025235
- type: nauc_precision_at_20_std
value: 29.1608044138323
- type: nauc_precision_at_3_diff1
value: 20.213669462067166
- type: nauc_precision_at_3_max
value: 31.120650847205912
- type: nauc_precision_at_3_std
value: 12.390972418818118
- type: nauc_precision_at_5_diff1
value: 20.114245715785678
- type: nauc_precision_at_5_max
value: 37.30360111495823
- type: nauc_precision_at_5_std
value: 19.053109037822853
- type: nauc_recall_at_1000_diff1
value: 9.85800049032612
- type: nauc_recall_at_1000_max
value: 32.48319160802687
- type: nauc_recall_at_1000_std
value: 43.79941601741161
- type: nauc_recall_at_100_diff1
value: 11.375255270968337
- type: nauc_recall_at_100_max
value: 35.1868784124497
- type: nauc_recall_at_100_std
value: 38.422680583482666
- type: nauc_recall_at_10_diff1
value: 18.445783123521938
- type: nauc_recall_at_10_max
value: 35.633267936276766
- type: nauc_recall_at_10_std
value: 24.94469506254716
- type: nauc_recall_at_1_diff1
value: 29.00799502838457
- type: nauc_recall_at_1_max
value: 26.64926291797503
- type: nauc_recall_at_1_std
value: 8.167291261637361
- type: nauc_recall_at_20_diff1
value: 17.314906604151936
- type: nauc_recall_at_20_max
value: 35.66067699203996
- type: nauc_recall_at_20_std
value: 29.400137012506082
- type: nauc_recall_at_3_diff1
value: 19.873710875648698
- type: nauc_recall_at_3_max
value: 30.92404718742849
- type: nauc_recall_at_3_std
value: 12.400871018075199
- type: nauc_recall_at_5_diff1
value: 19.869948324233192
- type: nauc_recall_at_5_max
value: 37.06832511687574
- type: nauc_recall_at_5_std
value: 19.0798814966156
- type: ndcg_at_1
value: 17.7
- type: ndcg_at_10
value: 15.379999999999999
- type: ndcg_at_100
value: 22.09
- type: ndcg_at_1000
value: 27.151999999999997
- type: ndcg_at_20
value: 17.576
- type: ndcg_at_3
value: 14.219999999999999
- type: ndcg_at_5
value: 12.579
- type: precision_at_1
value: 17.7
- type: precision_at_10
value: 8.08
- type: precision_at_100
value: 1.7840000000000003
- type: precision_at_1000
value: 0.3
- type: precision_at_20
value: 5.305
- type: precision_at_3
value: 13.167000000000002
- type: precision_at_5
value: 11.06
- type: recall_at_1
value: 3.6029999999999998
- type: recall_at_10
value: 16.413
- type: recall_at_100
value: 36.263
- type: recall_at_1000
value: 61.016999999999996
- type: recall_at_20
value: 21.587999999999997
- type: recall_at_3
value: 8.013
- type: recall_at_5
value: 11.198
- task:
type: Retrieval
dataset:
name: MTEB SciFact-PL (default)
type: clarin-knext/scifact-pl
config: default
split: test
revision: 47932a35f045ef8ed01ba82bf9ff67f6e109207e
metrics:
- type: main_score
value: 64.764
- type: map_at_1
value: 49.778
- type: map_at_10
value: 59.88
- type: map_at_100
value: 60.707
- type: map_at_1000
value: 60.729
- type: map_at_20
value: 60.419999999999995
- type: map_at_3
value: 57.45400000000001
- type: map_at_5
value: 58.729
- type: mrr_at_1
value: 52.33333333333333
- type: mrr_at_10
value: 61.29193121693122
- type: mrr_at_100
value: 61.95817765126313
- type: mrr_at_1000
value: 61.97583284368782
- type: mrr_at_20
value: 61.72469949641003
- type: mrr_at_3
value: 59.44444444444444
- type: mrr_at_5
value: 60.494444444444454
- type: nauc_map_at_1000_diff1
value: 62.21235294015774
- type: nauc_map_at_1000_max
value: 48.83996609100249
- type: nauc_map_at_1000_std
value: 5.23892781043174
- type: nauc_map_at_100_diff1
value: 62.20170226789429
- type: nauc_map_at_100_max
value: 48.8391766453537
- type: nauc_map_at_100_std
value: 5.2664077457917715
- type: nauc_map_at_10_diff1
value: 61.961975488329024
- type: nauc_map_at_10_max
value: 48.397109987625186
- type: nauc_map_at_10_std
value: 4.314859710827481
- type: nauc_map_at_1_diff1
value: 65.0865197011516
- type: nauc_map_at_1_max
value: 41.38862781954889
- type: nauc_map_at_1_std
value: -0.9182122632530586
- type: nauc_map_at_20_diff1
value: 61.99173935851292
- type: nauc_map_at_20_max
value: 48.79961814179307
- type: nauc_map_at_20_std
value: 5.262181845825118
- type: nauc_map_at_3_diff1
value: 62.37910539880477
- type: nauc_map_at_3_max
value: 47.13627890977091
- type: nauc_map_at_3_std
value: 2.327897198087264
- type: nauc_map_at_5_diff1
value: 61.60080757149592
- type: nauc_map_at_5_max
value: 47.60052458345962
- type: nauc_map_at_5_std
value: 3.1770196981231047
- type: nauc_mrr_at_1000_diff1
value: 62.86810952814966
- type: nauc_mrr_at_1000_max
value: 52.13248094447774
- type: nauc_mrr_at_1000_std
value: 10.100485746570733
- type: nauc_mrr_at_100_diff1
value: 62.85364829491874
- type: nauc_mrr_at_100_max
value: 52.134528010631854
- type: nauc_mrr_at_100_std
value: 10.120945685447369
- type: nauc_mrr_at_10_diff1
value: 62.65679301829915
- type: nauc_mrr_at_10_max
value: 52.09270719182349
- type: nauc_mrr_at_10_std
value: 9.913834434725441
- type: nauc_mrr_at_1_diff1
value: 66.84108271415636
- type: nauc_mrr_at_1_max
value: 46.67646429855176
- type: nauc_mrr_at_1_std
value: 5.5505252956352304
- type: nauc_mrr_at_20_diff1
value: 62.72473227039611
- type: nauc_mrr_at_20_max
value: 52.13479097802757
- type: nauc_mrr_at_20_std
value: 10.188278833464084
- type: nauc_mrr_at_3_diff1
value: 63.797429185518496
- type: nauc_mrr_at_3_max
value: 52.16486999573481
- type: nauc_mrr_at_3_std
value: 9.094360767062762
- type: nauc_mrr_at_5_diff1
value: 62.592917975475494
- type: nauc_mrr_at_5_max
value: 52.330741486107414
- type: nauc_mrr_at_5_std
value: 9.742175534421389
- type: nauc_ndcg_at_1000_diff1
value: 61.38859337672476
- type: nauc_ndcg_at_1000_max
value: 51.48380058339184
- type: nauc_ndcg_at_1000_std
value: 9.670547660897673
- type: nauc_ndcg_at_100_diff1
value: 61.02438489641434
- type: nauc_ndcg_at_100_max
value: 51.781246646780865
- type: nauc_ndcg_at_100_std
value: 10.592961553245187
- type: nauc_ndcg_at_10_diff1
value: 60.03678353308358
- type: nauc_ndcg_at_10_max
value: 50.70725688848762
- type: nauc_ndcg_at_10_std
value: 7.9472446491016315
- type: nauc_ndcg_at_1_diff1
value: 66.84108271415636
- type: nauc_ndcg_at_1_max
value: 46.67646429855176
- type: nauc_ndcg_at_1_std
value: 5.5505252956352304
- type: nauc_ndcg_at_20_diff1
value: 59.828482718480224
- type: nauc_ndcg_at_20_max
value: 51.45831789601284
- type: nauc_ndcg_at_20_std
value: 10.722673683272049
- type: nauc_ndcg_at_3_diff1
value: 61.68982937524109
- type: nauc_ndcg_at_3_max
value: 49.745326748604775
- type: nauc_ndcg_at_3_std
value: 4.948298621202247
- type: nauc_ndcg_at_5_diff1
value: 59.67396171973207
- type: nauc_ndcg_at_5_max
value: 49.87855139298281
- type: nauc_ndcg_at_5_std
value: 6.08990428055584
- type: nauc_precision_at_1000_diff1
value: -1.594227972036865
- type: nauc_precision_at_1000_max
value: 32.48431723086185
- type: nauc_precision_at_1000_std
value: 53.84748466965268
- type: nauc_precision_at_100_diff1
value: 8.06411455192293
- type: nauc_precision_at_100_max
value: 39.91003601878948
- type: nauc_precision_at_100_std
value: 55.52979711075091
- type: nauc_precision_at_10_diff1
value: 26.610514456014066
- type: nauc_precision_at_10_max
value: 47.09062494321172
- type: nauc_precision_at_10_std
value: 33.91984226498748
- type: nauc_precision_at_1_diff1
value: 66.84108271415636
- type: nauc_precision_at_1_max
value: 46.67646429855176
- type: nauc_precision_at_1_std
value: 5.5505252956352304
- type: nauc_precision_at_20_diff1
value: 16.947688843085583
- type: nauc_precision_at_20_max
value: 45.40488186572008
- type: nauc_precision_at_20_std
value: 48.354421924500905
- type: nauc_precision_at_3_diff1
value: 49.11263981720622
- type: nauc_precision_at_3_max
value: 52.7084625111683
- type: nauc_precision_at_3_std
value: 16.734612173556453
- type: nauc_precision_at_5_diff1
value: 39.06503705015792
- type: nauc_precision_at_5_max
value: 52.21710506893391
- type: nauc_precision_at_5_std
value: 23.350948149460233
- type: nauc_recall_at_1000_diff1
value: 43.1559290382817
- type: nauc_recall_at_1000_max
value: 83.66013071895456
- type: nauc_recall_at_1000_std
value: 86.27450980392177
- type: nauc_recall_at_100_diff1
value: 46.016860850620375
- type: nauc_recall_at_100_max
value: 69.3944888744547
- type: nauc_recall_at_100_std
value: 55.286945696152735
- type: nauc_recall_at_10_diff1
value: 49.65877895350921
- type: nauc_recall_at_10_max
value: 53.02636695700889
- type: nauc_recall_at_10_std
value: 13.967608945823828
- type: nauc_recall_at_1_diff1
value: 65.0865197011516
- type: nauc_recall_at_1_max
value: 41.38862781954889
- type: nauc_recall_at_1_std
value: -0.9182122632530586
- type: nauc_recall_at_20_diff1
value: 43.355308229973524
- type: nauc_recall_at_20_max
value: 57.04187909533764
- type: nauc_recall_at_20_std
value: 33.578720846660524
- type: nauc_recall_at_3_diff1
value: 56.922996057428165
- type: nauc_recall_at_3_max
value: 50.74417041895424
- type: nauc_recall_at_3_std
value: 5.623890124328387
- type: nauc_recall_at_5_diff1
value: 50.55620076865238
- type: nauc_recall_at_5_max
value: 51.3316854622085
- type: nauc_recall_at_5_std
value: 8.995457887269255
- type: ndcg_at_1
value: 52.333
- type: ndcg_at_10
value: 64.764
- type: ndcg_at_100
value: 68.167
- type: ndcg_at_1000
value: 68.816
- type: ndcg_at_20
value: 66.457
- type: ndcg_at_3
value: 60.346
- type: ndcg_at_5
value: 62.365
- type: precision_at_1
value: 52.333
- type: precision_at_10
value: 8.799999999999999
- type: precision_at_100
value: 1.057
- type: precision_at_1000
value: 0.11100000000000002
- type: precision_at_20
value: 4.8
- type: precision_at_3
value: 23.889
- type: precision_at_5
value: 15.6
- type: recall_at_1
value: 49.778
- type: recall_at_10
value: 78.206
- type: recall_at_100
value: 93.10000000000001
- type: recall_at_1000
value: 98.333
- type: recall_at_20
value: 84.467
- type: recall_at_3
value: 66.367
- type: recall_at_5
value: 71.35000000000001
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID-PL (default)
type: clarin-knext/trec-covid-pl
config: default
split: test
revision: 81bcb408f33366c2a20ac54adafad1ae7e877fdd
metrics:
- type: main_score
value: 72.18900000000001
- type: map_at_1
value: 0.214
- type: map_at_10
value: 1.755
- type: map_at_100
value: 9.944
- type: map_at_1000
value: 24.205
- type: map_at_20
value: 3.1510000000000002
- type: map_at_3
value: 0.6
- type: map_at_5
value: 0.9560000000000001
- type: mrr_at_1
value: 82.0
- type: mrr_at_10
value: 89.06666666666666
- type: mrr_at_100
value: 89.06666666666666
- type: mrr_at_1000
value: 89.06666666666666
- type: mrr_at_20
value: 89.06666666666666
- type: mrr_at_3
value: 87.66666666666666
- type: mrr_at_5
value: 89.06666666666666
- type: nauc_map_at_1000_diff1
value: -9.342037623635543
- type: nauc_map_at_1000_max
value: 45.71499810252398
- type: nauc_map_at_1000_std
value: 76.86482845196852
- type: nauc_map_at_100_diff1
value: -6.932395299866198
- type: nauc_map_at_100_max
value: 36.097801891181604
- type: nauc_map_at_100_std
value: 65.6085215411685
- type: nauc_map_at_10_diff1
value: -6.3654843824342775
- type: nauc_map_at_10_max
value: 9.564437521432714
- type: nauc_map_at_10_std
value: 21.8377319336476
- type: nauc_map_at_1_diff1
value: 8.269590874255034
- type: nauc_map_at_1_max
value: 3.482498491294516
- type: nauc_map_at_1_std
value: 8.985226819412189
- type: nauc_map_at_20_diff1
value: -4.971435767877232
- type: nauc_map_at_20_max
value: 22.88801858567121
- type: nauc_map_at_20_std
value: 32.38492618534027
- type: nauc_map_at_3_diff1
value: 1.1615973694623123
- type: nauc_map_at_3_max
value: 1.935417800315643
- type: nauc_map_at_3_std
value: 10.289328305818698
- type: nauc_map_at_5_diff1
value: -2.4675967231444105
- type: nauc_map_at_5_max
value: 2.4611483736622373
- type: nauc_map_at_5_std
value: 15.082324305750811
- type: nauc_mrr_at_1000_diff1
value: 13.098526703499063
- type: nauc_mrr_at_1000_max
value: 56.37362177417431
- type: nauc_mrr_at_1000_std
value: 73.2456769749587
- type: nauc_mrr_at_100_diff1
value: 13.098526703499063
- type: nauc_mrr_at_100_max
value: 56.37362177417431
- type: nauc_mrr_at_100_std
value: 73.2456769749587
- type: nauc_mrr_at_10_diff1
value: 13.098526703499063
- type: nauc_mrr_at_10_max
value: 56.37362177417431
- type: nauc_mrr_at_10_std
value: 73.2456769749587
- type: nauc_mrr_at_1_diff1
value: 12.099350148694809
- type: nauc_mrr_at_1_max
value: 53.75041304108387
- type: nauc_mrr_at_1_std
value: 68.84018063663402
- type: nauc_mrr_at_20_diff1
value: 13.098526703499063
- type: nauc_mrr_at_20_max
value: 56.37362177417431
- type: nauc_mrr_at_20_std
value: 73.2456769749587
- type: nauc_mrr_at_3_diff1
value: 12.173557857011161
- type: nauc_mrr_at_3_max
value: 57.540780562363395
- type: nauc_mrr_at_3_std
value: 75.42098189580211
- type: nauc_mrr_at_5_diff1
value: 13.098526703499063
- type: nauc_mrr_at_5_max
value: 56.37362177417431
- type: nauc_mrr_at_5_std
value: 73.2456769749587
- type: nauc_ndcg_at_1000_diff1
value: -8.951471847310401
- type: nauc_ndcg_at_1000_max
value: 43.86942237288822
- type: nauc_ndcg_at_1000_std
value: 74.61077735148591
- type: nauc_ndcg_at_100_diff1
value: -17.754559361083817
- type: nauc_ndcg_at_100_max
value: 53.97187119773482
- type: nauc_ndcg_at_100_std
value: 80.7944136146514
- type: nauc_ndcg_at_10_diff1
value: -26.637734697836414
- type: nauc_ndcg_at_10_max
value: 47.70102699133149
- type: nauc_ndcg_at_10_std
value: 70.26909560828646
- type: nauc_ndcg_at_1_diff1
value: -1.2250530785563207
- type: nauc_ndcg_at_1_max
value: 46.60509554140131
- type: nauc_ndcg_at_1_std
value: 62.63906581740976
- type: nauc_ndcg_at_20_diff1
value: -22.44286466550908
- type: nauc_ndcg_at_20_max
value: 55.40492058090103
- type: nauc_ndcg_at_20_std
value: 72.11813912145738
- type: nauc_ndcg_at_3_diff1
value: -14.8152721896563
- type: nauc_ndcg_at_3_max
value: 38.952259383027595
- type: nauc_ndcg_at_3_std
value: 59.819750166537766
- type: nauc_ndcg_at_5_diff1
value: -19.150105688904375
- type: nauc_ndcg_at_5_max
value: 42.311180547775315
- type: nauc_ndcg_at_5_std
value: 66.6632229321094
- type: nauc_precision_at_1000_diff1
value: -11.555591477978941
- type: nauc_precision_at_1000_max
value: 43.7311644834851
- type: nauc_precision_at_1000_std
value: 52.10644767999648
- type: nauc_precision_at_100_diff1
value: -16.94803099801117
- type: nauc_precision_at_100_max
value: 54.08281631067633
- type: nauc_precision_at_100_std
value: 82.77237347891331
- type: nauc_precision_at_10_diff1
value: -27.351332814863355
- type: nauc_precision_at_10_max
value: 48.08237549065846
- type: nauc_precision_at_10_std
value: 69.37250843534329
- type: nauc_precision_at_1_diff1
value: 12.099350148694809
- type: nauc_precision_at_1_max
value: 53.75041304108387
- type: nauc_precision_at_1_std
value: 68.84018063663402
- type: nauc_precision_at_20_diff1
value: -18.2422222283388
- type: nauc_precision_at_20_max
value: 59.517328129343696
- type: nauc_precision_at_20_std
value: 72.05149307342747
- type: nauc_precision_at_3_diff1
value: -10.226547543075897
- type: nauc_precision_at_3_max
value: 43.14684818832875
- type: nauc_precision_at_3_std
value: 57.31936467418288
- type: nauc_precision_at_5_diff1
value: -14.28521589468673
- type: nauc_precision_at_5_max
value: 41.633426753962596
- type: nauc_precision_at_5_std
value: 64.94400576804541
- type: nauc_recall_at_1000_diff1
value: -0.9648831207497152
- type: nauc_recall_at_1000_max
value: 31.70832946085005
- type: nauc_recall_at_1000_std
value: 63.21471613968869
- type: nauc_recall_at_100_diff1
value: -1.360254380933586
- type: nauc_recall_at_100_max
value: 25.960597782099605
- type: nauc_recall_at_100_std
value: 51.52757589609674
- type: nauc_recall_at_10_diff1
value: -0.3899439424189566
- type: nauc_recall_at_10_max
value: 5.094341897886072
- type: nauc_recall_at_10_std
value: 11.266045616925698
- type: nauc_recall_at_1_diff1
value: 8.269590874255034
- type: nauc_recall_at_1_max
value: 3.482498491294516
- type: nauc_recall_at_1_std
value: 8.985226819412189
- type: nauc_recall_at_20_diff1
value: 6.4797098359254175
- type: nauc_recall_at_20_max
value: 15.663700985336124
- type: nauc_recall_at_20_std
value: 17.154099587904913
- type: nauc_recall_at_3_diff1
value: 3.7245972450393507
- type: nauc_recall_at_3_max
value: 0.4063857187240345
- type: nauc_recall_at_3_std
value: 6.641948062821941
- type: nauc_recall_at_5_diff1
value: 4.013879477591466
- type: nauc_recall_at_5_max
value: -1.4266586618013566
- type: nauc_recall_at_5_std
value: 7.311601874411205
- type: ndcg_at_1
value: 75.0
- type: ndcg_at_10
value: 72.18900000000001
- type: ndcg_at_100
value: 54.022999999999996
- type: ndcg_at_1000
value: 49.492000000000004
- type: ndcg_at_20
value: 68.51
- type: ndcg_at_3
value: 73.184
- type: ndcg_at_5
value: 72.811
- type: precision_at_1
value: 82.0
- type: precision_at_10
value: 77.4
- type: precision_at_100
value: 55.24
- type: precision_at_1000
value: 21.822
- type: precision_at_20
value: 73.0
- type: precision_at_3
value: 79.333
- type: precision_at_5
value: 79.2
- type: recall_at_1
value: 0.214
- type: recall_at_10
value: 1.9980000000000002
- type: recall_at_100
value: 13.328999999999999
- type: recall_at_1000
value: 47.204
- type: recall_at_20
value: 3.7310000000000003
- type: recall_at_3
value: 0.628
- type: recall_at_5
value: 1.049
- task:
type: MultilabelClassification
dataset:
name: MTEB CEDRClassification (default)
type: ai-forever/cedr-classification
config: default
split: test
revision: c0ba03d058e3e1b2f3fd20518875a4563dd12db4
metrics:
- type: accuracy
value: 47.30605738575983
- type: f1
value: 41.26091043925065
- type: lrap
value: 72.89452709883206
- type: main_score
value: 47.30605738575983
- task:
type: Reranking
dataset:
name: MTEB MIRACLReranking (ru)
type: miracl/mmteb-miracl-reranking
config: ru
split: dev
revision: 6d1962c527217f8927fca80f890f14f36b2802af
metrics:
- type: MAP@1(MIRACL)
value: 20.721999999999998
- type: MAP@10(MIRACL)
value: 33.900999999999996
- type: MAP@100(MIRACL)
value: 36.813
- type: MAP@1000(MIRACL)
value: 36.813
- type: MAP@20(MIRACL)
value: 35.684
- type: MAP@3(MIRACL)
value: 28.141
- type: MAP@5(MIRACL)
value: 31.075000000000003
- type: NDCG@1(MIRACL)
value: 32.799
- type: NDCG@10(MIRACL)
value: 42.065000000000005
- type: NDCG@100(MIRACL)
value: 49.730999999999995
- type: NDCG@1000(MIRACL)
value: 49.730999999999995
- type: NDCG@20(MIRACL)
value: 46.0
- type: NDCG@3(MIRACL)
value: 34.481
- type: NDCG@5(MIRACL)
value: 37.452999999999996
- type: P@1(MIRACL)
value: 32.799
- type: P@10(MIRACL)
value: 11.668000000000001
- type: P@100(MIRACL)
value: 1.9529999999999998
- type: P@1000(MIRACL)
value: 0.19499999999999998
- type: P@20(MIRACL)
value: 7.51
- type: P@3(MIRACL)
value: 20.823
- type: P@5(MIRACL)
value: 16.728
- type: Recall@1(MIRACL)
value: 20.721999999999998
- type: Recall@10(MIRACL)
value: 54.762
- type: Recall@100(MIRACL)
value: 79.952
- type: Recall@1000(MIRACL)
value: 79.952
- type: Recall@20(MIRACL)
value: 66.26100000000001
- type: Recall@3(MIRACL)
value: 34.410000000000004
- type: Recall@5(MIRACL)
value: 42.659000000000006
- type: main_score
value: 42.065000000000005
- type: nAUC_MAP@1000_diff1(MIRACL)
value: 14.33534992502818
- type: nAUC_MAP@1000_max(MIRACL)
value: 12.367998764646115
- type: nAUC_MAP@1000_std(MIRACL)
value: 4.569686002935006
- type: nAUC_MAP@100_diff1(MIRACL)
value: 14.33534992502818
- type: nAUC_MAP@100_max(MIRACL)
value: 12.367998764646115
- type: nAUC_MAP@100_std(MIRACL)
value: 4.569686002935006
- type: nAUC_MAP@10_diff1(MIRACL)
value: 16.920323975680027
- type: nAUC_MAP@10_max(MIRACL)
value: 9.327171297204082
- type: nAUC_MAP@10_std(MIRACL)
value: 3.2039133783079015
- type: nAUC_MAP@1_diff1(MIRACL)
value: 28.698973487482206
- type: nAUC_MAP@1_max(MIRACL)
value: 2.9217687660885034
- type: nAUC_MAP@1_std(MIRACL)
value: -1.1247408800976524
- type: nAUC_MAP@20_diff1(MIRACL)
value: 15.359083081640476
- type: nAUC_MAP@20_max(MIRACL)
value: 11.310494233946345
- type: nAUC_MAP@20_std(MIRACL)
value: 4.4171898386022885
- type: nAUC_MAP@3_diff1(MIRACL)
value: 22.27430591851617
- type: nAUC_MAP@3_max(MIRACL)
value: 6.407438291284658
- type: nAUC_MAP@3_std(MIRACL)
value: 0.9799184530397409
- type: nAUC_MAP@5_diff1(MIRACL)
value: 19.20571689941054
- type: nAUC_MAP@5_max(MIRACL)
value: 7.987468654026893
- type: nAUC_MAP@5_std(MIRACL)
value: 1.8324246565938962
- type: nAUC_NDCG@1000_diff1(MIRACL)
value: 3.7537669018914768
- type: nAUC_NDCG@1000_max(MIRACL)
value: 20.7944707840533
- type: nAUC_NDCG@1000_std(MIRACL)
value: 8.444837055303063
- type: nAUC_NDCG@100_diff1(MIRACL)
value: 3.7537669018914768
- type: nAUC_NDCG@100_max(MIRACL)
value: 20.7944707840533
- type: nAUC_NDCG@100_std(MIRACL)
value: 8.444837055303063
- type: nAUC_NDCG@10_diff1(MIRACL)
value: 10.829575656103888
- type: nAUC_NDCG@10_max(MIRACL)
value: 13.0445496498929
- type: nAUC_NDCG@10_std(MIRACL)
value: 6.050412212625362
- type: nAUC_NDCG@1_diff1(MIRACL)
value: 19.1388712233292
- type: nAUC_NDCG@1_max(MIRACL)
value: 10.871900994781642
- type: nAUC_NDCG@1_std(MIRACL)
value: 3.218568248751811
- type: nAUC_NDCG@20_diff1(MIRACL)
value: 7.093172181746442
- type: nAUC_NDCG@20_max(MIRACL)
value: 16.955238078958836
- type: nAUC_NDCG@20_std(MIRACL)
value: 8.325656379573035
- type: nAUC_NDCG@3_diff1(MIRACL)
value: 17.134437303330802
- type: nAUC_NDCG@3_max(MIRACL)
value: 10.235328822955793
- type: nAUC_NDCG@3_std(MIRACL)
value: 3.2341358691084814
- type: nAUC_NDCG@5_diff1(MIRACL)
value: 14.733664618337636
- type: nAUC_NDCG@5_max(MIRACL)
value: 11.181897412035282
- type: nAUC_NDCG@5_std(MIRACL)
value: 3.642277088791985
- type: nAUC_P@1000_diff1(MIRACL)
value: -26.330038284867573
- type: nAUC_P@1000_max(MIRACL)
value: 28.450694137240458
- type: nAUC_P@1000_std(MIRACL)
value: 9.892993775474912
- type: nAUC_P@100_diff1(MIRACL)
value: -26.330038284867552
- type: nAUC_P@100_max(MIRACL)
value: 28.45069413724051
- type: nAUC_P@100_std(MIRACL)
value: 9.892993775474928
- type: nAUC_P@10_diff1(MIRACL)
value: -17.436937353231112
- type: nAUC_P@10_max(MIRACL)
value: 24.327018012947857
- type: nAUC_P@10_std(MIRACL)
value: 11.78803527706634
- type: nAUC_P@1_diff1(MIRACL)
value: 19.1388712233292
- type: nAUC_P@1_max(MIRACL)
value: 10.871900994781642
- type: nAUC_P@1_std(MIRACL)
value: 3.218568248751811
- type: nAUC_P@20_diff1(MIRACL)
value: -22.947528755272426
- type: nAUC_P@20_max(MIRACL)
value: 27.773093471902538
- type: nAUC_P@20_std(MIRACL)
value: 14.898619107087221
- type: nAUC_P@3_diff1(MIRACL)
value: 1.4100426412400944
- type: nAUC_P@3_max(MIRACL)
value: 17.397472872058845
- type: nAUC_P@3_std(MIRACL)
value: 8.240008229861875
- type: nAUC_P@5_diff1(MIRACL)
value: -7.971349332207021
- type: nAUC_P@5_max(MIRACL)
value: 22.198441167940963
- type: nAUC_P@5_std(MIRACL)
value: 9.00265164460082
- type: nAUC_Recall@1000_diff1(MIRACL)
value: -38.69835271863148
- type: nAUC_Recall@1000_max(MIRACL)
value: 50.9545152809108
- type: nAUC_Recall@1000_std(MIRACL)
value: 20.44270887092116
- type: nAUC_Recall@100_diff1(MIRACL)
value: -38.69835271863148
- type: nAUC_Recall@100_max(MIRACL)
value: 50.9545152809108
- type: nAUC_Recall@100_std(MIRACL)
value: 20.44270887092116
- type: nAUC_Recall@10_diff1(MIRACL)
value: -0.08109036309433801
- type: nAUC_Recall@10_max(MIRACL)
value: 12.696619907773568
- type: nAUC_Recall@10_std(MIRACL)
value: 8.791982704261589
- type: nAUC_Recall@1_diff1(MIRACL)
value: 28.698973487482206
- type: nAUC_Recall@1_max(MIRACL)
value: 2.9217687660885034
- type: nAUC_Recall@1_std(MIRACL)
value: -1.1247408800976524
- type: nAUC_Recall@20_diff1(MIRACL)
value: -13.312171017942623
- type: nAUC_Recall@20_max(MIRACL)
value: 24.19847346821666
- type: nAUC_Recall@20_std(MIRACL)
value: 15.8157702609797
- type: nAUC_Recall@3_diff1(MIRACL)
value: 16.909128321353343
- type: nAUC_Recall@3_max(MIRACL)
value: 6.552122731902991
- type: nAUC_Recall@3_std(MIRACL)
value: 1.9963898223457228
- type: nAUC_Recall@5_diff1(MIRACL)
value: 9.990292655247721
- type: nAUC_Recall@5_max(MIRACL)
value: 9.361722273507574
- type: nAUC_Recall@5_std(MIRACL)
value: 3.270918827854495
- task:
type: MultilabelClassification
dataset:
name: MTEB SensitiveTopicsClassification (default)
type: ai-forever/sensitive-topics-classification
config: default
split: test
revision: 416b34a802308eac30e4192afc0ff99bb8dcc7f2
metrics:
- type: accuracy
value: 30.634765625
- type: f1
value: 32.647559808678665
- type: lrap
value: 45.94319661458259
- type: main_score
value: 30.634765625
- task:
type: STS
dataset:
name: MTEB ATEC (default)
type: C-MTEB/ATEC
config: default
split: test
revision: 0f319b1142f28d00e055a6770f3f726ae9b7d865
metrics:
- type: cosine_pearson
value: 47.541497334563296
- type: cosine_spearman
value: 49.06268944206629
- type: euclidean_pearson
value: 51.838926748581635
- type: euclidean_spearman
value: 48.930697157135356
- type: main_score
value: 49.06268944206629
- type: manhattan_pearson
value: 51.835306769406365
- type: manhattan_spearman
value: 48.86135493444834
- type: pearson
value: 47.541497334563296
- type: spearman
value: 49.06268944206629
- task:
type: Classification
dataset:
name: MTEB AllegroReviews (default)
type: PL-MTEB/allegro-reviews
config: default
split: test
revision: b89853e6de927b0e3bfa8ecc0e56fe4e02ceafc6
metrics:
- type: accuracy
value: 49.51292246520874
- type: f1
value: 44.14350234332397
- type: f1_weighted
value: 51.65508998354552
- type: main_score
value: 49.51292246520874
- task:
type: Clustering
dataset:
name: MTEB AlloProfClusteringP2P (default)
type: lyon-nlp/alloprof
config: default
split: test
revision: 392ba3f5bcc8c51f578786c1fc3dae648662cb9b
metrics:
- type: main_score
value: 63.883383458621665
- type: v_measure
value: 63.883383458621665
- type: v_measure_std
value: 2.693666879958465
- type: main_score
value: 46.85924588755251
- type: v_measure
value: 46.85924588755251
- type: v_measure_std
value: 2.1918258880872377
- task:
type: Clustering
dataset:
name: MTEB 8TagsClustering
type: PL-MTEB/8tags-clustering
config: default
split: test
revision: None
metrics:
- type: v_measure
value: 43.65721212452554
- task:
type: Reranking
dataset:
name: MTEB AlloprofReranking (default)
type: lyon-nlp/mteb-fr-reranking-alloprof-s2p
config: default
split: test
revision: e40c8a63ce02da43200eccb5b0846fcaa888f562
metrics:
- type: map
value: 66.39013753839347
- type: mrr
value: 67.68045617786551
- type: main_score
value: 66.39013753839347
- task:
type: Retrieval
dataset:
name: MTEB AlloprofRetrieval (default)
type: lyon-nlp/alloprof
config: default
split: test
revision: fcf295ea64c750f41fadbaa37b9b861558e1bfbd
metrics:
- type: main_score
value: 54.284
- type: map_at_1
value: 37.047000000000004
- type: map_at_10
value: 48.53
- type: map_at_100
value: 49.357
- type: map_at_1000
value: 49.39
- type: map_at_20
value: 49.064
- type: map_at_3
value: 45.675
- type: map_at_5
value: 47.441
- type: mrr_at_1
value: 37.04663212435233
- type: mrr_at_10
value: 48.5300326232969
- type: mrr_at_100
value: 49.35708199037581
- type: mrr_at_1000
value: 49.39005824603193
- type: mrr_at_20
value: 49.06417416464799
- type: mrr_at_3
value: 45.67501439263105
- type: mrr_at_5
value: 47.44099021301103
- type: nauc_map_at_1000_diff1
value: 43.32474221868009
- type: nauc_map_at_1000_max
value: 39.407334029058575
- type: nauc_map_at_1000_std
value: -2.3728154448932606
- type: nauc_map_at_100_diff1
value: 43.32336300929909
- type: nauc_map_at_100_max
value: 39.432174777554835
- type: nauc_map_at_100_std
value: -2.356396922384349
- type: nauc_map_at_10_diff1
value: 43.1606520154482
- type: nauc_map_at_10_max
value: 39.33734650558226
- type: nauc_map_at_10_std
value: -2.5156222475075256
- type: nauc_map_at_1_diff1
value: 46.2178975214499
- type: nauc_map_at_1_max
value: 36.26173199049361
- type: nauc_map_at_1_std
value: -3.0897555582816443
- type: nauc_map_at_20_diff1
value: 43.272980702916456
- type: nauc_map_at_20_max
value: 39.4896977052276
- type: nauc_map_at_20_std
value: -2.3305501742917043
- type: nauc_map_at_3_diff1
value: 43.49525042967079
- type: nauc_map_at_3_max
value: 38.66352501824728
- type: nauc_map_at_3_std
value: -3.202794391620473
- type: nauc_map_at_5_diff1
value: 43.2266692546611
- type: nauc_map_at_5_max
value: 38.77368661115743
- type: nauc_map_at_5_std
value: -3.0897532130127954
- type: nauc_mrr_at_1000_diff1
value: 43.32474221868009
- type: nauc_mrr_at_1000_max
value: 39.407334029058575
- type: nauc_mrr_at_1000_std
value: -2.3728154448932606
- type: nauc_mrr_at_100_diff1
value: 43.32336300929909
- type: nauc_mrr_at_100_max
value: 39.432174777554835
- type: nauc_mrr_at_100_std
value: -2.356396922384349
- type: nauc_mrr_at_10_diff1
value: 43.1606520154482
- type: nauc_mrr_at_10_max
value: 39.33734650558226
- type: nauc_mrr_at_10_std
value: -2.5156222475075256
- type: nauc_mrr_at_1_diff1
value: 46.2178975214499
- type: nauc_mrr_at_1_max
value: 36.26173199049361
- type: nauc_mrr_at_1_std
value: -3.0897555582816443
- type: nauc_mrr_at_20_diff1
value: 43.272980702916456
- type: nauc_mrr_at_20_max
value: 39.4896977052276
- type: nauc_mrr_at_20_std
value: -2.3305501742917043
- type: nauc_mrr_at_3_diff1
value: 43.49525042967079
- type: nauc_mrr_at_3_max
value: 38.66352501824728
- type: nauc_mrr_at_3_std
value: -3.202794391620473
- type: nauc_mrr_at_5_diff1
value: 43.2266692546611
- type: nauc_mrr_at_5_max
value: 38.77368661115743
- type: nauc_mrr_at_5_std
value: -3.0897532130127954
- type: nauc_ndcg_at_1000_diff1
value: 43.01903168202974
- type: nauc_ndcg_at_1000_max
value: 40.75496622942232
- type: nauc_ndcg_at_1000_std
value: -1.3150412981845496
- type: nauc_ndcg_at_100_diff1
value: 42.98016493758145
- type: nauc_ndcg_at_100_max
value: 41.55869635162325
- type: nauc_ndcg_at_100_std
value: -0.5355252976886055
- type: nauc_ndcg_at_10_diff1
value: 42.218755211347506
- type: nauc_ndcg_at_10_max
value: 41.305042275175765
- type: nauc_ndcg_at_10_std
value: -1.4034484444573714
- type: nauc_ndcg_at_1_diff1
value: 46.2178975214499
- type: nauc_ndcg_at_1_max
value: 36.26173199049361
- type: nauc_ndcg_at_1_std
value: -3.0897555582816443
- type: nauc_ndcg_at_20_diff1
value: 42.66574440095576
- type: nauc_ndcg_at_20_max
value: 42.014620115124515
- type: nauc_ndcg_at_20_std
value: -0.5176162553751498
- type: nauc_ndcg_at_3_diff1
value: 42.837450505106055
- type: nauc_ndcg_at_3_max
value: 39.525369733082414
- type: nauc_ndcg_at_3_std
value: -3.1605948245795155
- type: nauc_ndcg_at_5_diff1
value: 42.37951815451173
- type: nauc_ndcg_at_5_max
value: 39.78840132935179
- type: nauc_ndcg_at_5_std
value: -2.936898430768135
- type: nauc_precision_at_1000_diff1
value: 49.69224988612385
- type: nauc_precision_at_1000_max
value: 79.57897547128005
- type: nauc_precision_at_1000_std
value: 45.040371354764645
- type: nauc_precision_at_100_diff1
value: 42.70597486048422
- type: nauc_precision_at_100_max
value: 65.74628759606188
- type: nauc_precision_at_100_std
value: 25.49157745244855
- type: nauc_precision_at_10_diff1
value: 38.565609931689345
- type: nauc_precision_at_10_max
value: 50.0239696180852
- type: nauc_precision_at_10_std
value: 3.976354829503967
- type: nauc_precision_at_1_diff1
value: 46.2178975214499
- type: nauc_precision_at_1_max
value: 36.26173199049361
- type: nauc_precision_at_1_std
value: -3.0897555582816443
- type: nauc_precision_at_20_diff1
value: 40.4134718566864
- type: nauc_precision_at_20_max
value: 57.121778108665374
- type: nauc_precision_at_20_std
value: 11.46021975428544
- type: nauc_precision_at_3_diff1
value: 40.90538379461529
- type: nauc_precision_at_3_max
value: 42.18393248057992
- type: nauc_precision_at_3_std
value: -3.005249943837297
- type: nauc_precision_at_5_diff1
value: 39.60162965860782
- type: nauc_precision_at_5_max
value: 43.28317158174058
- type: nauc_precision_at_5_std
value: -2.3469094487738054
- type: nauc_recall_at_1000_diff1
value: 49.69224988612252
- type: nauc_recall_at_1000_max
value: 79.57897547127862
- type: nauc_recall_at_1000_std
value: 45.04037135476256
- type: nauc_recall_at_100_diff1
value: 42.70597486048432
- type: nauc_recall_at_100_max
value: 65.74628759606213
- type: nauc_recall_at_100_std
value: 25.491577452448727
- type: nauc_recall_at_10_diff1
value: 38.56560993168935
- type: nauc_recall_at_10_max
value: 50.02396961808522
- type: nauc_recall_at_10_std
value: 3.9763548295040314
- type: nauc_recall_at_1_diff1
value: 46.2178975214499
- type: nauc_recall_at_1_max
value: 36.26173199049361
- type: nauc_recall_at_1_std
value: -3.0897555582816443
- type: nauc_recall_at_20_diff1
value: 40.41347185668637
- type: nauc_recall_at_20_max
value: 57.12177810866533
- type: nauc_recall_at_20_std
value: 11.460219754285431
- type: nauc_recall_at_3_diff1
value: 40.90538379461527
- type: nauc_recall_at_3_max
value: 42.18393248057989
- type: nauc_recall_at_3_std
value: -3.005249943837297
- type: nauc_recall_at_5_diff1
value: 39.601629658607784
- type: nauc_recall_at_5_max
value: 43.28317158174053
- type: nauc_recall_at_5_std
value: -2.3469094487738054
- type: ndcg_at_1
value: 37.047000000000004
- type: ndcg_at_10
value: 54.284
- type: ndcg_at_100
value: 58.34
- type: ndcg_at_1000
value: 59.303
- type: ndcg_at_20
value: 56.235
- type: ndcg_at_3
value: 48.503
- type: ndcg_at_5
value: 51.686
- type: precision_at_1
value: 37.047000000000004
- type: precision_at_10
value: 7.237
- type: precision_at_100
value: 0.914
- type: precision_at_1000
value: 0.099
- type: precision_at_20
value: 4.005
- type: precision_at_3
value: 18.898
- type: precision_at_5
value: 12.884
- type: recall_at_1
value: 37.047000000000004
- type: recall_at_10
value: 72.366
- type: recall_at_100
value: 91.408
- type: recall_at_1000
value: 99.136
- type: recall_at_20
value: 80.095
- type: recall_at_3
value: 56.693000000000005
- type: recall_at_5
value: 64.42099999999999
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 89.49253731343283
- type: ap
value: 61.88098616359918
- type: ap_weighted
value: 61.88098616359918
- type: f1
value: 84.76516623679144
- type: f1_weighted
value: 89.92745276292968
- type: main_score
value: 89.49253731343283
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (de)
type: mteb/amazon_counterfactual
config: de
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 89.61456102783727
- type: ap
value: 93.11816566733742
- type: ap_weighted
value: 93.11816566733742
- type: f1
value: 88.27635757733722
- type: f1_weighted
value: 89.82581568285453
- type: main_score
value: 89.61456102783727
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification (default)
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 95.3825
- type: ap
value: 93.393033869502
- type: ap_weighted
value: 93.393033869502
- type: f1
value: 95.38109007966307
- type: f1_weighted
value: 95.38109007966305
- type: main_score
value: 95.3825
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 49.768
- type: f1
value: 48.95084821944411
- type: f1_weighted
value: 48.9508482194441
- type: main_score
value: 49.768
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (de)
type: mteb/amazon_reviews_multi
config: de
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 48.071999999999996
- type: f1
value: 47.24171107487612
- type: f1_weighted
value: 47.24171107487612
- type: main_score
value: 48.071999999999996
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (es)
type: mteb/amazon_reviews_multi
config: es
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 48.102000000000004
- type: f1
value: 47.27193805278696
- type: f1_weighted
value: 47.27193805278696
- type: main_score
value: 48.102000000000004
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (fr)
type: mteb/amazon_reviews_multi
config: fr
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 47.30800000000001
- type: f1
value: 46.41683358017851
- type: f1_weighted
value: 46.41683358017851
- type: main_score
value: 47.30800000000001
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (zh)
type: mteb/amazon_reviews_multi
config: zh
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 44.944
- type: f1
value: 44.223824487744395
- type: f1_weighted
value: 44.22382448774439
- type: main_score
value: 44.944
- task:
type: Retrieval
dataset:
name: MTEB ArguAna (default)
type: mteb/arguana
config: default
split: test
revision: c22ab2a51041ffd869aaddef7af8d8215647e41a
metrics:
- type: map_at_1
value: 29.232000000000003
- type: map_at_10
value: 45.117000000000004
- type: map_at_100
value: 45.977000000000004
- type: map_at_1000
value: 45.98
- type: map_at_20
value: 45.815
- type: map_at_3
value: 39.912
- type: map_at_5
value: 42.693
- type: mrr_at_1
value: 29.659000000000002
- type: mrr_at_10
value: 45.253
- type: mrr_at_100
value: 46.125
- type: mrr_at_1000
value: 46.129
- type: mrr_at_20
value: 45.964
- type: mrr_at_3
value: 40.043
- type: mrr_at_5
value: 42.870000000000005
- type: ndcg_at_1
value: 29.232000000000003
- type: ndcg_at_10
value: 54.327999999999996
- type: ndcg_at_100
value: 57.86
- type: ndcg_at_1000
value: 57.935
- type: ndcg_at_20
value: 56.794
- type: ndcg_at_3
value: 43.516
- type: ndcg_at_5
value: 48.512
- type: precision_at_1
value: 29.232000000000003
- type: precision_at_10
value: 8.393
- type: precision_at_100
value: 0.991
- type: precision_at_1000
value: 0.1
- type: precision_at_20
value: 4.676
- type: precision_at_3
value: 17.994
- type: precision_at_5
value: 13.215
- type: recall_at_1
value: 29.232000000000003
- type: recall_at_10
value: 83.926
- type: recall_at_100
value: 99.075
- type: recall_at_1000
value: 99.644
- type: recall_at_20
value: 93.528
- type: recall_at_3
value: 53.983000000000004
- type: recall_at_5
value: 66.074
- type: main_score
value: 54.327999999999996
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P (default)
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: main_score
value: 46.6636824632419
- type: v_measure
value: 46.6636824632419
- type: v_measure_std
value: 13.817129140714963
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S (default)
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: main_score
value: 39.271141892800024
- type: v_measure
value: 39.271141892800024
- type: v_measure_std
value: 14.276782483454827
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions (default)
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 65.04363277324629
- type: mrr
value: 78.2372598162072
- type: main_score
value: 65.04363277324629
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking (default)
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 30.83
- type: main_score
value: 30.83
- task:
type: STS
dataset:
name: MTEB BIOSSES (default)
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cosine_pearson
value: 88.80382082011027
- type: cosine_spearman
value: 88.68876782169106
- type: euclidean_pearson
value: 87.00802890147176
- type: euclidean_spearman
value: 87.43211268192712
- type: main_score
value: 88.68876782169106
- type: manhattan_pearson
value: 87.14062537179474
- type: manhattan_spearman
value: 87.59115245033443
- type: pearson
value: 88.80382082011027
- type: spearman
value: 88.68876782169106
- task:
type: STS
dataset:
name: MTEB BQ (default)
type: C-MTEB/BQ
config: default
split: test
revision: e3dda5e115e487b39ec7e618c0c6a29137052a55
metrics:
- type: cosine_pearson
value: 61.588006604878196
- type: cosine_spearman
value: 63.20615427154465
- type: euclidean_pearson
value: 61.818547092516496
- type: euclidean_spearman
value: 63.21558009151778
- type: main_score
value: 63.20615427154465
- type: manhattan_pearson
value: 61.665588158487616
- type: manhattan_spearman
value: 63.051544488238584
- type: pearson
value: 61.588006604878196
- type: spearman
value: 63.20615427154465
- task:
type: Retrieval
dataset:
name: MTEB BSARDRetrieval (default)
type: maastrichtlawtech/bsard
config: default
split: test
revision: 5effa1b9b5fa3b0f9e12523e6e43e5f86a6e6d59
metrics:
- type: main_score
value: 64.414
- type: map_at_1
value: 14.865
- type: map_at_10
value: 21.605
- type: map_at_100
value: 22.762
- type: map_at_1000
value: 22.854
- type: map_at_20
value: 22.259999999999998
- type: map_at_3
value: 20.119999999999997
- type: map_at_5
value: 20.931
- type: mrr_at_1
value: 14.864864864864865
- type: mrr_at_10
value: 21.605176605176606
- type: mrr_at_100
value: 22.7622306460065
- type: mrr_at_1000
value: 22.85383406410312
- type: mrr_at_20
value: 22.259528463088845
- type: mrr_at_3
value: 20.12012012012012
- type: mrr_at_5
value: 20.930930930930934
- type: nauc_map_at_1000_diff1
value: 17.486265968689338
- type: nauc_map_at_1000_max
value: 22.736799291688836
- type: nauc_map_at_1000_std
value: 9.831687441977147
- type: nauc_map_at_100_diff1
value: 17.50754492049086
- type: nauc_map_at_100_max
value: 22.77693662806787
- type: nauc_map_at_100_std
value: 9.853899509675395
- type: nauc_map_at_10_diff1
value: 17.42133968580952
- type: nauc_map_at_10_max
value: 22.45861793882279
- type: nauc_map_at_10_std
value: 8.964888472915938
- type: nauc_map_at_1_diff1
value: 19.433947086968093
- type: nauc_map_at_1_max
value: 24.75657047550517
- type: nauc_map_at_1_std
value: 15.122329157218505
- type: nauc_map_at_20_diff1
value: 17.429856756008785
- type: nauc_map_at_20_max
value: 22.438850987431017
- type: nauc_map_at_20_std
value: 9.172746012213558
- type: nauc_map_at_3_diff1
value: 18.218182689678475
- type: nauc_map_at_3_max
value: 23.57169444088667
- type: nauc_map_at_3_std
value: 10.464473559366356
- type: nauc_map_at_5_diff1
value: 18.6075342519133
- type: nauc_map_at_5_max
value: 23.308845973576673
- type: nauc_map_at_5_std
value: 9.364009996445652
- type: nauc_mrr_at_1000_diff1
value: 17.486265968689338
- type: nauc_mrr_at_1000_max
value: 22.736799291688836
- type: nauc_mrr_at_1000_std
value: 9.831687441977147
- type: nauc_mrr_at_100_diff1
value: 17.50754492049086
- type: nauc_mrr_at_100_max
value: 22.77693662806787
- type: nauc_mrr_at_100_std
value: 9.853899509675395
- type: nauc_mrr_at_10_diff1
value: 17.42133968580952
- type: nauc_mrr_at_10_max
value: 22.45861793882279
- type: nauc_mrr_at_10_std
value: 8.964888472915938
- type: nauc_mrr_at_1_diff1
value: 19.433947086968093
- type: nauc_mrr_at_1_max
value: 24.75657047550517
- type: nauc_mrr_at_1_std
value: 15.122329157218505
- type: nauc_mrr_at_20_diff1
value: 17.429856756008785
- type: nauc_mrr_at_20_max
value: 22.438850987431017
- type: nauc_mrr_at_20_std
value: 9.172746012213558
- type: nauc_mrr_at_3_diff1
value: 18.218182689678475
- type: nauc_mrr_at_3_max
value: 23.57169444088667
- type: nauc_mrr_at_3_std
value: 10.464473559366356
- type: nauc_mrr_at_5_diff1
value: 18.6075342519133
- type: nauc_mrr_at_5_max
value: 23.308845973576673
- type: nauc_mrr_at_5_std
value: 9.364009996445652
- type: nauc_ndcg_at_1000_diff1
value: 16.327871824135745
- type: nauc_ndcg_at_1000_max
value: 23.308241052911495
- type: nauc_ndcg_at_1000_std
value: 11.50905911184097
- type: nauc_ndcg_at_100_diff1
value: 16.676226744692773
- type: nauc_ndcg_at_100_max
value: 24.323253721240974
- type: nauc_ndcg_at_100_std
value: 11.952612443651557
- type: nauc_ndcg_at_10_diff1
value: 16.030325121764594
- type: nauc_ndcg_at_10_max
value: 21.306799242079542
- type: nauc_ndcg_at_10_std
value: 6.63359364302513
- type: nauc_ndcg_at_1_diff1
value: 19.433947086968093
- type: nauc_ndcg_at_1_max
value: 24.75657047550517
- type: nauc_ndcg_at_1_std
value: 15.122329157218505
- type: nauc_ndcg_at_20_diff1
value: 16.013173605999857
- type: nauc_ndcg_at_20_max
value: 21.607217260736576
- type: nauc_ndcg_at_20_std
value: 7.319482417138996
- type: nauc_ndcg_at_3_diff1
value: 17.97958548328493
- type: nauc_ndcg_at_3_max
value: 23.58346522810145
- type: nauc_ndcg_at_3_std
value: 9.392582854708314
- type: nauc_ndcg_at_5_diff1
value: 18.734733324685287
- type: nauc_ndcg_at_5_max
value: 23.273244317623742
- type: nauc_ndcg_at_5_std
value: 7.638611545253834
- type: nauc_precision_at_1000_diff1
value: 7.919843339380295
- type: nauc_precision_at_1000_max
value: 31.575386234270486
- type: nauc_precision_at_1000_std
value: 39.332224386769404
- type: nauc_precision_at_100_diff1
value: 15.018050960000052
- type: nauc_precision_at_100_max
value: 34.98209513759861
- type: nauc_precision_at_100_std
value: 26.970034484359022
- type: nauc_precision_at_10_diff1
value: 12.102191084210922
- type: nauc_precision_at_10_max
value: 18.112541150340675
- type: nauc_precision_at_10_std
value: 0.7358784689406018
- type: nauc_precision_at_1_diff1
value: 19.433947086968093
- type: nauc_precision_at_1_max
value: 24.75657047550517
- type: nauc_precision_at_1_std
value: 15.122329157218505
- type: nauc_precision_at_20_diff1
value: 12.018814361204328
- type: nauc_precision_at_20_max
value: 19.75123746049928
- type: nauc_precision_at_20_std
value: 3.012204650582264
- type: nauc_precision_at_3_diff1
value: 17.41375604940955
- type: nauc_precision_at_3_max
value: 23.699834627021037
- type: nauc_precision_at_3_std
value: 6.793486779050103
- type: nauc_precision_at_5_diff1
value: 19.194631963780257
- type: nauc_precision_at_5_max
value: 23.31708702442155
- type: nauc_precision_at_5_std
value: 3.4591358279667332
- type: nauc_recall_at_1000_diff1
value: 7.919843339380378
- type: nauc_recall_at_1000_max
value: 31.57538623427063
- type: nauc_recall_at_1000_std
value: 39.332224386769546
- type: nauc_recall_at_100_diff1
value: 15.018050960000085
- type: nauc_recall_at_100_max
value: 34.9820951375986
- type: nauc_recall_at_100_std
value: 26.97003448435901
- type: nauc_recall_at_10_diff1
value: 12.102191084210837
- type: nauc_recall_at_10_max
value: 18.112541150340594
- type: nauc_recall_at_10_std
value: 0.7358784689405188
- type: nauc_recall_at_1_diff1
value: 19.433947086968093
- type: nauc_recall_at_1_max
value: 24.75657047550517
- type: nauc_recall_at_1_std
value: 15.122329157218505
- type: nauc_recall_at_20_diff1
value: 12.01881436120429
- type: nauc_recall_at_20_max
value: 19.751237460499222
- type: nauc_recall_at_20_std
value: 3.0122046505822135
- type: nauc_recall_at_3_diff1
value: 17.413756049409503
- type: nauc_recall_at_3_max
value: 23.699834627020998
- type: nauc_recall_at_3_std
value: 6.793486779050083
- type: nauc_recall_at_5_diff1
value: 19.194631963780203
- type: nauc_recall_at_5_max
value: 23.3170870244215
- type: nauc_recall_at_5_std
value: 3.459135827966664
- type: ndcg_at_1
value: 14.865
- type: ndcg_at_10
value: 24.764
- type: ndcg_at_100
value: 30.861
- type: ndcg_at_1000
value: 33.628
- type: ndcg_at_20
value: 27.078000000000003
- type: ndcg_at_3
value: 21.675
- type: ndcg_at_5
value: 23.148
- type: precision_at_1
value: 14.865
- type: precision_at_10
value: 3.4680000000000004
- type: precision_at_100
value: 0.644
- type: precision_at_1000
value: 0.087
- type: precision_at_20
value: 2.185
- type: precision_at_3
value: 8.709
- type: precision_at_5
value: 5.946
- type: recall_at_1
value: 14.865
- type: recall_at_10
value: 34.685
- type: recall_at_100
value: 64.414
- type: recall_at_1000
value: 86.937
- type: recall_at_20
value: 43.694
- type: recall_at_3
value: 26.125999999999998
- type: recall_at_5
value: 29.73
- task:
type: Classification
dataset:
name: MTEB Banking77Classification (default)
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 84.08116883116882
- type: f1
value: 84.05587055990273
- type: f1_weighted
value: 84.05587055990274
- type: main_score
value: 84.08116883116882
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P (default)
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: main_score
value: 38.1941007822277
- type: v_measure
value: 38.1941007822277
- type: v_measure_std
value: 0.7502113547288178
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S (default)
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: main_score
value: 34.42075599178318
- type: v_measure
value: 34.42075599178318
- type: v_measure_std
value: 0.600256720497283
- task:
type: Clustering
dataset:
name: MTEB BlurbsClusteringP2P (default)
type: slvnwhrl/blurbs-clustering-p2p
config: default
split: test
revision: a2dd5b02a77de3466a3eaa98ae586b5610314496
metrics:
- type: main_score
value: 41.634627363047265
- type: v_measure
value: 41.634627363047265
- type: v_measure_std
value: 9.726923191225307
- task:
type: Clustering
dataset:
name: MTEB BlurbsClusteringS2S (default)
type: slvnwhrl/blurbs-clustering-s2s
config: default
split: test
revision: 22793b6a6465bf00120ad525e38c51210858132c
metrics:
- type: main_score
value: 20.996468295584197
- type: v_measure
value: 20.996468295584197
- type: v_measure_std
value: 9.225766688272197
- task:
type: Classification
dataset:
name: MTEB CBD (default)
type: PL-MTEB/cbd
config: default
split: test
revision: 36ddb419bcffe6a5374c3891957912892916f28d
metrics:
- type: accuracy
value: 69.99
- type: ap
value: 22.57826353116948
- type: ap_weighted
value: 22.57826353116948
- type: f1
value: 59.04574955548393
- type: f1_weighted
value: 74.36235022309789
- type: main_score
value: 69.99
- task:
type: PairClassification
dataset:
name: MTEB CDSC-E (default)
type: PL-MTEB/cdsce-pairclassification
config: default
split: test
revision: 0a3d4aa409b22f80eb22cbf59b492637637b536d
metrics:
- type: cosine_accuracy
value: 88.7
- type: cosine_accuracy_threshold
value: 97.37848043441772
- type: cosine_ap
value: 73.0405088928302
- type: cosine_f1
value: 63.52201257861635
- type: cosine_f1_threshold
value: 96.98888063430786
- type: cosine_precision
value: 78.90625
- type: cosine_recall
value: 53.1578947368421
- type: dot_accuracy
value: 84.89999999999999
- type: dot_accuracy_threshold
value: 43603.09753417969
- type: dot_ap
value: 56.98157569085279
- type: dot_f1
value: 57.606490872210955
- type: dot_f1_threshold
value: 40406.23779296875
- type: dot_precision
value: 46.864686468646866
- type: dot_recall
value: 74.73684210526315
- type: euclidean_accuracy
value: 88.5
- type: euclidean_accuracy_threshold
value: 498.0483055114746
- type: euclidean_ap
value: 72.97328234816734
- type: euclidean_f1
value: 63.722397476340696
- type: euclidean_f1_threshold
value: 508.6186408996582
- type: euclidean_precision
value: 79.52755905511812
- type: euclidean_recall
value: 53.1578947368421
- type: main_score
value: 73.0405088928302
- type: manhattan_accuracy
value: 88.6
- type: manhattan_accuracy_threshold
value: 12233.079528808594
- type: manhattan_ap
value: 72.92148503992615
- type: manhattan_f1
value: 63.69426751592356
- type: manhattan_f1_threshold
value: 12392.754364013672
- type: manhattan_precision
value: 80.64516129032258
- type: manhattan_recall
value: 52.63157894736842
- type: max_accuracy
value: 88.7
- type: max_ap
value: 73.0405088928302
- type: max_f1
value: 63.722397476340696
- type: max_precision
value: 80.64516129032258
- type: max_recall
value: 74.73684210526315
- type: similarity_accuracy
value: 88.7
- type: similarity_accuracy_threshold
value: 97.37848043441772
- type: similarity_ap
value: 73.0405088928302
- type: similarity_f1
value: 63.52201257861635
- type: similarity_f1_threshold
value: 96.98888063430786
- type: similarity_precision
value: 78.90625
- type: similarity_recall
value: 53.1578947368421
- task:
type: STS
dataset:
name: MTEB CDSC-R (default)
type: PL-MTEB/cdscr-sts
config: default
split: test
revision: 1cd6abbb00df7d14be3dbd76a7dcc64b3a79a7cd
metrics:
- type: cosine_pearson
value: 92.97492495289738
- type: cosine_spearman
value: 92.63248098608472
- type: euclidean_pearson
value: 92.04712487782031
- type: euclidean_spearman
value: 92.19679486755008
- type: main_score
value: 92.63248098608472
- type: manhattan_pearson
value: 92.0101187740438
- type: manhattan_spearman
value: 92.20926859332754
- type: pearson
value: 92.97492495289738
- type: spearman
value: 92.63248098608472
- task:
type: Clustering
dataset:
name: MTEB CLSClusteringP2P (default)
type: C-MTEB/CLSClusteringP2P
config: default
split: test
revision: 4b6227591c6c1a73bc76b1055f3b7f3588e72476
metrics:
- type: main_score
value: 39.96377851800628
- type: v_measure
value: 39.96377851800628
- type: v_measure_std
value: 0.9793033243093288
- task:
type: Clustering
dataset:
name: MTEB CLSClusteringS2S (default)
type: C-MTEB/CLSClusteringS2S
config: default
split: test
revision: e458b3f5414b62b7f9f83499ac1f5497ae2e869f
metrics:
- type: main_score
value: 38.788850224595784
- type: v_measure
value: 38.788850224595784
- type: v_measure_std
value: 1.0712604145916924
- task:
type: Reranking
dataset:
name: MTEB CMedQAv1
type: C-MTEB/CMedQAv1-reranking
config: default
split: test
revision: 8d7f1e942507dac42dc58017c1a001c3717da7df
metrics:
- type: map
value: 77.95952507806115
- type: mrr
value: 80.8643253968254
- type: main_score
value: 77.95952507806115
- task:
type: Reranking
dataset:
name: MTEB CMedQAv2
type: C-MTEB/CMedQAv2-reranking
config: default
split: test
revision: 23d186750531a14a0357ca22cd92d712fd512ea0
metrics:
- type: map
value: 78.21522500165045
- type: mrr
value: 81.28194444444443
- type: main_score
value: 78.21522500165045
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval (default)
type: mteb/cqadupstack-android
config: default
split: test
revision: f46a197baaae43b4f621051089b82a364682dfeb
metrics:
- type: map_at_1
value: 33.377
- type: map_at_10
value: 46.371
- type: map_at_100
value: 47.829
- type: map_at_1000
value: 47.94
- type: map_at_20
value: 47.205000000000005
- type: map_at_3
value: 42.782
- type: map_at_5
value: 44.86
- type: mrr_at_1
value: 41.345
- type: mrr_at_10
value: 52.187
- type: mrr_at_100
value: 52.893
- type: mrr_at_1000
value: 52.929
- type: mrr_at_20
value: 52.637
- type: mrr_at_3
value: 49.714000000000006
- type: mrr_at_5
value: 51.373000000000005
- type: ndcg_at_1
value: 41.345
- type: ndcg_at_10
value: 52.946000000000005
- type: ndcg_at_100
value: 57.92699999999999
- type: ndcg_at_1000
value: 59.609
- type: ndcg_at_20
value: 54.900999999999996
- type: ndcg_at_3
value: 48.357
- type: ndcg_at_5
value: 50.739000000000004
- type: precision_at_1
value: 41.345
- type: precision_at_10
value: 10.186
- type: precision_at_100
value: 1.554
- type: precision_at_1000
value: 0.2
- type: precision_at_20
value: 5.959
- type: precision_at_3
value: 23.796
- type: precision_at_5
value: 17.024
- type: recall_at_1
value: 33.377
- type: recall_at_10
value: 65.067
- type: recall_at_100
value: 86.04899999999999
- type: recall_at_1000
value: 96.54899999999999
- type: recall_at_20
value: 72.071
- type: recall_at_3
value: 51.349999999999994
- type: recall_at_5
value: 58.41
- type: main_score
value: 52.946000000000005
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackEnglishRetrieval (default)
type: mteb/cqadupstack-english
config: default
split: test
revision: ad9991cb51e31e31e430383c75ffb2885547b5f0
metrics:
- type: map_at_1
value: 31.097
- type: map_at_10
value: 42.183
- type: map_at_100
value: 43.580999999999996
- type: map_at_1000
value: 43.718
- type: map_at_20
value: 42.921
- type: map_at_3
value: 38.963
- type: map_at_5
value: 40.815
- type: mrr_at_1
value: 39.745000000000005
- type: mrr_at_10
value: 48.736000000000004
- type: mrr_at_100
value: 49.405
- type: mrr_at_1000
value: 49.452
- type: mrr_at_20
value: 49.118
- type: mrr_at_3
value: 46.497
- type: mrr_at_5
value: 47.827999999999996
- type: ndcg_at_1
value: 39.745000000000005
- type: ndcg_at_10
value: 48.248000000000005
- type: ndcg_at_100
value: 52.956
- type: ndcg_at_1000
value: 54.99699999999999
- type: ndcg_at_20
value: 50.01
- type: ndcg_at_3
value: 43.946000000000005
- type: ndcg_at_5
value: 46.038000000000004
- type: precision_at_1
value: 39.745000000000005
- type: precision_at_10
value: 9.229
- type: precision_at_100
value: 1.5070000000000001
- type: precision_at_1000
value: 0.199
- type: precision_at_20
value: 5.489999999999999
- type: precision_at_3
value: 21.38
- type: precision_at_5
value: 15.274
- type: recall_at_1
value: 31.097
- type: recall_at_10
value: 58.617
- type: recall_at_100
value: 78.55199999999999
- type: recall_at_1000
value: 91.13900000000001
- type: recall_at_20
value: 64.92
- type: recall_at_3
value: 45.672000000000004
- type: recall_at_5
value: 51.669
- type: main_score
value: 48.248000000000005
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGamingRetrieval (default)
type: mteb/cqadupstack-gaming
config: default
split: test
revision: 4885aa143210c98657558c04aaf3dc47cfb54340
metrics:
- type: map_at_1
value: 39.745000000000005
- type: map_at_10
value: 52.063
- type: map_at_100
value: 53.077
- type: map_at_1000
value: 53.13
- type: map_at_20
value: 52.66
- type: map_at_3
value: 48.662
- type: map_at_5
value: 50.507000000000005
- type: mrr_at_1
value: 45.391999999999996
- type: mrr_at_10
value: 55.528
- type: mrr_at_100
value: 56.16100000000001
- type: mrr_at_1000
value: 56.192
- type: mrr_at_20
value: 55.923
- type: mrr_at_3
value: 52.93600000000001
- type: mrr_at_5
value: 54.435
- type: ndcg_at_1
value: 45.391999999999996
- type: ndcg_at_10
value: 58.019
- type: ndcg_at_100
value: 61.936
- type: ndcg_at_1000
value: 63.015
- type: ndcg_at_20
value: 59.691
- type: ndcg_at_3
value: 52.294
- type: ndcg_at_5
value: 55.017
- type: precision_at_1
value: 45.391999999999996
- type: precision_at_10
value: 9.386
- type: precision_at_100
value: 1.232
- type: precision_at_1000
value: 0.136
- type: precision_at_20
value: 5.223
- type: precision_at_3
value: 23.177
- type: precision_at_5
value: 15.9
- type: recall_at_1
value: 39.745000000000005
- type: recall_at_10
value: 72.08099999999999
- type: recall_at_100
value: 88.85300000000001
- type: recall_at_1000
value: 96.569
- type: recall_at_20
value: 78.203
- type: recall_at_3
value: 56.957
- type: recall_at_5
value: 63.63100000000001
- type: main_score
value: 58.019
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGisRetrieval (default)
type: mteb/cqadupstack-gis
config: default
split: test
revision: 5003b3064772da1887988e05400cf3806fe491f2
metrics:
- type: map_at_1
value: 26.651999999999997
- type: map_at_10
value: 35.799
- type: map_at_100
value: 36.846000000000004
- type: map_at_1000
value: 36.931000000000004
- type: map_at_20
value: 36.341
- type: map_at_3
value: 32.999
- type: map_at_5
value: 34.597
- type: mrr_at_1
value: 28.814
- type: mrr_at_10
value: 37.869
- type: mrr_at_100
value: 38.728
- type: mrr_at_1000
value: 38.795
- type: mrr_at_20
value: 38.317
- type: mrr_at_3
value: 35.235
- type: mrr_at_5
value: 36.738
- type: ndcg_at_1
value: 28.814
- type: ndcg_at_10
value: 41.028
- type: ndcg_at_100
value: 46.162
- type: ndcg_at_1000
value: 48.15
- type: ndcg_at_20
value: 42.824
- type: ndcg_at_3
value: 35.621
- type: ndcg_at_5
value: 38.277
- type: precision_at_1
value: 28.814
- type: precision_at_10
value: 6.361999999999999
- type: precision_at_100
value: 0.9450000000000001
- type: precision_at_1000
value: 0.11399999999999999
- type: precision_at_20
value: 3.6159999999999997
- type: precision_at_3
value: 15.140999999999998
- type: precision_at_5
value: 10.712000000000002
- type: recall_at_1
value: 26.651999999999997
- type: recall_at_10
value: 55.038
- type: recall_at_100
value: 78.806
- type: recall_at_1000
value: 93.485
- type: recall_at_20
value: 61.742
- type: recall_at_3
value: 40.682
- type: recall_at_5
value: 46.855000000000004
- type: main_score
value: 41.028
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackMathematicaRetrieval (default)
type: mteb/cqadupstack-mathematica
config: default
split: test
revision: 90fceea13679c63fe563ded68f3b6f06e50061de
metrics:
- type: map_at_1
value: 17.627000000000002
- type: map_at_10
value: 26.436999999999998
- type: map_at_100
value: 27.85
- type: map_at_1000
value: 27.955999999999996
- type: map_at_20
value: 27.233
- type: map_at_3
value: 23.777
- type: map_at_5
value: 25.122
- type: mrr_at_1
value: 22.387999999999998
- type: mrr_at_10
value: 31.589
- type: mrr_at_100
value: 32.641999999999996
- type: mrr_at_1000
value: 32.696999999999996
- type: mrr_at_20
value: 32.201
- type: mrr_at_3
value: 28.98
- type: mrr_at_5
value: 30.342000000000002
- type: ndcg_at_1
value: 22.387999999999998
- type: ndcg_at_10
value: 32.129999999999995
- type: ndcg_at_100
value: 38.562999999999995
- type: ndcg_at_1000
value: 40.903
- type: ndcg_at_20
value: 34.652
- type: ndcg_at_3
value: 27.26
- type: ndcg_at_5
value: 29.235
- type: precision_at_1
value: 22.387999999999998
- type: precision_at_10
value: 5.970000000000001
- type: precision_at_100
value: 1.068
- type: precision_at_1000
value: 0.13899999999999998
- type: precision_at_20
value: 3.6999999999999997
- type: precision_at_3
value: 13.267000000000001
- type: precision_at_5
value: 9.403
- type: recall_at_1
value: 17.627000000000002
- type: recall_at_10
value: 44.71
- type: recall_at_100
value: 72.426
- type: recall_at_1000
value: 88.64699999999999
- type: recall_at_20
value: 53.65
- type: recall_at_3
value: 30.989
- type: recall_at_5
value: 36.237
- type: main_score
value: 32.129999999999995
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackPhysicsRetrieval (default)
type: mteb/cqadupstack-physics
config: default
split: test
revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4
metrics:
- type: map_at_1
value: 30.891000000000002
- type: map_at_10
value: 41.519
- type: map_at_100
value: 42.896
- type: map_at_1000
value: 42.992999999999995
- type: map_at_20
value: 42.287
- type: map_at_3
value: 37.822
- type: map_at_5
value: 39.976
- type: mrr_at_1
value: 37.921
- type: mrr_at_10
value: 47.260999999999996
- type: mrr_at_100
value: 48.044
- type: mrr_at_1000
value: 48.08
- type: mrr_at_20
value: 47.699999999999996
- type: mrr_at_3
value: 44.513999999999996
- type: mrr_at_5
value: 46.064
- type: ndcg_at_1
value: 37.921
- type: ndcg_at_10
value: 47.806
- type: ndcg_at_100
value: 53.274
- type: ndcg_at_1000
value: 55.021
- type: ndcg_at_20
value: 49.973
- type: ndcg_at_3
value: 42.046
- type: ndcg_at_5
value: 44.835
- type: precision_at_1
value: 37.921
- type: precision_at_10
value: 8.767999999999999
- type: precision_at_100
value: 1.353
- type: precision_at_1000
value: 0.168
- type: precision_at_20
value: 5.135
- type: precision_at_3
value: 20.051
- type: precision_at_5
value: 14.398
- type: recall_at_1
value: 30.891000000000002
- type: recall_at_10
value: 60.897999999999996
- type: recall_at_100
value: 83.541
- type: recall_at_1000
value: 94.825
- type: recall_at_20
value: 68.356
- type: recall_at_3
value: 44.65
- type: recall_at_5
value: 51.919000000000004
- type: main_score
value: 47.806
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackProgrammersRetrieval (default)
type: mteb/cqadupstack-programmers
config: default
split: test
revision: 6184bc1440d2dbc7612be22b50686b8826d22b32
metrics:
- type: map_at_1
value: 27.654
- type: map_at_10
value: 38.025999999999996
- type: map_at_100
value: 39.425
- type: map_at_1000
value: 39.528
- type: map_at_20
value: 38.838
- type: map_at_3
value: 34.745
- type: map_at_5
value: 36.537
- type: mrr_at_1
value: 34.018
- type: mrr_at_10
value: 43.314
- type: mrr_at_100
value: 44.283
- type: mrr_at_1000
value: 44.327
- type: mrr_at_20
value: 43.929
- type: mrr_at_3
value: 40.868
- type: mrr_at_5
value: 42.317
- type: ndcg_at_1
value: 34.018
- type: ndcg_at_10
value: 43.887
- type: ndcg_at_100
value: 49.791000000000004
- type: ndcg_at_1000
value: 51.834
- type: ndcg_at_20
value: 46.376
- type: ndcg_at_3
value: 38.769999999999996
- type: ndcg_at_5
value: 41.144
- type: precision_at_1
value: 34.018
- type: precision_at_10
value: 8.001999999999999
- type: precision_at_100
value: 1.2630000000000001
- type: precision_at_1000
value: 0.16
- type: precision_at_20
value: 4.737
- type: precision_at_3
value: 18.417
- type: precision_at_5
value: 13.150999999999998
- type: recall_at_1
value: 27.654
- type: recall_at_10
value: 56.111
- type: recall_at_100
value: 81.136
- type: recall_at_1000
value: 94.788
- type: recall_at_20
value: 65.068
- type: recall_at_3
value: 41.713
- type: recall_at_5
value: 48.106
- type: main_score
value: 43.887
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackRetrieval (default)
type: CQADupstackRetrieval_is_a_combined_dataset
config: default
split: test
revision: CQADupstackRetrieval_is_a_combined_dataset
metrics:
- type: main_score
value: 42.58858333333333
- type: ndcg_at_10
value: 42.58858333333333
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackStatsRetrieval (default)
type: mteb/cqadupstack-stats
config: default
split: test
revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a
metrics:
- type: map_at_1
value: 24.501
- type: map_at_10
value: 32.814
- type: map_at_100
value: 33.754
- type: map_at_1000
value: 33.859
- type: map_at_20
value: 33.324
- type: map_at_3
value: 30.758000000000003
- type: map_at_5
value: 31.936999999999998
- type: mrr_at_1
value: 27.761000000000003
- type: mrr_at_10
value: 35.662
- type: mrr_at_100
value: 36.443999999999996
- type: mrr_at_1000
value: 36.516999999999996
- type: mrr_at_20
value: 36.085
- type: mrr_at_3
value: 33.742
- type: mrr_at_5
value: 34.931
- type: ndcg_at_1
value: 27.761000000000003
- type: ndcg_at_10
value: 37.208000000000006
- type: ndcg_at_100
value: 41.839
- type: ndcg_at_1000
value: 44.421
- type: ndcg_at_20
value: 38.917
- type: ndcg_at_3
value: 33.544000000000004
- type: ndcg_at_5
value: 35.374
- type: precision_at_1
value: 27.761000000000003
- type: precision_at_10
value: 5.92
- type: precision_at_100
value: 0.899
- type: precision_at_1000
value: 0.12
- type: precision_at_20
value: 3.4130000000000003
- type: precision_at_3
value: 15.031
- type: precision_at_5
value: 10.306999999999999
- type: recall_at_1
value: 24.501
- type: recall_at_10
value: 47.579
- type: recall_at_100
value: 69.045
- type: recall_at_1000
value: 88.032
- type: recall_at_20
value: 54.125
- type: recall_at_3
value: 37.202
- type: recall_at_5
value: 41.927
- type: main_score
value: 37.208000000000006
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackTexRetrieval (default)
type: mteb/cqadupstack-tex
config: default
split: test
revision: 46989137a86843e03a6195de44b09deda022eec7
metrics:
- type: map_at_1
value: 18.29
- type: map_at_10
value: 26.183
- type: map_at_100
value: 27.351999999999997
- type: map_at_1000
value: 27.483999999999998
- type: map_at_20
value: 26.798
- type: map_at_3
value: 23.629
- type: map_at_5
value: 24.937
- type: mrr_at_1
value: 22.299
- type: mrr_at_10
value: 30.189
- type: mrr_at_100
value: 31.098
- type: mrr_at_1000
value: 31.177
- type: mrr_at_20
value: 30.697000000000003
- type: mrr_at_3
value: 27.862
- type: mrr_at_5
value: 29.066
- type: ndcg_at_1
value: 22.299
- type: ndcg_at_10
value: 31.202
- type: ndcg_at_100
value: 36.617
- type: ndcg_at_1000
value: 39.544000000000004
- type: ndcg_at_20
value: 33.177
- type: ndcg_at_3
value: 26.639000000000003
- type: ndcg_at_5
value: 28.526
- type: precision_at_1
value: 22.299
- type: precision_at_10
value: 5.8020000000000005
- type: precision_at_100
value: 1.0070000000000001
- type: precision_at_1000
value: 0.14400000000000002
- type: precision_at_20
value: 3.505
- type: precision_at_3
value: 12.698
- type: precision_at_5
value: 9.174
- type: recall_at_1
value: 18.29
- type: recall_at_10
value: 42.254999999999995
- type: recall_at_100
value: 66.60000000000001
- type: recall_at_1000
value: 87.31400000000001
- type: recall_at_20
value: 49.572
- type: recall_at_3
value: 29.342000000000002
- type: recall_at_5
value: 34.221000000000004
- type: main_score
value: 31.202
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackUnixRetrieval (default)
type: mteb/cqadupstack-unix
config: default
split: test
revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53
metrics:
- type: map_at_1
value: 27.722
- type: map_at_10
value: 37.698
- type: map_at_100
value: 38.899
- type: map_at_1000
value: 38.998
- type: map_at_20
value: 38.381
- type: map_at_3
value: 34.244
- type: map_at_5
value: 36.295
- type: mrr_at_1
value: 32.183
- type: mrr_at_10
value: 41.429
- type: mrr_at_100
value: 42.308
- type: mrr_at_1000
value: 42.358000000000004
- type: mrr_at_20
value: 41.957
- type: mrr_at_3
value: 38.401999999999994
- type: mrr_at_5
value: 40.294999999999995
- type: ndcg_at_1
value: 32.183
- type: ndcg_at_10
value: 43.519000000000005
- type: ndcg_at_100
value: 48.786
- type: ndcg_at_1000
value: 50.861999999999995
- type: ndcg_at_20
value: 45.654
- type: ndcg_at_3
value: 37.521
- type: ndcg_at_5
value: 40.615
- type: precision_at_1
value: 32.183
- type: precision_at_10
value: 7.603
- type: precision_at_100
value: 1.135
- type: precision_at_1000
value: 0.14200000000000002
- type: precision_at_20
value: 4.408
- type: precision_at_3
value: 17.071
- type: precision_at_5
value: 12.668
- type: recall_at_1
value: 27.722
- type: recall_at_10
value: 57.230000000000004
- type: recall_at_100
value: 79.97999999999999
- type: recall_at_1000
value: 94.217
- type: recall_at_20
value: 64.864
- type: recall_at_3
value: 41.215
- type: recall_at_5
value: 48.774
- type: main_score
value: 43.519000000000005
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWebmastersRetrieval (default)
type: mteb/cqadupstack-webmasters
config: default
split: test
revision: 160c094312a0e1facb97e55eeddb698c0abe3571
metrics:
- type: map_at_1
value: 25.852999999999998
- type: map_at_10
value: 35.394999999999996
- type: map_at_100
value: 37.291999999999994
- type: map_at_1000
value: 37.495
- type: map_at_20
value: 36.372
- type: map_at_3
value: 32.336
- type: map_at_5
value: 34.159
- type: mrr_at_1
value: 31.818
- type: mrr_at_10
value: 40.677
- type: mrr_at_100
value: 41.728
- type: mrr_at_1000
value: 41.778
- type: mrr_at_20
value: 41.301
- type: mrr_at_3
value: 38.208
- type: mrr_at_5
value: 39.592
- type: ndcg_at_1
value: 31.818
- type: ndcg_at_10
value: 41.559000000000005
- type: ndcg_at_100
value: 48.012
- type: ndcg_at_1000
value: 50.234
- type: ndcg_at_20
value: 44.15
- type: ndcg_at_3
value: 36.918
- type: ndcg_at_5
value: 39.227000000000004
- type: precision_at_1
value: 31.818
- type: precision_at_10
value: 8.043
- type: precision_at_100
value: 1.625
- type: precision_at_1000
value: 0.245
- type: precision_at_20
value: 5.2170000000000005
- type: precision_at_3
value: 17.655
- type: precision_at_5
value: 12.845999999999998
- type: recall_at_1
value: 25.852999999999998
- type: recall_at_10
value: 53.093
- type: recall_at_100
value: 81.05799999999999
- type: recall_at_1000
value: 94.657
- type: recall_at_20
value: 62.748000000000005
- type: recall_at_3
value: 39.300000000000004
- type: recall_at_5
value: 45.754
- type: main_score
value: 41.559000000000005
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWordpressRetrieval (default)
type: mteb/cqadupstack-wordpress
config: default
split: test
revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4
metrics:
- type: map_at_1
value: 19.23
- type: map_at_10
value: 28.128999999999998
- type: map_at_100
value: 29.195
- type: map_at_1000
value: 29.310000000000002
- type: map_at_20
value: 28.713
- type: map_at_3
value: 25.191000000000003
- type: map_at_5
value: 26.69
- type: mrr_at_1
value: 21.257
- type: mrr_at_10
value: 30.253999999999998
- type: mrr_at_100
value: 31.195
- type: mrr_at_1000
value: 31.270999999999997
- type: mrr_at_20
value: 30.747999999999998
- type: mrr_at_3
value: 27.633999999999997
- type: mrr_at_5
value: 28.937
- type: ndcg_at_1
value: 21.257
- type: ndcg_at_10
value: 33.511
- type: ndcg_at_100
value: 38.733000000000004
- type: ndcg_at_1000
value: 41.489
- type: ndcg_at_20
value: 35.476
- type: ndcg_at_3
value: 27.845
- type: ndcg_at_5
value: 30.264999999999997
- type: precision_at_1
value: 21.257
- type: precision_at_10
value: 5.619
- type: precision_at_100
value: 0.893
- type: precision_at_1000
value: 0.124
- type: precision_at_20
value: 3.29
- type: precision_at_3
value: 12.508
- type: precision_at_5
value: 8.946
- type: recall_at_1
value: 19.23
- type: recall_at_10
value: 48.185
- type: recall_at_100
value: 71.932
- type: recall_at_1000
value: 92.587
- type: recall_at_20
value: 55.533
- type: recall_at_3
value: 32.865
- type: recall_at_5
value: 38.577
- type: main_score
value: 33.511
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER (default)
type: mteb/climate-fever
config: default
split: test
revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380
metrics:
- type: map_at_1
value: 19.594
- type: map_at_10
value: 32.519
- type: map_at_100
value: 34.1
- type: map_at_1000
value: 34.263
- type: map_at_20
value: 33.353
- type: map_at_3
value: 27.898
- type: map_at_5
value: 30.524
- type: mrr_at_1
value: 46.515
- type: mrr_at_10
value: 56.958
- type: mrr_at_100
value: 57.54899999999999
- type: mrr_at_1000
value: 57.574999999999996
- type: mrr_at_20
value: 57.315000000000005
- type: mrr_at_3
value: 54.852999999999994
- type: mrr_at_5
value: 56.153
- type: ndcg_at_1
value: 46.515
- type: ndcg_at_10
value: 42.363
- type: ndcg_at_100
value: 48.233
- type: ndcg_at_1000
value: 50.993
- type: ndcg_at_20
value: 44.533
- type: ndcg_at_3
value: 37.297000000000004
- type: ndcg_at_5
value: 38.911
- type: precision_at_1
value: 46.515
- type: precision_at_10
value: 12.520999999999999
- type: precision_at_100
value: 1.8980000000000001
- type: precision_at_1000
value: 0.242
- type: precision_at_20
value: 7.212000000000001
- type: precision_at_3
value: 27.752
- type: precision_at_5
value: 20.391000000000002
- type: recall_at_1
value: 19.594
- type: recall_at_10
value: 46.539
- type: recall_at_100
value: 66.782
- type: recall_at_1000
value: 82.049
- type: recall_at_20
value: 52.611
- type: recall_at_3
value: 32.528
- type: recall_at_5
value: 38.933
- type: main_score
value: 42.363
- task:
type: Retrieval
dataset:
name: MTEB CmedqaRetrieval (default)
type: C-MTEB/CmedqaRetrieval
config: default
split: dev
revision: cd540c506dae1cf9e9a59c3e06f42030d54e7301
metrics:
- type: main_score
value: 35.927
- type: map_at_1
value: 20.144000000000002
- type: map_at_10
value: 29.94
- type: map_at_100
value: 31.630000000000003
- type: map_at_1000
value: 31.778000000000002
- type: map_at_20
value: 30.798
- type: map_at_3
value: 26.534999999999997
- type: map_at_5
value: 28.33
- type: mrr_at_1
value: 31.23280820205051
- type: mrr_at_10
value: 38.66781179421835
- type: mrr_at_100
value: 39.656936166081785
- type: mrr_at_1000
value: 39.724602893117414
- type: mrr_at_20
value: 39.21272461558451
- type: mrr_at_3
value: 36.30907726931729
- type: mrr_at_5
value: 37.59814953738436
- type: nauc_map_at_1000_diff1
value: 44.5755334437146
- type: nauc_map_at_1000_max
value: 40.726916781400746
- type: nauc_map_at_1000_std
value: -19.591835061497367
- type: nauc_map_at_100_diff1
value: 44.54542899921038
- type: nauc_map_at_100_max
value: 40.68305902532837
- type: nauc_map_at_100_std
value: -19.658902089283487
- type: nauc_map_at_10_diff1
value: 44.56110529630953
- type: nauc_map_at_10_max
value: 39.89826167846008
- type: nauc_map_at_10_std
value: -20.62910633667902
- type: nauc_map_at_1_diff1
value: 50.82120107004449
- type: nauc_map_at_1_max
value: 33.208851367861584
- type: nauc_map_at_1_std
value: -20.29409730258174
- type: nauc_map_at_20_diff1
value: 44.51171242433788
- type: nauc_map_at_20_max
value: 40.30431132782945
- type: nauc_map_at_20_std
value: -20.290524142792417
- type: nauc_map_at_3_diff1
value: 45.80394138665133
- type: nauc_map_at_3_max
value: 37.766191281426956
- type: nauc_map_at_3_std
value: -21.223601997333876
- type: nauc_map_at_5_diff1
value: 45.00457218474283
- type: nauc_map_at_5_max
value: 38.901044576388365
- type: nauc_map_at_5_std
value: -20.893069613941634
- type: nauc_mrr_at_1000_diff1
value: 50.09855359231429
- type: nauc_mrr_at_1000_max
value: 46.481000170008826
- type: nauc_mrr_at_1000_std
value: -16.053461377096102
- type: nauc_mrr_at_100_diff1
value: 50.08205026347746
- type: nauc_mrr_at_100_max
value: 46.47262126963331
- type: nauc_mrr_at_100_std
value: -16.049112778748693
- type: nauc_mrr_at_10_diff1
value: 50.02363239081706
- type: nauc_mrr_at_10_max
value: 46.39287859062042
- type: nauc_mrr_at_10_std
value: -16.280866744769657
- type: nauc_mrr_at_1_diff1
value: 55.692503735317445
- type: nauc_mrr_at_1_max
value: 47.334834529801014
- type: nauc_mrr_at_1_std
value: -16.985483585693512
- type: nauc_mrr_at_20_diff1
value: 50.07725225722074
- type: nauc_mrr_at_20_max
value: 46.47279295070193
- type: nauc_mrr_at_20_std
value: -16.15168364678318
- type: nauc_mrr_at_3_diff1
value: 51.18685337274134
- type: nauc_mrr_at_3_max
value: 46.7286365021621
- type: nauc_mrr_at_3_std
value: -16.708451287313718
- type: nauc_mrr_at_5_diff1
value: 50.46777237893576
- type: nauc_mrr_at_5_max
value: 46.5352076502249
- type: nauc_mrr_at_5_std
value: -16.557413659905034
- type: nauc_ndcg_at_1000_diff1
value: 43.974299434438066
- type: nauc_ndcg_at_1000_max
value: 43.44628675071857
- type: nauc_ndcg_at_1000_std
value: -15.3495102005021
- type: nauc_ndcg_at_100_diff1
value: 43.336365081508504
- type: nauc_ndcg_at_100_max
value: 43.11345604460776
- type: nauc_ndcg_at_100_std
value: -15.571128070860615
- type: nauc_ndcg_at_10_diff1
value: 43.41266214720136
- type: nauc_ndcg_at_10_max
value: 41.519676787851914
- type: nauc_ndcg_at_10_std
value: -19.217175017223568
- type: nauc_ndcg_at_1_diff1
value: 55.692503735317445
- type: nauc_ndcg_at_1_max
value: 47.334834529801014
- type: nauc_ndcg_at_1_std
value: -16.985483585693512
- type: nauc_ndcg_at_20_diff1
value: 43.351653862834496
- type: nauc_ndcg_at_20_max
value: 42.11608469750499
- type: nauc_ndcg_at_20_std
value: -18.485363540641664
- type: nauc_ndcg_at_3_diff1
value: 45.64193888236677
- type: nauc_ndcg_at_3_max
value: 42.497135099009995
- type: nauc_ndcg_at_3_std
value: -18.764012041130094
- type: nauc_ndcg_at_5_diff1
value: 44.523392133895186
- type: nauc_ndcg_at_5_max
value: 41.564242030096345
- type: nauc_ndcg_at_5_std
value: -19.31080790984941
- type: nauc_precision_at_1000_diff1
value: 6.383464615714393
- type: nauc_precision_at_1000_max
value: 27.439930931284657
- type: nauc_precision_at_1000_std
value: 19.070716188143034
- type: nauc_precision_at_100_diff1
value: 12.599136754501284
- type: nauc_precision_at_100_max
value: 35.886310962337795
- type: nauc_precision_at_100_std
value: 14.06587592659196
- type: nauc_precision_at_10_diff1
value: 25.388891173150206
- type: nauc_precision_at_10_max
value: 46.10269270777384
- type: nauc_precision_at_10_std
value: -5.993803607158499
- type: nauc_precision_at_1_diff1
value: 55.692503735317445
- type: nauc_precision_at_1_max
value: 47.334834529801014
- type: nauc_precision_at_1_std
value: -16.985483585693512
- type: nauc_precision_at_20_diff1
value: 20.984013463099707
- type: nauc_precision_at_20_max
value: 42.9471854616888
- type: nauc_precision_at_20_std
value: -0.8045549929346024
- type: nauc_precision_at_3_diff1
value: 36.191850547148356
- type: nauc_precision_at_3_max
value: 48.09923832376049
- type: nauc_precision_at_3_std
value: -13.159407051271321
- type: nauc_precision_at_5_diff1
value: 31.04967966700407
- type: nauc_precision_at_5_max
value: 47.62867673349624
- type: nauc_precision_at_5_std
value: -10.345790325137353
- type: nauc_recall_at_1000_diff1
value: 11.03436839065707
- type: nauc_recall_at_1000_max
value: 42.32265076651575
- type: nauc_recall_at_1000_std
value: 30.478521053399206
- type: nauc_recall_at_100_diff1
value: 24.788349084510806
- type: nauc_recall_at_100_max
value: 36.72097184821956
- type: nauc_recall_at_100_std
value: -0.2241144179522076
- type: nauc_recall_at_10_diff1
value: 31.613053567704885
- type: nauc_recall_at_10_max
value: 34.4597322828833
- type: nauc_recall_at_10_std
value: -18.00022912690819
- type: nauc_recall_at_1_diff1
value: 50.82120107004449
- type: nauc_recall_at_1_max
value: 33.208851367861584
- type: nauc_recall_at_1_std
value: -20.29409730258174
- type: nauc_recall_at_20_diff1
value: 30.277002670708384
- type: nauc_recall_at_20_max
value: 35.212475675060375
- type: nauc_recall_at_20_std
value: -15.822788854733687
- type: nauc_recall_at_3_diff1
value: 38.87844958322257
- type: nauc_recall_at_3_max
value: 34.66914910044104
- type: nauc_recall_at_3_std
value: -20.234707300209127
- type: nauc_recall_at_5_diff1
value: 35.551139991687776
- type: nauc_recall_at_5_max
value: 34.61009958820695
- type: nauc_recall_at_5_std
value: -19.519180149293444
- type: ndcg_at_1
value: 31.233
- type: ndcg_at_10
value: 35.927
- type: ndcg_at_100
value: 43.037
- type: ndcg_at_1000
value: 45.900999999999996
- type: ndcg_at_20
value: 38.39
- type: ndcg_at_3
value: 31.366
- type: ndcg_at_5
value: 33.108
- type: precision_at_1
value: 31.233
- type: precision_at_10
value: 8.15
- type: precision_at_100
value: 1.402
- type: precision_at_1000
value: 0.17700000000000002
- type: precision_at_20
value: 4.91
- type: precision_at_3
value: 17.871000000000002
- type: precision_at_5
value: 12.948
- type: recall_at_1
value: 20.144000000000002
- type: recall_at_10
value: 44.985
- type: recall_at_100
value: 74.866
- type: recall_at_1000
value: 94.477
- type: recall_at_20
value: 53.37
- type: recall_at_3
value: 31.141000000000002
- type: recall_at_5
value: 36.721
- task:
type: PairClassification
dataset:
name: MTEB Cmnli (default)
type: C-MTEB/CMNLI
config: default
split: validation
revision: None
metrics:
- type: cos_sim_accuracy
value: 71.25676488274203
- type: cos_sim_accuracy_threshold
value: 78.11152935028076
- type: cos_sim_ap
value: 79.10444825556077
- type: cos_sim_f1
value: 74.10750923266312
- type: cos_sim_f1_threshold
value: 75.2312421798706
- type: cos_sim_precision
value: 66.02083714129044
- type: cos_sim_recall
value: 84.45171849427169
- type: dot_accuracy
value: 68.11785929043896
- type: dot_accuracy_threshold
value: 34783.23974609375
- type: dot_ap
value: 75.80201827987712
- type: dot_f1
value: 72.31670990679349
- type: dot_f1_threshold
value: 31978.036499023438
- type: dot_precision
value: 61.386623164763456
- type: dot_recall
value: 87.98223053542202
- type: euclidean_accuracy
value: 71.41310883944678
- type: euclidean_accuracy_threshold
value: 1374.9353408813477
- type: euclidean_ap
value: 79.23359768836457
- type: euclidean_f1
value: 74.38512297540491
- type: euclidean_f1_threshold
value: 1512.6035690307617
- type: euclidean_precision
value: 64.97816593886463
- type: euclidean_recall
value: 86.97685293429974
- type: manhattan_accuracy
value: 71.32892363199038
- type: manhattan_accuracy_threshold
value: 33340.49072265625
- type: manhattan_ap
value: 79.11973684118587
- type: manhattan_f1
value: 74.29401993355481
- type: manhattan_f1_threshold
value: 36012.52746582031
- type: manhattan_precision
value: 66.81605975723622
- type: manhattan_recall
value: 83.65676876315175
- type: max_accuracy
value: 71.41310883944678
- type: max_ap
value: 79.23359768836457
- type: max_f1
value: 74.38512297540491
- task:
type: Retrieval
dataset:
name: MTEB CovidRetrieval (default)
type: C-MTEB/CovidRetrieval
config: default
split: dev
revision: 1271c7809071a13532e05f25fb53511ffce77117
metrics:
- type: main_score
value: 78.917
- type: map_at_1
value: 67.281
- type: map_at_10
value: 75.262
- type: map_at_100
value: 75.60900000000001
- type: map_at_1000
value: 75.618
- type: map_at_20
value: 75.50200000000001
- type: map_at_3
value: 73.455
- type: map_at_5
value: 74.657
- type: mrr_at_1
value: 67.43940990516333
- type: mrr_at_10
value: 75.27367989696756
- type: mrr_at_100
value: 75.62029353306437
- type: mrr_at_1000
value: 75.62934741874726
- type: mrr_at_20
value: 75.51356607409173
- type: mrr_at_3
value: 73.5159817351598
- type: mrr_at_5
value: 74.73832103969093
- type: nauc_map_at_1000_diff1
value: 77.26666391867634
- type: nauc_map_at_1000_max
value: 49.928541012203496
- type: nauc_map_at_1000_std
value: -40.494469470474456
- type: nauc_map_at_100_diff1
value: 77.26087423162396
- type: nauc_map_at_100_max
value: 49.944275615664424
- type: nauc_map_at_100_std
value: -40.48299992715398
- type: nauc_map_at_10_diff1
value: 76.97400113500906
- type: nauc_map_at_10_max
value: 49.84177029115674
- type: nauc_map_at_10_std
value: -40.829250876511445
- type: nauc_map_at_1_diff1
value: 81.44050620630395
- type: nauc_map_at_1_max
value: 48.97711944070578
- type: nauc_map_at_1_std
value: -38.963689457570254
- type: nauc_map_at_20_diff1
value: 77.21791353089375
- type: nauc_map_at_20_max
value: 49.958206759079424
- type: nauc_map_at_20_std
value: -40.53067571658996
- type: nauc_map_at_3_diff1
value: 77.3555925208868
- type: nauc_map_at_3_max
value: 49.32158146451256
- type: nauc_map_at_3_std
value: -41.93552426981978
- type: nauc_map_at_5_diff1
value: 77.07099950431504
- type: nauc_map_at_5_max
value: 49.54190504495002
- type: nauc_map_at_5_std
value: -41.814968130918096
- type: nauc_mrr_at_1000_diff1
value: 77.31388774540477
- type: nauc_mrr_at_1000_max
value: 49.96779699175759
- type: nauc_mrr_at_1000_std
value: -40.43739645160277
- type: nauc_mrr_at_100_diff1
value: 77.30817786449413
- type: nauc_mrr_at_100_max
value: 49.982514428937655
- type: nauc_mrr_at_100_std
value: -40.42876582797744
- type: nauc_mrr_at_10_diff1
value: 77.02048060465756
- type: nauc_mrr_at_10_max
value: 49.87937207270602
- type: nauc_mrr_at_10_std
value: -40.77596560333177
- type: nauc_mrr_at_1_diff1
value: 81.27219599516599
- type: nauc_mrr_at_1_max
value: 49.3083394026327
- type: nauc_mrr_at_1_std
value: -38.31023037552026
- type: nauc_mrr_at_20_diff1
value: 77.26497089316055
- type: nauc_mrr_at_20_max
value: 49.996257597621415
- type: nauc_mrr_at_20_std
value: -40.476723608868014
- type: nauc_mrr_at_3_diff1
value: 77.38971294099257
- type: nauc_mrr_at_3_max
value: 49.38110328987404
- type: nauc_mrr_at_3_std
value: -41.7118646715979
- type: nauc_mrr_at_5_diff1
value: 77.08286142519952
- type: nauc_mrr_at_5_max
value: 49.655249374588685
- type: nauc_mrr_at_5_std
value: -41.48173039989406
- type: nauc_ndcg_at_1000_diff1
value: 76.47399204021758
- type: nauc_ndcg_at_1000_max
value: 50.55770139961048
- type: nauc_ndcg_at_1000_std
value: -39.55650430279072
- type: nauc_ndcg_at_100_diff1
value: 76.29355616618253
- type: nauc_ndcg_at_100_max
value: 51.003608112592936
- type: nauc_ndcg_at_100_std
value: -39.24769744605206
- type: nauc_ndcg_at_10_diff1
value: 74.88697528447634
- type: nauc_ndcg_at_10_max
value: 50.398416372815234
- type: nauc_ndcg_at_10_std
value: -40.76526585772833
- type: nauc_ndcg_at_1_diff1
value: 81.27219599516599
- type: nauc_ndcg_at_1_max
value: 49.3083394026327
- type: nauc_ndcg_at_1_std
value: -38.31023037552026
- type: nauc_ndcg_at_20_diff1
value: 75.85463512091866
- type: nauc_ndcg_at_20_max
value: 50.97338683654334
- type: nauc_ndcg_at_20_std
value: -39.353128774903404
- type: nauc_ndcg_at_3_diff1
value: 75.94015726123543
- type: nauc_ndcg_at_3_max
value: 49.22194251063148
- type: nauc_ndcg_at_3_std
value: -43.040457030630435
- type: nauc_ndcg_at_5_diff1
value: 75.19166189770303
- type: nauc_ndcg_at_5_max
value: 49.65696229797189
- type: nauc_ndcg_at_5_std
value: -42.81534909184424
- type: nauc_precision_at_1000_diff1
value: -14.830901395815788
- type: nauc_precision_at_1000_max
value: 19.686297136854623
- type: nauc_precision_at_1000_std
value: 61.19310360166978
- type: nauc_precision_at_100_diff1
value: 20.55469986751769
- type: nauc_precision_at_100_max
value: 50.78431835075583
- type: nauc_precision_at_100_std
value: 31.54986568374813
- type: nauc_precision_at_10_diff1
value: 45.991938532558656
- type: nauc_precision_at_10_max
value: 46.386318595630385
- type: nauc_precision_at_10_std
value: -23.463011435224608
- type: nauc_precision_at_1_diff1
value: 81.27219599516599
- type: nauc_precision_at_1_max
value: 49.3083394026327
- type: nauc_precision_at_1_std
value: -38.31023037552026
- type: nauc_precision_at_20_diff1
value: 41.53180472410822
- type: nauc_precision_at_20_max
value: 49.89800247204318
- type: nauc_precision_at_20_std
value: -2.4192847331537095
- type: nauc_precision_at_3_diff1
value: 67.37504651209993
- type: nauc_precision_at_3_max
value: 47.893537208629496
- type: nauc_precision_at_3_std
value: -43.2362212382819
- type: nauc_precision_at_5_diff1
value: 60.03438883791718
- type: nauc_precision_at_5_max
value: 48.29770502354206
- type: nauc_precision_at_5_std
value: -40.39588448271546
- type: nauc_recall_at_1000_diff1
value: 71.04741174480844
- type: nauc_recall_at_1000_max
value: 93.19056506596002
- type: nauc_recall_at_1000_std
value: 62.96994797650912
- type: nauc_recall_at_100_diff1
value: 65.00418176852641
- type: nauc_recall_at_100_max
value: 85.27352708427193
- type: nauc_recall_at_100_std
value: 2.8812005546518886
- type: nauc_recall_at_10_diff1
value: 61.263254794998865
- type: nauc_recall_at_10_max
value: 54.17618329507141
- type: nauc_recall_at_10_std
value: -39.80603966142593
- type: nauc_recall_at_1_diff1
value: 81.44050620630395
- type: nauc_recall_at_1_max
value: 48.97711944070578
- type: nauc_recall_at_1_std
value: -38.963689457570254
- type: nauc_recall_at_20_diff1
value: 64.42106091745396
- type: nauc_recall_at_20_max
value: 63.10796640821887
- type: nauc_recall_at_20_std
value: -22.60117424572222
- type: nauc_recall_at_3_diff1
value: 70.66311436592945
- type: nauc_recall_at_3_max
value: 48.69498944323469
- type: nauc_recall_at_3_std
value: -47.37847524874532
- type: nauc_recall_at_5_diff1
value: 66.12701111728848
- type: nauc_recall_at_5_max
value: 49.91763957934711
- type: nauc_recall_at_5_std
value: -48.173252920584126
- type: ndcg_at_1
value: 67.43900000000001
- type: ndcg_at_10
value: 78.917
- type: ndcg_at_100
value: 80.53399999999999
- type: ndcg_at_1000
value: 80.768
- type: ndcg_at_20
value: 79.813
- type: ndcg_at_3
value: 75.37
- type: ndcg_at_5
value: 77.551
- type: precision_at_1
value: 67.43900000000001
- type: precision_at_10
value: 9.115
- type: precision_at_100
value: 0.985
- type: precision_at_1000
value: 0.1
- type: precision_at_20
value: 4.737
- type: precision_at_3
value: 27.081
- type: precision_at_5
value: 17.345
- type: recall_at_1
value: 67.281
- type: recall_at_10
value: 90.2
- type: recall_at_100
value: 97.576
- type: recall_at_1000
value: 99.368
- type: recall_at_20
value: 93.783
- type: recall_at_3
value: 80.822
- type: recall_at_5
value: 86.091
- task:
type: Retrieval
dataset:
name: MTEB DBPedia (default)
type: mteb/dbpedia
config: default
split: test
revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659
metrics:
- type: map_at_1
value: 9.041
- type: map_at_10
value: 18.662
- type: map_at_100
value: 26.054
- type: map_at_1000
value: 27.769
- type: map_at_20
value: 21.499
- type: map_at_3
value: 13.628000000000002
- type: map_at_5
value: 15.617
- type: mrr_at_1
value: 67.25
- type: mrr_at_10
value: 74.673
- type: mrr_at_100
value: 75.022
- type: mrr_at_1000
value: 75.031
- type: mrr_at_20
value: 74.895
- type: mrr_at_3
value: 73.042
- type: mrr_at_5
value: 74.179
- type: ndcg_at_1
value: 55.75
- type: ndcg_at_10
value: 41.004000000000005
- type: ndcg_at_100
value: 44.912
- type: ndcg_at_1000
value: 51.946000000000005
- type: ndcg_at_20
value: 40.195
- type: ndcg_at_3
value: 45.803
- type: ndcg_at_5
value: 42.976
- type: precision_at_1
value: 67.25
- type: precision_at_10
value: 31.874999999999996
- type: precision_at_100
value: 10.37
- type: precision_at_1000
value: 2.1430000000000002
- type: precision_at_20
value: 24.275
- type: precision_at_3
value: 48.417
- type: precision_at_5
value: 40.2
- type: recall_at_1
value: 9.041
- type: recall_at_10
value: 23.592
- type: recall_at_100
value: 49.476
- type: recall_at_1000
value: 71.677
- type: recall_at_20
value: 30.153000000000002
- type: recall_at_3
value: 14.777000000000001
- type: recall_at_5
value: 17.829
- type: main_score
value: 41.004000000000005
- task:
type: Retrieval
dataset:
name: MTEB DuRetrieval (default)
type: C-MTEB/DuRetrieval
config: default
split: dev
revision: a1a333e290fe30b10f3f56498e3a0d911a693ced
metrics:
- type: main_score
value: 83.134
- type: map_at_1
value: 23.907999999999998
- type: map_at_10
value: 74.566
- type: map_at_100
value: 77.706
- type: map_at_1000
value: 77.762
- type: map_at_20
value: 76.943
- type: map_at_3
value: 50.971999999999994
- type: map_at_5
value: 64.429
- type: mrr_at_1
value: 84.8
- type: mrr_at_10
value: 89.73218253968246
- type: mrr_at_100
value: 89.82853630655774
- type: mrr_at_1000
value: 89.83170411703153
- type: mrr_at_20
value: 89.79582030091501
- type: mrr_at_3
value: 89.32499999999992
- type: mrr_at_5
value: 89.58749999999992
- type: nauc_map_at_1000_diff1
value: -2.2736020650163717
- type: nauc_map_at_1000_max
value: 45.3937519555142
- type: nauc_map_at_1000_std
value: 10.824778228268581
- type: nauc_map_at_100_diff1
value: -2.2662939752750066
- type: nauc_map_at_100_max
value: 45.423960626031366
- type: nauc_map_at_100_std
value: 10.804239351738717
- type: nauc_map_at_10_diff1
value: 0.9395752585654343
- type: nauc_map_at_10_max
value: 42.53814836940551
- type: nauc_map_at_10_std
value: 0.7199313235265218
- type: nauc_map_at_1_diff1
value: 45.19415865267676
- type: nauc_map_at_1_max
value: -1.7261947382471912
- type: nauc_map_at_1_std
value: -32.16144291613605
- type: nauc_map_at_20_diff1
value: -1.884514152147472
- type: nauc_map_at_20_max
value: 44.830401115927174
- type: nauc_map_at_20_std
value: 8.118530414377219
- type: nauc_map_at_3_diff1
value: 25.678881127059967
- type: nauc_map_at_3_max
value: 12.191400431839758
- type: nauc_map_at_3_std
value: -27.201740587642327
- type: nauc_map_at_5_diff1
value: 13.227128780829572
- type: nauc_map_at_5_max
value: 26.978282739708977
- type: nauc_map_at_5_std
value: -17.555610348070584
- type: nauc_mrr_at_1000_diff1
value: 21.073512437502178
- type: nauc_mrr_at_1000_max
value: 64.9680257861005
- type: nauc_mrr_at_1000_std
value: 19.626288754404293
- type: nauc_mrr_at_100_diff1
value: 21.074637426957732
- type: nauc_mrr_at_100_max
value: 64.97612675661915
- type: nauc_mrr_at_100_std
value: 19.649504127800878
- type: nauc_mrr_at_10_diff1
value: 21.12003267626651
- type: nauc_mrr_at_10_max
value: 65.24362289059766
- type: nauc_mrr_at_10_std
value: 19.92351276180984
- type: nauc_mrr_at_1_diff1
value: 22.711430629147635
- type: nauc_mrr_at_1_max
value: 58.4059429497403
- type: nauc_mrr_at_1_std
value: 11.967886722567973
- type: nauc_mrr_at_20_diff1
value: 20.98220830510272
- type: nauc_mrr_at_20_max
value: 65.05737535197835
- type: nauc_mrr_at_20_std
value: 19.66672900782771
- type: nauc_mrr_at_3_diff1
value: 20.924796220048528
- type: nauc_mrr_at_3_max
value: 65.71388669932584
- type: nauc_mrr_at_3_std
value: 20.05912197134477
- type: nauc_mrr_at_5_diff1
value: 20.61978649468208
- type: nauc_mrr_at_5_max
value: 65.50709154526211
- type: nauc_mrr_at_5_std
value: 20.241434276181838
- type: nauc_ndcg_at_1000_diff1
value: 0.25363171946133656
- type: nauc_ndcg_at_1000_max
value: 54.12840465309885
- type: nauc_ndcg_at_1000_std
value: 20.749184325412546
- type: nauc_ndcg_at_100_diff1
value: 0.15649430250272792
- type: nauc_ndcg_at_100_max
value: 54.47995322413234
- type: nauc_ndcg_at_100_std
value: 21.266786634233267
- type: nauc_ndcg_at_10_diff1
value: 0.14579250840386346
- type: nauc_ndcg_at_10_max
value: 49.8643037948353
- type: nauc_ndcg_at_10_std
value: 12.960701643914216
- type: nauc_ndcg_at_1_diff1
value: 22.711430629147635
- type: nauc_ndcg_at_1_max
value: 58.4059429497403
- type: nauc_ndcg_at_1_std
value: 11.967886722567973
- type: nauc_ndcg_at_20_diff1
value: -0.6701559981776763
- type: nauc_ndcg_at_20_max
value: 52.95443437012488
- type: nauc_ndcg_at_20_std
value: 16.708883972005758
- type: nauc_ndcg_at_3_diff1
value: -0.19084922341962388
- type: nauc_ndcg_at_3_max
value: 46.2110230886874
- type: nauc_ndcg_at_3_std
value: 13.363250229683038
- type: nauc_ndcg_at_5_diff1
value: 0.9840019268192548
- type: nauc_ndcg_at_5_max
value: 43.56594891798146
- type: nauc_ndcg_at_5_std
value: 8.577017104088146
- type: nauc_precision_at_1000_diff1
value: -30.779179091501145
- type: nauc_precision_at_1000_max
value: 16.056094258615673
- type: nauc_precision_at_1000_std
value: 49.96303902363283
- type: nauc_precision_at_100_diff1
value: -31.583236638899585
- type: nauc_precision_at_100_max
value: 19.16571713603373
- type: nauc_precision_at_100_std
value: 51.870647903980036
- type: nauc_precision_at_10_diff1
value: -35.62134572732597
- type: nauc_precision_at_10_max
value: 31.6935186494612
- type: nauc_precision_at_10_std
value: 46.68659723766723
- type: nauc_precision_at_1_diff1
value: 22.711430629147635
- type: nauc_precision_at_1_max
value: 58.4059429497403
- type: nauc_precision_at_1_std
value: 11.967886722567973
- type: nauc_precision_at_20_diff1
value: -33.875460046920495
- type: nauc_precision_at_20_max
value: 24.188420133566442
- type: nauc_precision_at_20_std
value: 50.02387762958483
- type: nauc_precision_at_3_diff1
value: -28.875998450906827
- type: nauc_precision_at_3_max
value: 44.77058831167941
- type: nauc_precision_at_3_std
value: 31.77993710437207
- type: nauc_precision_at_5_diff1
value: -34.92525440306491
- type: nauc_precision_at_5_max
value: 39.855219917077086
- type: nauc_precision_at_5_std
value: 37.95432046169299
- type: nauc_recall_at_1000_diff1
value: -14.293309371874733
- type: nauc_recall_at_1000_max
value: 59.06948692482579
- type: nauc_recall_at_1000_std
value: 62.586254868312686
- type: nauc_recall_at_100_diff1
value: -4.344100947212704
- type: nauc_recall_at_100_max
value: 58.42120421043602
- type: nauc_recall_at_100_std
value: 46.48562009316997
- type: nauc_recall_at_10_diff1
value: 0.04948662912161709
- type: nauc_recall_at_10_max
value: 42.42809687119093
- type: nauc_recall_at_10_std
value: 0.6892504250411409
- type: nauc_recall_at_1_diff1
value: 45.19415865267676
- type: nauc_recall_at_1_max
value: -1.7261947382471912
- type: nauc_recall_at_1_std
value: -32.16144291613605
- type: nauc_recall_at_20_diff1
value: -7.634587864605111
- type: nauc_recall_at_20_max
value: 49.21327187174134
- type: nauc_recall_at_20_std
value: 16.408481068336346
- type: nauc_recall_at_3_diff1
value: 24.72546591038644
- type: nauc_recall_at_3_max
value: 6.620763400972902
- type: nauc_recall_at_3_std
value: -29.994703323331684
- type: nauc_recall_at_5_diff1
value: 12.65527364845842
- type: nauc_recall_at_5_max
value: 20.400121385794694
- type: nauc_recall_at_5_std
value: -22.34284568447213
- type: ndcg_at_1
value: 84.8
- type: ndcg_at_10
value: 83.134
- type: ndcg_at_100
value: 86.628
- type: ndcg_at_1000
value: 87.151
- type: ndcg_at_20
value: 85.092
- type: ndcg_at_3
value: 81.228
- type: ndcg_at_5
value: 80.2
- type: precision_at_1
value: 84.8
- type: precision_at_10
value: 40.394999999999996
- type: precision_at_100
value: 4.745
- type: precision_at_1000
value: 0.488
- type: precision_at_20
value: 22.245
- type: precision_at_3
value: 73.25
- type: precision_at_5
value: 61.86000000000001
- type: recall_at_1
value: 23.907999999999998
- type: recall_at_10
value: 85.346
- type: recall_at_100
value: 96.515
- type: recall_at_1000
value: 99.156
- type: recall_at_20
value: 91.377
- type: recall_at_3
value: 54.135
- type: recall_at_5
value: 70.488
- task:
type: Retrieval
dataset:
name: MTEB EcomRetrieval (default)
type: C-MTEB/EcomRetrieval
config: default
split: dev
revision: 687de13dc7294d6fd9be10c6945f9e8fec8166b9
metrics:
- type: main_score
value: 60.887
- type: map_at_1
value: 46.6
- type: map_at_10
value: 56.035000000000004
- type: map_at_100
value: 56.741
- type: map_at_1000
value: 56.764
- type: map_at_20
value: 56.513999999999996
- type: map_at_3
value: 53.733
- type: map_at_5
value: 54.913000000000004
- type: mrr_at_1
value: 46.6
- type: mrr_at_10
value: 56.034523809523776
- type: mrr_at_100
value: 56.74056360434383
- type: mrr_at_1000
value: 56.76373487222486
- type: mrr_at_20
value: 56.51374873879128
- type: mrr_at_3
value: 53.73333333333328
- type: mrr_at_5
value: 54.91333333333327
- type: nauc_map_at_1000_diff1
value: 65.13546939953387
- type: nauc_map_at_1000_max
value: 43.358890946774494
- type: nauc_map_at_1000_std
value: -9.973282105235036
- type: nauc_map_at_100_diff1
value: 65.12449309472493
- type: nauc_map_at_100_max
value: 43.377100882923145
- type: nauc_map_at_100_std
value: -9.971781228240555
- type: nauc_map_at_10_diff1
value: 64.83020018537475
- type: nauc_map_at_10_max
value: 43.25969482323034
- type: nauc_map_at_10_std
value: -10.120272176001547
- type: nauc_map_at_1_diff1
value: 69.58727592100516
- type: nauc_map_at_1_max
value: 38.236494689522026
- type: nauc_map_at_1_std
value: -14.833390831689597
- type: nauc_map_at_20_diff1
value: 65.01159809914586
- type: nauc_map_at_20_max
value: 43.33440319829618
- type: nauc_map_at_20_std
value: -10.039958228659726
- type: nauc_map_at_3_diff1
value: 65.2396323885909
- type: nauc_map_at_3_max
value: 42.26904017378952
- type: nauc_map_at_3_std
value: -11.793017036934044
- type: nauc_map_at_5_diff1
value: 64.96397227898036
- type: nauc_map_at_5_max
value: 43.231333789145424
- type: nauc_map_at_5_std
value: -10.349933732151372
- type: nauc_mrr_at_1000_diff1
value: 65.13546939953387
- type: nauc_mrr_at_1000_max
value: 43.358890946774494
- type: nauc_mrr_at_1000_std
value: -9.973282105235036
- type: nauc_mrr_at_100_diff1
value: 65.12449309472493
- type: nauc_mrr_at_100_max
value: 43.377100882923145
- type: nauc_mrr_at_100_std
value: -9.971781228240555
- type: nauc_mrr_at_10_diff1
value: 64.83020018537475
- type: nauc_mrr_at_10_max
value: 43.25969482323034
- type: nauc_mrr_at_10_std
value: -10.120272176001547
- type: nauc_mrr_at_1_diff1
value: 69.58727592100516
- type: nauc_mrr_at_1_max
value: 38.236494689522026
- type: nauc_mrr_at_1_std
value: -14.833390831689597
- type: nauc_mrr_at_20_diff1
value: 65.01159809914586
- type: nauc_mrr_at_20_max
value: 43.33440319829618
- type: nauc_mrr_at_20_std
value: -10.039958228659726
- type: nauc_mrr_at_3_diff1
value: 65.2396323885909
- type: nauc_mrr_at_3_max
value: 42.26904017378952
- type: nauc_mrr_at_3_std
value: -11.793017036934044
- type: nauc_mrr_at_5_diff1
value: 64.96397227898036
- type: nauc_mrr_at_5_max
value: 43.231333789145424
- type: nauc_mrr_at_5_std
value: -10.349933732151372
- type: nauc_ndcg_at_1000_diff1
value: 64.26802655199876
- type: nauc_ndcg_at_1000_max
value: 45.854310744745185
- type: nauc_ndcg_at_1000_std
value: -6.184417305204082
- type: nauc_ndcg_at_100_diff1
value: 63.99268329609827
- type: nauc_ndcg_at_100_max
value: 46.31270128748375
- type: nauc_ndcg_at_100_std
value: -6.1393433180558965
- type: nauc_ndcg_at_10_diff1
value: 62.6735104141137
- type: nauc_ndcg_at_10_max
value: 45.54954799462398
- type: nauc_ndcg_at_10_std
value: -7.348851199024871
- type: nauc_ndcg_at_1_diff1
value: 69.58727592100516
- type: nauc_ndcg_at_1_max
value: 38.236494689522026
- type: nauc_ndcg_at_1_std
value: -14.833390831689597
- type: nauc_ndcg_at_20_diff1
value: 63.25899651677274
- type: nauc_ndcg_at_20_max
value: 45.952196968886014
- type: nauc_ndcg_at_20_std
value: -6.807607465125713
- type: nauc_ndcg_at_3_diff1
value: 63.65618337476822
- type: nauc_ndcg_at_3_max
value: 43.507890965228945
- type: nauc_ndcg_at_3_std
value: -10.73845622217601
- type: nauc_ndcg_at_5_diff1
value: 63.079162432921855
- type: nauc_ndcg_at_5_max
value: 45.38303443868148
- type: nauc_ndcg_at_5_std
value: -8.063657824835534
- type: nauc_precision_at_1000_diff1
value: 63.01459977930557
- type: nauc_precision_at_1000_max
value: 92.4253034547151
- type: nauc_precision_at_1000_std
value: 84.4845513963158
- type: nauc_precision_at_100_diff1
value: 57.17217119405878
- type: nauc_precision_at_100_max
value: 80.70049725316484
- type: nauc_precision_at_100_std
value: 41.78392287147403
- type: nauc_precision_at_10_diff1
value: 53.115665404390725
- type: nauc_precision_at_10_max
value: 55.73825657341263
- type: nauc_precision_at_10_std
value: 5.406226305013257
- type: nauc_precision_at_1_diff1
value: 69.58727592100516
- type: nauc_precision_at_1_max
value: 38.236494689522026
- type: nauc_precision_at_1_std
value: -14.833390831689597
- type: nauc_precision_at_20_diff1
value: 53.77730697622828
- type: nauc_precision_at_20_max
value: 61.88170819253054
- type: nauc_precision_at_20_std
value: 13.678730470003856
- type: nauc_precision_at_3_diff1
value: 58.580196992291455
- type: nauc_precision_at_3_max
value: 47.404834585376626
- type: nauc_precision_at_3_std
value: -7.374978769024051
- type: nauc_precision_at_5_diff1
value: 56.44564652606437
- type: nauc_precision_at_5_max
value: 53.08973975162324
- type: nauc_precision_at_5_std
value: 0.22762700141423803
- type: nauc_recall_at_1000_diff1
value: 63.01459977930565
- type: nauc_recall_at_1000_max
value: 92.42530345471532
- type: nauc_recall_at_1000_std
value: 84.48455139631602
- type: nauc_recall_at_100_diff1
value: 57.17217119405904
- type: nauc_recall_at_100_max
value: 80.70049725316468
- type: nauc_recall_at_100_std
value: 41.783922871474275
- type: nauc_recall_at_10_diff1
value: 53.11566540439087
- type: nauc_recall_at_10_max
value: 55.738256573412656
- type: nauc_recall_at_10_std
value: 5.406226305013377
- type: nauc_recall_at_1_diff1
value: 69.58727592100516
- type: nauc_recall_at_1_max
value: 38.236494689522026
- type: nauc_recall_at_1_std
value: -14.833390831689597
- type: nauc_recall_at_20_diff1
value: 53.77730697622846
- type: nauc_recall_at_20_max
value: 61.881708192530525
- type: nauc_recall_at_20_std
value: 13.678730470003947
- type: nauc_recall_at_3_diff1
value: 58.5801969922914
- type: nauc_recall_at_3_max
value: 47.40483458537654
- type: nauc_recall_at_3_std
value: -7.37497876902413
- type: nauc_recall_at_5_diff1
value: 56.445646526064394
- type: nauc_recall_at_5_max
value: 53.08973975162332
- type: nauc_recall_at_5_std
value: 0.22762700141428024
- type: ndcg_at_1
value: 46.6
- type: ndcg_at_10
value: 60.887
- type: ndcg_at_100
value: 64.18199999999999
- type: ndcg_at_1000
value: 64.726
- type: ndcg_at_20
value: 62.614999999999995
- type: ndcg_at_3
value: 56.038
- type: ndcg_at_5
value: 58.150999999999996
- type: precision_at_1
value: 46.6
- type: precision_at_10
value: 7.630000000000001
- type: precision_at_100
value: 0.914
- type: precision_at_1000
value: 0.096
- type: precision_at_20
value: 4.154999999999999
- type: precision_at_3
value: 20.9
- type: precision_at_5
value: 13.56
- type: recall_at_1
value: 46.6
- type: recall_at_10
value: 76.3
- type: recall_at_100
value: 91.4
- type: recall_at_1000
value: 95.6
- type: recall_at_20
value: 83.1
- type: recall_at_3
value: 62.7
- type: recall_at_5
value: 67.80000000000001
- task:
type: Classification
dataset:
name: MTEB EmotionClassification (default)
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 73.29999999999998
- type: f1
value: 67.71473706580302
- type: f1_weighted
value: 74.83537255312045
- type: main_score
value: 73.29999999999998
- task:
type: Retrieval
dataset:
name: MTEB FEVER (default)
type: mteb/fever
config: default
split: test
revision: bea83ef9e8fb933d90a2f1d5515737465d613e12
metrics:
- type: map_at_1
value: 78.371
- type: map_at_10
value: 85.762
- type: map_at_100
value: 85.954
- type: map_at_1000
value: 85.966
- type: map_at_20
value: 85.887
- type: map_at_3
value: 84.854
- type: map_at_5
value: 85.408
- type: mrr_at_1
value: 84.443
- type: mrr_at_10
value: 90.432
- type: mrr_at_100
value: 90.483
- type: mrr_at_1000
value: 90.484
- type: mrr_at_20
value: 90.473
- type: mrr_at_3
value: 89.89399999999999
- type: mrr_at_5
value: 90.244
- type: ndcg_at_1
value: 84.443
- type: ndcg_at_10
value: 89.05499999999999
- type: ndcg_at_100
value: 89.68
- type: ndcg_at_1000
value: 89.87899999999999
- type: ndcg_at_20
value: 89.381
- type: ndcg_at_3
value: 87.73100000000001
- type: ndcg_at_5
value: 88.425
- type: precision_at_1
value: 84.443
- type: precision_at_10
value: 10.520999999999999
- type: precision_at_100
value: 1.103
- type: precision_at_1000
value: 0.11399999999999999
- type: precision_at_20
value: 5.362
- type: precision_at_3
value: 33.198
- type: precision_at_5
value: 20.441000000000003
- type: recall_at_1
value: 78.371
- type: recall_at_10
value: 94.594
- type: recall_at_100
value: 96.97099999999999
- type: recall_at_1000
value: 98.18
- type: recall_at_20
value: 95.707
- type: recall_at_3
value: 90.853
- type: recall_at_5
value: 92.74799999999999
- type: main_score
value: 89.05499999999999
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018 (default)
type: mteb/fiqa
config: default
split: test
revision: 27a168819829fe9bcd655c2df245fb19452e8e06
metrics:
- type: map_at_1
value: 23.810000000000002
- type: map_at_10
value: 39.051
- type: map_at_100
value: 41.231
- type: map_at_1000
value: 41.376000000000005
- type: map_at_20
value: 40.227000000000004
- type: map_at_3
value: 33.915
- type: map_at_5
value: 36.459
- type: mrr_at_1
value: 48.148
- type: mrr_at_10
value: 55.765
- type: mrr_at_100
value: 56.495
- type: mrr_at_1000
value: 56.525999999999996
- type: mrr_at_20
value: 56.213
- type: mrr_at_3
value: 53.086
- type: mrr_at_5
value: 54.513999999999996
- type: ndcg_at_1
value: 48.148
- type: ndcg_at_10
value: 47.349999999999994
- type: ndcg_at_100
value: 54.61899999999999
- type: ndcg_at_1000
value: 56.830000000000005
- type: ndcg_at_20
value: 50.143
- type: ndcg_at_3
value: 43.108000000000004
- type: ndcg_at_5
value: 44.023
- type: precision_at_1
value: 48.148
- type: precision_at_10
value: 13.441
- type: precision_at_100
value: 2.085
- type: precision_at_1000
value: 0.248
- type: precision_at_20
value: 7.870000000000001
- type: precision_at_3
value: 28.909000000000002
- type: precision_at_5
value: 20.957
- type: recall_at_1
value: 23.810000000000002
- type: recall_at_10
value: 54.303000000000004
- type: recall_at_100
value: 81.363
- type: recall_at_1000
value: 94.391
- type: recall_at_20
value: 63.056999999999995
- type: recall_at_3
value: 38.098
- type: recall_at_5
value: 44.414
- type: main_score
value: 47.349999999999994
- task:
type: Classification
dataset:
name: MTEB GeoreviewClassification (default)
type: ai-forever/georeview-classification
config: default
split: test
revision: 3765c0d1de6b7d264bc459433c45e5a75513839c
metrics:
- type: accuracy
value: 48.0126953125
- type: f1
value: 47.65764016160488
- type: f1_weighted
value: 47.65701659482088
- type: main_score
value: 48.0126953125
- task:
type: Clustering
dataset:
name: MTEB GeoreviewClusteringP2P (default)
type: ai-forever/georeview-clustering-p2p
config: default
split: test
revision: 97a313c8fc85b47f13f33e7e9a95c1ad888c7fec
metrics:
- type: main_score
value: 73.62357853672266
- type: v_measure
value: 73.62357853672266
- type: v_measure_std
value: 0.5942247545535766
- task:
type: Retrieval
dataset:
name: MTEB GerDaLIR (default)
type: jinaai/ger_da_lir
config: default
split: test
revision: 0bb47f1d73827e96964edb84dfe552f62f4fd5eb
metrics:
- type: main_score
value: 16.227
- type: map_at_1
value: 8.082
- type: map_at_10
value: 12.959999999999999
- type: map_at_100
value: 13.923
- type: map_at_1000
value: 14.030999999999999
- type: map_at_20
value: 13.453000000000001
- type: map_at_3
value: 11.018
- type: map_at_5
value: 12.056000000000001
- type: mrr_at_1
value: 8.993332249146203
- type: mrr_at_10
value: 13.994013092850247
- type: mrr_at_100
value: 14.913737673149308
- type: mrr_at_1000
value: 15.00843809934407
- type: mrr_at_20
value: 14.470268462334007
- type: mrr_at_3
value: 12.000596302921846
- type: mrr_at_5
value: 13.070689000921561
- type: nauc_map_at_1000_diff1
value: 28.559639584013286
- type: nauc_map_at_1000_max
value: 25.533800126086714
- type: nauc_map_at_1000_std
value: 9.826551026628666
- type: nauc_map_at_100_diff1
value: 28.544724499331696
- type: nauc_map_at_100_max
value: 25.46734324526386
- type: nauc_map_at_100_std
value: 9.739314481785591
- type: nauc_map_at_10_diff1
value: 28.77447517718118
- type: nauc_map_at_10_max
value: 24.7431615237795
- type: nauc_map_at_10_std
value: 8.349878188033646
- type: nauc_map_at_1_diff1
value: 37.405452629895514
- type: nauc_map_at_1_max
value: 24.444208978394023
- type: nauc_map_at_1_std
value: 4.043820373810528
- type: nauc_map_at_20_diff1
value: 28.69764217789062
- type: nauc_map_at_20_max
value: 25.111848355996496
- type: nauc_map_at_20_std
value: 9.034829905305918
- type: nauc_map_at_3_diff1
value: 30.89053285076882
- type: nauc_map_at_3_max
value: 24.862886115911152
- type: nauc_map_at_3_std
value: 6.654260832396586
- type: nauc_map_at_5_diff1
value: 29.230629676604263
- type: nauc_map_at_5_max
value: 24.374302288018583
- type: nauc_map_at_5_std
value: 7.341846952319046
- type: nauc_mrr_at_1000_diff1
value: 28.086147932781426
- type: nauc_mrr_at_1000_max
value: 25.98698528264653
- type: nauc_mrr_at_1000_std
value: 9.917554348624545
- type: nauc_mrr_at_100_diff1
value: 28.069163279791336
- type: nauc_mrr_at_100_max
value: 25.949440010886804
- type: nauc_mrr_at_100_std
value: 9.874340979732578
- type: nauc_mrr_at_10_diff1
value: 28.239920869530046
- type: nauc_mrr_at_10_max
value: 25.351271409498576
- type: nauc_mrr_at_10_std
value: 8.669862759875162
- type: nauc_mrr_at_1_diff1
value: 35.96543040207856
- type: nauc_mrr_at_1_max
value: 25.488936487231967
- type: nauc_mrr_at_1_std
value: 4.76439131038345
- type: nauc_mrr_at_20_diff1
value: 28.18865871284607
- type: nauc_mrr_at_20_max
value: 25.67121763344746
- type: nauc_mrr_at_20_std
value: 9.297910707519472
- type: nauc_mrr_at_3_diff1
value: 30.166714199740717
- type: nauc_mrr_at_3_max
value: 25.541792491964877
- type: nauc_mrr_at_3_std
value: 7.083090296398472
- type: nauc_mrr_at_5_diff1
value: 28.68475284656478
- type: nauc_mrr_at_5_max
value: 24.994071363482835
- type: nauc_mrr_at_5_std
value: 7.687507254902365
- type: nauc_ndcg_at_1000_diff1
value: 25.292792613586467
- type: nauc_ndcg_at_1000_max
value: 29.211905289377178
- type: nauc_ndcg_at_1000_std
value: 18.088867467320355
- type: nauc_ndcg_at_100_diff1
value: 25.026905011089152
- type: nauc_ndcg_at_100_max
value: 27.98822281254431
- type: nauc_ndcg_at_100_std
value: 16.69456904301902
- type: nauc_ndcg_at_10_diff1
value: 25.972279051109503
- type: nauc_ndcg_at_10_max
value: 24.86486482734957
- type: nauc_ndcg_at_10_std
value: 10.398605822106353
- type: nauc_ndcg_at_1_diff1
value: 36.134710485184826
- type: nauc_ndcg_at_1_max
value: 25.384572790326025
- type: nauc_ndcg_at_1_std
value: 4.591863033771824
- type: nauc_ndcg_at_20_diff1
value: 25.850033660205536
- type: nauc_ndcg_at_20_max
value: 25.944243193140515
- type: nauc_ndcg_at_20_std
value: 12.392409721204892
- type: nauc_ndcg_at_3_diff1
value: 29.1966056380018
- type: nauc_ndcg_at_3_max
value: 24.978843156259913
- type: nauc_ndcg_at_3_std
value: 7.353914459205087
- type: nauc_ndcg_at_5_diff1
value: 26.795315295756282
- type: nauc_ndcg_at_5_max
value: 24.1196789150412
- type: nauc_ndcg_at_5_std
value: 8.311970988265172
- type: nauc_precision_at_1000_diff1
value: 9.128270550217984
- type: nauc_precision_at_1000_max
value: 35.79286915973607
- type: nauc_precision_at_1000_std
value: 39.15669472887154
- type: nauc_precision_at_100_diff1
value: 14.770289799034384
- type: nauc_precision_at_100_max
value: 34.58262232264337
- type: nauc_precision_at_100_std
value: 34.101148102981384
- type: nauc_precision_at_10_diff1
value: 19.899104673118178
- type: nauc_precision_at_10_max
value: 26.636940338985625
- type: nauc_precision_at_10_std
value: 15.73871357255849
- type: nauc_precision_at_1_diff1
value: 36.134710485184826
- type: nauc_precision_at_1_max
value: 25.384572790326025
- type: nauc_precision_at_1_std
value: 4.591863033771824
- type: nauc_precision_at_20_diff1
value: 19.423457975148942
- type: nauc_precision_at_20_max
value: 29.58123490878582
- type: nauc_precision_at_20_std
value: 20.847850110821618
- type: nauc_precision_at_3_diff1
value: 24.986416623492918
- type: nauc_precision_at_3_max
value: 25.973548400472975
- type: nauc_precision_at_3_std
value: 9.486410455972823
- type: nauc_precision_at_5_diff1
value: 21.237741424923332
- type: nauc_precision_at_5_max
value: 24.647141028200164
- type: nauc_precision_at_5_std
value: 11.102785032334147
- type: nauc_recall_at_1000_diff1
value: 15.999714888817829
- type: nauc_recall_at_1000_max
value: 44.34701908906545
- type: nauc_recall_at_1000_std
value: 51.13471291594717
- type: nauc_recall_at_100_diff1
value: 17.401714890483706
- type: nauc_recall_at_100_max
value: 33.39042631654808
- type: nauc_recall_at_100_std
value: 33.944446168451584
- type: nauc_recall_at_10_diff1
value: 20.30036232399894
- type: nauc_recall_at_10_max
value: 24.006718284396786
- type: nauc_recall_at_10_std
value: 14.049375108518669
- type: nauc_recall_at_1_diff1
value: 37.405452629895514
- type: nauc_recall_at_1_max
value: 24.444208978394023
- type: nauc_recall_at_1_std
value: 4.043820373810528
- type: nauc_recall_at_20_diff1
value: 20.23582802609045
- type: nauc_recall_at_20_max
value: 26.408063410785243
- type: nauc_recall_at_20_std
value: 18.617479515468112
- type: nauc_recall_at_3_diff1
value: 25.53221830103098
- type: nauc_recall_at_3_max
value: 24.283712329152678
- type: nauc_recall_at_3_std
value: 8.428947805841867
- type: nauc_recall_at_5_diff1
value: 21.741499601020823
- type: nauc_recall_at_5_max
value: 22.754924586295296
- type: nauc_recall_at_5_std
value: 9.966736688169814
- type: ndcg_at_1
value: 8.977
- type: ndcg_at_10
value: 16.227
- type: ndcg_at_100
value: 21.417
- type: ndcg_at_1000
value: 24.451
- type: ndcg_at_20
value: 17.982
- type: ndcg_at_3
value: 12.206999999999999
- type: ndcg_at_5
value: 14.059
- type: precision_at_1
value: 8.977
- type: precision_at_10
value: 2.933
- type: precision_at_100
value: 0.59
- type: precision_at_1000
value: 0.087
- type: precision_at_20
value: 1.8599999999999999
- type: precision_at_3
value: 5.550999999999999
- type: precision_at_5
value: 4.340999999999999
- type: recall_at_1
value: 8.082
- type: recall_at_10
value: 25.52
- type: recall_at_100
value: 50.32
- type: recall_at_1000
value: 74.021
- type: recall_at_20
value: 32.229
- type: recall_at_3
value: 14.66
- type: recall_at_5
value: 19.062
- task:
type: Retrieval
dataset:
name: MTEB GermanDPR (default)
type: deepset/germandpr
config: default
split: test
revision: 5129d02422a66be600ac89cd3e8531b4f97d347d
metrics:
- type: main_score
value: 82.422
- type: map_at_1
value: 64.39
- type: map_at_10
value: 77.273
- type: map_at_100
value: 77.375
- type: map_at_1000
value: 77.376
- type: map_at_20
value: 77.351
- type: map_at_3
value: 75.46300000000001
- type: map_at_5
value: 76.878
- type: mrr_at_1
value: 64.19512195121952
- type: mrr_at_10
value: 77.15842044134736
- type: mrr_at_100
value: 77.2604854308704
- type: mrr_at_1000
value: 77.26087882190109
- type: mrr_at_20
value: 77.23572154560611
- type: mrr_at_3
value: 75.34959349593504
- type: mrr_at_5
value: 76.76422764227652
- type: nauc_map_at_1000_diff1
value: 49.73135253389972
- type: nauc_map_at_1000_max
value: 8.665570717396145
- type: nauc_map_at_1000_std
value: -25.920927572114522
- type: nauc_map_at_100_diff1
value: 49.729170775336605
- type: nauc_map_at_100_max
value: 8.66717979705074
- type: nauc_map_at_100_std
value: -25.918338868918596
- type: nauc_map_at_10_diff1
value: 49.708681691445925
- type: nauc_map_at_10_max
value: 8.830640635692113
- type: nauc_map_at_10_std
value: -25.843238986304858
- type: nauc_map_at_1_diff1
value: 51.750022350988914
- type: nauc_map_at_1_max
value: 3.599863010364626
- type: nauc_map_at_1_std
value: -27.670122127567314
- type: nauc_map_at_20_diff1
value: 49.72609185887161
- type: nauc_map_at_20_max
value: 8.766556053409218
- type: nauc_map_at_20_std
value: -25.85975887517904
- type: nauc_map_at_3_diff1
value: 49.328512536255595
- type: nauc_map_at_3_max
value: 9.475682028996795
- type: nauc_map_at_3_std
value: -26.277349632171017
- type: nauc_map_at_5_diff1
value: 49.42801822186142
- type: nauc_map_at_5_max
value: 8.788822474357252
- type: nauc_map_at_5_std
value: -25.959260882028573
- type: nauc_mrr_at_1000_diff1
value: 50.13038598302397
- type: nauc_mrr_at_1000_max
value: 8.734338637484832
- type: nauc_mrr_at_1000_std
value: -26.653343549855908
- type: nauc_mrr_at_100_diff1
value: 50.12820392111392
- type: nauc_mrr_at_100_max
value: 8.735940503917966
- type: nauc_mrr_at_100_std
value: -26.65074918231251
- type: nauc_mrr_at_10_diff1
value: 50.10567888458267
- type: nauc_mrr_at_10_max
value: 8.898451291748575
- type: nauc_mrr_at_10_std
value: -26.572046921975655
- type: nauc_mrr_at_1_diff1
value: 52.22769994409465
- type: nauc_mrr_at_1_max
value: 3.6490820146062015
- type: nauc_mrr_at_1_std
value: -28.535100562320498
- type: nauc_mrr_at_20_diff1
value: 50.12462222100699
- type: nauc_mrr_at_20_max
value: 8.83487018268756
- type: nauc_mrr_at_20_std
value: -26.591437036958332
- type: nauc_mrr_at_3_diff1
value: 49.6987353700016
- type: nauc_mrr_at_3_max
value: 9.531003760756258
- type: nauc_mrr_at_3_std
value: -26.949799063124818
- type: nauc_mrr_at_5_diff1
value: 49.823881656376585
- type: nauc_mrr_at_5_max
value: 8.850404667985085
- type: nauc_mrr_at_5_std
value: -26.680008966088582
- type: nauc_ndcg_at_1000_diff1
value: 49.41721203361181
- type: nauc_ndcg_at_1000_max
value: 9.41093067609825
- type: nauc_ndcg_at_1000_std
value: -25.499543637737567
- type: nauc_ndcg_at_100_diff1
value: 49.32810419509252
- type: nauc_ndcg_at_100_max
value: 9.476216458766897
- type: nauc_ndcg_at_100_std
value: -25.393856250990414
- type: nauc_ndcg_at_10_diff1
value: 49.181984436623694
- type: nauc_ndcg_at_10_max
value: 10.65234732763274
- type: nauc_ndcg_at_10_std
value: -24.737669349012297
- type: nauc_ndcg_at_1_diff1
value: 51.750022350988914
- type: nauc_ndcg_at_1_max
value: 3.599863010364626
- type: nauc_ndcg_at_1_std
value: -27.670122127567314
- type: nauc_ndcg_at_20_diff1
value: 49.275394594995056
- type: nauc_ndcg_at_20_max
value: 10.402059796651923
- type: nauc_ndcg_at_20_std
value: -24.82329915806705
- type: nauc_ndcg_at_3_diff1
value: 48.22614352152889
- type: nauc_ndcg_at_3_max
value: 11.67464280791404
- type: nauc_ndcg_at_3_std
value: -25.867824868234095
- type: nauc_ndcg_at_5_diff1
value: 48.35583502987241
- type: nauc_ndcg_at_5_max
value: 10.494278750448451
- type: nauc_ndcg_at_5_std
value: -25.11599634172764
- type: nauc_precision_at_1000_diff1
value: .nan
- type: nauc_precision_at_1000_max
value: .nan
- type: nauc_precision_at_1000_std
value: .nan
- type: nauc_precision_at_100_diff1
value: -56.39478136433852
- type: nauc_precision_at_100_max
value: 86.93518577529493
- type: nauc_precision_at_100_std
value: 100.0
- type: nauc_precision_at_10_diff1
value: 38.662829729133094
- type: nauc_precision_at_10_max
value: 56.38018435740605
- type: nauc_precision_at_10_std
value: 6.288091897081105
- type: nauc_precision_at_1_diff1
value: 51.750022350988914
- type: nauc_precision_at_1_max
value: 3.599863010364626
- type: nauc_precision_at_1_std
value: -27.670122127567314
- type: nauc_precision_at_20_diff1
value: 34.739153182429085
- type: nauc_precision_at_20_max
value: 84.86908403000989
- type: nauc_precision_at_20_std
value: 29.156199421219455
- type: nauc_precision_at_3_diff1
value: 42.09287362529135
- type: nauc_precision_at_3_max
value: 23.629152759287074
- type: nauc_precision_at_3_std
value: -23.721376911302492
- type: nauc_precision_at_5_diff1
value: 36.03866171924644
- type: nauc_precision_at_5_max
value: 29.166173558775327
- type: nauc_precision_at_5_std
value: -15.096374563068448
- type: nauc_recall_at_1000_diff1
value: .nan
- type: nauc_recall_at_1000_max
value: .nan
- type: nauc_recall_at_1000_std
value: .nan
- type: nauc_recall_at_100_diff1
value: -56.39478136433541
- type: nauc_recall_at_100_max
value: 86.93518577528111
- type: nauc_recall_at_100_std
value: 100.0
- type: nauc_recall_at_10_diff1
value: 38.66282972913384
- type: nauc_recall_at_10_max
value: 56.3801843574071
- type: nauc_recall_at_10_std
value: 6.288091897082639
- type: nauc_recall_at_1_diff1
value: 51.750022350988914
- type: nauc_recall_at_1_max
value: 3.599863010364626
- type: nauc_recall_at_1_std
value: -27.670122127567314
- type: nauc_recall_at_20_diff1
value: 34.7391531824321
- type: nauc_recall_at_20_max
value: 84.86908403001016
- type: nauc_recall_at_20_std
value: 29.156199421220748
- type: nauc_recall_at_3_diff1
value: 42.09287362529107
- type: nauc_recall_at_3_max
value: 23.629152759286946
- type: nauc_recall_at_3_std
value: -23.72137691130291
- type: nauc_recall_at_5_diff1
value: 36.0386617192469
- type: nauc_recall_at_5_max
value: 29.1661735587759
- type: nauc_recall_at_5_std
value: -15.09637456306774
- type: ndcg_at_1
value: 64.39
- type: ndcg_at_10
value: 82.422
- type: ndcg_at_100
value: 82.86099999999999
- type: ndcg_at_1000
value: 82.87299999999999
- type: ndcg_at_20
value: 82.67999999999999
- type: ndcg_at_3
value: 78.967
- type: ndcg_at_5
value: 81.50699999999999
- type: precision_at_1
value: 64.39
- type: precision_at_10
value: 9.795
- type: precision_at_100
value: 0.9990000000000001
- type: precision_at_1000
value: 0.1
- type: precision_at_20
value: 4.946
- type: precision_at_3
value: 29.691000000000003
- type: precision_at_5
value: 19.044
- type: recall_at_1
value: 64.39
- type: recall_at_10
value: 97.951
- type: recall_at_100
value: 99.902
- type: recall_at_1000
value: 100.0
- type: recall_at_20
value: 98.92699999999999
- type: recall_at_3
value: 89.07300000000001
- type: recall_at_5
value: 95.22
- task:
type: Retrieval
dataset:
name: MTEB GermanQuAD-Retrieval (default)
type: mteb/germanquad-retrieval
config: default
split: test
revision: f5c87ae5a2e7a5106606314eef45255f03151bb3
metrics:
- type: main_score
value: 94.15532365396247
- type: map_at_1
value: 90.789
- type: map_at_10
value: 94.24
- type: map_at_100
value: 94.283
- type: map_at_1000
value: 94.284
- type: map_at_20
value: 94.272
- type: map_at_3
value: 93.913
- type: map_at_5
value: 94.155
- type: mrr_at_1
value: 90.78947368421053
- type: mrr_at_10
value: 94.23987411056376
- type: mrr_at_100
value: 94.28320936825
- type: mrr_at_1000
value: 94.28350209115848
- type: mrr_at_20
value: 94.271919092559
- type: mrr_at_3
value: 93.91258318209313
- type: mrr_at_5
value: 94.15532365396247
- type: nauc_map_at_1000_diff1
value: 89.29089310650436
- type: nauc_map_at_1000_max
value: 73.83868784032414
- type: nauc_map_at_1000_std
value: -11.635778561889989
- type: nauc_map_at_100_diff1
value: 89.29077225707755
- type: nauc_map_at_100_max
value: 73.84002740580378
- type: nauc_map_at_100_std
value: -11.644096256165092
- type: nauc_map_at_10_diff1
value: 89.29117612292366
- type: nauc_map_at_10_max
value: 73.97487984981221
- type: nauc_map_at_10_std
value: -11.35191794373827
- type: nauc_map_at_1_diff1
value: 89.35436544117584
- type: nauc_map_at_1_max
value: 70.35936815057701
- type: nauc_map_at_1_std
value: -13.598996360976903
- type: nauc_map_at_20_diff1
value: 89.2530394052653
- type: nauc_map_at_20_max
value: 73.83537529419839
- type: nauc_map_at_20_std
value: -11.628272822028478
- type: nauc_map_at_3_diff1
value: 89.375111893546
- type: nauc_map_at_3_max
value: 74.78900366026112
- type: nauc_map_at_3_std
value: -12.720905253503274
- type: nauc_map_at_5_diff1
value: 89.35358300820893
- type: nauc_map_at_5_max
value: 74.31996219723239
- type: nauc_map_at_5_std
value: -10.768642638210867
- type: nauc_mrr_at_1000_diff1
value: 89.29089310650436
- type: nauc_mrr_at_1000_max
value: 73.83868784032414
- type: nauc_mrr_at_1000_std
value: -11.635778561889989
- type: nauc_mrr_at_100_diff1
value: 89.29077225707755
- type: nauc_mrr_at_100_max
value: 73.84002740580378
- type: nauc_mrr_at_100_std
value: -11.644096256165092
- type: nauc_mrr_at_10_diff1
value: 89.29117612292366
- type: nauc_mrr_at_10_max
value: 73.97487984981221
- type: nauc_mrr_at_10_std
value: -11.35191794373827
- type: nauc_mrr_at_1_diff1
value: 89.35436544117584
- type: nauc_mrr_at_1_max
value: 70.35936815057701
- type: nauc_mrr_at_1_std
value: -13.598996360976903
- type: nauc_mrr_at_20_diff1
value: 89.2530394052653
- type: nauc_mrr_at_20_max
value: 73.83537529419839
- type: nauc_mrr_at_20_std
value: -11.628272822028478
- type: nauc_mrr_at_3_diff1
value: 89.375111893546
- type: nauc_mrr_at_3_max
value: 74.78900366026112
- type: nauc_mrr_at_3_std
value: -12.720905253503274
- type: nauc_mrr_at_5_diff1
value: 89.35358300820893
- type: nauc_mrr_at_5_max
value: 74.31996219723239
- type: nauc_mrr_at_5_std
value: -10.768642638210867
- type: nauc_ndcg_at_1000_diff1
value: 89.27620775856863
- type: nauc_ndcg_at_1000_max
value: 74.2985757362615
- type: nauc_ndcg_at_1000_std
value: -11.236142819703023
- type: nauc_ndcg_at_100_diff1
value: 89.27284787540731
- type: nauc_ndcg_at_100_max
value: 74.33539303365968
- type: nauc_ndcg_at_100_std
value: -11.469413615851936
- type: nauc_ndcg_at_10_diff1
value: 89.21496710661724
- type: nauc_ndcg_at_10_max
value: 75.02035398490516
- type: nauc_ndcg_at_10_std
value: -9.903255803665814
- type: nauc_ndcg_at_1_diff1
value: 89.35436544117584
- type: nauc_ndcg_at_1_max
value: 70.35936815057701
- type: nauc_ndcg_at_1_std
value: -13.598996360976903
- type: nauc_ndcg_at_20_diff1
value: 89.03561289544179
- type: nauc_ndcg_at_20_max
value: 74.4006766600049
- type: nauc_ndcg_at_20_std
value: -11.129237862587743
- type: nauc_ndcg_at_3_diff1
value: 89.46540193201693
- type: nauc_ndcg_at_3_max
value: 76.87093548368378
- type: nauc_ndcg_at_3_std
value: -12.484902872086767
- type: nauc_ndcg_at_5_diff1
value: 89.39924941584766
- type: nauc_ndcg_at_5_max
value: 75.96975269092722
- type: nauc_ndcg_at_5_std
value: -8.180295581144833
- type: nauc_precision_at_1000_diff1
value: 100.0
- type: nauc_precision_at_1000_max
value: 100.0
- type: nauc_precision_at_1000_std
value: 100.0
- type: nauc_precision_at_100_diff1
value: 86.93074003795302
- type: nauc_precision_at_100_max
value: 100.0
- type: nauc_precision_at_100_std
value: -174.07785375176616
- type: nauc_precision_at_10_diff1
value: 87.43064119412082
- type: nauc_precision_at_10_max
value: 90.60785783417448
- type: nauc_precision_at_10_std
value: 15.378710059645906
- type: nauc_precision_at_1_diff1
value: 89.35436544117584
- type: nauc_precision_at_1_max
value: 70.35936815057701
- type: nauc_precision_at_1_std
value: -13.598996360976903
- type: nauc_precision_at_20_diff1
value: 78.78206037685919
- type: nauc_precision_at_20_max
value: 82.52264166455923
- type: nauc_precision_at_20_std
value: -5.95806599216658
- type: nauc_precision_at_3_diff1
value: 90.12709256456401
- type: nauc_precision_at_3_max
value: 90.72678805838154
- type: nauc_precision_at_3_std
value: -11.047599315631993
- type: nauc_precision_at_5_diff1
value: 89.9066873566561
- type: nauc_precision_at_5_max
value: 93.51571626543664
- type: nauc_precision_at_5_std
value: 22.632403279126162
- type: nauc_recall_at_1000_diff1
value: .nan
- type: nauc_recall_at_1000_max
value: .nan
- type: nauc_recall_at_1000_std
value: .nan
- type: nauc_recall_at_100_diff1
value: 86.93074003793416
- type: nauc_recall_at_100_max
value: 100.0
- type: nauc_recall_at_100_std
value: -174.07785375175723
- type: nauc_recall_at_10_diff1
value: 87.43064119411991
- type: nauc_recall_at_10_max
value: 90.60785783417579
- type: nauc_recall_at_10_std
value: 15.378710059643607
- type: nauc_recall_at_1_diff1
value: 89.35436544117584
- type: nauc_recall_at_1_max
value: 70.35936815057701
- type: nauc_recall_at_1_std
value: -13.598996360976903
- type: nauc_recall_at_20_diff1
value: 78.78206037685645
- type: nauc_recall_at_20_max
value: 82.52264166455791
- type: nauc_recall_at_20_std
value: -5.958065992168697
- type: nauc_recall_at_3_diff1
value: 90.12709256456463
- type: nauc_recall_at_3_max
value: 90.7267880583832
- type: nauc_recall_at_3_std
value: -11.047599315631881
- type: nauc_recall_at_5_diff1
value: 89.90668735665676
- type: nauc_recall_at_5_max
value: 93.51571626543753
- type: nauc_recall_at_5_std
value: 22.632403279126112
- type: ndcg_at_1
value: 90.789
- type: ndcg_at_10
value: 95.46
- type: ndcg_at_100
value: 95.652
- type: ndcg_at_1000
value: 95.659
- type: ndcg_at_20
value: 95.575
- type: ndcg_at_3
value: 94.82000000000001
- type: ndcg_at_5
value: 95.26400000000001
- type: precision_at_1
value: 90.789
- type: precision_at_10
value: 9.908999999999999
- type: precision_at_100
value: 1.0
- type: precision_at_1000
value: 0.1
- type: precision_at_20
value: 4.977
- type: precision_at_3
value: 32.471
- type: precision_at_5
value: 19.701
- type: recall_at_1
value: 90.789
- type: recall_at_10
value: 99.093
- type: recall_at_100
value: 99.955
- type: recall_at_1000
value: 100.0
- type: recall_at_20
value: 99.546
- type: recall_at_3
value: 97.414
- type: recall_at_5
value: 98.503
- task:
type: STS
dataset:
name: MTEB GermanSTSBenchmark (default)
type: jinaai/german-STSbenchmark
config: default
split: test
revision: e36907544d44c3a247898ed81540310442329e20
metrics:
- type: cosine_pearson
value: 86.55319003300265
- type: cosine_spearman
value: 87.50267373081324
- type: euclidean_pearson
value: 87.41630636501863
- type: euclidean_spearman
value: 88.02170803409365
- type: main_score
value: 87.50267373081324
- type: manhattan_pearson
value: 87.33703179056744
- type: manhattan_spearman
value: 87.99192826922514
- type: pearson
value: 86.55319003300265
- type: spearman
value: 87.50267373081324
- task:
type: Clustering
dataset:
name: MTEB HALClusteringS2S (default)
type: lyon-nlp/clustering-hal-s2s
config: default
split: test
revision: e06ebbbb123f8144bef1a5d18796f3dec9ae2915
metrics:
- type: main_score
value: 27.477557517301303
- type: v_measure
value: 27.477557517301303
- type: v_measure_std
value: 3.3525736581861336
- task:
type: Classification
dataset:
name: MTEB HeadlineClassification (default)
type: ai-forever/headline-classification
config: default
split: test
revision: 2fe05ee6b5832cda29f2ef7aaad7b7fe6a3609eb
metrics:
- type: accuracy
value: 75.0830078125
- type: f1
value: 75.08863209267814
- type: f1_weighted
value: 75.08895979060917
- type: main_score
value: 75.0830078125
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA (default)
type: mteb/hotpotqa
config: default
split: test
revision: ab518f4d6fcca38d87c25209f94beba119d02014
metrics:
- type: map_at_1
value: 38.143
- type: map_at_10
value: 55.916999999999994
- type: map_at_100
value: 56.706
- type: map_at_1000
value: 56.77100000000001
- type: map_at_20
value: 56.367
- type: map_at_3
value: 53.111
- type: map_at_5
value: 54.839000000000006
- type: mrr_at_1
value: 76.286
- type: mrr_at_10
value: 81.879
- type: mrr_at_100
value: 82.09100000000001
- type: mrr_at_1000
value: 82.101
- type: mrr_at_20
value: 82.01
- type: mrr_at_3
value: 80.972
- type: mrr_at_5
value: 81.537
- type: ndcg_at_1
value: 76.286
- type: ndcg_at_10
value: 64.673
- type: ndcg_at_100
value: 67.527
- type: ndcg_at_1000
value: 68.857
- type: ndcg_at_20
value: 65.822
- type: ndcg_at_3
value: 60.616
- type: ndcg_at_5
value: 62.827999999999996
- type: precision_at_1
value: 76.286
- type: precision_at_10
value: 13.196
- type: precision_at_100
value: 1.544
- type: precision_at_1000
value: 0.172
- type: precision_at_20
value: 6.968000000000001
- type: precision_at_3
value: 37.992
- type: precision_at_5
value: 24.54
- type: recall_at_1
value: 38.143
- type: recall_at_10
value: 65.982
- type: recall_at_100
value: 77.225
- type: recall_at_1000
value: 86.077
- type: recall_at_20
value: 69.68299999999999
- type: recall_at_3
value: 56.989000000000004
- type: recall_at_5
value: 61.35
- type: main_score
value: 64.673
- task:
type: Classification
dataset:
name: MTEB IFlyTek (default)
type: C-MTEB/IFlyTek-classification
config: default
split: validation
revision: 421605374b29664c5fc098418fe20ada9bd55f8a
metrics:
- type: accuracy
value: 41.67756829549827
- type: f1
value: 33.929325579581636
- type: f1_weighted
value: 43.03952025643197
- type: main_score
value: 41.67756829549827
- task:
type: Classification
dataset:
name: MTEB ImdbClassification (default)
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 91.90440000000001
- type: ap
value: 88.78663714603425
- type: ap_weighted
value: 88.78663714603425
- type: f1
value: 91.89564361975891
- type: f1_weighted
value: 91.89564361975891
- type: main_score
value: 91.90440000000001
- task:
type: Classification
dataset:
name: MTEB InappropriatenessClassification (default)
type: ai-forever/inappropriateness-classification
config: default
split: test
revision: 601651fdc45ef243751676e62dd7a19f491c0285
metrics:
- type: accuracy
value: 61.0498046875
- type: ap
value: 57.04240566648215
- type: ap_weighted
value: 57.04240566648215
- type: f1
value: 60.867630038606954
- type: f1_weighted
value: 60.867630038606954
- type: main_score
value: 61.0498046875
- task:
type: Classification
dataset:
name: MTEB JDReview (default)
type: C-MTEB/JDReview-classification
config: default
split: test
revision: b7c64bd89eb87f8ded463478346f76731f07bf8b
metrics:
- type: accuracy
value: 83.50844277673546
- type: ap
value: 48.46732380712268
- type: ap_weighted
value: 48.46732380712268
- type: f1
value: 77.43967451387445
- type: f1_weighted
value: 84.78462929014114
- type: main_score
value: 83.50844277673546
- task:
type: Classification
dataset:
name: MTEB KinopoiskClassification (default)
type: ai-forever/kinopoisk-sentiment-classification
config: default
split: test
revision: 5911f26666ac11af46cb9c6849d0dc80a378af24
metrics:
- type: accuracy
value: 62.393333333333324
- type: f1
value: 61.35940129568015
- type: f1_weighted
value: 61.35940129568015
- type: main_score
value: 62.393333333333324
- task:
type: STS
dataset:
name: MTEB LCQMC (default)
type: C-MTEB/LCQMC
config: default
split: test
revision: 17f9b096f80380fce5ed12a9be8be7784b337daf
metrics:
- type: cosine_pearson
value: 67.74375505907872
- type: cosine_spearman
value: 75.94582231399434
- type: euclidean_pearson
value: 74.52501692443582
- type: euclidean_spearman
value: 75.88428434746646
- type: main_score
value: 75.94582231399434
- type: manhattan_pearson
value: 74.55015441749529
- type: manhattan_spearman
value: 75.83288262176175
- type: pearson
value: 67.74375505907872
- type: spearman
value: 75.94582231399434
- task:
type: Retrieval
dataset:
name: MTEB LEMBNarrativeQARetrieval (default)
type: dwzhu/LongEmbed
config: default
split: test
revision: 6e346642246bfb4928c560ee08640dc84d074e8c
metrics:
- type: map_at_1
value: 23.093
- type: map_at_10
value: 30.227999999999998
- type: map_at_100
value: 31.423000000000002
- type: map_at_1000
value: 31.533
- type: map_at_20
value: 30.835
- type: map_at_3
value: 27.983999999999998
- type: map_at_5
value: 29.253
- type: mrr_at_1
value: 23.093
- type: mrr_at_10
value: 30.227999999999998
- type: mrr_at_100
value: 31.423000000000002
- type: mrr_at_1000
value: 31.533
- type: mrr_at_20
value: 30.835
- type: mrr_at_3
value: 27.983999999999998
- type: mrr_at_5
value: 29.253
- type: ndcg_at_1
value: 23.093
- type: ndcg_at_10
value: 34.297
- type: ndcg_at_100
value: 41.049
- type: ndcg_at_1000
value: 43.566
- type: ndcg_at_20
value: 36.52
- type: ndcg_at_3
value: 29.629
- type: ndcg_at_5
value: 31.926
- type: precision_at_1
value: 23.093
- type: precision_at_10
value: 4.735
- type: precision_at_100
value: 0.8109999999999999
- type: precision_at_1000
value: 0.1
- type: precision_at_20
value: 2.8080000000000003
- type: precision_at_3
value: 11.468
- type: precision_at_5
value: 8.001
- type: recall_at_1
value: 23.093
- type: recall_at_10
value: 47.354
- type: recall_at_100
value: 81.147
- type: recall_at_1000
value: 100.0
- type: recall_at_20
value: 56.16799999999999
- type: recall_at_3
value: 34.405
- type: recall_at_5
value: 40.004
- type: main_score
value: 34.297
- type: map_at_1
value: 24.361
- type: map_at_10
value: 33.641
- type: map_at_100
value: 35.104
- type: map_at_1000
value: 35.127
- type: map_at_20
value: 34.388999999999996
- type: map_at_3
value: 30.255
- type: map_at_5
value: 32.079
- type: mrr_at_1
value: 24.361
- type: mrr_at_10
value: 33.641
- type: mrr_at_100
value: 35.104
- type: mrr_at_1000
value: 35.127
- type: mrr_at_20
value: 34.388999999999996
- type: mrr_at_3
value: 30.255
- type: mrr_at_5
value: 32.079
- type: ndcg_at_1
value: 24.361
- type: ndcg_at_10
value: 39.337
- type: ndcg_at_100
value: 47.384
- type: ndcg_at_1000
value: 47.75
- type: ndcg_at_20
value: 42.077999999999996
- type: ndcg_at_3
value: 32.235
- type: ndcg_at_5
value: 35.524
- type: precision_at_1
value: 24.361
- type: precision_at_10
value: 5.783
- type: precision_at_100
value: 0.975
- type: precision_at_1000
value: 0.1
- type: precision_at_20
value: 3.435
- type: precision_at_3
value: 12.661
- type: precision_at_5
value: 9.193999999999999
- type: recall_at_1
value: 24.361
- type: recall_at_10
value: 57.826
- type: recall_at_100
value: 97.51100000000001
- type: recall_at_1000
value: 100.0
- type: recall_at_20
value: 68.697
- type: recall_at_3
value: 37.983
- type: recall_at_5
value: 45.972
- type: main_score
value: 39.337
- type: map_at_1
value: 53.667
- type: map_at_10
value: 61.719
- type: map_at_100
value: 62.471
- type: map_at_1000
value: 62.492000000000004
- type: map_at_20
value: 62.153000000000006
- type: map_at_3
value: 59.167
- type: map_at_5
value: 60.95
- type: mrr_at_1
value: 53.667
- type: mrr_at_10
value: 61.719
- type: mrr_at_100
value: 62.471
- type: mrr_at_1000
value: 62.492000000000004
- type: mrr_at_20
value: 62.153000000000006
- type: mrr_at_3
value: 59.167
- type: mrr_at_5
value: 60.95
- type: ndcg_at_1
value: 53.667
- type: ndcg_at_10
value: 66.018
- type: ndcg_at_100
value: 69.726
- type: ndcg_at_1000
value: 70.143
- type: ndcg_at_20
value: 67.61399999999999
- type: ndcg_at_3
value: 60.924
- type: ndcg_at_5
value: 64.10900000000001
- type: precision_at_1
value: 53.667
- type: precision_at_10
value: 7.9670000000000005
- type: precision_at_100
value: 0.97
- type: precision_at_1000
value: 0.1
- type: precision_at_20
value: 4.3
- type: precision_at_3
value: 22.0
- type: precision_at_5
value: 14.732999999999999
- type: recall_at_1
value: 53.667
- type: recall_at_10
value: 79.667
- type: recall_at_100
value: 97.0
- type: recall_at_1000
value: 100.0
- type: recall_at_20
value: 86.0
- type: recall_at_3
value: 66.0
- type: recall_at_5
value: 73.667
- type: main_score
value: 66.018
- task:
type: Retrieval
dataset:
name: MTEB LEMBNeedleRetrieval (default)
type: dwzhu/LongEmbed
config: default
split: test_256
revision: 6e346642246bfb4928c560ee08640dc84d074e8c
metrics:
- type: map_at_1
value: 64.0
- type: map_at_10
value: 77.083
- type: map_at_100
value: 77.265
- type: map_at_1000
value: 77.265
- type: map_at_20
value: 77.265
- type: map_at_3
value: 76.333
- type: map_at_5
value: 76.833
- type: mrr_at_1
value: 64.0
- type: mrr_at_10
value: 77.083
- type: mrr_at_100
value: 77.265
- type: mrr_at_1000
value: 77.265
- type: mrr_at_20
value: 77.265
- type: mrr_at_3
value: 76.333
- type: mrr_at_5
value: 76.833
- type: ndcg_at_1
value: 64.0
- type: ndcg_at_10
value: 82.325
- type: ndcg_at_100
value: 82.883
- type: ndcg_at_1000
value: 82.883
- type: ndcg_at_20
value: 82.883
- type: ndcg_at_3
value: 80.833
- type: ndcg_at_5
value: 81.694
- type: precision_at_1
value: 64.0
- type: precision_at_10
value: 9.8
- type: precision_at_100
value: 1.0
- type: precision_at_1000
value: 0.1
- type: precision_at_20
value: 5.0
- type: precision_at_3
value: 31.333
- type: precision_at_5
value: 19.2
- type: recall_at_1
value: 64.0
- type: recall_at_10
value: 98.0
- type: recall_at_100
value: 100.0
- type: recall_at_1000
value: 100.0
- type: recall_at_20
value: 100.0
- type: recall_at_3
value: 94.0
- type: recall_at_5
value: 96.0
- type: main_score
value: 64.0
- type: map_at_1
value: 100.0
- type: map_at_10
value: 100.0
- type: map_at_100
value: 100.0
- type: map_at_1000
value: 100.0
- type: map_at_20
value: 100.0
- type: map_at_3
value: 100.0
- type: map_at_5
value: 100.0
- type: mrr_at_1
value: 100.0
- type: mrr_at_10
value: 100.0
- type: mrr_at_100
value: 100.0
- type: mrr_at_1000
value: 100.0
- type: mrr_at_20
value: 100.0
- type: mrr_at_3
value: 100.0
- type: mrr_at_5
value: 100.0
- type: ndcg_at_1
value: 100.0
- type: ndcg_at_10
value: 100.0
- type: ndcg_at_100
value: 100.0
- type: ndcg_at_1000
value: 100.0
- type: ndcg_at_20
value: 100.0
- type: ndcg_at_3
value: 100.0
- type: ndcg_at_5
value: 100.0
- type: precision_at_1
value: 100.0
- type: precision_at_10
value: 10.0
- type: precision_at_100
value: 1.0
- type: precision_at_1000
value: 0.1
- type: precision_at_20
value: 5.0
- type: precision_at_3
value: 33.333
- type: precision_at_5
value: 20.0
- type: recall_at_1
value: 100.0
- type: recall_at_10
value: 100.0
- type: recall_at_100
value: 100.0
- type: recall_at_1000
value: 100.0
- type: recall_at_20
value: 100.0
- type: recall_at_3
value: 100.0
- type: recall_at_5
value: 100.0
- type: main_score
value: 100.0
- task:
type: Retrieval
dataset:
name: MTEB LEMBSummScreenFDRetrieval (default)
type: dwzhu/LongEmbed
config: default
split: validation
revision: 6e346642246bfb4928c560ee08640dc84d074e8c
metrics:
- type: map_at_1
value: 84.821
- type: map_at_10
value: 90.11200000000001
- type: map_at_100
value: 90.158
- type: map_at_1000
value: 90.158
- type: map_at_20
value: 90.137
- type: map_at_3
value: 89.385
- type: map_at_5
value: 89.876
- type: mrr_at_1
value: 84.821
- type: mrr_at_10
value: 90.11200000000001
- type: mrr_at_100
value: 90.158
- type: mrr_at_1000
value: 90.158
- type: mrr_at_20
value: 90.137
- type: mrr_at_3
value: 89.385
- type: mrr_at_5
value: 89.876
- type: ndcg_at_1
value: 84.821
- type: ndcg_at_10
value: 92.334
- type: ndcg_at_100
value: 92.535
- type: ndcg_at_1000
value: 92.535
- type: ndcg_at_20
value: 92.414
- type: ndcg_at_3
value: 90.887
- type: ndcg_at_5
value: 91.758
- type: precision_at_1
value: 84.821
- type: precision_at_10
value: 9.911
- type: precision_at_100
value: 1.0
- type: precision_at_1000
value: 0.1
- type: precision_at_20
value: 4.97
- type: precision_at_3
value: 31.746000000000002
- type: precision_at_5
value: 19.464000000000002
- type: recall_at_1
value: 84.821
- type: recall_at_10
value: 99.107
- type: recall_at_100
value: 100.0
- type: recall_at_1000
value: 100.0
- type: recall_at_20
value: 99.405
- type: recall_at_3
value: 95.238
- type: recall_at_5
value: 97.321
- type: main_score
value: 92.334
- task:
type: Retrieval
dataset:
name: MTEB MLQARetrieval (deu-deu)
type: facebook/mlqa
config: deu-deu
split: test
revision: 397ed406c1a7902140303e7faf60fff35b58d285
metrics:
- type: main_score
value: 67.548
- type: map_at_1
value: 56.559000000000005
- type: map_at_10
value: 63.867
- type: map_at_100
value: 64.429
- type: map_at_1000
value: 64.457
- type: map_at_20
value: 64.215
- type: map_at_3
value: 62.109
- type: map_at_5
value: 63.101
- type: mrr_at_1
value: 56.56990915134057
- type: mrr_at_10
value: 63.86820789324668
- type: mrr_at_100
value: 64.42973602152581
- type: mrr_at_1000
value: 64.45818598090155
- type: mrr_at_20
value: 64.2163052263868
- type: mrr_at_3
value: 62.10946155550634
- type: mrr_at_5
value: 63.10104143585199
- type: nauc_map_at_1000_diff1
value: 73.78440163370111
- type: nauc_map_at_1000_max
value: 66.37875518052162
- type: nauc_map_at_1000_std
value: -17.063915098135396
- type: nauc_map_at_100_diff1
value: 73.77180802985815
- type: nauc_map_at_100_max
value: 66.38365998362033
- type: nauc_map_at_100_std
value: -17.053345109661972
- type: nauc_map_at_10_diff1
value: 73.70041876696037
- type: nauc_map_at_10_max
value: 66.33213342705997
- type: nauc_map_at_10_std
value: -17.40657791273925
- type: nauc_map_at_1_diff1
value: 76.8784374396948
- type: nauc_map_at_1_max
value: 64.07170606935357
- type: nauc_map_at_1_std
value: -18.464213686790654
- type: nauc_map_at_20_diff1
value: 73.72371377231813
- type: nauc_map_at_20_max
value: 66.42108121059451
- type: nauc_map_at_20_std
value: -17.05384923889036
- type: nauc_map_at_3_diff1
value: 74.08287018839246
- type: nauc_map_at_3_max
value: 66.42422337760333
- type: nauc_map_at_3_std
value: -17.79503404131652
- type: nauc_map_at_5_diff1
value: 73.9294779027339
- type: nauc_map_at_5_max
value: 66.51752041065726
- type: nauc_map_at_5_std
value: -17.67309805113804
- type: nauc_mrr_at_1000_diff1
value: 73.78389736923545
- type: nauc_mrr_at_1000_max
value: 66.37929720858341
- type: nauc_mrr_at_1000_std
value: -17.058591711291278
- type: nauc_mrr_at_100_diff1
value: 73.77126451253136
- type: nauc_mrr_at_100_max
value: 66.38405917246607
- type: nauc_mrr_at_100_std
value: -17.047251035212863
- type: nauc_mrr_at_10_diff1
value: 73.69960470665124
- type: nauc_mrr_at_10_max
value: 66.33265194210313
- type: nauc_mrr_at_10_std
value: -17.399659076827998
- type: nauc_mrr_at_1_diff1
value: 76.8689850260726
- type: nauc_mrr_at_1_max
value: 64.09858188287487
- type: nauc_mrr_at_1_std
value: -18.46064784201847
- type: nauc_mrr_at_20_diff1
value: 73.72312682063128
- type: nauc_mrr_at_20_max
value: 66.42181932858745
- type: nauc_mrr_at_20_std
value: -17.04690257511092
- type: nauc_mrr_at_3_diff1
value: 74.08287018839246
- type: nauc_mrr_at_3_max
value: 66.42422337760333
- type: nauc_mrr_at_3_std
value: -17.79503404131652
- type: nauc_mrr_at_5_diff1
value: 73.9294779027339
- type: nauc_mrr_at_5_max
value: 66.51752041065726
- type: nauc_mrr_at_5_std
value: -17.67309805113804
- type: nauc_ndcg_at_1000_diff1
value: 72.97825548342801
- type: nauc_ndcg_at_1000_max
value: 66.96275437178257
- type: nauc_ndcg_at_1000_std
value: -15.611902299641587
- type: nauc_ndcg_at_100_diff1
value: 72.58724738936613
- type: nauc_ndcg_at_100_max
value: 67.16774012704182
- type: nauc_ndcg_at_100_std
value: -14.945088654796812
- type: nauc_ndcg_at_10_diff1
value: 72.16253640477947
- type: nauc_ndcg_at_10_max
value: 67.01746849484621
- type: nauc_ndcg_at_10_std
value: -16.46102507270809
- type: nauc_ndcg_at_1_diff1
value: 76.8689850260726
- type: nauc_ndcg_at_1_max
value: 64.09858188287487
- type: nauc_ndcg_at_1_std
value: -18.46064784201847
- type: nauc_ndcg_at_20_diff1
value: 72.19995325129975
- type: nauc_ndcg_at_20_max
value: 67.39639713797962
- type: nauc_ndcg_at_20_std
value: -15.091689370748531
- type: nauc_ndcg_at_3_diff1
value: 73.13123604206514
- type: nauc_ndcg_at_3_max
value: 67.23123167871547
- type: nauc_ndcg_at_3_std
value: -17.492755234009156
- type: nauc_ndcg_at_5_diff1
value: 72.8154718929895
- type: nauc_ndcg_at_5_max
value: 67.44578008373777
- type: nauc_ndcg_at_5_std
value: -17.251840358751362
- type: nauc_precision_at_1000_diff1
value: 47.89748325983604
- type: nauc_precision_at_1000_max
value: 70.47466197804906
- type: nauc_precision_at_1000_std
value: 72.66193512114775
- type: nauc_precision_at_100_diff1
value: 59.493743734005356
- type: nauc_precision_at_100_max
value: 74.02140147220713
- type: nauc_precision_at_100_std
value: 17.26664098026236
- type: nauc_precision_at_10_diff1
value: 64.94415011040277
- type: nauc_precision_at_10_max
value: 69.6963814950747
- type: nauc_precision_at_10_std
value: -11.663043657012954
- type: nauc_precision_at_1_diff1
value: 76.8689850260726
- type: nauc_precision_at_1_max
value: 64.09858188287487
- type: nauc_precision_at_1_std
value: -18.46064784201847
- type: nauc_precision_at_20_diff1
value: 63.145886909986416
- type: nauc_precision_at_20_max
value: 72.95708033630744
- type: nauc_precision_at_20_std
value: -1.5039593629280323
- type: nauc_precision_at_3_diff1
value: 69.88902201644449
- type: nauc_precision_at_3_max
value: 69.80499971089935
- type: nauc_precision_at_3_std
value: -16.444680766676647
- type: nauc_precision_at_5_diff1
value: 68.60869967062919
- type: nauc_precision_at_5_max
value: 70.75998207564281
- type: nauc_precision_at_5_std
value: -15.62613396998262
- type: nauc_recall_at_1000_diff1
value: 62.6646436338833
- type: nauc_recall_at_1000_max
value: 86.17801636476078
- type: nauc_recall_at_1000_std
value: 71.84718775540334
- type: nauc_recall_at_100_diff1
value: 61.110492191439505
- type: nauc_recall_at_100_max
value: 75.45730686603042
- type: nauc_recall_at_100_std
value: 16.202465011589428
- type: nauc_recall_at_10_diff1
value: 65.1522196516815
- type: nauc_recall_at_10_max
value: 69.7626435962161
- type: nauc_recall_at_10_std
value: -11.801178474770449
- type: nauc_recall_at_1_diff1
value: 76.8784374396948
- type: nauc_recall_at_1_max
value: 64.07170606935357
- type: nauc_recall_at_1_std
value: -18.464213686790654
- type: nauc_recall_at_20_diff1
value: 63.40332739504143
- type: nauc_recall_at_20_max
value: 73.04113661090965
- type: nauc_recall_at_20_std
value: -1.6609741140266947
- type: nauc_recall_at_3_diff1
value: 70.03728086098866
- type: nauc_recall_at_3_max
value: 69.85953774320521
- type: nauc_recall_at_3_std
value: -16.482993123411706
- type: nauc_recall_at_5_diff1
value: 68.77396121765933
- type: nauc_recall_at_5_max
value: 70.8231205493519
- type: nauc_recall_at_5_std
value: -15.668037770700863
- type: ndcg_at_1
value: 56.57
- type: ndcg_at_10
value: 67.548
- type: ndcg_at_100
value: 70.421
- type: ndcg_at_1000
value: 71.198
- type: ndcg_at_20
value: 68.829
- type: ndcg_at_3
value: 63.88700000000001
- type: ndcg_at_5
value: 65.689
- type: precision_at_1
value: 56.57
- type: precision_at_10
value: 7.922
- type: precision_at_100
value: 0.9299999999999999
- type: precision_at_1000
value: 0.099
- type: precision_at_20
value: 4.216
- type: precision_at_3
value: 23.015
- type: precision_at_5
value: 14.691
- type: recall_at_1
value: 56.559000000000005
- type: recall_at_10
value: 79.182
- type: recall_at_100
value: 92.946
- type: recall_at_1000
value: 99.092
- type: recall_at_20
value: 84.27900000000001
- type: recall_at_3
value: 69.023
- type: recall_at_5
value: 73.432
- task:
type: Retrieval
dataset:
name: MTEB MLQARetrieval (deu-spa)
type: facebook/mlqa
config: deu-spa
split: test
revision: 397ed406c1a7902140303e7faf60fff35b58d285
metrics:
- type: main_score
value: 70.645
- type: map_at_1
value: 58.423
- type: map_at_10
value: 66.613
- type: map_at_100
value: 67.14099999999999
- type: map_at_1000
value: 67.161
- type: map_at_20
value: 66.965
- type: map_at_3
value: 64.714
- type: map_at_5
value: 65.835
- type: mrr_at_1
value: 58.4225352112676
- type: mrr_at_10
value: 66.61321260898735
- type: mrr_at_100
value: 67.13991570812132
- type: mrr_at_1000
value: 67.1598532168174
- type: mrr_at_20
value: 66.96384710024888
- type: mrr_at_3
value: 64.71361502347425
- type: mrr_at_5
value: 65.83474178403769
- type: nauc_map_at_1000_diff1
value: 73.9485117118935
- type: nauc_map_at_1000_max
value: 65.74479869396299
- type: nauc_map_at_1000_std
value: -20.300269749495563
- type: nauc_map_at_100_diff1
value: 73.93900406302829
- type: nauc_map_at_100_max
value: 65.75508449194885
- type: nauc_map_at_100_std
value: -20.265330791570175
- type: nauc_map_at_10_diff1
value: 73.84863233472605
- type: nauc_map_at_10_max
value: 65.89377317378211
- type: nauc_map_at_10_std
value: -20.404123131964695
- type: nauc_map_at_1_diff1
value: 76.73627284218519
- type: nauc_map_at_1_max
value: 62.94957512510876
- type: nauc_map_at_1_std
value: -20.99649749330682
- type: nauc_map_at_20_diff1
value: 73.88712006109598
- type: nauc_map_at_20_max
value: 65.82057018162664
- type: nauc_map_at_20_std
value: -20.269476512431915
- type: nauc_map_at_3_diff1
value: 74.21419190161502
- type: nauc_map_at_3_max
value: 65.64993368062119
- type: nauc_map_at_3_std
value: -21.34641749007071
- type: nauc_map_at_5_diff1
value: 74.0119419385777
- type: nauc_map_at_5_max
value: 65.69809416369732
- type: nauc_map_at_5_std
value: -21.16901556082261
- type: nauc_mrr_at_1000_diff1
value: 73.94915184134923
- type: nauc_mrr_at_1000_max
value: 65.74522469633418
- type: nauc_mrr_at_1000_std
value: -20.303028367132246
- type: nauc_mrr_at_100_diff1
value: 73.93964394728808
- type: nauc_mrr_at_100_max
value: 65.75550992323707
- type: nauc_mrr_at_100_std
value: -20.26808820438918
- type: nauc_mrr_at_10_diff1
value: 73.84863233472605
- type: nauc_mrr_at_10_max
value: 65.89377317378211
- type: nauc_mrr_at_10_std
value: -20.404123131964695
- type: nauc_mrr_at_1_diff1
value: 76.73627284218519
- type: nauc_mrr_at_1_max
value: 62.94957512510876
- type: nauc_mrr_at_1_std
value: -20.99649749330682
- type: nauc_mrr_at_20_diff1
value: 73.88775721128745
- type: nauc_mrr_at_20_max
value: 65.820991355628
- type: nauc_mrr_at_20_std
value: -20.272216587019734
- type: nauc_mrr_at_3_diff1
value: 74.21419190161502
- type: nauc_mrr_at_3_max
value: 65.64993368062119
- type: nauc_mrr_at_3_std
value: -21.34641749007071
- type: nauc_mrr_at_5_diff1
value: 74.0119419385777
- type: nauc_mrr_at_5_max
value: 65.69809416369732
- type: nauc_mrr_at_5_std
value: -21.16901556082261
- type: nauc_ndcg_at_1000_diff1
value: 73.29396365944277
- type: nauc_ndcg_at_1000_max
value: 66.44879592109541
- type: nauc_ndcg_at_1000_std
value: -19.285991058788195
- type: nauc_ndcg_at_100_diff1
value: 73.0159172721162
- type: nauc_ndcg_at_100_max
value: 66.76216389231388
- type: nauc_ndcg_at_100_std
value: -18.27931368094887
- type: nauc_ndcg_at_10_diff1
value: 72.42096650774693
- type: nauc_ndcg_at_10_max
value: 67.48592688463306
- type: nauc_ndcg_at_10_std
value: -18.91453756077581
- type: nauc_ndcg_at_1_diff1
value: 76.73627284218519
- type: nauc_ndcg_at_1_max
value: 62.94957512510876
- type: nauc_ndcg_at_1_std
value: -20.99649749330682
- type: nauc_ndcg_at_20_diff1
value: 72.53699362385684
- type: nauc_ndcg_at_20_max
value: 67.22763976357872
- type: nauc_ndcg_at_20_std
value: -18.299910635008338
- type: nauc_ndcg_at_3_diff1
value: 73.3698453761989
- type: nauc_ndcg_at_3_max
value: 66.71056987289383
- type: nauc_ndcg_at_3_std
value: -21.405154376652803
- type: nauc_ndcg_at_5_diff1
value: 72.9491030712935
- type: nauc_ndcg_at_5_max
value: 66.85786103137077
- type: nauc_ndcg_at_5_std
value: -21.04005053344073
- type: nauc_precision_at_1000_diff1
value: 17.02462370967451
- type: nauc_precision_at_1000_max
value: 48.03260752496052
- type: nauc_precision_at_1000_std
value: 87.56077915079334
- type: nauc_precision_at_100_diff1
value: 58.590352501194985
- type: nauc_precision_at_100_max
value: 78.2649015433222
- type: nauc_precision_at_100_std
value: 28.05030453158992
- type: nauc_precision_at_10_diff1
value: 64.89497928764766
- type: nauc_precision_at_10_max
value: 75.93257124951242
- type: nauc_precision_at_10_std
value: -9.825306994117462
- type: nauc_precision_at_1_diff1
value: 76.73627284218519
- type: nauc_precision_at_1_max
value: 62.94957512510876
- type: nauc_precision_at_1_std
value: -20.99649749330682
- type: nauc_precision_at_20_diff1
value: 62.11366204321558
- type: nauc_precision_at_20_max
value: 75.9571427846493
- type: nauc_precision_at_20_std
value: -0.94585212808191
- type: nauc_precision_at_3_diff1
value: 70.52940972112398
- type: nauc_precision_at_3_max
value: 70.3402053170779
- type: nauc_precision_at_3_std
value: -21.579778424241304
- type: nauc_precision_at_5_diff1
value: 68.78962580223575
- type: nauc_precision_at_5_max
value: 71.41410894398376
- type: nauc_precision_at_5_std
value: -20.415603405161956
- type: nauc_recall_at_1000_diff1
value: 55.88625447348128
- type: nauc_recall_at_1000_max
value: 100.0
- type: nauc_recall_at_1000_std
value: 100.0
- type: nauc_recall_at_100_diff1
value: 61.17942268389525
- type: nauc_recall_at_100_max
value: 81.12207841563487
- type: nauc_recall_at_100_std
value: 27.141215257528113
- type: nauc_recall_at_10_diff1
value: 64.8949792876478
- type: nauc_recall_at_10_max
value: 75.93257124951249
- type: nauc_recall_at_10_std
value: -9.825306994117323
- type: nauc_recall_at_1_diff1
value: 76.73627284218519
- type: nauc_recall_at_1_max
value: 62.94957512510876
- type: nauc_recall_at_1_std
value: -20.99649749330682
- type: nauc_recall_at_20_diff1
value: 63.07808719241162
- type: nauc_recall_at_20_max
value: 76.96808746317542
- type: nauc_recall_at_20_std
value: -1.5235053258631275
- type: nauc_recall_at_3_diff1
value: 70.52940972112405
- type: nauc_recall_at_3_max
value: 70.3402053170779
- type: nauc_recall_at_3_std
value: -21.57977842424124
- type: nauc_recall_at_5_diff1
value: 68.78962580223575
- type: nauc_recall_at_5_max
value: 71.41410894398392
- type: nauc_recall_at_5_std
value: -20.415603405161793
- type: ndcg_at_1
value: 58.423
- type: ndcg_at_10
value: 70.645
- type: ndcg_at_100
value: 73.277
- type: ndcg_at_1000
value: 73.785
- type: ndcg_at_20
value: 71.918
- type: ndcg_at_3
value: 66.679
- type: ndcg_at_5
value: 68.72200000000001
- type: precision_at_1
value: 58.423
- type: precision_at_10
value: 8.338
- type: precision_at_100
value: 0.959
- type: precision_at_1000
value: 0.1
- type: precision_at_20
value: 4.423
- type: precision_at_3
value: 24.113
- type: precision_at_5
value: 15.47
- type: recall_at_1
value: 58.423
- type: recall_at_10
value: 83.38
- type: recall_at_100
value: 95.887
- type: recall_at_1000
value: 99.831
- type: recall_at_20
value: 88.39399999999999
- type: recall_at_3
value: 72.33800000000001
- type: recall_at_5
value: 77.352
- task:
type: Retrieval
dataset:
name: MTEB MLQARetrieval (deu-eng)
type: facebook/mlqa
config: deu-eng
split: test
revision: 397ed406c1a7902140303e7faf60fff35b58d285
metrics:
- type: main_score
value: 67.067
- type: map_at_1
value: 55.861000000000004
- type: map_at_10
value: 63.42100000000001
- type: map_at_100
value: 64.03
- type: map_at_1000
value: 64.05999999999999
- type: map_at_20
value: 63.819
- type: map_at_3
value: 61.773
- type: map_at_5
value: 62.736999999999995
- type: mrr_at_1
value: 55.88300465322402
- type: mrr_at_10
value: 63.43111082973707
- type: mrr_at_100
value: 64.03962373590272
- type: mrr_at_1000
value: 64.0698259866376
- type: mrr_at_20
value: 63.82871766489112
- type: mrr_at_3
value: 61.78447448112865
- type: mrr_at_5
value: 62.74835659945346
- type: nauc_map_at_1000_diff1
value: 74.58505763417352
- type: nauc_map_at_1000_max
value: 66.26060764852198
- type: nauc_map_at_1000_std
value: -16.896178230873897
- type: nauc_map_at_100_diff1
value: 74.57057487892857
- type: nauc_map_at_100_max
value: 66.26600433283826
- type: nauc_map_at_100_std
value: -16.87596113104189
- type: nauc_map_at_10_diff1
value: 74.53453636322749
- type: nauc_map_at_10_max
value: 66.27501737773804
- type: nauc_map_at_10_std
value: -17.178743257781775
- type: nauc_map_at_1_diff1
value: 77.63067209375254
- type: nauc_map_at_1_max
value: 64.17718675702672
- type: nauc_map_at_1_std
value: -17.639521106853717
- type: nauc_map_at_20_diff1
value: 74.52007402431164
- type: nauc_map_at_20_max
value: 66.28276291359268
- type: nauc_map_at_20_std
value: -16.939292897754758
- type: nauc_map_at_3_diff1
value: 74.79187974631951
- type: nauc_map_at_3_max
value: 66.23256568210611
- type: nauc_map_at_3_std
value: -17.894889918934112
- type: nauc_map_at_5_diff1
value: 74.63011328882517
- type: nauc_map_at_5_max
value: 66.35411054978499
- type: nauc_map_at_5_std
value: -17.50140342194211
- type: nauc_mrr_at_1000_diff1
value: 74.57520089771667
- type: nauc_mrr_at_1000_max
value: 66.27270912845914
- type: nauc_mrr_at_1000_std
value: -16.84012675362397
- type: nauc_mrr_at_100_diff1
value: 74.56070964572156
- type: nauc_mrr_at_100_max
value: 66.2780701126926
- type: nauc_mrr_at_100_std
value: -16.820035083069865
- type: nauc_mrr_at_10_diff1
value: 74.52455978435117
- type: nauc_mrr_at_10_max
value: 66.28697244023137
- type: nauc_mrr_at_10_std
value: -17.122477723330523
- type: nauc_mrr_at_1_diff1
value: 77.60643512422061
- type: nauc_mrr_at_1_max
value: 64.21736966061896
- type: nauc_mrr_at_1_std
value: -17.56627338275146
- type: nauc_mrr_at_20_diff1
value: 74.5099814266373
- type: nauc_mrr_at_20_max
value: 66.29485560556576
- type: nauc_mrr_at_20_std
value: -16.882350027335306
- type: nauc_mrr_at_3_diff1
value: 74.78132817375507
- type: nauc_mrr_at_3_max
value: 66.24761860047623
- type: nauc_mrr_at_3_std
value: -17.833128575678998
- type: nauc_mrr_at_5_diff1
value: 74.6193031207433
- type: nauc_mrr_at_5_max
value: 66.36951764432901
- type: nauc_mrr_at_5_std
value: -17.438203106324227
- type: nauc_ndcg_at_1000_diff1
value: 73.79386161629151
- type: nauc_ndcg_at_1000_max
value: 66.84013038018082
- type: nauc_ndcg_at_1000_std
value: -15.387358822700667
- type: nauc_ndcg_at_100_diff1
value: 73.36132885277745
- type: nauc_ndcg_at_100_max
value: 67.04416926901568
- type: nauc_ndcg_at_100_std
value: -14.503256942521972
- type: nauc_ndcg_at_10_diff1
value: 73.11847332785027
- type: nauc_ndcg_at_10_max
value: 67.02149621303091
- type: nauc_ndcg_at_10_std
value: -16.142234662067782
- type: nauc_ndcg_at_1_diff1
value: 77.60643512422061
- type: nauc_ndcg_at_1_max
value: 64.21736966061896
- type: nauc_ndcg_at_1_std
value: -17.56627338275146
- type: nauc_ndcg_at_20_diff1
value: 72.97961452569768
- type: nauc_ndcg_at_20_max
value: 67.12369127081152
- type: nauc_ndcg_at_20_std
value: -15.11921773223936
- type: nauc_ndcg_at_3_diff1
value: 73.77769312598772
- type: nauc_ndcg_at_3_max
value: 66.94438755852309
- type: nauc_ndcg_at_3_std
value: -17.75960443830741
- type: nauc_ndcg_at_5_diff1
value: 73.43991209562891
- type: nauc_ndcg_at_5_max
value: 67.21682951737418
- type: nauc_ndcg_at_5_std
value: -17.013510008231805
- type: nauc_precision_at_1000_diff1
value: 51.30633281948362
- type: nauc_precision_at_1000_max
value: 76.78675288883846
- type: nauc_precision_at_1000_std
value: 71.70041985304397
- type: nauc_precision_at_100_diff1
value: 59.86656455853326
- type: nauc_precision_at_100_max
value: 74.41958422732161
- type: nauc_precision_at_100_std
value: 22.098920296069124
- type: nauc_precision_at_10_diff1
value: 66.4696166928741
- type: nauc_precision_at_10_max
value: 69.88463108697104
- type: nauc_precision_at_10_std
value: -10.707950954702742
- type: nauc_precision_at_1_diff1
value: 77.60643512422061
- type: nauc_precision_at_1_max
value: 64.21736966061896
- type: nauc_precision_at_1_std
value: -17.56627338275146
- type: nauc_precision_at_20_diff1
value: 63.45094585276983
- type: nauc_precision_at_20_max
value: 71.57741245347195
- type: nauc_precision_at_20_std
value: -2.2211545419051744
- type: nauc_precision_at_3_diff1
value: 70.28060818081384
- type: nauc_precision_at_3_max
value: 69.22652927816439
- type: nauc_precision_at_3_std
value: -17.158576243559434
- type: nauc_precision_at_5_diff1
value: 68.90765418427162
- type: nauc_precision_at_5_max
value: 70.32585273389111
- type: nauc_precision_at_5_std
value: -14.950363729664524
- type: nauc_recall_at_1000_diff1
value: 65.11255117927331
- type: nauc_recall_at_1000_max
value: 88.35641213283338
- type: nauc_recall_at_1000_std
value: 69.89792573640547
- type: nauc_recall_at_100_diff1
value: 61.46376457272238
- type: nauc_recall_at_100_max
value: 75.48265142243015
- type: nauc_recall_at_100_std
value: 21.223182712042178
- type: nauc_recall_at_10_diff1
value: 66.89353375308997
- type: nauc_recall_at_10_max
value: 70.06655416883785
- type: nauc_recall_at_10_std
value: -11.100871879439435
- type: nauc_recall_at_1_diff1
value: 77.63067209375254
- type: nauc_recall_at_1_max
value: 64.17718675702672
- type: nauc_recall_at_1_std
value: -17.639521106853717
- type: nauc_recall_at_20_diff1
value: 63.98532276331878
- type: nauc_recall_at_20_max
value: 71.81562599791899
- type: nauc_recall_at_20_std
value: -2.696537977147695
- type: nauc_recall_at_3_diff1
value: 70.4507655865698
- type: nauc_recall_at_3_max
value: 69.25705030141037
- type: nauc_recall_at_3_std
value: -17.299948348202836
- type: nauc_recall_at_5_diff1
value: 69.09152857901888
- type: nauc_recall_at_5_max
value: 70.35609636026405
- type: nauc_recall_at_5_std
value: -15.105012139255896
- type: ndcg_at_1
value: 55.883
- type: ndcg_at_10
value: 67.067
- type: ndcg_at_100
value: 70.07
- type: ndcg_at_1000
value: 70.875
- type: ndcg_at_20
value: 68.498
- type: ndcg_at_3
value: 63.666
- type: ndcg_at_5
value: 65.40599999999999
- type: precision_at_1
value: 55.883
- type: precision_at_10
value: 7.8549999999999995
- type: precision_at_100
value: 0.928
- type: precision_at_1000
value: 0.099
- type: precision_at_20
value: 4.2090000000000005
- type: precision_at_3
value: 23.052
- type: precision_at_5
value: 14.677999999999999
- type: recall_at_1
value: 55.861000000000004
- type: recall_at_10
value: 78.495
- type: recall_at_100
value: 92.688
- type: recall_at_1000
value: 99.02499999999999
- type: recall_at_20
value: 84.124
- type: recall_at_3
value: 69.123
- type: recall_at_5
value: 73.355
- task:
type: Retrieval
dataset:
name: MTEB MLQARetrieval (spa-deu)
type: facebook/mlqa
config: spa-deu
split: test
revision: 397ed406c1a7902140303e7faf60fff35b58d285
metrics:
- type: main_score
value: 73.90299999999999
- type: map_at_1
value: 61.236000000000004
- type: map_at_10
value: 69.88799999999999
- type: map_at_100
value: 70.319
- type: map_at_1000
value: 70.341
- type: map_at_20
value: 70.16799999999999
- type: map_at_3
value: 68.104
- type: map_at_5
value: 69.164
- type: mrr_at_1
value: 61.2739571589628
- type: mrr_at_10
value: 69.92589162684993
- type: mrr_at_100
value: 70.35245455509234
- type: mrr_at_1000
value: 70.37438351396742
- type: mrr_at_20
value: 70.20247469915404
- type: mrr_at_3
value: 68.14167606163099
- type: mrr_at_5
value: 69.20142803457354
- type: nauc_map_at_1000_diff1
value: 74.70416754842327
- type: nauc_map_at_1000_max
value: 65.86915994583384
- type: nauc_map_at_1000_std
value: -19.04437483534443
- type: nauc_map_at_100_diff1
value: 74.70011798058674
- type: nauc_map_at_100_max
value: 65.88507779167188
- type: nauc_map_at_100_std
value: -19.018670970643786
- type: nauc_map_at_10_diff1
value: 74.6362126804427
- type: nauc_map_at_10_max
value: 66.05733054427198
- type: nauc_map_at_10_std
value: -19.034317737897354
- type: nauc_map_at_1_diff1
value: 77.24970536833601
- type: nauc_map_at_1_max
value: 62.07820573048406
- type: nauc_map_at_1_std
value: -20.917086586335078
- type: nauc_map_at_20_diff1
value: 74.64113920401083
- type: nauc_map_at_20_max
value: 65.89991740166793
- type: nauc_map_at_20_std
value: -19.09987515041243
- type: nauc_map_at_3_diff1
value: 74.6518162332119
- type: nauc_map_at_3_max
value: 66.10312348194024
- type: nauc_map_at_3_std
value: -18.95881457716116
- type: nauc_map_at_5_diff1
value: 74.55141020670321
- type: nauc_map_at_5_max
value: 65.94345752979342
- type: nauc_map_at_5_std
value: -19.453976877992304
- type: nauc_mrr_at_1000_diff1
value: 74.64458488344088
- type: nauc_mrr_at_1000_max
value: 65.84575328456057
- type: nauc_mrr_at_1000_std
value: -18.901614615119904
- type: nauc_mrr_at_100_diff1
value: 74.64058497924627
- type: nauc_mrr_at_100_max
value: 65.86170461767928
- type: nauc_mrr_at_100_std
value: -18.87601697091505
- type: nauc_mrr_at_10_diff1
value: 74.57266634464752
- type: nauc_mrr_at_10_max
value: 66.03331587645152
- type: nauc_mrr_at_10_std
value: -18.87888060105393
- type: nauc_mrr_at_1_diff1
value: 77.19578272647183
- type: nauc_mrr_at_1_max
value: 62.05252035478773
- type: nauc_mrr_at_1_std
value: -20.790530940625267
- type: nauc_mrr_at_20_diff1
value: 74.5808171250021
- type: nauc_mrr_at_20_max
value: 65.87643606587798
- type: nauc_mrr_at_20_std
value: -18.95476583474199
- type: nauc_mrr_at_3_diff1
value: 74.5917053289191
- type: nauc_mrr_at_3_max
value: 66.08044079438714
- type: nauc_mrr_at_3_std
value: -18.81168463163586
- type: nauc_mrr_at_5_diff1
value: 74.48934579694608
- type: nauc_mrr_at_5_max
value: 65.91993162383771
- type: nauc_mrr_at_5_std
value: -19.302710791338797
- type: nauc_ndcg_at_1000_diff1
value: 74.20191283992186
- type: nauc_ndcg_at_1000_max
value: 66.60831175771229
- type: nauc_ndcg_at_1000_std
value: -18.175208725175484
- type: nauc_ndcg_at_100_diff1
value: 74.07713451642955
- type: nauc_ndcg_at_100_max
value: 67.02028626335476
- type: nauc_ndcg_at_100_std
value: -17.36560972181693
- type: nauc_ndcg_at_10_diff1
value: 73.63235521598476
- type: nauc_ndcg_at_10_max
value: 67.8118473312638
- type: nauc_ndcg_at_10_std
value: -17.647560577355915
- type: nauc_ndcg_at_1_diff1
value: 77.19578272647183
- type: nauc_ndcg_at_1_max
value: 62.05252035478773
- type: nauc_ndcg_at_1_std
value: -20.790530940625267
- type: nauc_ndcg_at_20_diff1
value: 73.65300308228291
- type: nauc_ndcg_at_20_max
value: 67.18353402731985
- type: nauc_ndcg_at_20_std
value: -17.9240756389792
- type: nauc_ndcg_at_3_diff1
value: 73.73764900202292
- type: nauc_ndcg_at_3_max
value: 67.60840957876889
- type: nauc_ndcg_at_3_std
value: -17.962667543518933
- type: nauc_ndcg_at_5_diff1
value: 73.49040500302092
- type: nauc_ndcg_at_5_max
value: 67.41251918514402
- type: nauc_ndcg_at_5_std
value: -18.851877225955523
- type: nauc_precision_at_1000_diff1
value: -18.652906102973922
- type: nauc_precision_at_1000_max
value: 2.1701672475574885
- type: nauc_precision_at_1000_std
value: 61.713411950188835
- type: nauc_precision_at_100_diff1
value: 62.37565302288498
- type: nauc_precision_at_100_max
value: 76.96921843049006
- type: nauc_precision_at_100_std
value: 19.152009040219678
- type: nauc_precision_at_10_diff1
value: 68.14047344105212
- type: nauc_precision_at_10_max
value: 77.7177273849099
- type: nauc_precision_at_10_std
value: -9.124325941493698
- type: nauc_precision_at_1_diff1
value: 77.19578272647183
- type: nauc_precision_at_1_max
value: 62.05252035478773
- type: nauc_precision_at_1_std
value: -20.790530940625267
- type: nauc_precision_at_20_diff1
value: 65.38487456362745
- type: nauc_precision_at_20_max
value: 74.61122933443669
- type: nauc_precision_at_20_std
value: -8.129775929648341
- type: nauc_precision_at_3_diff1
value: 70.45937744142297
- type: nauc_precision_at_3_max
value: 73.03004233073901
- type: nauc_precision_at_3_std
value: -14.246554579025158
- type: nauc_precision_at_5_diff1
value: 69.02821772428955
- type: nauc_precision_at_5_max
value: 73.52949774726446
- type: nauc_precision_at_5_std
value: -16.355747231517757
- type: nauc_recall_at_1000_diff1
value: 35.804192824985755
- type: nauc_recall_at_1000_max
value: 61.367785756485894
- type: nauc_recall_at_1000_std
value: 54.01380822466869
- type: nauc_recall_at_100_diff1
value: 67.96210883597479
- type: nauc_recall_at_100_max
value: 82.38124823732169
- type: nauc_recall_at_100_std
value: 16.814922595309966
- type: nauc_recall_at_10_diff1
value: 68.21964459634341
- type: nauc_recall_at_10_max
value: 77.68301934858845
- type: nauc_recall_at_10_std
value: -9.430792913885066
- type: nauc_recall_at_1_diff1
value: 77.24970536833601
- type: nauc_recall_at_1_max
value: 62.07820573048406
- type: nauc_recall_at_1_std
value: -20.917086586335078
- type: nauc_recall_at_20_diff1
value: 66.60569906579487
- type: nauc_recall_at_20_max
value: 75.66163186604354
- type: nauc_recall_at_20_std
value: -9.09826205489828
- type: nauc_recall_at_3_diff1
value: 70.52323701841641
- type: nauc_recall_at_3_max
value: 73.03478107411232
- type: nauc_recall_at_3_std
value: -14.432325989967962
- type: nauc_recall_at_5_diff1
value: 69.08521261524373
- type: nauc_recall_at_5_max
value: 73.51150270382094
- type: nauc_recall_at_5_std
value: -16.569387503524368
- type: ndcg_at_1
value: 61.273999999999994
- type: ndcg_at_10
value: 73.90299999999999
- type: ndcg_at_100
value: 75.983
- type: ndcg_at_1000
value: 76.488
- type: ndcg_at_20
value: 74.921
- type: ndcg_at_3
value: 70.277
- type: ndcg_at_5
value: 72.172
- type: precision_at_1
value: 61.273999999999994
- type: precision_at_10
value: 8.641
- type: precision_at_100
value: 0.962
- type: precision_at_1000
value: 0.1
- type: precision_at_20
value: 4.524
- type: precision_at_3
value: 25.517
- type: precision_at_5
value: 16.223000000000003
- type: recall_at_1
value: 61.236000000000004
- type: recall_at_10
value: 86.37700000000001
- type: recall_at_100
value: 96.054
- type: recall_at_1000
value: 99.887
- type: recall_at_20
value: 90.398
- type: recall_at_3
value: 76.51299999999999
- type: recall_at_5
value: 81.07900000000001
- task:
type: Retrieval
dataset:
name: MTEB MLQARetrieval (spa-spa)
type: facebook/mlqa
config: spa-spa
split: test
revision: 397ed406c1a7902140303e7faf60fff35b58d285
metrics:
- type: main_score
value: 68.632
- type: map_at_1
value: 57.046
- type: map_at_10
value: 64.869
- type: map_at_100
value: 65.384
- type: map_at_1000
value: 65.413
- type: map_at_20
value: 65.185
- type: map_at_3
value: 63.178
- type: map_at_5
value: 64.12
- type: mrr_at_1
value: 57.05579889544848
- type: mrr_at_10
value: 64.8806425382317
- type: mrr_at_100
value: 65.39469233244084
- type: mrr_at_1000
value: 65.42342199403159
- type: mrr_at_20
value: 65.19634815919534
- type: mrr_at_3
value: 63.18796419729591
- type: mrr_at_5
value: 64.13159398209874
- type: nauc_map_at_1000_diff1
value: 73.23803038674018
- type: nauc_map_at_1000_max
value: 67.44156201421714
- type: nauc_map_at_1000_std
value: -8.60143026450049
- type: nauc_map_at_100_diff1
value: 73.22575613034235
- type: nauc_map_at_100_max
value: 67.44735143420195
- type: nauc_map_at_100_std
value: -8.576905069492895
- type: nauc_map_at_10_diff1
value: 73.11950129610865
- type: nauc_map_at_10_max
value: 67.45107232305055
- type: nauc_map_at_10_std
value: -8.799837857015392
- type: nauc_map_at_1_diff1
value: 76.18354072047988
- type: nauc_map_at_1_max
value: 65.03342186728786
- type: nauc_map_at_1_std
value: -10.867650288695796
- type: nauc_map_at_20_diff1
value: 73.21570748770948
- type: nauc_map_at_20_max
value: 67.50340321088724
- type: nauc_map_at_20_std
value: -8.594057184944676
- type: nauc_map_at_3_diff1
value: 73.17239276163892
- type: nauc_map_at_3_max
value: 67.06319504819103
- type: nauc_map_at_3_std
value: -9.883216310270528
- type: nauc_map_at_5_diff1
value: 73.11913507367727
- type: nauc_map_at_5_max
value: 67.27497019567078
- type: nauc_map_at_5_std
value: -9.497714822103118
- type: nauc_mrr_at_1000_diff1
value: 73.22971233311306
- type: nauc_mrr_at_1000_max
value: 67.42977229057223
- type: nauc_mrr_at_1000_std
value: -8.550068702273297
- type: nauc_mrr_at_100_diff1
value: 73.21744467317815
- type: nauc_mrr_at_100_max
value: 67.43557491068093
- type: nauc_mrr_at_100_std
value: -8.52559275190607
- type: nauc_mrr_at_10_diff1
value: 73.11075619726137
- type: nauc_mrr_at_10_max
value: 67.43889760205286
- type: nauc_mrr_at_10_std
value: -8.74617232559183
- type: nauc_mrr_at_1_diff1
value: 76.17529975949547
- type: nauc_mrr_at_1_max
value: 65.02401127001608
- type: nauc_mrr_at_1_std
value: -10.817814457633952
- type: nauc_mrr_at_20_diff1
value: 73.20689275225138
- type: nauc_mrr_at_20_max
value: 67.49111752272192
- type: nauc_mrr_at_20_std
value: -8.539827528410353
- type: nauc_mrr_at_3_diff1
value: 73.16291729623958
- type: nauc_mrr_at_3_max
value: 67.05300993427998
- type: nauc_mrr_at_3_std
value: -9.827915885680811
- type: nauc_mrr_at_5_diff1
value: 73.11055686484109
- type: nauc_mrr_at_5_max
value: 67.26299851089122
- type: nauc_mrr_at_5_std
value: -9.445190276650903
- type: nauc_ndcg_at_1000_diff1
value: 72.58833638407177
- type: nauc_ndcg_at_1000_max
value: 68.10447506371374
- type: nauc_ndcg_at_1000_std
value: -6.910306241546282
- type: nauc_ndcg_at_100_diff1
value: 72.24524849631476
- type: nauc_ndcg_at_100_max
value: 68.30659210081238
- type: nauc_ndcg_at_100_std
value: -6.04305364268931
- type: nauc_ndcg_at_10_diff1
value: 71.87363502582961
- type: nauc_ndcg_at_10_max
value: 68.5010009653693
- type: nauc_ndcg_at_10_std
value: -7.021281296450588
- type: nauc_ndcg_at_1_diff1
value: 76.17529975949547
- type: nauc_ndcg_at_1_max
value: 65.02401127001608
- type: nauc_ndcg_at_1_std
value: -10.817814457633952
- type: nauc_ndcg_at_20_diff1
value: 72.21241010439327
- type: nauc_ndcg_at_20_max
value: 68.71743274030551
- type: nauc_ndcg_at_20_std
value: -6.186629577195946
- type: nauc_ndcg_at_3_diff1
value: 72.08204674794459
- type: nauc_ndcg_at_3_max
value: 67.5958365046156
- type: nauc_ndcg_at_3_std
value: -9.576418336610345
- type: nauc_ndcg_at_5_diff1
value: 71.93179095844508
- type: nauc_ndcg_at_5_max
value: 68.01914639754217
- type: nauc_ndcg_at_5_std
value: -8.833768332910777
- type: nauc_precision_at_1000_diff1
value: 63.0051360227489
- type: nauc_precision_at_1000_max
value: 79.93532442313229
- type: nauc_precision_at_1000_std
value: 52.869517607133254
- type: nauc_precision_at_100_diff1
value: 62.43301501857154
- type: nauc_precision_at_100_max
value: 75.57280416668183
- type: nauc_precision_at_100_std
value: 26.758300486132747
- type: nauc_precision_at_10_diff1
value: 66.29806375971134
- type: nauc_precision_at_10_max
value: 73.40301413754797
- type: nauc_precision_at_10_std
value: 1.9858547295235462
- type: nauc_precision_at_1_diff1
value: 76.17529975949547
- type: nauc_precision_at_1_max
value: 65.02401127001608
- type: nauc_precision_at_1_std
value: -10.817814457633952
- type: nauc_precision_at_20_diff1
value: 67.05111836051105
- type: nauc_precision_at_20_max
value: 76.09783190824155
- type: nauc_precision_at_20_std
value: 9.906010659515564
- type: nauc_precision_at_3_diff1
value: 68.44186679250453
- type: nauc_precision_at_3_max
value: 69.30301351119388
- type: nauc_precision_at_3_std
value: -8.566522518882348
- type: nauc_precision_at_5_diff1
value: 67.51737199297388
- type: nauc_precision_at_5_max
value: 70.75887601590472
- type: nauc_precision_at_5_std
value: -6.278983102710238
- type: nauc_recall_at_1000_diff1
value: 65.12360093170948
- type: nauc_recall_at_1000_max
value: 82.60209843191132
- type: nauc_recall_at_1000_std
value: 51.740179583368636
- type: nauc_recall_at_100_diff1
value: 62.82007697326819
- type: nauc_recall_at_100_max
value: 76.04844844677562
- type: nauc_recall_at_100_std
value: 26.4678415019248
- type: nauc_recall_at_10_diff1
value: 66.28557566848767
- type: nauc_recall_at_10_max
value: 73.40302709828738
- type: nauc_recall_at_10_std
value: 1.9224272854613582
- type: nauc_recall_at_1_diff1
value: 76.18354072047988
- type: nauc_recall_at_1_max
value: 65.03342186728786
- type: nauc_recall_at_1_std
value: -10.867650288695796
- type: nauc_recall_at_20_diff1
value: 67.03430451094992
- type: nauc_recall_at_20_max
value: 76.09474005171319
- type: nauc_recall_at_20_std
value: 9.815888637851074
- type: nauc_recall_at_3_diff1
value: 68.44411411344718
- type: nauc_recall_at_3_max
value: 69.30502737137265
- type: nauc_recall_at_3_std
value: -8.629526329714132
- type: nauc_recall_at_5_diff1
value: 67.51469265953514
- type: nauc_recall_at_5_max
value: 70.76969893818111
- type: nauc_recall_at_5_std
value: -6.325600167105444
- type: ndcg_at_1
value: 57.056
- type: ndcg_at_10
value: 68.632
- type: ndcg_at_100
value: 71.202
- type: ndcg_at_1000
value: 71.97099999999999
- type: ndcg_at_20
value: 69.785
- type: ndcg_at_3
value: 65.131
- type: ndcg_at_5
value: 66.834
- type: precision_at_1
value: 57.056
- type: precision_at_10
value: 8.044
- type: precision_at_100
value: 0.9259999999999999
- type: precision_at_1000
value: 0.099
- type: precision_at_20
value: 4.251
- type: precision_at_3
value: 23.589
- type: precision_at_5
value: 14.984
- type: recall_at_1
value: 57.046
- type: recall_at_10
value: 80.423
- type: recall_at_100
value: 92.582
- type: recall_at_1000
value: 98.638
- type: recall_at_20
value: 84.993
- type: recall_at_3
value: 70.758
- type: recall_at_5
value: 74.9
- task:
type: Retrieval
dataset:
name: MTEB MLQARetrieval (spa-eng)
type: facebook/mlqa
config: spa-eng
split: test
revision: 397ed406c1a7902140303e7faf60fff35b58d285
metrics:
- type: main_score
value: 68.765
- type: map_at_1
value: 56.538999999999994
- type: map_at_10
value: 64.816
- type: map_at_100
value: 65.325
- type: map_at_1000
value: 65.352
- type: map_at_20
value: 65.113
- type: map_at_3
value: 62.934999999999995
- type: map_at_5
value: 64.063
- type: mrr_at_1
value: 56.539120502569965
- type: mrr_at_10
value: 64.81561556661505
- type: mrr_at_100
value: 65.32464238613954
- type: mrr_at_1000
value: 65.35206516602133
- type: mrr_at_20
value: 65.11270445292227
- type: mrr_at_3
value: 62.935465448315384
- type: mrr_at_5
value: 64.06339234723022
- type: nauc_map_at_1000_diff1
value: 73.20701050428072
- type: nauc_map_at_1000_max
value: 67.32797480614404
- type: nauc_map_at_1000_std
value: -6.211540626528362
- type: nauc_map_at_100_diff1
value: 73.19497683923063
- type: nauc_map_at_100_max
value: 67.33392646467817
- type: nauc_map_at_100_std
value: -6.196671563900051
- type: nauc_map_at_10_diff1
value: 73.16010547612956
- type: nauc_map_at_10_max
value: 67.37793741307372
- type: nauc_map_at_10_std
value: -6.3443240322521675
- type: nauc_map_at_1_diff1
value: 76.63696578575964
- type: nauc_map_at_1_max
value: 65.08189618178105
- type: nauc_map_at_1_std
value: -8.594195451782733
- type: nauc_map_at_20_diff1
value: 73.15233479381568
- type: nauc_map_at_20_max
value: 67.3679607256072
- type: nauc_map_at_20_std
value: -6.175928265286352
- type: nauc_map_at_3_diff1
value: 73.14853380980746
- type: nauc_map_at_3_max
value: 67.10354198073468
- type: nauc_map_at_3_std
value: -7.409679815529866
- type: nauc_map_at_5_diff1
value: 73.13425961877715
- type: nauc_map_at_5_max
value: 67.22452899371224
- type: nauc_map_at_5_std
value: -6.895257774506354
- type: nauc_mrr_at_1000_diff1
value: 73.20701050428072
- type: nauc_mrr_at_1000_max
value: 67.32797480614404
- type: nauc_mrr_at_1000_std
value: -6.211540626528362
- type: nauc_mrr_at_100_diff1
value: 73.19497683923063
- type: nauc_mrr_at_100_max
value: 67.33392646467817
- type: nauc_mrr_at_100_std
value: -6.196671563900051
- type: nauc_mrr_at_10_diff1
value: 73.16010547612956
- type: nauc_mrr_at_10_max
value: 67.37793741307372
- type: nauc_mrr_at_10_std
value: -6.3443240322521675
- type: nauc_mrr_at_1_diff1
value: 76.63696578575964
- type: nauc_mrr_at_1_max
value: 65.08189618178105
- type: nauc_mrr_at_1_std
value: -8.594195451782733
- type: nauc_mrr_at_20_diff1
value: 73.15233479381568
- type: nauc_mrr_at_20_max
value: 67.3679607256072
- type: nauc_mrr_at_20_std
value: -6.175928265286352
- type: nauc_mrr_at_3_diff1
value: 73.14853380980746
- type: nauc_mrr_at_3_max
value: 67.10354198073468
- type: nauc_mrr_at_3_std
value: -7.409679815529866
- type: nauc_mrr_at_5_diff1
value: 73.13425961877715
- type: nauc_mrr_at_5_max
value: 67.22452899371224
- type: nauc_mrr_at_5_std
value: -6.895257774506354
- type: nauc_ndcg_at_1000_diff1
value: 72.44364625096874
- type: nauc_ndcg_at_1000_max
value: 67.93635761141552
- type: nauc_ndcg_at_1000_std
value: -4.616429464350954
- type: nauc_ndcg_at_100_diff1
value: 72.11352383758482
- type: nauc_ndcg_at_100_max
value: 68.1627312575955
- type: nauc_ndcg_at_100_std
value: -3.894213672131282
- type: nauc_ndcg_at_10_diff1
value: 71.8526850770812
- type: nauc_ndcg_at_10_max
value: 68.41366561888562
- type: nauc_ndcg_at_10_std
value: -4.472146861145989
- type: nauc_ndcg_at_1_diff1
value: 76.63696578575964
- type: nauc_ndcg_at_1_max
value: 65.08189618178105
- type: nauc_ndcg_at_1_std
value: -8.594195451782733
- type: nauc_ndcg_at_20_diff1
value: 71.76464418138866
- type: nauc_ndcg_at_20_max
value: 68.41174963313698
- type: nauc_ndcg_at_20_std
value: -3.7449762037540157
- type: nauc_ndcg_at_3_diff1
value: 71.93808990683131
- type: nauc_ndcg_at_3_max
value: 67.7010029507334
- type: nauc_ndcg_at_3_std
value: -6.971858419379321
- type: nauc_ndcg_at_5_diff1
value: 71.8505224811326
- type: nauc_ndcg_at_5_max
value: 67.97139549500251
- type: nauc_ndcg_at_5_std
value: -5.958491308070017
- type: nauc_precision_at_1000_diff1
value: 62.20956180320043
- type: nauc_precision_at_1000_max
value: 82.53412670611299
- type: nauc_precision_at_1000_std
value: 55.57278124999575
- type: nauc_precision_at_100_diff1
value: 62.03792857023201
- type: nauc_precision_at_100_max
value: 76.77130713424538
- type: nauc_precision_at_100_std
value: 26.674102719959564
- type: nauc_precision_at_10_diff1
value: 65.89798055049931
- type: nauc_precision_at_10_max
value: 73.41908620140674
- type: nauc_precision_at_10_std
value: 5.21818573283179
- type: nauc_precision_at_1_diff1
value: 76.63696578575964
- type: nauc_precision_at_1_max
value: 65.08189618178105
- type: nauc_precision_at_1_std
value: -8.594195451782733
- type: nauc_precision_at_20_diff1
value: 63.734308542647355
- type: nauc_precision_at_20_max
value: 74.69578825096144
- type: nauc_precision_at_20_std
value: 12.627842502659162
- type: nauc_precision_at_3_diff1
value: 67.91189666671904
- type: nauc_precision_at_3_max
value: 69.64986036783209
- type: nauc_precision_at_3_std
value: -5.505669087429055
- type: nauc_precision_at_5_diff1
value: 67.01880006360248
- type: nauc_precision_at_5_max
value: 70.78916423358686
- type: nauc_precision_at_5_std
value: -2.2273742736401045
- type: nauc_recall_at_1000_diff1
value: 62.20956180319936
- type: nauc_recall_at_1000_max
value: 82.53412670611287
- type: nauc_recall_at_1000_std
value: 55.57278124999549
- type: nauc_recall_at_100_diff1
value: 62.03792857023208
- type: nauc_recall_at_100_max
value: 76.77130713424577
- type: nauc_recall_at_100_std
value: 26.67410271995973
- type: nauc_recall_at_10_diff1
value: 65.8979805504994
- type: nauc_recall_at_10_max
value: 73.41908620140678
- type: nauc_recall_at_10_std
value: 5.2181857328318655
- type: nauc_recall_at_1_diff1
value: 76.63696578575964
- type: nauc_recall_at_1_max
value: 65.08189618178105
- type: nauc_recall_at_1_std
value: -8.594195451782733
- type: nauc_recall_at_20_diff1
value: 63.734308542647334
- type: nauc_recall_at_20_max
value: 74.69578825096123
- type: nauc_recall_at_20_std
value: 12.627842502658982
- type: nauc_recall_at_3_diff1
value: 67.91189666671897
- type: nauc_recall_at_3_max
value: 69.64986036783203
- type: nauc_recall_at_3_std
value: -5.505669087428989
- type: nauc_recall_at_5_diff1
value: 67.01880006360243
- type: nauc_recall_at_5_max
value: 70.78916423358686
- type: nauc_recall_at_5_std
value: -2.227374273640135
- type: ndcg_at_1
value: 56.538999999999994
- type: ndcg_at_10
value: 68.765
- type: ndcg_at_100
value: 71.314
- type: ndcg_at_1000
value: 72.038
- type: ndcg_at_20
value: 69.828
- type: ndcg_at_3
value: 64.937
- type: ndcg_at_5
value: 66.956
- type: precision_at_1
value: 56.538999999999994
- type: precision_at_10
value: 8.113
- type: precision_at_100
value: 0.932
- type: precision_at_1000
value: 0.099
- type: precision_at_20
value: 4.265
- type: precision_at_3
value: 23.567
- type: precision_at_5
value: 15.115
- type: recall_at_1
value: 56.538999999999994
- type: recall_at_10
value: 81.135
- type: recall_at_100
value: 93.223
- type: recall_at_1000
value: 98.896
- type: recall_at_20
value: 85.304
- type: recall_at_3
value: 70.702
- type: recall_at_5
value: 75.576
- task:
type: Retrieval
dataset:
name: MTEB MLQARetrieval (eng-deu)
type: facebook/mlqa
config: eng-deu
split: test
revision: 397ed406c1a7902140303e7faf60fff35b58d285
metrics:
- type: main_score
value: 69.298
- type: map_at_1
value: 58.553
- type: map_at_10
value: 65.769
- type: map_at_100
value: 66.298
- type: map_at_1000
value: 66.328
- type: map_at_20
value: 66.101
- type: map_at_3
value: 64.048
- type: map_at_5
value: 65.09
- type: mrr_at_1
value: 58.564148016840235
- type: mrr_at_10
value: 65.7685997066675
- type: mrr_at_100
value: 66.29874034432214
- type: mrr_at_1000
value: 66.32844979939088
- type: mrr_at_20
value: 66.10120513957821
- type: mrr_at_3
value: 64.04830489696437
- type: mrr_at_5
value: 65.08974074894746
- type: nauc_map_at_1000_diff1
value: 76.8409650183994
- type: nauc_map_at_1000_max
value: 71.86367015521367
- type: nauc_map_at_1000_std
value: -14.464881539957256
- type: nauc_map_at_100_diff1
value: 76.82536521842064
- type: nauc_map_at_100_max
value: 71.86811127965429
- type: nauc_map_at_100_std
value: -14.441105539722244
- type: nauc_map_at_10_diff1
value: 76.75522453447859
- type: nauc_map_at_10_max
value: 71.87677500176706
- type: nauc_map_at_10_std
value: -14.741331625103559
- type: nauc_map_at_1_diff1
value: 79.64060747740989
- type: nauc_map_at_1_max
value: 69.84278563569617
- type: nauc_map_at_1_std
value: -15.936904929655832
- type: nauc_map_at_20_diff1
value: 76.78894776059715
- type: nauc_map_at_20_max
value: 71.89637938044827
- type: nauc_map_at_20_std
value: -14.500564106990769
- type: nauc_map_at_3_diff1
value: 77.20562577450342
- type: nauc_map_at_3_max
value: 71.80578229361525
- type: nauc_map_at_3_std
value: -15.344134588512201
- type: nauc_map_at_5_diff1
value: 77.00480147367867
- type: nauc_map_at_5_max
value: 71.98335924076163
- type: nauc_map_at_5_std
value: -15.16537653041026
- type: nauc_mrr_at_1000_diff1
value: 76.84165367691193
- type: nauc_mrr_at_1000_max
value: 71.8642679499795
- type: nauc_mrr_at_1000_std
value: -14.461717954593158
- type: nauc_mrr_at_100_diff1
value: 76.8263363557998
- type: nauc_mrr_at_100_max
value: 71.86874522368626
- type: nauc_mrr_at_100_std
value: -14.437105168707426
- type: nauc_mrr_at_10_diff1
value: 76.75522453447859
- type: nauc_mrr_at_10_max
value: 71.87677500176706
- type: nauc_mrr_at_10_std
value: -14.741331625103559
- type: nauc_mrr_at_1_diff1
value: 79.65642669321981
- type: nauc_mrr_at_1_max
value: 69.89135358784799
- type: nauc_mrr_at_1_std
value: -15.919357002229589
- type: nauc_mrr_at_20_diff1
value: 76.78883171270601
- type: nauc_mrr_at_20_max
value: 71.89806887245291
- type: nauc_mrr_at_20_std
value: -14.497139746907905
- type: nauc_mrr_at_3_diff1
value: 77.20562577450342
- type: nauc_mrr_at_3_max
value: 71.80578229361525
- type: nauc_mrr_at_3_std
value: -15.344134588512201
- type: nauc_mrr_at_5_diff1
value: 77.00480147367867
- type: nauc_mrr_at_5_max
value: 71.98335924076163
- type: nauc_mrr_at_5_std
value: -15.16537653041026
- type: nauc_ndcg_at_1000_diff1
value: 76.07802417817047
- type: nauc_ndcg_at_1000_max
value: 72.31792804426776
- type: nauc_ndcg_at_1000_std
value: -13.049160715132244
- type: nauc_ndcg_at_100_diff1
value: 75.63343849116544
- type: nauc_ndcg_at_100_max
value: 72.48362076101817
- type: nauc_ndcg_at_100_std
value: -12.089600993516777
- type: nauc_ndcg_at_10_diff1
value: 75.23387929929208
- type: nauc_ndcg_at_10_max
value: 72.51436288271807
- type: nauc_ndcg_at_10_std
value: -13.624132103038104
- type: nauc_ndcg_at_1_diff1
value: 79.65642669321981
- type: nauc_ndcg_at_1_max
value: 69.89135358784799
- type: nauc_ndcg_at_1_std
value: -15.919357002229589
- type: nauc_ndcg_at_20_diff1
value: 75.32926047656296
- type: nauc_ndcg_at_20_max
value: 72.61254165918145
- type: nauc_ndcg_at_20_std
value: -12.683157599238701
- type: nauc_ndcg_at_3_diff1
value: 76.3089337665469
- type: nauc_ndcg_at_3_max
value: 72.40014674426054
- type: nauc_ndcg_at_3_std
value: -15.08624226353458
- type: nauc_ndcg_at_5_diff1
value: 75.88857331641834
- type: nauc_ndcg_at_5_max
value: 72.7719386827224
- type: nauc_ndcg_at_5_std
value: -14.70546521089236
- type: nauc_precision_at_1000_diff1
value: 59.66563879069911
- type: nauc_precision_at_1000_max
value: 74.57123562956772
- type: nauc_precision_at_1000_std
value: 58.61396866718965
- type: nauc_precision_at_100_diff1
value: 62.8695896550042
- type: nauc_precision_at_100_max
value: 77.81408796785
- type: nauc_precision_at_100_std
value: 23.819735672317826
- type: nauc_precision_at_10_diff1
value: 68.08051625224569
- type: nauc_precision_at_10_max
value: 75.14432336036869
- type: nauc_precision_at_10_std
value: -7.97602345252735
- type: nauc_precision_at_1_diff1
value: 79.65642669321981
- type: nauc_precision_at_1_max
value: 69.89135358784799
- type: nauc_precision_at_1_std
value: -15.919357002229589
- type: nauc_precision_at_20_diff1
value: 66.7168005185165
- type: nauc_precision_at_20_max
value: 76.58522761697147
- type: nauc_precision_at_20_std
value: -0.17923428317323292
- type: nauc_precision_at_3_diff1
value: 73.23394851561207
- type: nauc_precision_at_3_max
value: 74.32517846819215
- type: nauc_precision_at_3_std
value: -14.142301336188348
- type: nauc_precision_at_5_diff1
value: 71.5666882547012
- type: nauc_precision_at_5_max
value: 75.71098205440033
- type: nauc_precision_at_5_std
value: -12.808362513638052
- type: nauc_recall_at_1000_diff1
value: 71.73736112325805
- type: nauc_recall_at_1000_max
value: 86.70743436225898
- type: nauc_recall_at_1000_std
value: 54.45802578371167
- type: nauc_recall_at_100_diff1
value: 64.07053861428128
- type: nauc_recall_at_100_max
value: 78.8348308099261
- type: nauc_recall_at_100_std
value: 22.72263677785103
- type: nauc_recall_at_10_diff1
value: 68.20272901407903
- type: nauc_recall_at_10_max
value: 75.16315335381938
- type: nauc_recall_at_10_std
value: -8.060716748913386
- type: nauc_recall_at_1_diff1
value: 79.64060747740989
- type: nauc_recall_at_1_max
value: 69.84278563569617
- type: nauc_recall_at_1_std
value: -15.936904929655832
- type: nauc_recall_at_20_diff1
value: 66.88206981973654
- type: nauc_recall_at_20_max
value: 76.54824917595687
- type: nauc_recall_at_20_std
value: -0.40294589316962287
- type: nauc_recall_at_3_diff1
value: 73.33076087258938
- type: nauc_recall_at_3_max
value: 74.33763112508771
- type: nauc_recall_at_3_std
value: -14.213355414905399
- type: nauc_recall_at_5_diff1
value: 71.67487623469464
- type: nauc_recall_at_5_max
value: 75.72770292516316
- type: nauc_recall_at_5_std
value: -12.887572274644818
- type: ndcg_at_1
value: 58.56400000000001
- type: ndcg_at_10
value: 69.298
- type: ndcg_at_100
value: 71.95899999999999
- type: ndcg_at_1000
value: 72.735
- type: ndcg_at_20
value: 70.50699999999999
- type: ndcg_at_3
value: 65.81700000000001
- type: ndcg_at_5
value: 67.681
- type: precision_at_1
value: 58.56400000000001
- type: precision_at_10
value: 8.039
- type: precision_at_100
value: 0.931
- type: precision_at_1000
value: 0.099
- type: precision_at_20
value: 4.259
- type: precision_at_3
value: 23.65
- type: precision_at_5
value: 15.09
- type: recall_at_1
value: 58.553
- type: recall_at_10
value: 80.368
- type: recall_at_100
value: 93.013
- type: recall_at_1000
value: 99.092
- type: recall_at_20
value: 85.143
- type: recall_at_3
value: 70.928
- type: recall_at_5
value: 75.42699999999999
- task:
type: Retrieval
dataset:
name: MTEB MLQARetrieval (eng-spa)
type: facebook/mlqa
config: eng-spa
split: test
revision: 397ed406c1a7902140303e7faf60fff35b58d285
metrics:
- type: main_score
value: 66.374
- type: map_at_1
value: 55.494
- type: map_at_10
value: 62.763999999999996
- type: map_at_100
value: 63.33
- type: map_at_1000
value: 63.36000000000001
- type: map_at_20
value: 63.104000000000006
- type: map_at_3
value: 61.065000000000005
- type: map_at_5
value: 62.053000000000004
- type: mrr_at_1
value: 55.49419158255571
- type: mrr_at_10
value: 62.765195140457095
- type: mrr_at_100
value: 63.33083349354529
- type: mrr_at_1000
value: 63.3611897014839
- type: mrr_at_20
value: 63.10543590095977
- type: mrr_at_3
value: 61.06455913159412
- type: mrr_at_5
value: 62.052942296705474
- type: nauc_map_at_1000_diff1
value: 75.04200018088618
- type: nauc_map_at_1000_max
value: 70.49937782771909
- type: nauc_map_at_1000_std
value: -5.257206317083184
- type: nauc_map_at_100_diff1
value: 75.02786834256312
- type: nauc_map_at_100_max
value: 70.5016476500189
- type: nauc_map_at_100_std
value: -5.228770832077681
- type: nauc_map_at_10_diff1
value: 74.9626552701647
- type: nauc_map_at_10_max
value: 70.56253732243214
- type: nauc_map_at_10_std
value: -5.359037281768563
- type: nauc_map_at_1_diff1
value: 78.46858307815857
- type: nauc_map_at_1_max
value: 69.03908373759435
- type: nauc_map_at_1_std
value: -7.479412070736642
- type: nauc_map_at_20_diff1
value: 74.98121458084796
- type: nauc_map_at_20_max
value: 70.51885366822565
- type: nauc_map_at_20_std
value: -5.286051287133815
- type: nauc_map_at_3_diff1
value: 75.36078454383373
- type: nauc_map_at_3_max
value: 70.34997144546014
- type: nauc_map_at_3_std
value: -6.663517224039184
- type: nauc_map_at_5_diff1
value: 75.0274512828238
- type: nauc_map_at_5_max
value: 70.45292551591874
- type: nauc_map_at_5_std
value: -6.029224488640147
- type: nauc_mrr_at_1000_diff1
value: 75.04018768469983
- type: nauc_mrr_at_1000_max
value: 70.49855509132635
- type: nauc_mrr_at_1000_std
value: -5.258929961409948
- type: nauc_mrr_at_100_diff1
value: 75.02605732810112
- type: nauc_mrr_at_100_max
value: 70.50082584929103
- type: nauc_mrr_at_100_std
value: -5.2304917988542154
- type: nauc_mrr_at_10_diff1
value: 74.96079080525713
- type: nauc_mrr_at_10_max
value: 70.56167294920391
- type: nauc_mrr_at_10_std
value: -5.360650630655072
- type: nauc_mrr_at_1_diff1
value: 78.46858307815857
- type: nauc_mrr_at_1_max
value: 69.03908373759435
- type: nauc_mrr_at_1_std
value: -7.479412070736642
- type: nauc_mrr_at_20_diff1
value: 74.97939804960517
- type: nauc_mrr_at_20_max
value: 70.51804078965411
- type: nauc_mrr_at_20_std
value: -5.287681954889177
- type: nauc_mrr_at_3_diff1
value: 75.36078454383373
- type: nauc_mrr_at_3_max
value: 70.34997144546014
- type: nauc_mrr_at_3_std
value: -6.663517224039184
- type: nauc_mrr_at_5_diff1
value: 75.0274512828238
- type: nauc_mrr_at_5_max
value: 70.45292551591874
- type: nauc_mrr_at_5_std
value: -6.029224488640147
- type: nauc_ndcg_at_1000_diff1
value: 74.22106834748942
- type: nauc_ndcg_at_1000_max
value: 70.93625922934912
- type: nauc_ndcg_at_1000_std
value: -3.4878399005946017
- type: nauc_ndcg_at_100_diff1
value: 73.74068883646733
- type: nauc_ndcg_at_100_max
value: 71.02357018347472
- type: nauc_ndcg_at_100_std
value: -2.462293184201324
- type: nauc_ndcg_at_10_diff1
value: 73.40967965536565
- type: nauc_ndcg_at_10_max
value: 71.29379828672067
- type: nauc_ndcg_at_10_std
value: -3.295547756383108
- type: nauc_ndcg_at_1_diff1
value: 78.46858307815857
- type: nauc_ndcg_at_1_max
value: 69.03908373759435
- type: nauc_ndcg_at_1_std
value: -7.479412070736642
- type: nauc_ndcg_at_20_diff1
value: 73.45790057693699
- type: nauc_ndcg_at_20_max
value: 71.16598432419126
- type: nauc_ndcg_at_20_std
value: -2.962877157646097
- type: nauc_ndcg_at_3_diff1
value: 74.30696173964847
- type: nauc_ndcg_at_3_max
value: 70.79878978459556
- type: nauc_ndcg_at_3_std
value: -6.297286578628299
- type: nauc_ndcg_at_5_diff1
value: 73.65858211199816
- type: nauc_ndcg_at_5_max
value: 71.01122417463776
- type: nauc_ndcg_at_5_std
value: -5.075990882646765
- type: nauc_precision_at_1000_diff1
value: 68.71065091972568
- type: nauc_precision_at_1000_max
value: 81.38173585624777
- type: nauc_precision_at_1000_std
value: 58.035497889797895
- type: nauc_precision_at_100_diff1
value: 61.93634256957017
- type: nauc_precision_at_100_max
value: 74.84191770203093
- type: nauc_precision_at_100_std
value: 31.3325983123831
- type: nauc_precision_at_10_diff1
value: 66.68247010944937
- type: nauc_precision_at_10_max
value: 74.48773524654571
- type: nauc_precision_at_10_std
value: 6.560421880785153
- type: nauc_precision_at_1_diff1
value: 78.46858307815857
- type: nauc_precision_at_1_max
value: 69.03908373759435
- type: nauc_precision_at_1_std
value: -7.479412070736642
- type: nauc_precision_at_20_diff1
value: 65.51592872758067
- type: nauc_precision_at_20_max
value: 74.50684066823096
- type: nauc_precision_at_20_std
value: 10.830479877698208
- type: nauc_precision_at_3_diff1
value: 70.89587884861588
- type: nauc_precision_at_3_max
value: 72.25310558370424
- type: nauc_precision_at_3_std
value: -5.0796100900749765
- type: nauc_precision_at_5_diff1
value: 68.71885719845497
- type: nauc_precision_at_5_max
value: 73.02601751485672
- type: nauc_precision_at_5_std
value: -1.4382681421626857
- type: nauc_recall_at_1000_diff1
value: 71.95510299834734
- type: nauc_recall_at_1000_max
value: 84.03647166092985
- type: nauc_recall_at_1000_std
value: 56.87490604776847
- type: nauc_recall_at_100_diff1
value: 62.446624924715955
- type: nauc_recall_at_100_max
value: 75.25666892464507
- type: nauc_recall_at_100_std
value: 31.068789794554686
- type: nauc_recall_at_10_diff1
value: 66.70676336328988
- type: nauc_recall_at_10_max
value: 74.4963699656397
- type: nauc_recall_at_10_std
value: 6.57498399706916
- type: nauc_recall_at_1_diff1
value: 78.46858307815857
- type: nauc_recall_at_1_max
value: 69.03908373759435
- type: nauc_recall_at_1_std
value: -7.479412070736642
- type: nauc_recall_at_20_diff1
value: 65.54082767974772
- type: nauc_recall_at_20_max
value: 74.5111529838772
- type: nauc_recall_at_20_std
value: 10.84574829707354
- type: nauc_recall_at_3_diff1
value: 70.89587884861584
- type: nauc_recall_at_3_max
value: 72.25310558370421
- type: nauc_recall_at_3_std
value: -5.07961009007491
- type: nauc_recall_at_5_diff1
value: 68.71885719845501
- type: nauc_recall_at_5_max
value: 73.02601751485666
- type: nauc_recall_at_5_std
value: -1.4382681421626995
- type: ndcg_at_1
value: 55.494
- type: ndcg_at_10
value: 66.374
- type: ndcg_at_100
value: 69.254
- type: ndcg_at_1000
value: 70.136
- type: ndcg_at_20
value: 67.599
- type: ndcg_at_3
value: 62.863
- type: ndcg_at_5
value: 64.644
- type: precision_at_1
value: 55.494
- type: precision_at_10
value: 7.776
- type: precision_at_100
value: 0.9159999999999999
- type: precision_at_1000
value: 0.099
- type: precision_at_20
value: 4.1290000000000004
- type: precision_at_3
value: 22.688
- type: precision_at_5
value: 14.477
- type: recall_at_1
value: 55.494
- type: recall_at_10
value: 77.747
- type: recall_at_100
value: 91.535
- type: recall_at_1000
value: 98.619
- type: recall_at_20
value: 82.565
- type: recall_at_3
value: 68.063
- type: recall_at_5
value: 72.386
- task:
type: Retrieval
dataset:
name: MTEB MLQARetrieval (eng-eng)
type: facebook/mlqa
config: eng-eng
split: test
revision: 397ed406c1a7902140303e7faf60fff35b58d285
metrics:
- type: main_score
value: 64.723
- type: map_at_1
value: 54.308
- type: map_at_10
value: 61.26200000000001
- type: map_at_100
value: 61.82299999999999
- type: map_at_1000
value: 61.856
- type: map_at_20
value: 61.575
- type: map_at_3
value: 59.565
- type: map_at_5
value: 60.561
- type: mrr_at_1
value: 54.31704368848212
- type: mrr_at_10
value: 61.26520216098834
- type: mrr_at_100
value: 61.82588321127103
- type: mrr_at_1000
value: 61.859333030574334
- type: mrr_at_20
value: 61.57780339921337
- type: mrr_at_3
value: 59.569446842801646
- type: mrr_at_5
value: 60.56323029989004
- type: nauc_map_at_1000_diff1
value: 74.21413722468635
- type: nauc_map_at_1000_max
value: 70.41741227882316
- type: nauc_map_at_1000_std
value: -2.5438707209848506
- type: nauc_map_at_100_diff1
value: 74.19812315947975
- type: nauc_map_at_100_max
value: 70.41589146728445
- type: nauc_map_at_100_std
value: -2.5336117059429553
- type: nauc_map_at_10_diff1
value: 74.21810561152937
- type: nauc_map_at_10_max
value: 70.48816115200171
- type: nauc_map_at_10_std
value: -2.7443834681406734
- type: nauc_map_at_1_diff1
value: 77.69378738778958
- type: nauc_map_at_1_max
value: 68.64652310701173
- type: nauc_map_at_1_std
value: -4.667071946448379
- type: nauc_map_at_20_diff1
value: 74.16105697562438
- type: nauc_map_at_20_max
value: 70.42491994631179
- type: nauc_map_at_20_std
value: -2.6070416022440472
- type: nauc_map_at_3_diff1
value: 74.60449392878863
- type: nauc_map_at_3_max
value: 70.39888609914269
- type: nauc_map_at_3_std
value: -3.5401151125723986
- type: nauc_map_at_5_diff1
value: 74.2423420992663
- type: nauc_map_at_5_max
value: 70.36574501826757
- type: nauc_map_at_5_std
value: -3.2707393116898964
- type: nauc_mrr_at_1000_diff1
value: 74.21029843731323
- type: nauc_mrr_at_1000_max
value: 70.43020492688913
- type: nauc_mrr_at_1000_std
value: -2.526895582202081
- type: nauc_mrr_at_100_diff1
value: 74.19440960479243
- type: nauc_mrr_at_100_max
value: 70.4288998824232
- type: nauc_mrr_at_100_std
value: -2.5160929945118107
- type: nauc_mrr_at_10_diff1
value: 74.2141357266166
- type: nauc_mrr_at_10_max
value: 70.5005683347807
- type: nauc_mrr_at_10_std
value: -2.727154557882168
- type: nauc_mrr_at_1_diff1
value: 77.69891248239793
- type: nauc_mrr_at_1_max
value: 68.68255231164922
- type: nauc_mrr_at_1_std
value: -4.630226727154317
- type: nauc_mrr_at_20_diff1
value: 74.15705434409723
- type: nauc_mrr_at_20_max
value: 70.43741835972747
- type: nauc_mrr_at_20_std
value: -2.5896756472464495
- type: nauc_mrr_at_3_diff1
value: 74.5981844349412
- type: nauc_mrr_at_3_max
value: 70.41834937080564
- type: nauc_mrr_at_3_std
value: -3.5161656408031163
- type: nauc_mrr_at_5_diff1
value: 74.23847535424844
- type: nauc_mrr_at_5_max
value: 70.37763810013656
- type: nauc_mrr_at_5_std
value: -3.2560955164581733
- type: nauc_ndcg_at_1000_diff1
value: 73.20994496725493
- type: nauc_ndcg_at_1000_max
value: 70.8903016277125
- type: nauc_ndcg_at_1000_std
value: -0.625772298462309
- type: nauc_ndcg_at_100_diff1
value: 72.6847141682645
- type: nauc_ndcg_at_100_max
value: 70.86564422034162
- type: nauc_ndcg_at_100_std
value: -0.07195786766326141
- type: nauc_ndcg_at_10_diff1
value: 72.78806493754281
- type: nauc_ndcg_at_10_max
value: 71.21957067926769
- type: nauc_ndcg_at_10_std
value: -1.2760418313382227
- type: nauc_ndcg_at_1_diff1
value: 77.69891248239793
- type: nauc_ndcg_at_1_max
value: 68.68255231164922
- type: nauc_ndcg_at_1_std
value: -4.630226727154317
- type: nauc_ndcg_at_20_diff1
value: 72.52082440882546
- type: nauc_ndcg_at_20_max
value: 70.98185004796734
- type: nauc_ndcg_at_20_std
value: -0.6908280874815464
- type: nauc_ndcg_at_3_diff1
value: 73.59870660843939
- type: nauc_ndcg_at_3_max
value: 70.94391957288654
- type: nauc_ndcg_at_3_std
value: -3.147723179140428
- type: nauc_ndcg_at_5_diff1
value: 72.90122868193457
- type: nauc_ndcg_at_5_max
value: 70.89376368965165
- type: nauc_ndcg_at_5_std
value: -2.6451807385626744
- type: nauc_precision_at_1000_diff1
value: 58.14737201864067
- type: nauc_precision_at_1000_max
value: 78.79011251144826
- type: nauc_precision_at_1000_std
value: 59.98985420476577
- type: nauc_precision_at_100_diff1
value: 59.21069121644552
- type: nauc_precision_at_100_max
value: 73.00557835912306
- type: nauc_precision_at_100_std
value: 26.85027406282173
- type: nauc_precision_at_10_diff1
value: 66.8760831023675
- type: nauc_precision_at_10_max
value: 74.21167950452596
- type: nauc_precision_at_10_std
value: 5.453652499335947
- type: nauc_precision_at_1_diff1
value: 77.69891248239793
- type: nauc_precision_at_1_max
value: 68.68255231164922
- type: nauc_precision_at_1_std
value: -4.630226727154317
- type: nauc_precision_at_20_diff1
value: 64.3118559132602
- type: nauc_precision_at_20_max
value: 73.33078184673825
- type: nauc_precision_at_20_std
value: 9.993299523049402
- type: nauc_precision_at_3_diff1
value: 70.38667185155593
- type: nauc_precision_at_3_max
value: 72.66495006030951
- type: nauc_precision_at_3_std
value: -1.8532839591326276
- type: nauc_precision_at_5_diff1
value: 68.12161337583686
- type: nauc_precision_at_5_max
value: 72.65644960375046
- type: nauc_precision_at_5_std
value: -0.33317164167012875
- type: nauc_recall_at_1000_diff1
value: 61.63204394739985
- type: nauc_recall_at_1000_max
value: 81.77241537319897
- type: nauc_recall_at_1000_std
value: 58.44841544062308
- type: nauc_recall_at_100_diff1
value: 59.72072697224705
- type: nauc_recall_at_100_max
value: 73.28519507061553
- type: nauc_recall_at_100_std
value: 26.27318390763456
- type: nauc_recall_at_10_diff1
value: 66.9757135465418
- type: nauc_recall_at_10_max
value: 74.21919493374149
- type: nauc_recall_at_10_std
value: 5.323369605377166
- type: nauc_recall_at_1_diff1
value: 77.69378738778958
- type: nauc_recall_at_1_max
value: 68.64652310701173
- type: nauc_recall_at_1_std
value: -4.667071946448379
- type: nauc_recall_at_20_diff1
value: 64.42290081731899
- type: nauc_recall_at_20_max
value: 73.3358289439033
- type: nauc_recall_at_20_std
value: 9.846598361586073
- type: nauc_recall_at_3_diff1
value: 70.41211290964785
- type: nauc_recall_at_3_max
value: 72.64451776775402
- type: nauc_recall_at_3_std
value: -1.916280959835826
- type: nauc_recall_at_5_diff1
value: 68.20695272727916
- type: nauc_recall_at_5_max
value: 72.66404224006101
- type: nauc_recall_at_5_std
value: -0.431125323007886
- type: ndcg_at_1
value: 54.31700000000001
- type: ndcg_at_10
value: 64.723
- type: ndcg_at_100
value: 67.648
- type: ndcg_at_1000
value: 68.619
- type: ndcg_at_20
value: 65.85499999999999
- type: ndcg_at_3
value: 61.244
- type: ndcg_at_5
value: 63.038000000000004
- type: precision_at_1
value: 54.31700000000001
- type: precision_at_10
value: 7.564
- type: precision_at_100
value: 0.898
- type: precision_at_1000
value: 0.098
- type: precision_at_20
value: 4.005
- type: precision_at_3
value: 22.034000000000002
- type: precision_at_5
value: 14.093
- type: recall_at_1
value: 54.308
- type: recall_at_10
value: 75.622
- type: recall_at_100
value: 89.744
- type: recall_at_1000
value: 97.539
- type: recall_at_20
value: 80.085
- type: recall_at_3
value: 66.09
- type: recall_at_5
value: 70.446
- task:
type: Clustering
dataset:
name: MTEB MLSUMClusteringP2P (de)
type: reciTAL/mlsum
config: de
split: test
revision: b5d54f8f3b61ae17845046286940f03c6bc79bc7
metrics:
- type: main_score
value: 41.267647761702854
- type: v_measure
value: 41.267647761702854
- type: v_measure_std
value: 10.93390895077248
- type: main_score
value: 40.07927325071353
- type: v_measure
value: 40.07927325071353
- type: v_measure_std
value: 9.296680835266145
- task:
type: Clustering
dataset:
name: MTEB MLSUMClusteringP2P (fr)
type: reciTAL/mlsum
config: fr
split: test
revision: b5d54f8f3b61ae17845046286940f03c6bc79bc7
metrics:
- type: main_score
value: 44.68714862333979
- type: v_measure
value: 44.68714862333979
- type: v_measure_std
value: 1.811036989797814
- type: main_score
value: 44.88484854069901
- type: v_measure
value: 44.88484854069901
- type: v_measure_std
value: 2.3704247819781843
- task:
type: Clustering
dataset:
name: MTEB MLSUMClusteringP2P (ru)
type: reciTAL/mlsum
config: ru
split: test
revision: b5d54f8f3b61ae17845046286940f03c6bc79bc7
metrics:
- type: main_score
value: 41.92518785753813
- type: v_measure
value: 41.92518785753813
- type: v_measure_std
value: 5.9356661900220775
- type: main_score
value: 43.97657450929179
- type: v_measure
value: 43.97657450929179
- type: v_measure_std
value: 6.087547931333613
- task:
type: Clustering
dataset:
name: MTEB MLSUMClusteringP2P (es)
type: reciTAL/mlsum
config: es
split: test
revision: b5d54f8f3b61ae17845046286940f03c6bc79bc7
metrics:
- type: main_score
value: 48.69875719812033
- type: v_measure
value: 48.69875719812033
- type: v_measure_std
value: 1.204253881950113
- type: main_score
value: 48.41108671948728
- type: v_measure
value: 48.41108671948728
- type: v_measure_std
value: 1.3848320630151243
- task:
type: Reranking
dataset:
name: MTEB MMarcoReranking (default)
type: C-MTEB/Mmarco-reranking
config: default
split: dev
revision: 8e0c766dbe9e16e1d221116a3f36795fbade07f6
metrics:
- type: map
value: 21.050447576170395
- type: mrr
value: 20.201984126984126
- type: main_score
value: 21.050447576170395
- task:
type: Retrieval
dataset:
name: MTEB MMarcoRetrieval (default)
type: C-MTEB/MMarcoRetrieval
config: default
split: dev
revision: 539bbde593d947e2a124ba72651aafc09eb33fc2
metrics:
- type: main_score
value: 79.687
- type: map_at_1
value: 66.872
- type: map_at_10
value: 75.949
- type: map_at_100
value: 76.25
- type: map_at_1000
value: 76.259
- type: map_at_20
value: 76.145
- type: map_at_3
value: 74.01299999999999
- type: map_at_5
value: 75.232
- type: mrr_at_1
value: 69.18338108882521
- type: mrr_at_10
value: 76.5424227952881
- type: mrr_at_100
value: 76.8019342792628
- type: mrr_at_1000
value: 76.81002278342808
- type: mrr_at_20
value: 76.7115234815896
- type: mrr_at_3
value: 74.83046800382044
- type: mrr_at_5
value: 75.88490926456515
- type: nauc_map_at_1000_diff1
value: 78.06933310424179
- type: nauc_map_at_1000_max
value: 49.392948209665896
- type: nauc_map_at_1000_std
value: -15.126109322591166
- type: nauc_map_at_100_diff1
value: 78.06612779298378
- type: nauc_map_at_100_max
value: 49.40761618630397
- type: nauc_map_at_100_std
value: -15.099282408159349
- type: nauc_map_at_10_diff1
value: 77.94565685470538
- type: nauc_map_at_10_max
value: 49.50559610363201
- type: nauc_map_at_10_std
value: -15.182130695916355
- type: nauc_map_at_1_diff1
value: 79.84814509858211
- type: nauc_map_at_1_max
value: 40.78978466656547
- type: nauc_map_at_1_std
value: -19.96189264026715
- type: nauc_map_at_20_diff1
value: 78.03597839981245
- type: nauc_map_at_20_max
value: 49.49477427223376
- type: nauc_map_at_20_std
value: -15.084990000838378
- type: nauc_map_at_3_diff1
value: 78.0637014655507
- type: nauc_map_at_3_max
value: 48.63214001973341
- type: nauc_map_at_3_std
value: -17.093950563306596
- type: nauc_map_at_5_diff1
value: 77.94068229240348
- type: nauc_map_at_5_max
value: 49.38930719689204
- type: nauc_map_at_5_std
value: -15.9919454201954
- type: nauc_mrr_at_1000_diff1
value: 78.34582398092816
- type: nauc_mrr_at_1000_max
value: 49.623566992784156
- type: nauc_mrr_at_1000_std
value: -14.381347765493265
- type: nauc_mrr_at_100_diff1
value: 78.3429966714221
- type: nauc_mrr_at_100_max
value: 49.63684922240546
- type: nauc_mrr_at_100_std
value: -14.354914066301236
- type: nauc_mrr_at_10_diff1
value: 78.2208070219624
- type: nauc_mrr_at_10_max
value: 49.77720536573364
- type: nauc_mrr_at_10_std
value: -14.316233764741812
- type: nauc_mrr_at_1_diff1
value: 80.22305496572142
- type: nauc_mrr_at_1_max
value: 44.30231210192536
- type: nauc_mrr_at_1_std
value: -18.942549914934492
- type: nauc_mrr_at_20_diff1
value: 78.31006724240147
- type: nauc_mrr_at_20_max
value: 49.72338465276142
- type: nauc_mrr_at_20_std
value: -14.30722621948953
- type: nauc_mrr_at_3_diff1
value: 78.39832634634523
- type: nauc_mrr_at_3_max
value: 49.24985961036677
- type: nauc_mrr_at_3_std
value: -15.966286866763191
- type: nauc_mrr_at_5_diff1
value: 78.2406507247798
- type: nauc_mrr_at_5_max
value: 49.71276359754787
- type: nauc_mrr_at_5_std
value: -14.979526226149698
- type: nauc_ndcg_at_1000_diff1
value: 77.74892471071016
- type: nauc_ndcg_at_1000_max
value: 51.11543344053061
- type: nauc_ndcg_at_1000_std
value: -12.208878737005096
- type: nauc_ndcg_at_100_diff1
value: 77.67462502211228
- type: nauc_ndcg_at_100_max
value: 51.593977338939034
- type: nauc_ndcg_at_100_std
value: -11.312126179513802
- type: nauc_ndcg_at_10_diff1
value: 77.0571291760012
- type: nauc_ndcg_at_10_max
value: 52.35435572808972
- type: nauc_ndcg_at_10_std
value: -11.33242546164059
- type: nauc_ndcg_at_1_diff1
value: 80.22305496572142
- type: nauc_ndcg_at_1_max
value: 44.30231210192536
- type: nauc_ndcg_at_1_std
value: -18.942549914934492
- type: nauc_ndcg_at_20_diff1
value: 77.4141216117471
- type: nauc_ndcg_at_20_max
value: 52.340600871365375
- type: nauc_ndcg_at_20_std
value: -10.989010161550912
- type: nauc_ndcg_at_3_diff1
value: 77.43971989259062
- type: nauc_ndcg_at_3_max
value: 50.59251358320663
- type: nauc_ndcg_at_3_std
value: -15.59337960636058
- type: nauc_ndcg_at_5_diff1
value: 77.12174287031847
- type: nauc_ndcg_at_5_max
value: 51.97108510288907
- type: nauc_ndcg_at_5_std
value: -13.474902612427167
- type: nauc_precision_at_1000_diff1
value: -19.36793534929367
- type: nauc_precision_at_1000_max
value: 11.803383262344036
- type: nauc_precision_at_1000_std
value: 24.304436015177046
- type: nauc_precision_at_100_diff1
value: -6.273790806909921
- type: nauc_precision_at_100_max
value: 23.372606271300747
- type: nauc_precision_at_100_std
value: 29.085768971612342
- type: nauc_precision_at_10_diff1
value: 21.67045907336595
- type: nauc_precision_at_10_max
value: 41.68948432407223
- type: nauc_precision_at_10_std
value: 17.837055074458092
- type: nauc_precision_at_1_diff1
value: 80.22305496572142
- type: nauc_precision_at_1_max
value: 44.30231210192536
- type: nauc_precision_at_1_std
value: -18.942549914934492
- type: nauc_precision_at_20_diff1
value: 12.577671896684803
- type: nauc_precision_at_20_max
value: 37.44944702246691
- type: nauc_precision_at_20_std
value: 23.635897665206087
- type: nauc_precision_at_3_diff1
value: 47.165335112814056
- type: nauc_precision_at_3_max
value: 47.0458691263379
- type: nauc_precision_at_3_std
value: -3.3181861146890217
- type: nauc_precision_at_5_diff1
value: 35.406205343514806
- type: nauc_precision_at_5_max
value: 45.56549449285401
- type: nauc_precision_at_5_std
value: 5.612378074562386
- type: nauc_recall_at_1000_diff1
value: 72.32762520815842
- type: nauc_recall_at_1000_max
value: 85.64979256307343
- type: nauc_recall_at_1000_std
value: 73.61925297037476
- type: nauc_recall_at_100_diff1
value: 72.31946328709962
- type: nauc_recall_at_100_max
value: 83.76576070068353
- type: nauc_recall_at_100_std
value: 57.39376538662535
- type: nauc_recall_at_10_diff1
value: 69.51307788072499
- type: nauc_recall_at_10_max
value: 69.60124733654142
- type: nauc_recall_at_10_std
value: 13.483540424716892
- type: nauc_recall_at_1_diff1
value: 79.84814509858211
- type: nauc_recall_at_1_max
value: 40.78978466656547
- type: nauc_recall_at_1_std
value: -19.96189264026715
- type: nauc_recall_at_20_diff1
value: 70.92168324710599
- type: nauc_recall_at_20_max
value: 76.09106252420084
- type: nauc_recall_at_20_std
value: 25.406842300761447
- type: nauc_recall_at_3_diff1
value: 74.1212680517145
- type: nauc_recall_at_3_max
value: 56.24921832879403
- type: nauc_recall_at_3_std
value: -11.55542913578436
- type: nauc_recall_at_5_diff1
value: 72.31262959872993
- type: nauc_recall_at_5_max
value: 62.761214896697915
- type: nauc_recall_at_5_std
value: -3.280167584070396
- type: ndcg_at_1
value: 69.18299999999999
- type: ndcg_at_10
value: 79.687
- type: ndcg_at_100
value: 81.062
- type: ndcg_at_1000
value: 81.312
- type: ndcg_at_20
value: 80.34599999999999
- type: ndcg_at_3
value: 75.98700000000001
- type: ndcg_at_5
value: 78.039
- type: precision_at_1
value: 69.18299999999999
- type: precision_at_10
value: 9.636
- type: precision_at_100
value: 1.0330000000000001
- type: precision_at_1000
value: 0.105
- type: precision_at_20
value: 4.958
- type: precision_at_3
value: 28.515
- type: precision_at_5
value: 18.201
- type: recall_at_1
value: 66.872
- type: recall_at_10
value: 90.688
- type: recall_at_100
value: 96.99
- type: recall_at_1000
value: 98.958
- type: recall_at_20
value: 93.21199999999999
- type: recall_at_3
value: 80.84599999999999
- type: recall_at_5
value: 85.732
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO (default)
type: mteb/msmarco
config: default
split: dev
revision: c5a29a104738b98a9e76336939199e264163d4a0
metrics:
- type: map_at_1
value: 21.861
- type: map_at_10
value: 34.008
- type: map_at_100
value: 35.174
- type: map_at_1000
value: 35.224
- type: map_at_20
value: 34.705999999999996
- type: map_at_3
value: 30.209000000000003
- type: map_at_5
value: 32.351
- type: mrr_at_1
value: 22.493
- type: mrr_at_10
value: 34.583999999999996
- type: mrr_at_100
value: 35.691
- type: mrr_at_1000
value: 35.736000000000004
- type: mrr_at_20
value: 35.257
- type: mrr_at_3
value: 30.85
- type: mrr_at_5
value: 32.962
- type: ndcg_at_1
value: 22.493
- type: ndcg_at_10
value: 40.815
- type: ndcg_at_100
value: 46.483999999999995
- type: ndcg_at_1000
value: 47.73
- type: ndcg_at_20
value: 43.302
- type: ndcg_at_3
value: 33.056000000000004
- type: ndcg_at_5
value: 36.879
- type: precision_at_1
value: 22.493
- type: precision_at_10
value: 6.465999999999999
- type: precision_at_100
value: 0.932
- type: precision_at_1000
value: 0.104
- type: precision_at_20
value: 3.752
- type: precision_at_3
value: 14.069
- type: precision_at_5
value: 10.384
- type: recall_at_1
value: 21.861
- type: recall_at_10
value: 61.781
- type: recall_at_100
value: 88.095
- type: recall_at_1000
value: 97.625
- type: recall_at_20
value: 71.44500000000001
- type: recall_at_3
value: 40.653
- type: recall_at_5
value: 49.841
- type: main_score
value: 40.815
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 97.4874601003192
- type: f1
value: 97.19067544931094
- type: f1_weighted
value: 97.49331776181019
- type: main_score
value: 97.4874601003192
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (de)
type: mteb/mtop_domain
config: de
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 96.89489997182305
- type: f1
value: 96.51138586512977
- type: f1_weighted
value: 96.89723065967186
- type: main_score
value: 96.89489997182305
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (es)
type: mteb/mtop_domain
config: es
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 97.17144763175452
- type: f1
value: 96.81785681878274
- type: f1_weighted
value: 97.1778974586874
- type: main_score
value: 97.17144763175452
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (fr)
type: mteb/mtop_domain
config: fr
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 96.30128405887879
- type: f1
value: 95.94555923088487
- type: f1_weighted
value: 96.30399416794926
- type: main_score
value: 96.30128405887879
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 84.53488372093022
- type: f1
value: 61.77995074251401
- type: f1_weighted
value: 86.8005170485101
- type: main_score
value: 84.53488372093022
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (de)
type: mteb/mtop_intent
config: de
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 80.79459002535924
- type: f1
value: 56.08938302001448
- type: f1_weighted
value: 83.66582131948252
- type: main_score
value: 80.79459002535924
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (es)
type: mteb/mtop_intent
config: es
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 84.7765176784523
- type: f1
value: 61.39860057885528
- type: f1_weighted
value: 86.94881745670745
- type: main_score
value: 84.7765176784523
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (fr)
type: mteb/mtop_intent
config: fr
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 82.2079549013467
- type: f1
value: 59.90260478749016
- type: f1_weighted
value: 84.36861708593257
- type: main_score
value: 82.2079549013467
- task:
type: Classification
dataset:
name: MTEB MasakhaNEWSClassification (eng)
type: mteb/masakhanews
config: eng
split: test
revision: 18193f187b92da67168c655c9973a165ed9593dd
metrics:
- type: accuracy
value: 74.98945147679325
- type: f1
value: 74.3157483560261
- type: f1_weighted
value: 75.01179008904884
- type: main_score
value: 74.98945147679325
- task:
type: Classification
dataset:
name: MTEB MasakhaNEWSClassification (fra)
type: mteb/masakhanews
config: fra
split: test
revision: 18193f187b92da67168c655c9973a165ed9593dd
metrics:
- type: accuracy
value: 74.02843601895735
- type: f1
value: 70.40326349620732
- type: f1_weighted
value: 74.6596277063484
- type: main_score
value: 74.02843601895735
- task:
type: Clustering
dataset:
name: MTEB MasakhaNEWSClusteringP2P (amh)
type: masakhane/masakhanews
config: amh
split: test
revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60
metrics:
- type: main_score
value: 69.45780291725053
- type: v_measure
value: 69.45780291725053
- type: v_measure_std
value: 36.54340055904091
- type: main_score
value: 60.95132147787602
- type: v_measure
value: 60.95132147787602
- type: v_measure_std
value: 37.330148394033365
- task:
type: Clustering
dataset:
name: MTEB MasakhaNEWSClusteringP2P (eng)
type: masakhane/masakhanews
config: eng
split: test
revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60
metrics:
- type: main_score
value: 64.88996119332239
- type: v_measure
value: 64.88996119332239
- type: v_measure_std
value: 30.017223408197268
- type: main_score
value: 60.974810831426595
- type: v_measure
value: 60.974810831426595
- type: v_measure_std
value: 24.934675467507827
- task:
type: Clustering
dataset:
name: MTEB MasakhaNEWSClusteringP2P (fra)
type: masakhane/masakhanews
config: fra
split: test
revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60
metrics:
- type: main_score
value: 42.362383958691666
- type: v_measure
value: 42.362383958691666
- type: v_measure_std
value: 37.61076788039063
- type: main_score
value: 44.479206673553335
- type: v_measure
value: 44.479206673553335
- type: v_measure_std
value: 32.58254804499339
- task:
type: Clustering
dataset:
name: MTEB MasakhaNEWSClusteringP2P (hau)
type: masakhane/masakhanews
config: hau
split: test
revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60
metrics:
- type: main_score
value: 43.29201252405562
- type: v_measure
value: 43.29201252405562
- type: v_measure_std
value: 34.31987945146255
- type: main_score
value: 26.4742082741682
- type: v_measure
value: 26.4742082741682
- type: v_measure_std
value: 22.344929192323097
- task:
type: Clustering
dataset:
name: MTEB MasakhaNEWSClusteringP2P (ibo)
type: masakhane/masakhanews
config: ibo
split: test
revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60
metrics:
- type: main_score
value: 33.59926542995238
- type: v_measure
value: 33.59926542995238
- type: v_measure_std
value: 35.70048601084112
- type: main_score
value: 38.906129911741985
- type: v_measure
value: 38.906129911741985
- type: v_measure_std
value: 34.785601792668444
- task:
type: Clustering
dataset:
name: MTEB MasakhaNEWSClusteringP2P (lin)
type: masakhane/masakhanews
config: lin
split: test
revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60
metrics:
- type: main_score
value: 67.58487601893106
- type: v_measure
value: 67.58487601893106
- type: v_measure_std
value: 35.16784970777931
- type: main_score
value: 62.60982020876592
- type: v_measure
value: 62.60982020876592
- type: v_measure_std
value: 40.7368955715045
- task:
type: Clustering
dataset:
name: MTEB MasakhaNEWSClusteringP2P (lug)
type: masakhane/masakhanews
config: lug
split: test
revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60
metrics:
- type: main_score
value: 50.01220872023533
- type: v_measure
value: 50.01220872023533
- type: v_measure_std
value: 41.87411574676182
- type: main_score
value: 42.70424106365967
- type: v_measure
value: 42.70424106365967
- type: v_measure_std
value: 46.80946241135087
- task:
type: Clustering
dataset:
name: MTEB MasakhaNEWSClusteringP2P (orm)
type: masakhane/masakhanews
config: orm
split: test
revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60
metrics:
- type: main_score
value: 29.007847502598317
- type: v_measure
value: 29.007847502598317
- type: v_measure_std
value: 38.374997395079994
- type: main_score
value: 28.609942199922322
- type: v_measure
value: 28.609942199922322
- type: v_measure_std
value: 38.46685040191088
- task:
type: Clustering
dataset:
name: MTEB MasakhaNEWSClusteringP2P (pcm)
type: masakhane/masakhanews
config: pcm
split: test
revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60
metrics:
- type: main_score
value: 79.13520228554611
- type: v_measure
value: 79.13520228554611
- type: v_measure_std
value: 18.501843848275183
- type: main_score
value: 76.83901348810822
- type: v_measure
value: 76.83901348810822
- type: v_measure_std
value: 17.57617141269189
- task:
type: Clustering
dataset:
name: MTEB MasakhaNEWSClusteringP2P (run)
type: masakhane/masakhanews
config: run
split: test
revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60
metrics:
- type: main_score
value: 60.317213909746656
- type: v_measure
value: 60.317213909746656
- type: v_measure_std
value: 36.500281823747386
- type: main_score
value: 46.89757547846193
- type: v_measure
value: 46.89757547846193
- type: v_measure_std
value: 44.58903590203438
- task:
type: Clustering
dataset:
name: MTEB MasakhaNEWSClusteringP2P (sna)
type: masakhane/masakhanews
config: sna
split: test
revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60
metrics:
- type: main_score
value: 59.395277358240946
- type: v_measure
value: 59.395277358240946
- type: v_measure_std
value: 37.500916816164654
- type: main_score
value: 55.37185207068829
- type: v_measure
value: 55.37185207068829
- type: v_measure_std
value: 36.944574863543004
- task:
type: Clustering
dataset:
name: MTEB MasakhaNEWSClusteringP2P (som)
type: masakhane/masakhanews
config: som
split: test
revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60
metrics:
- type: main_score
value: 38.18638688704302
- type: v_measure
value: 38.18638688704302
- type: v_measure_std
value: 35.453681137564466
- type: main_score
value: 37.44211021681754
- type: v_measure
value: 37.44211021681754
- type: v_measure_std
value: 33.41469994463241
- task:
type: Clustering
dataset:
name: MTEB MasakhaNEWSClusteringP2P (swa)
type: masakhane/masakhanews
config: swa
split: test
revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60
metrics:
- type: main_score
value: 29.49230755729658
- type: v_measure
value: 29.49230755729658
- type: v_measure_std
value: 28.284313285264645
- type: main_score
value: 26.020680621216062
- type: v_measure
value: 26.020680621216062
- type: v_measure_std
value: 25.480037522570413
- task:
type: Clustering
dataset:
name: MTEB MasakhaNEWSClusteringP2P (tir)
type: masakhane/masakhanews
config: tir
split: test
revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60
metrics:
- type: main_score
value: 60.632258622750115
- type: v_measure
value: 60.632258622750115
- type: v_measure_std
value: 34.429711214740564
- type: main_score
value: 63.74306846771303
- type: v_measure
value: 63.74306846771303
- type: v_measure_std
value: 32.19119631078685
- task:
type: Clustering
dataset:
name: MTEB MasakhaNEWSClusteringP2P (xho)
type: masakhane/masakhanews
config: xho
split: test
revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60
metrics:
- type: main_score
value: 41.76322918806381
- type: v_measure
value: 41.76322918806381
- type: v_measure_std
value: 36.43245296200775
- type: main_score
value: 24.580890519243777
- type: v_measure
value: 24.580890519243777
- type: v_measure_std
value: 37.941836363967106
- task:
type: Clustering
dataset:
name: MTEB MasakhaNEWSClusteringP2P (yor)
type: masakhane/masakhanews
config: yor
split: test
revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60
metrics:
- type: main_score
value: 33.17083910808645
- type: v_measure
value: 33.17083910808645
- type: v_measure_std
value: 34.87547994284835
- type: main_score
value: 43.63458888828314
- type: v_measure
value: 43.63458888828314
- type: v_measure_std
value: 31.28169350649098
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (pl)
type: mteb/amazon_massive_intent
config: pl
split: test
revision: 4672e20407010da34463acc759c162ca9734bca6
metrics:
- type: accuracy
value: 75.37323470073974
- type: f1
value: 71.1836877753734
- type: f1_weighted
value: 75.72073213955457
- type: main_score
value: 75.37323470073974
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (de)
type: mteb/amazon_massive_intent
config: de
split: test
revision: 4672e20407010da34463acc759c162ca9734bca6
metrics:
- type: accuracy
value: 74.83523873570948
- type: f1
value: 70.72375821116886
- type: f1_weighted
value: 75.20800490010755
- type: main_score
value: 74.83523873570948
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (es)
type: mteb/amazon_massive_intent
config: es
split: test
revision: 4672e20407010da34463acc759c162ca9734bca6
metrics:
- type: accuracy
value: 75.31607262945528
- type: f1
value: 72.06063554897662
- type: f1_weighted
value: 75.72438161355252
- type: main_score
value: 75.31607262945528
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (ru)
type: mteb/amazon_massive_intent
config: ru
split: test
revision: 4672e20407010da34463acc759c162ca9734bca6
metrics:
- type: accuracy
value: 76.7955615332885
- type: f1
value: 73.08099648499756
- type: f1_weighted
value: 77.18482068239668
- type: main_score
value: 76.7955615332885
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 4672e20407010da34463acc759c162ca9734bca6
metrics:
- type: accuracy
value: 77.60591795561534
- type: f1
value: 74.46676705370395
- type: f1_weighted
value: 77.69888062336614
- type: main_score
value: 77.60591795561534
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (fr)
type: mteb/amazon_massive_intent
config: fr
split: test
revision: 4672e20407010da34463acc759c162ca9734bca6
metrics:
- type: accuracy
value: 76.32145258910558
- type: f1
value: 72.89824154178328
- type: f1_weighted
value: 76.6539327979472
- type: main_score
value: 76.32145258910558
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (zh-CN)
type: mteb/amazon_massive_intent
config: zh-CN
split: test
revision: 4672e20407010da34463acc759c162ca9734bca6
metrics:
- type: accuracy
value: 73.21788836583724
- type: f1
value: 70.45594512246377
- type: f1_weighted
value: 73.67862536499393
- type: main_score
value: 73.21788836583724
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (zh-CN)
type: mteb/amazon_massive_scenario
config: zh-CN
split: test
revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8
metrics:
- type: accuracy
value: 80.82044384667114
- type: f1
value: 80.53217664465089
- type: f1_weighted
value: 80.94535087010512
- type: main_score
value: 80.82044384667114
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (pl)
type: mteb/amazon_massive_scenario
config: pl
split: test
revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8
metrics:
- type: accuracy
value: 82.1049092131809
- type: f1
value: 81.55343463694733
- type: f1_weighted
value: 82.33509098770782
- type: main_score
value: 82.1049092131809
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (es)
type: mteb/amazon_massive_scenario
config: es
split: test
revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8
metrics:
- type: accuracy
value: 82.58238063214526
- type: f1
value: 82.27974449333072
- type: f1_weighted
value: 82.81337569618209
- type: main_score
value: 82.58238063214526
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (de)
type: mteb/amazon_massive_scenario
config: de
split: test
revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8
metrics:
- type: accuracy
value: 83.97108271687962
- type: f1
value: 83.56285606936076
- type: f1_weighted
value: 84.10198745390771
- type: main_score
value: 83.97108271687962
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8
metrics:
- type: accuracy
value: 84.71082716879623
- type: f1
value: 84.09447062371402
- type: f1_weighted
value: 84.73765765551342
- type: main_score
value: 84.71082716879623
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (fr)
type: mteb/amazon_massive_scenario
config: fr
split: test
revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8
metrics:
- type: accuracy
value: 83.093476798924
- type: f1
value: 82.72656900752943
- type: f1_weighted
value: 83.26606516503364
- type: main_score
value: 83.093476798924
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (ru)
type: mteb/amazon_massive_scenario
config: ru
split: test
revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8
metrics:
- type: accuracy
value: 84.05850706119705
- type: f1
value: 83.64234048881222
- type: f1_weighted
value: 84.17315768381876
- type: main_score
value: 84.05850706119705
- task:
type: Retrieval
dataset:
name: MTEB MedicalRetrieval (default)
type: C-MTEB/MedicalRetrieval
config: default
split: dev
revision: 2039188fb5800a9803ba5048df7b76e6fb151fc6
metrics:
- type: main_score
value: 56.635999999999996
- type: map_at_1
value: 48.699999999999996
- type: map_at_10
value: 53.991
- type: map_at_100
value: 54.449999999999996
- type: map_at_1000
value: 54.515
- type: map_at_20
value: 54.212
- type: map_at_3
value: 52.833
- type: map_at_5
value: 53.503
- type: mrr_at_1
value: 48.699999999999996
- type: mrr_at_10
value: 53.991309523809505
- type: mrr_at_100
value: 54.45008993448266
- type: mrr_at_1000
value: 54.515253990549795
- type: mrr_at_20
value: 54.21201762247036
- type: mrr_at_3
value: 52.8333333333333
- type: mrr_at_5
value: 53.50333333333328
- type: nauc_map_at_1000_diff1
value: 79.96867989401643
- type: nauc_map_at_1000_max
value: 69.75230895599029
- type: nauc_map_at_1000_std
value: 2.6418738289740213
- type: nauc_map_at_100_diff1
value: 79.95343709599133
- type: nauc_map_at_100_max
value: 69.751282671507
- type: nauc_map_at_100_std
value: 2.621719966106279
- type: nauc_map_at_10_diff1
value: 80.02875864565634
- type: nauc_map_at_10_max
value: 69.80948662290187
- type: nauc_map_at_10_std
value: 2.329151604733765
- type: nauc_map_at_1_diff1
value: 83.616940281383
- type: nauc_map_at_1_max
value: 69.08142651929452
- type: nauc_map_at_1_std
value: 1.9687791394035643
- type: nauc_map_at_20_diff1
value: 79.95555601275339
- type: nauc_map_at_20_max
value: 69.76604695002925
- type: nauc_map_at_20_std
value: 2.556184141901367
- type: nauc_map_at_3_diff1
value: 80.74790131023668
- type: nauc_map_at_3_max
value: 70.57797991892402
- type: nauc_map_at_3_std
value: 2.7115149849964117
- type: nauc_map_at_5_diff1
value: 80.31796539878381
- type: nauc_map_at_5_max
value: 69.93573796420061
- type: nauc_map_at_5_std
value: 2.0731614029506606
- type: nauc_mrr_at_1000_diff1
value: 79.96867999907981
- type: nauc_mrr_at_1000_max
value: 69.57395578976896
- type: nauc_mrr_at_1000_std
value: 2.46351945887829
- type: nauc_mrr_at_100_diff1
value: 79.95343709599133
- type: nauc_mrr_at_100_max
value: 69.57322054130803
- type: nauc_mrr_at_100_std
value: 2.4436578359073433
- type: nauc_mrr_at_10_diff1
value: 80.02875864565634
- type: nauc_mrr_at_10_max
value: 69.63292630937411
- type: nauc_mrr_at_10_std
value: 2.1525912912060012
- type: nauc_mrr_at_1_diff1
value: 83.616940281383
- type: nauc_mrr_at_1_max
value: 68.74717310480305
- type: nauc_mrr_at_1_std
value: 1.6345257249120868
- type: nauc_mrr_at_20_diff1
value: 79.95555601275339
- type: nauc_mrr_at_20_max
value: 69.58883608470444
- type: nauc_mrr_at_20_std
value: 2.378973276576547
- type: nauc_mrr_at_3_diff1
value: 80.74790131023668
- type: nauc_mrr_at_3_max
value: 70.40430475488604
- type: nauc_mrr_at_3_std
value: 2.5378398209583817
- type: nauc_mrr_at_5_diff1
value: 80.31796539878381
- type: nauc_mrr_at_5_max
value: 69.7605991748183
- type: nauc_mrr_at_5_std
value: 1.898022613568352
- type: nauc_ndcg_at_1000_diff1
value: 78.35504059321225
- type: nauc_ndcg_at_1000_max
value: 69.06752522437093
- type: nauc_ndcg_at_1000_std
value: 3.9624036886099265
- type: nauc_ndcg_at_100_diff1
value: 77.79729140249833
- type: nauc_ndcg_at_100_max
value: 68.93113791506029
- type: nauc_ndcg_at_100_std
value: 3.642178826886181
- type: nauc_ndcg_at_10_diff1
value: 78.160158293918
- type: nauc_ndcg_at_10_max
value: 69.28122202281361
- type: nauc_ndcg_at_10_std
value: 2.438976810940962
- type: nauc_ndcg_at_1_diff1
value: 83.616940281383
- type: nauc_ndcg_at_1_max
value: 69.08142651929452
- type: nauc_ndcg_at_1_std
value: 1.9687791394035643
- type: nauc_ndcg_at_20_diff1
value: 77.88514432874997
- type: nauc_ndcg_at_20_max
value: 69.06148818508873
- type: nauc_ndcg_at_20_std
value: 3.1800249272363676
- type: nauc_ndcg_at_3_diff1
value: 79.73510384405803
- type: nauc_ndcg_at_3_max
value: 70.78000695123832
- type: nauc_ndcg_at_3_std
value: 2.9041415468363274
- type: nauc_ndcg_at_5_diff1
value: 78.91872808866195
- type: nauc_ndcg_at_5_max
value: 69.61478429620091
- type: nauc_ndcg_at_5_std
value: 1.734699636301054
- type: nauc_precision_at_1000_diff1
value: 66.37858395390673
- type: nauc_precision_at_1000_max
value: 60.651659037598534
- type: nauc_precision_at_1000_std
value: 27.388353715469798
- type: nauc_precision_at_100_diff1
value: 66.34325807776025
- type: nauc_precision_at_100_max
value: 63.63855305621111
- type: nauc_precision_at_100_std
value: 10.641748149575351
- type: nauc_precision_at_10_diff1
value: 71.3784685491089
- type: nauc_precision_at_10_max
value: 67.05313695174542
- type: nauc_precision_at_10_std
value: 3.000406867930561
- type: nauc_precision_at_1_diff1
value: 83.616940281383
- type: nauc_precision_at_1_max
value: 69.08142651929452
- type: nauc_precision_at_1_std
value: 1.9687791394035643
- type: nauc_precision_at_20_diff1
value: 69.73407910977694
- type: nauc_precision_at_20_max
value: 65.77426240320742
- type: nauc_precision_at_20_std
value: 6.204416838482586
- type: nauc_precision_at_3_diff1
value: 76.63737537643107
- type: nauc_precision_at_3_max
value: 71.29710200719668
- type: nauc_precision_at_3_std
value: 3.47180961484546
- type: nauc_precision_at_5_diff1
value: 74.36945983536717
- type: nauc_precision_at_5_max
value: 68.33292218003061
- type: nauc_precision_at_5_std
value: 0.47128762620258075
- type: nauc_recall_at_1000_diff1
value: 66.37858395390681
- type: nauc_recall_at_1000_max
value: 60.65165903759889
- type: nauc_recall_at_1000_std
value: 27.388353715469822
- type: nauc_recall_at_100_diff1
value: 66.34325807776025
- type: nauc_recall_at_100_max
value: 63.63855305621116
- type: nauc_recall_at_100_std
value: 10.641748149575351
- type: nauc_recall_at_10_diff1
value: 71.37846854910892
- type: nauc_recall_at_10_max
value: 67.05313695174546
- type: nauc_recall_at_10_std
value: 3.000406867930663
- type: nauc_recall_at_1_diff1
value: 83.616940281383
- type: nauc_recall_at_1_max
value: 69.08142651929452
- type: nauc_recall_at_1_std
value: 1.9687791394035643
- type: nauc_recall_at_20_diff1
value: 69.73407910977691
- type: nauc_recall_at_20_max
value: 65.77426240320746
- type: nauc_recall_at_20_std
value: 6.204416838482536
- type: nauc_recall_at_3_diff1
value: 76.63737537643112
- type: nauc_recall_at_3_max
value: 71.29710200719668
- type: nauc_recall_at_3_std
value: 3.471809614845442
- type: nauc_recall_at_5_diff1
value: 74.36945983536715
- type: nauc_recall_at_5_max
value: 68.33292218003065
- type: nauc_recall_at_5_std
value: 0.4712876262026442
- type: ndcg_at_1
value: 48.699999999999996
- type: ndcg_at_10
value: 56.635999999999996
- type: ndcg_at_100
value: 59.193
- type: ndcg_at_1000
value: 60.97
- type: ndcg_at_20
value: 57.426
- type: ndcg_at_3
value: 54.186
- type: ndcg_at_5
value: 55.407
- type: precision_at_1
value: 48.699999999999996
- type: precision_at_10
value: 6.5
- type: precision_at_100
value: 0.777
- type: precision_at_1000
value: 0.092
- type: precision_at_20
value: 3.405
- type: precision_at_3
value: 19.367
- type: precision_at_5
value: 12.22
- type: recall_at_1
value: 48.699999999999996
- type: recall_at_10
value: 65.0
- type: recall_at_100
value: 77.7
- type: recall_at_1000
value: 91.8
- type: recall_at_20
value: 68.10000000000001
- type: recall_at_3
value: 58.099999999999994
- type: recall_at_5
value: 61.1
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P (default)
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: main_score
value: 34.80188561439236
- type: v_measure
value: 34.80188561439236
- type: v_measure_std
value: 1.5703148841573102
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S (default)
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: main_score
value: 32.42285513996236
- type: v_measure
value: 32.42285513996236
- type: v_measure_std
value: 1.3769867487457566
- task:
type: Retrieval
dataset:
name: MTEB MintakaRetrieval (de)
type: jinaai/mintakaqa
config: de
split: test
revision: efa78cc2f74bbcd21eff2261f9e13aebe40b814e
metrics:
- type: main_score
value: 27.025
- type: map_at_1
value: 14.532
- type: map_at_10
value: 22.612
- type: map_at_100
value: 23.802
- type: map_at_1000
value: 23.9
- type: map_at_20
value: 23.275000000000002
- type: map_at_3
value: 20.226
- type: map_at_5
value: 21.490000000000002
- type: mrr_at_1
value: 14.532434709351305
- type: mrr_at_10
value: 22.612077265615575
- type: mrr_at_100
value: 23.801523356874675
- type: mrr_at_1000
value: 23.900118499340238
- type: mrr_at_20
value: 23.275466430108995
- type: mrr_at_3
value: 20.22606009547877
- type: mrr_at_5
value: 21.489750070204945
- type: nauc_map_at_1000_diff1
value: 14.148987799763596
- type: nauc_map_at_1000_max
value: 44.70338461387784
- type: nauc_map_at_1000_std
value: 15.868006767707637
- type: nauc_map_at_100_diff1
value: 14.11371769080442
- type: nauc_map_at_100_max
value: 44.67995540936296
- type: nauc_map_at_100_std
value: 15.890796502029076
- type: nauc_map_at_10_diff1
value: 14.29066834165688
- type: nauc_map_at_10_max
value: 45.10997111765282
- type: nauc_map_at_10_std
value: 15.508568918629864
- type: nauc_map_at_1_diff1
value: 23.473291302576396
- type: nauc_map_at_1_max
value: 44.68942599764586
- type: nauc_map_at_1_std
value: 12.424377262427253
- type: nauc_map_at_20_diff1
value: 14.112652046087831
- type: nauc_map_at_20_max
value: 44.82014861413682
- type: nauc_map_at_20_std
value: 15.739350613646385
- type: nauc_map_at_3_diff1
value: 16.119659221396347
- type: nauc_map_at_3_max
value: 46.04766378953525
- type: nauc_map_at_3_std
value: 13.969878046315925
- type: nauc_map_at_5_diff1
value: 15.095453434076184
- type: nauc_map_at_5_max
value: 45.802128149314406
- type: nauc_map_at_5_std
value: 14.957442173319949
- type: nauc_mrr_at_1000_diff1
value: 14.148987799763596
- type: nauc_mrr_at_1000_max
value: 44.70338461387784
- type: nauc_mrr_at_1000_std
value: 15.868006767707637
- type: nauc_mrr_at_100_diff1
value: 14.11371769080442
- type: nauc_mrr_at_100_max
value: 44.67995540936296
- type: nauc_mrr_at_100_std
value: 15.890796502029076
- type: nauc_mrr_at_10_diff1
value: 14.29066834165688
- type: nauc_mrr_at_10_max
value: 45.10997111765282
- type: nauc_mrr_at_10_std
value: 15.508568918629864
- type: nauc_mrr_at_1_diff1
value: 23.473291302576396
- type: nauc_mrr_at_1_max
value: 44.68942599764586
- type: nauc_mrr_at_1_std
value: 12.424377262427253
- type: nauc_mrr_at_20_diff1
value: 14.112652046087831
- type: nauc_mrr_at_20_max
value: 44.82014861413682
- type: nauc_mrr_at_20_std
value: 15.739350613646385
- type: nauc_mrr_at_3_diff1
value: 16.119659221396347
- type: nauc_mrr_at_3_max
value: 46.04766378953525
- type: nauc_mrr_at_3_std
value: 13.969878046315925
- type: nauc_mrr_at_5_diff1
value: 15.095453434076184
- type: nauc_mrr_at_5_max
value: 45.802128149314406
- type: nauc_mrr_at_5_std
value: 14.957442173319949
- type: nauc_ndcg_at_1000_diff1
value: 11.626606894574028
- type: nauc_ndcg_at_1000_max
value: 43.328592841065536
- type: nauc_ndcg_at_1000_std
value: 18.049446272245547
- type: nauc_ndcg_at_100_diff1
value: 10.485720606660239
- type: nauc_ndcg_at_100_max
value: 42.405317674170966
- type: nauc_ndcg_at_100_std
value: 19.107151641936987
- type: nauc_ndcg_at_10_diff1
value: 11.029351078162982
- type: nauc_ndcg_at_10_max
value: 44.36855031964681
- type: nauc_ndcg_at_10_std
value: 17.302796171409305
- type: nauc_ndcg_at_1_diff1
value: 23.473291302576396
- type: nauc_ndcg_at_1_max
value: 44.68942599764586
- type: nauc_ndcg_at_1_std
value: 12.424377262427253
- type: nauc_ndcg_at_20_diff1
value: 10.356662718168412
- type: nauc_ndcg_at_20_max
value: 43.31602680430083
- type: nauc_ndcg_at_20_std
value: 18.162891267850316
- type: nauc_ndcg_at_3_diff1
value: 14.42844952297869
- type: nauc_ndcg_at_3_max
value: 46.26603339466543
- type: nauc_ndcg_at_3_std
value: 14.449362723887857
- type: nauc_ndcg_at_5_diff1
value: 12.783416563486396
- type: nauc_ndcg_at_5_max
value: 45.852176479124424
- type: nauc_ndcg_at_5_std
value: 16.11775016428085
- type: nauc_precision_at_1000_diff1
value: -8.045361059399795
- type: nauc_precision_at_1000_max
value: 21.970273281738777
- type: nauc_precision_at_1000_std
value: 49.564650488193266
- type: nauc_precision_at_100_diff1
value: -2.118628861593353
- type: nauc_precision_at_100_max
value: 31.32498977104778
- type: nauc_precision_at_100_std
value: 32.96087731883451
- type: nauc_precision_at_10_diff1
value: 3.0335517475367615
- type: nauc_precision_at_10_max
value: 42.21620215030219
- type: nauc_precision_at_10_std
value: 21.90159732315962
- type: nauc_precision_at_1_diff1
value: 23.473291302576396
- type: nauc_precision_at_1_max
value: 44.68942599764586
- type: nauc_precision_at_1_std
value: 12.424377262427253
- type: nauc_precision_at_20_diff1
value: 0.4087201843719047
- type: nauc_precision_at_20_max
value: 38.485034773895734
- type: nauc_precision_at_20_std
value: 25.077397979916682
- type: nauc_precision_at_3_diff1
value: 10.408327736589833
- type: nauc_precision_at_3_max
value: 46.757216289175076
- type: nauc_precision_at_3_std
value: 15.62594354926867
- type: nauc_precision_at_5_diff1
value: 7.326752744229544
- type: nauc_precision_at_5_max
value: 45.89190518573553
- type: nauc_precision_at_5_std
value: 19.01717163438957
- type: nauc_recall_at_1000_diff1
value: -8.045361059400387
- type: nauc_recall_at_1000_max
value: 21.97027328173812
- type: nauc_recall_at_1000_std
value: 49.56465048819266
- type: nauc_recall_at_100_diff1
value: -2.118628861593277
- type: nauc_recall_at_100_max
value: 31.324989771047818
- type: nauc_recall_at_100_std
value: 32.96087731883457
- type: nauc_recall_at_10_diff1
value: 3.0335517475367166
- type: nauc_recall_at_10_max
value: 42.21620215030217
- type: nauc_recall_at_10_std
value: 21.901597323159606
- type: nauc_recall_at_1_diff1
value: 23.473291302576396
- type: nauc_recall_at_1_max
value: 44.68942599764586
- type: nauc_recall_at_1_std
value: 12.424377262427253
- type: nauc_recall_at_20_diff1
value: 0.40872018437190905
- type: nauc_recall_at_20_max
value: 38.485034773895734
- type: nauc_recall_at_20_std
value: 25.077397979916693
- type: nauc_recall_at_3_diff1
value: 10.408327736589843
- type: nauc_recall_at_3_max
value: 46.75721628917507
- type: nauc_recall_at_3_std
value: 15.625943549268664
- type: nauc_recall_at_5_diff1
value: 7.326752744229548
- type: nauc_recall_at_5_max
value: 45.89190518573557
- type: nauc_recall_at_5_std
value: 19.01717163438958
- type: ndcg_at_1
value: 14.532
- type: ndcg_at_10
value: 27.025
- type: ndcg_at_100
value: 33.305
- type: ndcg_at_1000
value: 36.38
- type: ndcg_at_20
value: 29.443
- type: ndcg_at_3
value: 22.035
- type: ndcg_at_5
value: 24.319
- type: precision_at_1
value: 14.532
- type: precision_at_10
value: 4.115
- type: precision_at_100
value: 0.717
- type: precision_at_1000
value: 0.097
- type: precision_at_20
value: 2.536
- type: precision_at_3
value: 9.085
- type: precision_at_5
value: 6.563
- type: recall_at_1
value: 14.532
- type: recall_at_10
value: 41.154
- type: recall_at_100
value: 71.651
- type: recall_at_1000
value: 96.841
- type: recall_at_20
value: 50.71600000000001
- type: recall_at_3
value: 27.254
- type: recall_at_5
value: 32.814
- task:
type: Retrieval
dataset:
name: MTEB MintakaRetrieval (es)
type: jinaai/mintakaqa
config: es
split: test
revision: efa78cc2f74bbcd21eff2261f9e13aebe40b814e
metrics:
- type: main_score
value: 26.912000000000003
- type: map_at_1
value: 14.686
- type: map_at_10
value: 22.569
- type: map_at_100
value: 23.679
- type: map_at_1000
value: 23.777
- type: map_at_20
value: 23.169
- type: map_at_3
value: 20.201
- type: map_at_5
value: 21.566
- type: mrr_at_1
value: 14.686468646864686
- type: mrr_at_10
value: 22.569346220336296
- type: mrr_at_100
value: 23.678819125817146
- type: mrr_at_1000
value: 23.77713511338264
- type: mrr_at_20
value: 23.16850858443442
- type: mrr_at_3
value: 20.200770077007665
- type: mrr_at_5
value: 21.56628162816276
- type: nauc_map_at_1000_diff1
value: 14.129007578838381
- type: nauc_map_at_1000_max
value: 44.4255501141499
- type: nauc_map_at_1000_std
value: 19.95906154868176
- type: nauc_map_at_100_diff1
value: 14.09071870575231
- type: nauc_map_at_100_max
value: 44.403179928955566
- type: nauc_map_at_100_std
value: 20.00413657519976
- type: nauc_map_at_10_diff1
value: 14.149535953153688
- type: nauc_map_at_10_max
value: 44.66529917634685
- type: nauc_map_at_10_std
value: 19.580235989479394
- type: nauc_map_at_1_diff1
value: 23.489813522176636
- type: nauc_map_at_1_max
value: 46.54578639925787
- type: nauc_map_at_1_std
value: 16.39083721709994
- type: nauc_map_at_20_diff1
value: 14.021560420656181
- type: nauc_map_at_20_max
value: 44.4825455452467
- type: nauc_map_at_20_std
value: 19.886927750826878
- type: nauc_map_at_3_diff1
value: 16.182977890477723
- type: nauc_map_at_3_max
value: 46.1840554029258
- type: nauc_map_at_3_std
value: 18.735671900228958
- type: nauc_map_at_5_diff1
value: 14.779126395472833
- type: nauc_map_at_5_max
value: 45.23237213817556
- type: nauc_map_at_5_std
value: 19.348508580412872
- type: nauc_mrr_at_1000_diff1
value: 14.129007578838381
- type: nauc_mrr_at_1000_max
value: 44.4255501141499
- type: nauc_mrr_at_1000_std
value: 19.95906154868176
- type: nauc_mrr_at_100_diff1
value: 14.09071870575231
- type: nauc_mrr_at_100_max
value: 44.403179928955566
- type: nauc_mrr_at_100_std
value: 20.00413657519976
- type: nauc_mrr_at_10_diff1
value: 14.149535953153688
- type: nauc_mrr_at_10_max
value: 44.66529917634685
- type: nauc_mrr_at_10_std
value: 19.580235989479394
- type: nauc_mrr_at_1_diff1
value: 23.489813522176636
- type: nauc_mrr_at_1_max
value: 46.54578639925787
- type: nauc_mrr_at_1_std
value: 16.39083721709994
- type: nauc_mrr_at_20_diff1
value: 14.021560420656181
- type: nauc_mrr_at_20_max
value: 44.4825455452467
- type: nauc_mrr_at_20_std
value: 19.886927750826878
- type: nauc_mrr_at_3_diff1
value: 16.182977890477723
- type: nauc_mrr_at_3_max
value: 46.1840554029258
- type: nauc_mrr_at_3_std
value: 18.735671900228958
- type: nauc_mrr_at_5_diff1
value: 14.779126395472833
- type: nauc_mrr_at_5_max
value: 45.23237213817556
- type: nauc_mrr_at_5_std
value: 19.348508580412872
- type: nauc_ndcg_at_1000_diff1
value: 11.762470380481101
- type: nauc_ndcg_at_1000_max
value: 42.8233203033089
- type: nauc_ndcg_at_1000_std
value: 21.78503705117719
- type: nauc_ndcg_at_100_diff1
value: 10.45886076220022
- type: nauc_ndcg_at_100_max
value: 41.85472899256818
- type: nauc_ndcg_at_100_std
value: 23.20955486335138
- type: nauc_ndcg_at_10_diff1
value: 10.605912468659469
- type: nauc_ndcg_at_10_max
value: 43.150942448104715
- type: nauc_ndcg_at_10_std
value: 21.120035764826085
- type: nauc_ndcg_at_1_diff1
value: 23.489813522176636
- type: nauc_ndcg_at_1_max
value: 46.54578639925787
- type: nauc_ndcg_at_1_std
value: 16.39083721709994
- type: nauc_ndcg_at_20_diff1
value: 10.11291783888644
- type: nauc_ndcg_at_20_max
value: 42.51260678842788
- type: nauc_ndcg_at_20_std
value: 22.1744949382252
- type: nauc_ndcg_at_3_diff1
value: 14.25625326760802
- type: nauc_ndcg_at_3_max
value: 45.96162916377383
- type: nauc_ndcg_at_3_std
value: 19.557832728215523
- type: nauc_ndcg_at_5_diff1
value: 11.956317653823053
- type: nauc_ndcg_at_5_max
value: 44.35971268886807
- type: nauc_ndcg_at_5_std
value: 20.581696730374233
- type: nauc_precision_at_1000_diff1
value: 5.132291843566577
- type: nauc_precision_at_1000_max
value: 25.293354576835263
- type: nauc_precision_at_1000_std
value: 40.36005126087624
- type: nauc_precision_at_100_diff1
value: -1.5252854375008238
- type: nauc_precision_at_100_max
value: 31.007586474495984
- type: nauc_precision_at_100_std
value: 37.297552993548386
- type: nauc_precision_at_10_diff1
value: 1.9663657370770737
- type: nauc_precision_at_10_max
value: 39.194092293625125
- type: nauc_precision_at_10_std
value: 24.956542621999542
- type: nauc_precision_at_1_diff1
value: 23.489813522176636
- type: nauc_precision_at_1_max
value: 46.54578639925787
- type: nauc_precision_at_1_std
value: 16.39083721709994
- type: nauc_precision_at_20_diff1
value: 0.011112090390932373
- type: nauc_precision_at_20_max
value: 36.9357074392519
- type: nauc_precision_at_20_std
value: 28.611387115093876
- type: nauc_precision_at_3_diff1
value: 9.596831091013703
- type: nauc_precision_at_3_max
value: 45.3905541893809
- type: nauc_precision_at_3_std
value: 21.599314388526945
- type: nauc_precision_at_5_diff1
value: 5.175887949900142
- type: nauc_precision_at_5_max
value: 42.129467510414464
- type: nauc_precision_at_5_std
value: 23.607251548776677
- type: nauc_recall_at_1000_diff1
value: 5.132291843566257
- type: nauc_recall_at_1000_max
value: 25.29335457683396
- type: nauc_recall_at_1000_std
value: 40.36005126087638
- type: nauc_recall_at_100_diff1
value: -1.5252854375008988
- type: nauc_recall_at_100_max
value: 31.00758647449594
- type: nauc_recall_at_100_std
value: 37.29755299354834
- type: nauc_recall_at_10_diff1
value: 1.9663657370770793
- type: nauc_recall_at_10_max
value: 39.19409229362512
- type: nauc_recall_at_10_std
value: 24.956542621999546
- type: nauc_recall_at_1_diff1
value: 23.489813522176636
- type: nauc_recall_at_1_max
value: 46.54578639925787
- type: nauc_recall_at_1_std
value: 16.39083721709994
- type: nauc_recall_at_20_diff1
value: 0.011112090390923075
- type: nauc_recall_at_20_max
value: 36.93570743925189
- type: nauc_recall_at_20_std
value: 28.611387115093883
- type: nauc_recall_at_3_diff1
value: 9.596831091013714
- type: nauc_recall_at_3_max
value: 45.39055418938087
- type: nauc_recall_at_3_std
value: 21.599314388526956
- type: nauc_recall_at_5_diff1
value: 5.17588794990012
- type: nauc_recall_at_5_max
value: 42.12946751041448
- type: nauc_recall_at_5_std
value: 23.607251548776695
- type: ndcg_at_1
value: 14.686
- type: ndcg_at_10
value: 26.912000000000003
- type: ndcg_at_100
value: 32.919
- type: ndcg_at_1000
value: 36.119
- type: ndcg_at_20
value: 29.079
- type: ndcg_at_3
value: 21.995
- type: ndcg_at_5
value: 24.474999999999998
- type: precision_at_1
value: 14.686
- type: precision_at_10
value: 4.08
- type: precision_at_100
value: 0.703
- type: precision_at_1000
value: 0.097
- type: precision_at_20
value: 2.467
- type: precision_at_3
value: 9.062000000000001
- type: precision_at_5
value: 6.65
- type: recall_at_1
value: 14.686
- type: recall_at_10
value: 40.8
- type: recall_at_100
value: 70.338
- type: recall_at_1000
value: 96.82300000000001
- type: recall_at_20
value: 49.34
- type: recall_at_3
value: 27.186
- type: recall_at_5
value: 33.251
- task:
type: Retrieval
dataset:
name: MTEB MintakaRetrieval (fr)
type: jinaai/mintakaqa
config: fr
split: test
revision: efa78cc2f74bbcd21eff2261f9e13aebe40b814e
metrics:
- type: main_score
value: 26.909
- type: map_at_1
value: 14.701
- type: map_at_10
value: 22.613
- type: map_at_100
value: 23.729
- type: map_at_1000
value: 23.837
- type: map_at_20
value: 23.262
- type: map_at_3
value: 20.236
- type: map_at_5
value: 21.673000000000002
- type: mrr_at_1
value: 14.7010647010647
- type: mrr_at_10
value: 22.613165113165113
- type: mrr_at_100
value: 23.72877605989423
- type: mrr_at_1000
value: 23.837150802746805
- type: mrr_at_20
value: 23.261627081110596
- type: mrr_at_3
value: 20.2361452361452
- type: mrr_at_5
value: 21.673491673491625
- type: nauc_map_at_1000_diff1
value: 17.08927788889635
- type: nauc_map_at_1000_max
value: 47.240929150603336
- type: nauc_map_at_1000_std
value: 20.559244258100275
- type: nauc_map_at_100_diff1
value: 17.029461792796777
- type: nauc_map_at_100_max
value: 47.207381115550696
- type: nauc_map_at_100_std
value: 20.581498156895265
- type: nauc_map_at_10_diff1
value: 17.351456007804536
- type: nauc_map_at_10_max
value: 47.815880040221344
- type: nauc_map_at_10_std
value: 20.292999107555794
- type: nauc_map_at_1_diff1
value: 27.297525357600776
- type: nauc_map_at_1_max
value: 47.18835074959486
- type: nauc_map_at_1_std
value: 18.304203168281834
- type: nauc_map_at_20_diff1
value: 17.157460199542136
- type: nauc_map_at_20_max
value: 47.4776610667456
- type: nauc_map_at_20_std
value: 20.499186342964478
- type: nauc_map_at_3_diff1
value: 19.393119961356277
- type: nauc_map_at_3_max
value: 49.02841822452882
- type: nauc_map_at_3_std
value: 19.293122796321292
- type: nauc_map_at_5_diff1
value: 17.76275044752008
- type: nauc_map_at_5_max
value: 48.01292548040298
- type: nauc_map_at_5_std
value: 19.928449977400504
- type: nauc_mrr_at_1000_diff1
value: 17.08927788889635
- type: nauc_mrr_at_1000_max
value: 47.240929150603336
- type: nauc_mrr_at_1000_std
value: 20.559244258100275
- type: nauc_mrr_at_100_diff1
value: 17.029461792796777
- type: nauc_mrr_at_100_max
value: 47.207381115550696
- type: nauc_mrr_at_100_std
value: 20.581498156895265
- type: nauc_mrr_at_10_diff1
value: 17.351456007804536
- type: nauc_mrr_at_10_max
value: 47.815880040221344
- type: nauc_mrr_at_10_std
value: 20.292999107555794
- type: nauc_mrr_at_1_diff1
value: 27.297525357600776
- type: nauc_mrr_at_1_max
value: 47.18835074959486
- type: nauc_mrr_at_1_std
value: 18.304203168281834
- type: nauc_mrr_at_20_diff1
value: 17.157460199542136
- type: nauc_mrr_at_20_max
value: 47.4776610667456
- type: nauc_mrr_at_20_std
value: 20.499186342964478
- type: nauc_mrr_at_3_diff1
value: 19.393119961356277
- type: nauc_mrr_at_3_max
value: 49.02841822452882
- type: nauc_mrr_at_3_std
value: 19.293122796321292
- type: nauc_mrr_at_5_diff1
value: 17.76275044752008
- type: nauc_mrr_at_5_max
value: 48.01292548040298
- type: nauc_mrr_at_5_std
value: 19.928449977400504
- type: nauc_ndcg_at_1000_diff1
value: 13.989496006047975
- type: nauc_ndcg_at_1000_max
value: 45.626323944336114
- type: nauc_ndcg_at_1000_std
value: 22.125600410796515
- type: nauc_ndcg_at_100_diff1
value: 12.302204843705244
- type: nauc_ndcg_at_100_max
value: 44.46856314559079
- type: nauc_ndcg_at_100_std
value: 23.084984546328677
- type: nauc_ndcg_at_10_diff1
value: 14.001226213368275
- type: nauc_ndcg_at_10_max
value: 47.37780636546918
- type: nauc_ndcg_at_10_std
value: 21.702709032840637
- type: nauc_ndcg_at_1_diff1
value: 27.297525357600776
- type: nauc_ndcg_at_1_max
value: 47.18835074959486
- type: nauc_ndcg_at_1_std
value: 18.304203168281834
- type: nauc_ndcg_at_20_diff1
value: 13.317759910171056
- type: nauc_ndcg_at_20_max
value: 46.25171251043813
- type: nauc_ndcg_at_20_std
value: 22.309331575402595
- type: nauc_ndcg_at_3_diff1
value: 17.555381234893872
- type: nauc_ndcg_at_3_max
value: 49.48635590260059
- type: nauc_ndcg_at_3_std
value: 19.734570962933674
- type: nauc_ndcg_at_5_diff1
value: 14.844841165765061
- type: nauc_ndcg_at_5_max
value: 47.76437065028708
- type: nauc_ndcg_at_5_std
value: 20.816034479453954
- type: nauc_precision_at_1000_diff1
value: -15.591898698252546
- type: nauc_precision_at_1000_max
value: 20.545984285353892
- type: nauc_precision_at_1000_std
value: 38.9013414992826
- type: nauc_precision_at_100_diff1
value: -5.290395978742176
- type: nauc_precision_at_100_max
value: 31.340480360546845
- type: nauc_precision_at_100_std
value: 33.6897935720505
- type: nauc_precision_at_10_diff1
value: 5.965001997926562
- type: nauc_precision_at_10_max
value: 46.12515296162247
- type: nauc_precision_at_10_std
value: 25.409433135253558
- type: nauc_precision_at_1_diff1
value: 27.297525357600776
- type: nauc_precision_at_1_max
value: 47.18835074959486
- type: nauc_precision_at_1_std
value: 18.304203168281834
- type: nauc_precision_at_20_diff1
value: 3.4438127279827744
- type: nauc_precision_at_20_max
value: 42.36095587714494
- type: nauc_precision_at_20_std
value: 27.367900512797906
- type: nauc_precision_at_3_diff1
value: 13.165017224718916
- type: nauc_precision_at_3_max
value: 50.58931825484506
- type: nauc_precision_at_3_std
value: 20.852009214609442
- type: nauc_precision_at_5_diff1
value: 7.840087177549876
- type: nauc_precision_at_5_max
value: 46.99388755575109
- type: nauc_precision_at_5_std
value: 23.048702393099834
- type: nauc_recall_at_1000_diff1
value: -15.591898698252932
- type: nauc_recall_at_1000_max
value: 20.5459842853537
- type: nauc_recall_at_1000_std
value: 38.901341499282395
- type: nauc_recall_at_100_diff1
value: -5.290395978742165
- type: nauc_recall_at_100_max
value: 31.340480360546863
- type: nauc_recall_at_100_std
value: 33.68979357205046
- type: nauc_recall_at_10_diff1
value: 5.96500199792656
- type: nauc_recall_at_10_max
value: 46.1251529616225
- type: nauc_recall_at_10_std
value: 25.409433135253543
- type: nauc_recall_at_1_diff1
value: 27.297525357600776
- type: nauc_recall_at_1_max
value: 47.18835074959486
- type: nauc_recall_at_1_std
value: 18.304203168281834
- type: nauc_recall_at_20_diff1
value: 3.4438127279827833
- type: nauc_recall_at_20_max
value: 42.36095587714498
- type: nauc_recall_at_20_std
value: 27.36790051279787
- type: nauc_recall_at_3_diff1
value: 13.165017224718916
- type: nauc_recall_at_3_max
value: 50.589318254845054
- type: nauc_recall_at_3_std
value: 20.852009214609435
- type: nauc_recall_at_5_diff1
value: 7.840087177549891
- type: nauc_recall_at_5_max
value: 46.99388755575112
- type: nauc_recall_at_5_std
value: 23.048702393099845
- type: ndcg_at_1
value: 14.701
- type: ndcg_at_10
value: 26.909
- type: ndcg_at_100
value: 32.727000000000004
- type: ndcg_at_1000
value: 36.086
- type: ndcg_at_20
value: 29.236
- type: ndcg_at_3
value: 22.004
- type: ndcg_at_5
value: 24.615000000000002
- type: precision_at_1
value: 14.701
- type: precision_at_10
value: 4.062
- type: precision_at_100
value: 0.688
- type: precision_at_1000
value: 0.096
- type: precision_at_20
value: 2.488
- type: precision_at_3
value: 9.036
- type: precision_at_5
value: 6.699
- type: recall_at_1
value: 14.701
- type: recall_at_10
value: 40.622
- type: recall_at_100
value: 68.796
- type: recall_at_1000
value: 96.314
- type: recall_at_20
value: 49.754
- type: recall_at_3
value: 27.108999999999998
- type: recall_at_5
value: 33.497
- task:
type: Classification
dataset:
name: MTEB MultilingualSentiment (default)
type: C-MTEB/MultilingualSentiment-classification
config: default
split: test
revision: 46958b007a63fdbf239b7672c25d0bea67b5ea1a
metrics:
- type: accuracy
value: 73.20999999999998
- type: f1
value: 73.18755986777474
- type: f1_weighted
value: 73.18755986777475
- type: main_score
value: 73.20999999999998
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus (default)
type: mteb/nfcorpus
config: default
split: test
revision: ec0fa4fe99da2ff19ca1214b7966684033a58814
metrics:
- type: map_at_1
value: 4.822
- type: map_at_10
value: 13.144
- type: map_at_100
value: 17.254
- type: map_at_1000
value: 18.931
- type: map_at_20
value: 14.834
- type: map_at_3
value: 8.975
- type: map_at_5
value: 10.922
- type: mrr_at_1
value: 47.059
- type: mrr_at_10
value: 55.806999999999995
- type: mrr_at_100
value: 56.286
- type: mrr_at_1000
value: 56.327000000000005
- type: mrr_at_20
value: 56.00000000000001
- type: mrr_at_3
value: 54.17999999999999
- type: mrr_at_5
value: 55.155
- type: ndcg_at_1
value: 44.427
- type: ndcg_at_10
value: 36.623
- type: ndcg_at_100
value: 33.664
- type: ndcg_at_1000
value: 42.538
- type: ndcg_at_20
value: 34.066
- type: ndcg_at_3
value: 41.118
- type: ndcg_at_5
value: 39.455
- type: precision_at_1
value: 46.44
- type: precision_at_10
value: 28.607
- type: precision_at_100
value: 9.189
- type: precision_at_1000
value: 2.261
- type: precision_at_20
value: 21.238
- type: precision_at_3
value: 39.628
- type: precision_at_5
value: 35.604
- type: recall_at_1
value: 4.822
- type: recall_at_10
value: 17.488999999999997
- type: recall_at_100
value: 35.052
- type: recall_at_1000
value: 66.67999999999999
- type: recall_at_20
value: 21.343999999999998
- type: recall_at_3
value: 10.259
- type: recall_at_5
value: 13.406
- type: main_score
value: 36.623
- task:
type: Retrieval
dataset:
name: MTEB NQ (default)
type: mteb/nq
config: default
split: test
revision: b774495ed302d8c44a3a7ea25c90dbce03968f31
metrics:
- type: map_at_1
value: 41.411
- type: map_at_10
value: 57.179
- type: map_at_100
value: 57.945
- type: map_at_1000
value: 57.967999999999996
- type: map_at_20
value: 57.687
- type: map_at_3
value: 53.46300000000001
- type: map_at_5
value: 55.696999999999996
- type: mrr_at_1
value: 46.233999999999995
- type: mrr_at_10
value: 59.831999999999994
- type: mrr_at_100
value: 60.33500000000001
- type: mrr_at_1000
value: 60.348
- type: mrr_at_20
value: 60.167
- type: mrr_at_3
value: 56.972
- type: mrr_at_5
value: 58.74
- type: ndcg_at_1
value: 46.205
- type: ndcg_at_10
value: 64.23100000000001
- type: ndcg_at_100
value: 67.242
- type: ndcg_at_1000
value: 67.72500000000001
- type: ndcg_at_20
value: 65.77300000000001
- type: ndcg_at_3
value: 57.516
- type: ndcg_at_5
value: 61.11600000000001
- type: precision_at_1
value: 46.205
- type: precision_at_10
value: 9.873
- type: precision_at_100
value: 1.158
- type: precision_at_1000
value: 0.12
- type: precision_at_20
value: 5.319
- type: precision_at_3
value: 25.424999999999997
- type: precision_at_5
value: 17.375
- type: recall_at_1
value: 41.411
- type: recall_at_10
value: 82.761
- type: recall_at_100
value: 95.52199999999999
- type: recall_at_1000
value: 99.02499999999999
- type: recall_at_20
value: 88.34
- type: recall_at_3
value: 65.73
- type: recall_at_5
value: 73.894
- type: main_score
value: 64.23100000000001
- task:
type: PairClassification
dataset:
name: MTEB Ocnli (default)
type: C-MTEB/OCNLI
config: default
split: validation
revision: 66e76a618a34d6d565d5538088562851e6daa7ec
metrics:
- type: cosine_accuracy
value: 62.3714131023281
- type: cosine_accuracy_threshold
value: 79.70921993255615
- type: cosine_ap
value: 66.41380155495659
- type: cosine_f1
value: 68.89547185780786
- type: cosine_f1_threshold
value: 72.91591167449951
- type: cosine_precision
value: 57.485875706214685
- type: cosine_recall
value: 85.95564941921859
- type: dot_accuracy
value: 60.47644829453167
- type: dot_accuracy_threshold
value: 36627.362060546875
- type: dot_ap
value: 63.696303449293204
- type: dot_f1
value: 68.3986041101202
- type: dot_f1_threshold
value: 30452.72216796875
- type: dot_precision
value: 54.04411764705882
- type: dot_recall
value: 93.13621964097149
- type: euclidean_accuracy
value: 63.02111532214402
- type: euclidean_accuracy_threshold
value: 1392.76762008667
- type: euclidean_ap
value: 66.65907089443218
- type: euclidean_f1
value: 69.05036524413688
- type: euclidean_f1_threshold
value: 1711.5310668945312
- type: euclidean_precision
value: 54.29262394195889
- type: euclidean_recall
value: 94.82576557550159
- type: main_score
value: 63.02111532214402
- type: manhattan_accuracy
value: 62.75040606388739
- type: manhattan_accuracy_threshold
value: 32475.347900390625
- type: manhattan_ap
value: 66.50943585125434
- type: manhattan_f1
value: 69.08382066276802
- type: manhattan_f1_threshold
value: 41238.470458984375
- type: manhattan_precision
value: 54.75896168108776
- type: manhattan_recall
value: 93.55860612460401
- type: max_accuracy
value: 63.02111532214402
- type: max_ap
value: 66.65907089443218
- type: max_f1
value: 69.08382066276802
- type: max_precision
value: 57.485875706214685
- type: max_recall
value: 94.82576557550159
- type: similarity_accuracy
value: 62.3714131023281
- type: similarity_accuracy_threshold
value: 79.70921993255615
- type: similarity_ap
value: 66.41380155495659
- type: similarity_f1
value: 68.89547185780786
- type: similarity_f1_threshold
value: 72.91591167449951
- type: similarity_precision
value: 57.485875706214685
- type: similarity_recall
value: 85.95564941921859
- task:
type: Classification
dataset:
name: MTEB OnlineShopping (default)
type: C-MTEB/OnlineShopping-classification
config: default
split: test
revision: e610f2ebd179a8fda30ae534c3878750a96db120
metrics:
- type: accuracy
value: 91.88000000000001
- type: ap
value: 89.52463684448476
- type: ap_weighted
value: 89.52463684448476
- type: f1
value: 91.86313022306673
- type: f1_weighted
value: 91.87806318146912
- type: main_score
value: 91.88000000000001
- task:
type: PairClassification
dataset:
name: MTEB OpusparcusPC (en)
type: GEM/opusparcus
config: en
split: test.full
revision: 9e9b1f8ef51616073f47f306f7f47dd91663f86a
metrics:
- type: cosine_accuracy
value: 92.65578635014838
- type: cosine_accuracy_threshold
value: 74.02530312538147
- type: cosine_ap
value: 98.3834226153613
- type: cosine_f1
value: 94.92567913890312
- type: cosine_f1_threshold
value: 74.02530312538147
- type: cosine_precision
value: 95.562435500516
- type: cosine_recall
value: 94.29735234215886
- type: dot_accuracy
value: 91.54302670623146
- type: dot_accuracy_threshold
value: 34452.29187011719
- type: dot_ap
value: 98.1237257754439
- type: dot_f1
value: 94.22400803616273
- type: dot_f1_threshold
value: 33670.41931152344
- type: dot_precision
value: 92.9633300297324
- type: dot_recall
value: 95.5193482688391
- type: euclidean_accuracy
value: 92.28486646884274
- type: euclidean_accuracy_threshold
value: 1602.8022766113281
- type: euclidean_ap
value: 98.3099021504706
- type: euclidean_f1
value: 94.75277497477296
- type: euclidean_f1_threshold
value: 1604.7462463378906
- type: euclidean_precision
value: 93.89999999999999
- type: euclidean_recall
value: 95.62118126272912
- type: main_score
value: 98.3834226153613
- type: manhattan_accuracy
value: 92.2106824925816
- type: manhattan_accuracy_threshold
value: 38872.90954589844
- type: manhattan_ap
value: 98.28694101230218
- type: manhattan_f1
value: 94.67815509376584
- type: manhattan_f1_threshold
value: 38872.90954589844
- type: manhattan_precision
value: 94.24823410696267
- type: manhattan_recall
value: 95.11201629327903
- type: max_accuracy
value: 92.65578635014838
- type: max_ap
value: 98.3834226153613
- type: max_f1
value: 94.92567913890312
- type: max_precision
value: 95.562435500516
- type: max_recall
value: 95.62118126272912
- type: similarity_accuracy
value: 92.65578635014838
- type: similarity_accuracy_threshold
value: 74.02530312538147
- type: similarity_ap
value: 98.3834226153613
- type: similarity_f1
value: 94.92567913890312
- type: similarity_f1_threshold
value: 74.02530312538147
- type: similarity_precision
value: 95.562435500516
- type: similarity_recall
value: 94.29735234215886
- task:
type: PairClassification
dataset:
name: MTEB OpusparcusPC (de)
type: GEM/opusparcus
config: de
split: test.full
revision: 9e9b1f8ef51616073f47f306f7f47dd91663f86a
metrics:
- type: cosine_accuracy
value: 87.72178850248403
- type: cosine_accuracy_threshold
value: 73.33863377571106
- type: cosine_ap
value: 96.98901408834976
- type: cosine_f1
value: 91.89944134078212
- type: cosine_f1_threshold
value: 71.45810127258301
- type: cosine_precision
value: 89.64577656675749
- type: cosine_recall
value: 94.26934097421203
- type: dot_accuracy
value: 86.30234208658624
- type: dot_accuracy_threshold
value: 32027.130126953125
- type: dot_ap
value: 96.12260574893256
- type: dot_f1
value: 91.31602506714414
- type: dot_f1_threshold
value: 30804.376220703125
- type: dot_precision
value: 85.93091828138164
- type: dot_recall
value: 97.42120343839542
- type: euclidean_accuracy
value: 87.9347054648687
- type: euclidean_accuracy_threshold
value: 1609.6670150756836
- type: euclidean_ap
value: 97.00238860358252
- type: euclidean_f1
value: 92.1089063221043
- type: euclidean_f1_threshold
value: 1641.8487548828125
- type: euclidean_precision
value: 89.10714285714286
- type: euclidean_recall
value: 95.31996179560649
- type: main_score
value: 97.00238860358252
- type: manhattan_accuracy
value: 87.72178850248403
- type: manhattan_accuracy_threshold
value: 40137.060546875
- type: manhattan_ap
value: 96.98653728159941
- type: manhattan_f1
value: 92.03865623561896
- type: manhattan_f1_threshold
value: 40137.060546875
- type: manhattan_precision
value: 88.80994671403198
- type: manhattan_recall
value: 95.51098376313276
- type: max_accuracy
value: 87.9347054648687
- type: max_ap
value: 97.00238860358252
- type: max_f1
value: 92.1089063221043
- type: max_precision
value: 89.64577656675749
- type: max_recall
value: 97.42120343839542
- type: similarity_accuracy
value: 87.72178850248403
- type: similarity_accuracy_threshold
value: 73.33863377571106
- type: similarity_ap
value: 96.98901408834976
- type: similarity_f1
value: 91.89944134078212
- type: similarity_f1_threshold
value: 71.45810127258301
- type: similarity_precision
value: 89.64577656675749
- type: similarity_recall
value: 94.26934097421203
- task:
type: PairClassification
dataset:
name: MTEB OpusparcusPC (fr)
type: GEM/opusparcus
config: fr
split: test.full
revision: 9e9b1f8ef51616073f47f306f7f47dd91663f86a
metrics:
- type: cosine_accuracy
value: 80.92643051771117
- type: cosine_accuracy_threshold
value: 76.68856382369995
- type: cosine_ap
value: 93.74622381534307
- type: cosine_f1
value: 87.12328767123287
- type: cosine_f1_threshold
value: 71.64022922515869
- type: cosine_precision
value: 80.64243448858834
- type: cosine_recall
value: 94.73684210526315
- type: dot_accuracy
value: 80.858310626703
- type: dot_accuracy_threshold
value: 34028.3935546875
- type: dot_ap
value: 91.18448457633308
- type: dot_f1
value: 86.82606657290202
- type: dot_f1_threshold
value: 34028.3935546875
- type: dot_precision
value: 82.2380106571936
- type: dot_recall
value: 91.9563058589871
- type: euclidean_accuracy
value: 80.858310626703
- type: euclidean_accuracy_threshold
value: 1595.7651138305664
- type: euclidean_ap
value: 93.8182717829648
- type: euclidean_f1
value: 87.04044117647058
- type: euclidean_f1_threshold
value: 1609.2475891113281
- type: euclidean_precision
value: 81.00940975192472
- type: euclidean_recall
value: 94.04170804369414
- type: main_score
value: 93.8182717829648
- type: manhattan_accuracy
value: 80.99455040871935
- type: manhattan_accuracy_threshold
value: 38092.132568359375
- type: manhattan_ap
value: 93.77563401151711
- type: manhattan_f1
value: 86.91983122362869
- type: manhattan_f1_threshold
value: 38092.132568359375
- type: manhattan_precision
value: 82.32682060390763
- type: manhattan_recall
value: 92.05561072492551
- type: max_accuracy
value: 80.99455040871935
- type: max_ap
value: 93.8182717829648
- type: max_f1
value: 87.12328767123287
- type: max_precision
value: 82.32682060390763
- type: max_recall
value: 94.73684210526315
- type: similarity_accuracy
value: 80.92643051771117
- type: similarity_accuracy_threshold
value: 76.68856382369995
- type: similarity_ap
value: 93.74622381534307
- type: similarity_f1
value: 87.12328767123287
- type: similarity_f1_threshold
value: 71.64022922515869
- type: similarity_precision
value: 80.64243448858834
- type: similarity_recall
value: 94.73684210526315
- task:
type: PairClassification
dataset:
name: MTEB OpusparcusPC (ru)
type: GEM/opusparcus
config: ru
split: test.full
revision: 9e9b1f8ef51616073f47f306f7f47dd91663f86a
metrics:
- type: cosine_accuracy
value: 76.83823529411765
- type: cosine_accuracy_threshold
value: 72.70769476890564
- type: cosine_ap
value: 89.56692049908222
- type: cosine_f1
value: 83.99832003359934
- type: cosine_f1_threshold
value: 70.9052324295044
- type: cosine_precision
value: 76.16146230007617
- type: cosine_recall
value: 93.63295880149812
- type: dot_accuracy
value: 76.28676470588235
- type: dot_accuracy_threshold
value: 33740.68908691406
- type: dot_ap
value: 87.77185177141567
- type: dot_f1
value: 83.62251375370292
- type: dot_f1_threshold
value: 32726.611328125
- type: dot_precision
value: 76.29343629343629
- type: dot_recall
value: 92.50936329588015
- type: euclidean_accuracy
value: 77.32843137254902
- type: euclidean_accuracy_threshold
value: 1566.510009765625
- type: euclidean_ap
value: 89.60605626791111
- type: euclidean_f1
value: 84.06546080964686
- type: euclidean_f1_threshold
value: 1576.4202117919922
- type: euclidean_precision
value: 77.83094098883574
- type: euclidean_recall
value: 91.38576779026218
- type: main_score
value: 89.60605626791111
- type: manhattan_accuracy
value: 76.89950980392157
- type: manhattan_accuracy_threshold
value: 38202.215576171875
- type: manhattan_ap
value: 89.55766894104868
- type: manhattan_f1
value: 83.80462724935732
- type: manhattan_f1_threshold
value: 38934.375
- type: manhattan_precision
value: 77.25118483412322
- type: manhattan_recall
value: 91.57303370786516
- type: max_accuracy
value: 77.32843137254902
- type: max_ap
value: 89.60605626791111
- type: max_f1
value: 84.06546080964686
- type: max_precision
value: 77.83094098883574
- type: max_recall
value: 93.63295880149812
- type: similarity_accuracy
value: 76.83823529411765
- type: similarity_accuracy_threshold
value: 72.70769476890564
- type: similarity_ap
value: 89.56692049908222
- type: similarity_f1
value: 83.99832003359934
- type: similarity_f1_threshold
value: 70.9052324295044
- type: similarity_precision
value: 76.16146230007617
- type: similarity_recall
value: 93.63295880149812
- task:
type: Classification
dataset:
name: MTEB PAC (default)
type: laugustyniak/abusive-clauses-pl
config: default
split: test
revision: fc69d1c153a8ccdcf1eef52f4e2a27f88782f543
metrics:
- type: accuracy
value: 68.39559803069794
- type: ap
value: 77.68074206719457
- type: ap_weighted
value: 77.68074206719457
- type: f1
value: 66.23485605467732
- type: f1_weighted
value: 69.03201442129347
- type: main_score
value: 68.39559803069794
- task:
type: STS
dataset:
name: MTEB PAWSX (default)
type: C-MTEB/PAWSX
config: default
split: test
revision: 9c6a90e430ac22b5779fb019a23e820b11a8b5e1
metrics:
- type: cosine_pearson
value: 13.161523266433587
- type: cosine_spearman
value: 15.557333873773386
- type: euclidean_pearson
value: 17.147508431907525
- type: euclidean_spearman
value: 15.664112857732146
- type: main_score
value: 15.557333873773386
- type: manhattan_pearson
value: 17.130875906264386
- type: manhattan_spearman
value: 15.624397342229637
- type: pearson
value: 13.161523266433587
- type: spearman
value: 15.557333873773386
- task:
type: PairClassification
dataset:
name: MTEB PSC (default)
type: PL-MTEB/psc-pairclassification
config: default
split: test
revision: d05a294af9e1d3ff2bfb6b714e08a24a6cabc669
metrics:
- type: cosine_accuracy
value: 97.86641929499072
- type: cosine_accuracy_threshold
value: 79.0391206741333
- type: cosine_ap
value: 99.19403807771533
- type: cosine_f1
value: 96.45608628659475
- type: cosine_f1_threshold
value: 79.0391206741333
- type: cosine_precision
value: 97.50778816199377
- type: cosine_recall
value: 95.42682926829268
- type: dot_accuracy
value: 98.14471243042672
- type: dot_accuracy_threshold
value: 29808.1787109375
- type: dot_ap
value: 99.331999859971
- type: dot_f1
value: 97.01492537313433
- type: dot_f1_threshold
value: 29808.1787109375
- type: dot_precision
value: 95.02923976608187
- type: dot_recall
value: 99.08536585365853
- type: euclidean_accuracy
value: 97.49536178107606
- type: euclidean_accuracy_threshold
value: 1276.227855682373
- type: euclidean_ap
value: 98.91056467717377
- type: euclidean_f1
value: 95.83975346687212
- type: euclidean_f1_threshold
value: 1276.227855682373
- type: euclidean_precision
value: 96.88473520249221
- type: euclidean_recall
value: 94.8170731707317
- type: main_score
value: 99.331999859971
- type: manhattan_accuracy
value: 97.49536178107606
- type: manhattan_accuracy_threshold
value: 31097.674560546875
- type: manhattan_ap
value: 98.95694691792707
- type: manhattan_f1
value: 95.83975346687212
- type: manhattan_f1_threshold
value: 31097.674560546875
- type: manhattan_precision
value: 96.88473520249221
- type: manhattan_recall
value: 94.8170731707317
- type: max_accuracy
value: 98.14471243042672
- type: max_ap
value: 99.331999859971
- type: max_f1
value: 97.01492537313433
- type: max_precision
value: 97.50778816199377
- type: max_recall
value: 99.08536585365853
- type: similarity_accuracy
value: 97.86641929499072
- type: similarity_accuracy_threshold
value: 79.0391206741333
- type: similarity_ap
value: 99.19403807771533
- type: similarity_f1
value: 96.45608628659475
- type: similarity_f1_threshold
value: 79.0391206741333
- type: similarity_precision
value: 97.50778816199377
- type: similarity_recall
value: 95.42682926829268
- task:
type: PairClassification
dataset:
name: MTEB PawsXPairClassification (en)
type: google-research-datasets/paws-x
config: en
split: test
revision: 8a04d940a42cd40658986fdd8e3da561533a3646
metrics:
- type: cosine_accuracy
value: 61.8
- type: cosine_accuracy_threshold
value: 99.5664119720459
- type: cosine_ap
value: 60.679317786040585
- type: cosine_f1
value: 63.17354143441101
- type: cosine_f1_threshold
value: 97.22164869308472
- type: cosine_precision
value: 47.6457399103139
- type: cosine_recall
value: 93.71554575523705
- type: dot_accuracy
value: 55.7
- type: dot_accuracy_threshold
value: 48353.62548828125
- type: dot_ap
value: 48.53805970536875
- type: dot_f1
value: 62.42214532871972
- type: dot_f1_threshold
value: 38215.53955078125
- type: dot_precision
value: 45.48663640948058
- type: dot_recall
value: 99.44873208379272
- type: euclidean_accuracy
value: 61.75000000000001
- type: euclidean_accuracy_threshold
value: 189.0761137008667
- type: euclidean_ap
value: 60.55517418691518
- type: euclidean_f1
value: 63.07977736549165
- type: euclidean_f1_threshold
value: 504.3168067932129
- type: euclidean_precision
value: 47.53914988814318
- type: euclidean_recall
value: 93.71554575523705
- type: main_score
value: 60.679317786040585
- type: manhattan_accuracy
value: 61.9
- type: manhattan_accuracy_threshold
value: 4695.778274536133
- type: manhattan_ap
value: 60.48686620413608
- type: manhattan_f1
value: 62.92880855772778
- type: manhattan_f1_threshold
value: 12542.36831665039
- type: manhattan_precision
value: 47.28381374722838
- type: manhattan_recall
value: 94.04630650496141
- type: max_accuracy
value: 61.9
- type: max_ap
value: 60.679317786040585
- type: max_f1
value: 63.17354143441101
- type: max_precision
value: 47.6457399103139
- type: max_recall
value: 99.44873208379272
- type: similarity_accuracy
value: 61.8
- type: similarity_accuracy_threshold
value: 99.5664119720459
- type: similarity_ap
value: 60.679317786040585
- type: similarity_f1
value: 63.17354143441101
- type: similarity_f1_threshold
value: 97.22164869308472
- type: similarity_precision
value: 47.6457399103139
- type: similarity_recall
value: 93.71554575523705
- task:
type: PairClassification
dataset:
name: MTEB PawsXPairClassification (de)
type: google-research-datasets/paws-x
config: de
split: test
revision: 8a04d940a42cd40658986fdd8e3da561533a3646
metrics:
- type: cosine_accuracy
value: 60.25
- type: cosine_accuracy_threshold
value: 99.54338073730469
- type: cosine_ap
value: 56.7863613689054
- type: cosine_f1
value: 62.23499820337766
- type: cosine_f1_threshold
value: 89.95014429092407
- type: cosine_precision
value: 45.86864406779661
- type: cosine_recall
value: 96.75977653631284
- type: dot_accuracy
value: 56.8
- type: dot_accuracy_threshold
value: 47349.78332519531
- type: dot_ap
value: 49.7857806061729
- type: dot_f1
value: 62.31225986727209
- type: dot_f1_threshold
value: 30143.206787109375
- type: dot_precision
value: 45.32520325203252
- type: dot_recall
value: 99.66480446927373
- type: euclidean_accuracy
value: 60.3
- type: euclidean_accuracy_threshold
value: 219.78106498718262
- type: euclidean_ap
value: 56.731544327179606
- type: euclidean_f1
value: 62.19895287958115
- type: euclidean_f1_threshold
value: 1792.1623229980469
- type: euclidean_precision
value: 45.22842639593909
- type: euclidean_recall
value: 99.55307262569832
- type: main_score
value: 56.7863613689054
- type: manhattan_accuracy
value: 60.150000000000006
- type: manhattan_accuracy_threshold
value: 5104.503631591797
- type: manhattan_ap
value: 56.70304479768734
- type: manhattan_f1
value: 62.22067039106145
- type: manhattan_f1_threshold
value: 42839.471435546875
- type: manhattan_precision
value: 45.2513966480447
- type: manhattan_recall
value: 99.55307262569832
- type: max_accuracy
value: 60.3
- type: max_ap
value: 56.7863613689054
- type: max_f1
value: 62.31225986727209
- type: max_precision
value: 45.86864406779661
- type: max_recall
value: 99.66480446927373
- type: similarity_accuracy
value: 60.25
- type: similarity_accuracy_threshold
value: 99.54338073730469
- type: similarity_ap
value: 56.7863613689054
- type: similarity_f1
value: 62.23499820337766
- type: similarity_f1_threshold
value: 89.95014429092407
- type: similarity_precision
value: 45.86864406779661
- type: similarity_recall
value: 96.75977653631284
- task:
type: PairClassification
dataset:
name: MTEB PawsXPairClassification (es)
type: google-research-datasets/paws-x
config: es
split: test
revision: 8a04d940a42cd40658986fdd8e3da561533a3646
metrics:
- type: cosine_accuracy
value: 59.699999999999996
- type: cosine_accuracy_threshold
value: 99.55930709838867
- type: cosine_ap
value: 57.31662248806265
- type: cosine_f1
value: 62.444061962134256
- type: cosine_f1_threshold
value: 74.75898265838623
- type: cosine_precision
value: 45.3953953953954
- type: cosine_recall
value: 100.0
- type: dot_accuracy
value: 55.900000000000006
- type: dot_accuracy_threshold
value: 47512.90283203125
- type: dot_ap
value: 49.39339147787568
- type: dot_f1
value: 62.487082328625554
- type: dot_f1_threshold
value: 34989.03503417969
- type: dot_precision
value: 45.44088176352705
- type: dot_recall
value: 100.0
- type: euclidean_accuracy
value: 59.599999999999994
- type: euclidean_accuracy_threshold
value: 200.82547664642334
- type: euclidean_ap
value: 57.19737488445163
- type: euclidean_f1
value: 62.444061962134256
- type: euclidean_f1_threshold
value: 1538.8837814331055
- type: euclidean_precision
value: 45.3953953953954
- type: euclidean_recall
value: 100.0
- type: main_score
value: 57.31662248806265
- type: manhattan_accuracy
value: 59.550000000000004
- type: manhattan_accuracy_threshold
value: 5016.501617431641
- type: manhattan_ap
value: 57.089959907945065
- type: manhattan_f1
value: 62.444061962134256
- type: manhattan_f1_threshold
value: 37523.53515625
- type: manhattan_precision
value: 45.3953953953954
- type: manhattan_recall
value: 100.0
- type: max_accuracy
value: 59.699999999999996
- type: max_ap
value: 57.31662248806265
- type: max_f1
value: 62.487082328625554
- type: max_precision
value: 45.44088176352705
- type: max_recall
value: 100.0
- type: similarity_accuracy
value: 59.699999999999996
- type: similarity_accuracy_threshold
value: 99.55930709838867
- type: similarity_ap
value: 57.31662248806265
- type: similarity_f1
value: 62.444061962134256
- type: similarity_f1_threshold
value: 74.75898265838623
- type: similarity_precision
value: 45.3953953953954
- type: similarity_recall
value: 100.0
- task:
type: PairClassification
dataset:
name: MTEB PawsXPairClassification (fr)
type: google-research-datasets/paws-x
config: fr
split: test
revision: 8a04d940a42cd40658986fdd8e3da561533a3646
metrics:
- type: cosine_accuracy
value: 61.150000000000006
- type: cosine_accuracy_threshold
value: 99.36153888702393
- type: cosine_ap
value: 59.43845317938599
- type: cosine_f1
value: 62.51298026998961
- type: cosine_f1_threshold
value: 76.77866220474243
- type: cosine_precision
value: 45.468277945619334
- type: cosine_recall
value: 100.0
- type: dot_accuracy
value: 55.75
- type: dot_accuracy_threshold
value: 48931.55212402344
- type: dot_ap
value: 50.15949290538757
- type: dot_f1
value: 62.53462603878117
- type: dot_f1_threshold
value: 34415.7958984375
- type: dot_precision
value: 45.4911838790932
- type: dot_recall
value: 100.0
- type: euclidean_accuracy
value: 61.050000000000004
- type: euclidean_accuracy_threshold
value: 240.8097267150879
- type: euclidean_ap
value: 59.367971294226216
- type: euclidean_f1
value: 62.51298026998961
- type: euclidean_f1_threshold
value: 1444.132423400879
- type: euclidean_precision
value: 45.468277945619334
- type: euclidean_recall
value: 100.0
- type: main_score
value: 59.43845317938599
- type: manhattan_accuracy
value: 60.95
- type: manhattan_accuracy_threshold
value: 5701.206207275391
- type: manhattan_ap
value: 59.30094096378774
- type: manhattan_f1
value: 62.53462603878117
- type: manhattan_f1_threshold
value: 33445.672607421875
- type: manhattan_precision
value: 45.4911838790932
- type: manhattan_recall
value: 100.0
- type: max_accuracy
value: 61.150000000000006
- type: max_ap
value: 59.43845317938599
- type: max_f1
value: 62.53462603878117
- type: max_precision
value: 45.4911838790932
- type: max_recall
value: 100.0
- type: similarity_accuracy
value: 61.150000000000006
- type: similarity_accuracy_threshold
value: 99.36153888702393
- type: similarity_ap
value: 59.43845317938599
- type: similarity_f1
value: 62.51298026998961
- type: similarity_f1_threshold
value: 76.77866220474243
- type: similarity_precision
value: 45.468277945619334
- type: similarity_recall
value: 100.0
- task:
type: PairClassification
dataset:
name: MTEB PawsXPairClassification (zh)
type: google-research-datasets/paws-x
config: zh
split: test
revision: 8a04d940a42cd40658986fdd8e3da561533a3646
metrics:
- type: cosine_accuracy
value: 58.85
- type: cosine_accuracy_threshold
value: 99.73838329315186
- type: cosine_ap
value: 54.66913160570546
- type: cosine_f1
value: 62.32136632973162
- type: cosine_f1_threshold
value: 76.4499306678772
- type: cosine_precision
value: 45.265822784810126
- type: cosine_recall
value: 100.0
- type: dot_accuracy
value: 56.25
- type: dot_accuracy_threshold
value: 47351.9287109375
- type: dot_ap
value: 48.5266232989438
- type: dot_f1
value: 62.277951933124356
- type: dot_f1_threshold
value: 31325.28076171875
- type: dot_precision
value: 45.220030349013655
- type: dot_recall
value: 100.0
- type: euclidean_accuracy
value: 58.9
- type: euclidean_accuracy_threshold
value: 144.24468278884888
- type: euclidean_ap
value: 54.66981490353506
- type: euclidean_f1
value: 62.32136632973162
- type: euclidean_f1_threshold
value: 1484.908676147461
- type: euclidean_precision
value: 45.265822784810126
- type: euclidean_recall
value: 100.0
- type: main_score
value: 54.66981490353506
- type: manhattan_accuracy
value: 58.9
- type: manhattan_accuracy_threshold
value: 3586.785125732422
- type: manhattan_ap
value: 54.668355260247736
- type: manhattan_f1
value: 62.32136632973162
- type: manhattan_f1_threshold
value: 36031.22863769531
- type: manhattan_precision
value: 45.265822784810126
- type: manhattan_recall
value: 100.0
- type: max_accuracy
value: 58.9
- type: max_ap
value: 54.66981490353506
- type: max_f1
value: 62.32136632973162
- type: max_precision
value: 45.265822784810126
- type: max_recall
value: 100.0
- type: similarity_accuracy
value: 58.85
- type: similarity_accuracy_threshold
value: 99.73838329315186
- type: similarity_ap
value: 54.66913160570546
- type: similarity_f1
value: 62.32136632973162
- type: similarity_f1_threshold
value: 76.4499306678772
- type: similarity_precision
value: 45.265822784810126
- type: similarity_recall
value: 100.0
- task:
type: Classification
dataset:
name: MTEB PolEmo2.0-IN (default)
type: PL-MTEB/polemo2_in
config: default
split: test
revision: d90724373c70959f17d2331ad51fb60c71176b03
metrics:
- type: accuracy
value: 83.75346260387812
- type: f1
value: 81.98304891214909
- type: f1_weighted
value: 84.29623200830078
- type: main_score
value: 83.75346260387812
- task:
type: Classification
dataset:
name: MTEB PolEmo2.0-OUT (default)
type: PL-MTEB/polemo2_out
config: default
split: test
revision: 6a21ab8716e255ab1867265f8b396105e8aa63d4
metrics:
- type: accuracy
value: 66.53846153846153
- type: f1
value: 52.71826064368638
- type: f1_weighted
value: 69.10010124630334
- type: main_score
value: 66.53846153846153
- task:
type: PairClassification
dataset:
name: MTEB PPC
type: PL-MTEB/ppc-pairclassification
config: default
split: test
revision: None
metrics:
- type: cosine_accuracy
value: 81.8
- type: cosine_accuracy_threshold
value: 90.47793745994568
- type: cosine_ap
value: 91.42490266080884
- type: cosine_f1
value: 85.4632587859425
- type: cosine_f1_threshold
value: 90.47793745994568
- type: cosine_precision
value: 82.56172839506173
- type: cosine_recall
value: 88.57615894039735
- type: dot_accuracy
value: 74.6
- type: dot_accuracy_threshold
value: 42102.23693847656
- type: dot_ap
value: 86.20060009096979
- type: dot_f1
value: 80.02842928216063
- type: dot_f1_threshold
value: 38970.16906738281
- type: dot_precision
value: 70.1120797011208
- type: dot_recall
value: 93.21192052980133
- type: euclidean_accuracy
value: 81.5
- type: euclidean_accuracy_threshold
value: 880.433464050293
- type: euclidean_ap
value: 91.33143477982087
- type: euclidean_f1
value: 85.44600938967135
- type: euclidean_f1_threshold
value: 964.0384674072266
- type: euclidean_precision
value: 81.00890207715133
- type: euclidean_recall
value: 90.39735099337747
- type: main_score
value: 91.42490266080884
- type: manhattan_accuracy
value: 81.3
- type: manhattan_accuracy_threshold
value: 22100.830078125
- type: manhattan_ap
value: 91.25996158651282
- type: manhattan_f1
value: 85.38102643856921
- type: manhattan_f1_threshold
value: 24043.515014648438
- type: manhattan_precision
value: 80.49853372434018
- type: manhattan_recall
value: 90.89403973509934
- type: max_accuracy
value: 81.8
- type: max_ap
value: 91.42490266080884
- type: max_f1
value: 85.4632587859425
- type: max_precision
value: 82.56172839506173
- type: max_recall
value: 93.21192052980133
- type: similarity_accuracy
value: 81.8
- type: similarity_accuracy_threshold
value: 90.47793745994568
- type: similarity_ap
value: 91.42490266080884
- type: similarity_f1
value: 85.4632587859425
- type: similarity_f1_threshold
value: 90.47793745994568
- type: similarity_precision
value: 82.56172839506173
- type: similarity_recall
value: 88.57615894039735
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval (default)
type: mteb/quora
config: default
split: test
revision: e4e08e0b7dbe3c8700f0daef558ff32256715259
metrics:
- type: map_at_1
value: 71.419
- type: map_at_10
value: 85.542
- type: map_at_100
value: 86.161
- type: map_at_1000
value: 86.175
- type: map_at_20
value: 85.949
- type: map_at_3
value: 82.623
- type: map_at_5
value: 84.5
- type: mrr_at_1
value: 82.27
- type: mrr_at_10
value: 88.21900000000001
- type: mrr_at_100
value: 88.313
- type: mrr_at_1000
value: 88.31400000000001
- type: mrr_at_20
value: 88.286
- type: mrr_at_3
value: 87.325
- type: mrr_at_5
value: 87.97500000000001
- type: ndcg_at_1
value: 82.3
- type: ndcg_at_10
value: 89.088
- type: ndcg_at_100
value: 90.217
- type: ndcg_at_1000
value: 90.29700000000001
- type: ndcg_at_20
value: 89.697
- type: ndcg_at_3
value: 86.435
- type: ndcg_at_5
value: 87.966
- type: precision_at_1
value: 82.3
- type: precision_at_10
value: 13.527000000000001
- type: precision_at_100
value: 1.537
- type: precision_at_1000
value: 0.157
- type: precision_at_20
value: 7.165000000000001
- type: precision_at_3
value: 37.92
- type: precision_at_5
value: 24.914
- type: recall_at_1
value: 71.419
- type: recall_at_10
value: 95.831
- type: recall_at_100
value: 99.64
- type: recall_at_1000
value: 99.988
- type: recall_at_20
value: 97.76599999999999
- type: recall_at_3
value: 88.081
- type: recall_at_5
value: 92.50500000000001
- type: main_score
value: 89.088
- task:
type: STS
dataset:
name: MTEB RUParaPhraserSTS (default)
type: merionum/ru_paraphraser
config: default
split: test
revision: 43265056790b8f7c59e0139acb4be0a8dad2c8f4
metrics:
- type: cosine_pearson
value: 67.91177744712421
- type: cosine_spearman
value: 76.77113726753656
- type: euclidean_pearson
value: 73.81454206068638
- type: euclidean_spearman
value: 76.92529493599028
- type: main_score
value: 76.77113726753656
- type: manhattan_pearson
value: 73.81690454439168
- type: manhattan_spearman
value: 76.87333776705002
- type: pearson
value: 67.91177744712421
- type: spearman
value: 76.77113726753656
- task:
type: Clustering
dataset:
name: MTEB RedditClustering (default)
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: main_score
value: 55.39924225216962
- type: v_measure
value: 55.39924225216962
- type: v_measure_std
value: 4.723802279292467
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P (default)
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 385e3cb46b4cfa89021f56c4380204149d0efe33
metrics:
- type: main_score
value: 62.87465161304012
- type: v_measure
value: 62.87465161304012
- type: v_measure_std
value: 12.082670914488473
- task:
type: Retrieval
dataset:
name: MTEB RiaNewsRetrieval (default)
type: ai-forever/ria-news-retrieval
config: default
split: test
revision: 82374b0bbacda6114f39ff9c5b925fa1512ca5d7
metrics:
- type: main_score
value: 79.209
- type: map_at_1
value: 67.33
- type: map_at_10
value: 75.633
- type: map_at_100
value: 75.897
- type: map_at_1000
value: 75.907
- type: map_at_20
value: 75.804
- type: map_at_3
value: 74.2
- type: map_at_5
value: 75.13300000000001
- type: mrr_at_1
value: 67.31
- type: mrr_at_10
value: 75.62709126984095
- type: mrr_at_100
value: 75.89105697041113
- type: mrr_at_1000
value: 75.90115653883124
- type: mrr_at_20
value: 75.79802332308172
- type: mrr_at_3
value: 74.19499999999961
- type: mrr_at_5
value: 75.12849999999939
- type: nauc_map_at_1000_diff1
value: 74.30304869630591
- type: nauc_map_at_1000_max
value: 36.477146725784046
- type: nauc_map_at_1000_std
value: -20.862772498461723
- type: nauc_map_at_100_diff1
value: 74.29833058090355
- type: nauc_map_at_100_max
value: 36.483678619667884
- type: nauc_map_at_100_std
value: -20.856274849980135
- type: nauc_map_at_10_diff1
value: 74.20729220697967
- type: nauc_map_at_10_max
value: 36.56543146170092
- type: nauc_map_at_10_std
value: -20.991081015484728
- type: nauc_map_at_1_diff1
value: 77.38899022125185
- type: nauc_map_at_1_max
value: 32.45918619669731
- type: nauc_map_at_1_std
value: -22.149586336167324
- type: nauc_map_at_20_diff1
value: 74.2447573558587
- type: nauc_map_at_20_max
value: 36.50383130240387
- type: nauc_map_at_20_std
value: -20.87013743041831
- type: nauc_map_at_3_diff1
value: 74.3054577294586
- type: nauc_map_at_3_max
value: 36.484530586652724
- type: nauc_map_at_3_std
value: -21.90543024607988
- type: nauc_map_at_5_diff1
value: 74.21062368961503
- type: nauc_map_at_5_max
value: 36.55670532498779
- type: nauc_map_at_5_std
value: -21.488786900676942
- type: nauc_mrr_at_1000_diff1
value: 74.31619177956684
- type: nauc_mrr_at_1000_max
value: 36.53498918453189
- type: nauc_mrr_at_1000_std
value: -20.75986704931237
- type: nauc_mrr_at_100_diff1
value: 74.31146790382356
- type: nauc_mrr_at_100_max
value: 36.54149252857106
- type: nauc_mrr_at_100_std
value: -20.75341959250079
- type: nauc_mrr_at_10_diff1
value: 74.22027806145095
- type: nauc_mrr_at_10_max
value: 36.622542969971725
- type: nauc_mrr_at_10_std
value: -20.889417384064117
- type: nauc_mrr_at_1_diff1
value: 77.4306709551449
- type: nauc_mrr_at_1_max
value: 32.57259463438259
- type: nauc_mrr_at_1_std
value: -21.964402859613937
- type: nauc_mrr_at_20_diff1
value: 74.25784396230718
- type: nauc_mrr_at_20_max
value: 36.561412224507336
- type: nauc_mrr_at_20_std
value: -20.767665000065723
- type: nauc_mrr_at_3_diff1
value: 74.31423253547214
- type: nauc_mrr_at_3_max
value: 36.537745749488906
- type: nauc_mrr_at_3_std
value: -21.81259529019546
- type: nauc_mrr_at_5_diff1
value: 74.22404613312771
- type: nauc_mrr_at_5_max
value: 36.60743768455219
- type: nauc_mrr_at_5_std
value: -21.39479216331971
- type: nauc_ndcg_at_1000_diff1
value: 73.48182819705742
- type: nauc_ndcg_at_1000_max
value: 37.86991608461793
- type: nauc_ndcg_at_1000_std
value: -19.021499322688904
- type: nauc_ndcg_at_100_diff1
value: 73.34941250585759
- type: nauc_ndcg_at_100_max
value: 38.11150275625829
- type: nauc_ndcg_at_100_std
value: -18.70624087206104
- type: nauc_ndcg_at_10_diff1
value: 72.82520265115987
- type: nauc_ndcg_at_10_max
value: 38.43323357650525
- type: nauc_ndcg_at_10_std
value: -19.410953792830878
- type: nauc_ndcg_at_1_diff1
value: 77.38899022125185
- type: nauc_ndcg_at_1_max
value: 32.45918619669731
- type: nauc_ndcg_at_1_std
value: -22.149586336167324
- type: nauc_ndcg_at_20_diff1
value: 72.93309285256507
- type: nauc_ndcg_at_20_max
value: 38.217372819067755
- type: nauc_ndcg_at_20_std
value: -18.864113576359333
- type: nauc_ndcg_at_3_diff1
value: 73.18253776744112
- type: nauc_ndcg_at_3_max
value: 38.008109328364
- type: nauc_ndcg_at_3_std
value: -21.68785687594153
- type: nauc_ndcg_at_5_diff1
value: 72.90474739784793
- type: nauc_ndcg_at_5_max
value: 38.29483039202184
- type: nauc_ndcg_at_5_std
value: -20.833049811453474
- type: nauc_precision_at_1000_diff1
value: 59.306217613750334
- type: nauc_precision_at_1000_max
value: 72.20747948302262
- type: nauc_precision_at_1000_std
value: 45.58837180096227
- type: nauc_precision_at_100_diff1
value: 62.87286844562389
- type: nauc_precision_at_100_max
value: 61.33108214045868
- type: nauc_precision_at_100_std
value: 20.67481963545654
- type: nauc_precision_at_10_diff1
value: 64.11222984256685
- type: nauc_precision_at_10_max
value: 50.323697746037496
- type: nauc_precision_at_10_std
value: -7.9994544634332625
- type: nauc_precision_at_1_diff1
value: 77.38899022125185
- type: nauc_precision_at_1_max
value: 32.45918619669731
- type: nauc_precision_at_1_std
value: -22.149586336167324
- type: nauc_precision_at_20_diff1
value: 62.30228127286973
- type: nauc_precision_at_20_max
value: 52.02090746208407
- type: nauc_precision_at_20_std
value: 0.7629898806370331
- type: nauc_precision_at_3_diff1
value: 68.82856645994157
- type: nauc_precision_at_3_max
value: 43.94171571306625
- type: nauc_precision_at_3_std
value: -20.78595255410148
- type: nauc_precision_at_5_diff1
value: 66.62157622497887
- type: nauc_precision_at_5_max
value: 46.69398173603811
- type: nauc_precision_at_5_std
value: -17.412423571163057
- type: nauc_recall_at_1000_diff1
value: 59.30621761375148
- type: nauc_recall_at_1000_max
value: 72.20747948302191
- type: nauc_recall_at_1000_std
value: 45.588371800962655
- type: nauc_recall_at_100_diff1
value: 62.872868445623894
- type: nauc_recall_at_100_max
value: 61.33108214045813
- type: nauc_recall_at_100_std
value: 20.67481963545666
- type: nauc_recall_at_10_diff1
value: 64.11222984256698
- type: nauc_recall_at_10_max
value: 50.32369774603755
- type: nauc_recall_at_10_std
value: -7.999454463433321
- type: nauc_recall_at_1_diff1
value: 77.38899022125185
- type: nauc_recall_at_1_max
value: 32.45918619669731
- type: nauc_recall_at_1_std
value: -22.149586336167324
- type: nauc_recall_at_20_diff1
value: 62.3022812728695
- type: nauc_recall_at_20_max
value: 52.02090746208397
- type: nauc_recall_at_20_std
value: 0.7629898806369458
- type: nauc_recall_at_3_diff1
value: 68.82856645994157
- type: nauc_recall_at_3_max
value: 43.94171571306612
- type: nauc_recall_at_3_std
value: -20.78595255410157
- type: nauc_recall_at_5_diff1
value: 66.62157622497897
- type: nauc_recall_at_5_max
value: 46.693981736038246
- type: nauc_recall_at_5_std
value: -17.412423571162954
- type: ndcg_at_1
value: 67.33
- type: ndcg_at_10
value: 79.209
- type: ndcg_at_100
value: 80.463
- type: ndcg_at_1000
value: 80.74799999999999
- type: ndcg_at_20
value: 79.81899999999999
- type: ndcg_at_3
value: 76.335
- type: ndcg_at_5
value: 78.011
- type: precision_at_1
value: 67.33
- type: precision_at_10
value: 9.020999999999999
- type: precision_at_100
value: 0.96
- type: precision_at_1000
value: 0.098
- type: precision_at_20
value: 4.63
- type: precision_at_3
value: 27.493000000000002
- type: precision_at_5
value: 17.308
- type: recall_at_1
value: 67.33
- type: recall_at_10
value: 90.21000000000001
- type: recall_at_100
value: 96.00999999999999
- type: recall_at_1000
value: 98.29
- type: recall_at_20
value: 92.60000000000001
- type: recall_at_3
value: 82.48
- type: recall_at_5
value: 86.53999999999999
- task:
type: Reranking
dataset:
name: MTEB RuBQReranking (default)
type: ai-forever/rubq-reranking
config: default
split: test
revision: 2e96b8f098fa4b0950fc58eacadeb31c0d0c7fa2
metrics:
- type: main_score
value: 65.57453932493252
- type: map
value: 65.57453932493252
- type: mrr
value: 70.51408205663526
- type: nAUC_map_diff1
value: 26.69583260609023
- type: nAUC_map_max
value: 12.928262749610663
- type: nAUC_map_std
value: 11.702468857903128
- type: nAUC_mrr_diff1
value: 28.5206955462174
- type: nAUC_mrr_max
value: 14.207162454694227
- type: nAUC_mrr_std
value: 10.725721001555296
- task:
type: Retrieval
dataset:
name: MTEB RuBQRetrieval (default)
type: ai-forever/rubq-retrieval
config: default
split: test
revision: e19b6ffa60b3bc248e0b41f4cc37c26a55c2a67b
metrics:
- type: main_score
value: 72.306
- type: map_at_1
value: 44.187
- type: map_at_10
value: 64.836
- type: map_at_100
value: 65.771
- type: map_at_1000
value: 65.8
- type: map_at_20
value: 65.497
- type: map_at_3
value: 59.692
- type: map_at_5
value: 63.105
- type: mrr_at_1
value: 62.23404255319149
- type: mrr_at_10
value: 73.40810161732159
- type: mrr_at_100
value: 73.67949305473395
- type: mrr_at_1000
value: 73.68707852294746
- type: mrr_at_20
value: 73.60429051697479
- type: mrr_at_3
value: 71.47360126083535
- type: mrr_at_5
value: 72.8447596532704
- type: nauc_map_at_1000_diff1
value: 39.838449035736886
- type: nauc_map_at_1000_max
value: 32.29962306877408
- type: nauc_map_at_1000_std
value: -6.324859592714388
- type: nauc_map_at_100_diff1
value: 39.824361938745426
- type: nauc_map_at_100_max
value: 32.32055222704763
- type: nauc_map_at_100_std
value: -6.301641111869559
- type: nauc_map_at_10_diff1
value: 39.50155328718487
- type: nauc_map_at_10_max
value: 31.745730244960672
- type: nauc_map_at_10_std
value: -6.867215137329693
- type: nauc_map_at_1_diff1
value: 47.66181128677822
- type: nauc_map_at_1_max
value: 21.75204233166764
- type: nauc_map_at_1_std
value: -8.06951079061697
- type: nauc_map_at_20_diff1
value: 39.78364637902108
- type: nauc_map_at_20_max
value: 32.39065528029405
- type: nauc_map_at_20_std
value: -6.368994332729006
- type: nauc_map_at_3_diff1
value: 39.51829474433183
- type: nauc_map_at_3_max
value: 28.633292697821673
- type: nauc_map_at_3_std
value: -7.2561170814963925
- type: nauc_map_at_5_diff1
value: 39.288433237676266
- type: nauc_map_at_5_max
value: 31.007702201615515
- type: nauc_map_at_5_std
value: -7.235131195162474
- type: nauc_mrr_at_1000_diff1
value: 49.599102391215226
- type: nauc_mrr_at_1000_max
value: 38.25521825911133
- type: nauc_mrr_at_1000_std
value: -10.448180939809435
- type: nauc_mrr_at_100_diff1
value: 49.5957067716212
- type: nauc_mrr_at_100_max
value: 38.26760703964535
- type: nauc_mrr_at_100_std
value: -10.438443051971081
- type: nauc_mrr_at_10_diff1
value: 49.35269710190271
- type: nauc_mrr_at_10_max
value: 38.43782589127069
- type: nauc_mrr_at_10_std
value: -10.404402063509815
- type: nauc_mrr_at_1_diff1
value: 53.32206103688421
- type: nauc_mrr_at_1_max
value: 33.52402390241035
- type: nauc_mrr_at_1_std
value: -12.73473393949936
- type: nauc_mrr_at_20_diff1
value: 49.550630850826636
- type: nauc_mrr_at_20_max
value: 38.35964703941151
- type: nauc_mrr_at_20_std
value: -10.444577766284766
- type: nauc_mrr_at_3_diff1
value: 49.12029127633829
- type: nauc_mrr_at_3_max
value: 38.01631275124067
- type: nauc_mrr_at_3_std
value: -10.523724301481309
- type: nauc_mrr_at_5_diff1
value: 49.04606949432458
- type: nauc_mrr_at_5_max
value: 38.33647550077891
- type: nauc_mrr_at_5_std
value: -10.47076409263114
- type: nauc_ndcg_at_1000_diff1
value: 41.342785916264226
- type: nauc_ndcg_at_1000_max
value: 35.75731064862711
- type: nauc_ndcg_at_1000_std
value: -5.45573422899229
- type: nauc_ndcg_at_100_diff1
value: 40.972974559636086
- type: nauc_ndcg_at_100_max
value: 36.32938573321036
- type: nauc_ndcg_at_100_std
value: -4.749631537590004
- type: nauc_ndcg_at_10_diff1
value: 39.67813474464166
- type: nauc_ndcg_at_10_max
value: 35.480200504848966
- type: nauc_ndcg_at_10_std
value: -6.318561293935512
- type: nauc_ndcg_at_1_diff1
value: 53.45970160222764
- type: nauc_ndcg_at_1_max
value: 33.14759013278075
- type: nauc_ndcg_at_1_std
value: -12.579833891774847
- type: nauc_ndcg_at_20_diff1
value: 40.67492861219249
- type: nauc_ndcg_at_20_max
value: 36.84960799838019
- type: nauc_ndcg_at_20_std
value: -5.202530835850179
- type: nauc_ndcg_at_3_diff1
value: 39.574906207408844
- type: nauc_ndcg_at_3_max
value: 31.76512164509258
- type: nauc_ndcg_at_3_std
value: -7.656143208565999
- type: nauc_ndcg_at_5_diff1
value: 39.096348529742095
- type: nauc_ndcg_at_5_max
value: 34.075926475544165
- type: nauc_ndcg_at_5_std
value: -7.238045445366631
- type: nauc_precision_at_1000_diff1
value: -14.283799754212609
- type: nauc_precision_at_1000_max
value: 6.449741756717101
- type: nauc_precision_at_1000_std
value: 4.862828679759048
- type: nauc_precision_at_100_diff1
value: -13.23173132700258
- type: nauc_precision_at_100_max
value: 11.058898534529195
- type: nauc_precision_at_100_std
value: 7.343683941814956
- type: nauc_precision_at_10_diff1
value: -7.202951643546464
- type: nauc_precision_at_10_max
value: 17.499446869433278
- type: nauc_precision_at_10_std
value: 2.8367985220406307
- type: nauc_precision_at_1_diff1
value: 53.45970160222764
- type: nauc_precision_at_1_max
value: 33.14759013278075
- type: nauc_precision_at_1_std
value: -12.579833891774847
- type: nauc_precision_at_20_diff1
value: -9.477122699154124
- type: nauc_precision_at_20_max
value: 16.80556031564312
- type: nauc_precision_at_20_std
value: 6.420218284416923
- type: nauc_precision_at_3_diff1
value: 5.5276143574150245
- type: nauc_precision_at_3_max
value: 23.65952688481666
- type: nauc_precision_at_3_std
value: -1.8730348729295785
- type: nauc_precision_at_5_diff1
value: -2.4537029093721308
- type: nauc_precision_at_5_max
value: 21.41469327545133
- type: nauc_precision_at_5_std
value: 0.1543890645722277
- type: nauc_recall_at_1000_diff1
value: -1.7474947956413491
- type: nauc_recall_at_1000_max
value: 46.22670991970479
- type: nauc_recall_at_1000_std
value: 62.582840705588794
- type: nauc_recall_at_100_diff1
value: 16.116089801097345
- type: nauc_recall_at_100_max
value: 52.54794580975103
- type: nauc_recall_at_100_std
value: 33.720245696003246
- type: nauc_recall_at_10_diff1
value: 23.134924318655482
- type: nauc_recall_at_10_max
value: 38.73754275649077
- type: nauc_recall_at_10_std
value: 0.6137471711639239
- type: nauc_recall_at_1_diff1
value: 47.66181128677822
- type: nauc_recall_at_1_max
value: 21.75204233166764
- type: nauc_recall_at_1_std
value: -8.06951079061697
- type: nauc_recall_at_20_diff1
value: 24.130616271355017
- type: nauc_recall_at_20_max
value: 48.306178640146136
- type: nauc_recall_at_20_std
value: 9.290819557000022
- type: nauc_recall_at_3_diff1
value: 29.767415016250226
- type: nauc_recall_at_3_max
value: 28.54289782140701
- type: nauc_recall_at_3_std
value: -5.1395675072005576
- type: nauc_recall_at_5_diff1
value: 25.410613126870174
- type: nauc_recall_at_5_max
value: 33.24658754857624
- type: nauc_recall_at_5_std
value: -4.211226036746632
- type: ndcg_at_1
value: 62.175000000000004
- type: ndcg_at_10
value: 72.306
- type: ndcg_at_100
value: 75.074
- type: ndcg_at_1000
value: 75.581
- type: ndcg_at_20
value: 73.875
- type: ndcg_at_3
value: 65.641
- type: ndcg_at_5
value: 69.48299999999999
- type: precision_at_1
value: 62.175000000000004
- type: precision_at_10
value: 13.907
- type: precision_at_100
value: 1.591
- type: precision_at_1000
value: 0.166
- type: precision_at_20
value: 7.446999999999999
- type: precision_at_3
value: 35.619
- type: precision_at_5
value: 24.917
- type: recall_at_1
value: 44.187
- type: recall_at_10
value: 85.10600000000001
- type: recall_at_100
value: 95.488
- type: recall_at_1000
value: 98.831
- type: recall_at_20
value: 90.22200000000001
- type: recall_at_3
value: 68.789
- type: recall_at_5
value: 77.85499999999999
- task:
type: Classification
dataset:
name: MTEB RuReviewsClassification (default)
type: ai-forever/ru-reviews-classification
config: default
split: test
revision: f6d2c31f4dc6b88f468552750bfec05b4b41b05a
metrics:
- type: accuracy
value: 67.5830078125
- type: f1
value: 67.56931936632446
- type: f1_weighted
value: 67.57137733752779
- type: main_score
value: 67.5830078125
- task:
type: STS
dataset:
name: MTEB RuSTSBenchmarkSTS (default)
type: ai-forever/ru-stsbenchmark-sts
config: default
split: test
revision: 7cf24f325c6da6195df55bef3d86b5e0616f3018
metrics:
- type: cosine_pearson
value: 85.90493484626788
- type: cosine_spearman
value: 86.21965691667411
- type: euclidean_pearson
value: 86.07499842984909
- type: euclidean_spearman
value: 86.55506818735688
- type: main_score
value: 86.21965691667411
- type: manhattan_pearson
value: 85.95976420231729
- type: manhattan_spearman
value: 86.48604243661234
- type: pearson
value: 85.90493484626788
- type: spearman
value: 86.21965691667411
- task:
type: Classification
dataset:
name: MTEB RuSciBenchGRNTIClassification (default)
type: ai-forever/ru-scibench-grnti-classification
config: default
split: test
revision: 673a610d6d3dd91a547a0d57ae1b56f37ebbf6a1
metrics:
- type: accuracy
value: 59.1943359375
- type: f1
value: 58.894480861440414
- type: f1_weighted
value: 58.903615560240866
- type: main_score
value: 59.1943359375
- task:
type: Clustering
dataset:
name: MTEB RuSciBenchGRNTIClusteringP2P (default)
type: ai-forever/ru-scibench-grnti-classification
config: default
split: test
revision: 673a610d6d3dd91a547a0d57ae1b56f37ebbf6a1
metrics:
- type: main_score
value: 57.99209448663228
- type: v_measure
value: 57.99209448663228
- type: v_measure_std
value: 1.0381163861993816
- task:
type: Classification
dataset:
name: MTEB RuSciBenchOECDClassification (default)
type: ai-forever/ru-scibench-oecd-classification
config: default
split: test
revision: 26c88e99dcaba32bb45d0e1bfc21902337f6d471
metrics:
- type: accuracy
value: 45.556640625
- type: f1
value: 45.159163104085906
- type: f1_weighted
value: 45.16098316398626
- type: main_score
value: 45.556640625
- task:
type: Clustering
dataset:
name: MTEB RuSciBenchOECDClusteringP2P (default)
type: ai-forever/ru-scibench-oecd-classification
config: default
split: test
revision: 26c88e99dcaba32bb45d0e1bfc21902337f6d471
metrics:
- type: main_score
value: 50.787548070488974
- type: v_measure
value: 50.787548070488974
- type: v_measure_std
value: 0.8569958168946827
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS (default)
type: mteb/scidocs
config: default
split: test
revision: f8c2fcf00f625baaa80f62ec5bd9e1fff3b8ae88
metrics:
- type: map_at_1
value: 4.843
- type: map_at_10
value: 11.752
- type: map_at_100
value: 13.919
- type: map_at_1000
value: 14.198
- type: map_at_20
value: 12.898000000000001
- type: map_at_3
value: 8.603
- type: map_at_5
value: 10.069
- type: mrr_at_1
value: 23.799999999999997
- type: mrr_at_10
value: 34.449999999999996
- type: mrr_at_100
value: 35.64
- type: mrr_at_1000
value: 35.691
- type: mrr_at_20
value: 35.213
- type: mrr_at_3
value: 31.383
- type: mrr_at_5
value: 33.062999999999995
- type: ndcg_at_1
value: 23.799999999999997
- type: ndcg_at_10
value: 19.811
- type: ndcg_at_100
value: 28.108
- type: ndcg_at_1000
value: 33.1
- type: ndcg_at_20
value: 22.980999999999998
- type: ndcg_at_3
value: 19.153000000000002
- type: ndcg_at_5
value: 16.408
- type: precision_at_1
value: 23.799999999999997
- type: precision_at_10
value: 10.16
- type: precision_at_100
value: 2.1999999999999997
- type: precision_at_1000
value: 0.34099999999999997
- type: precision_at_20
value: 6.915
- type: precision_at_3
value: 17.8
- type: precision_at_5
value: 14.14
- type: recall_at_1
value: 4.843
- type: recall_at_10
value: 20.595
- type: recall_at_100
value: 44.66
- type: recall_at_1000
value: 69.152
- type: recall_at_20
value: 28.04
- type: recall_at_3
value: 10.833
- type: recall_at_5
value: 14.346999999999998
- type: main_score
value: 19.811
- task:
type: PairClassification
dataset:
name: MTEB SICK-E-PL (default)
type: PL-MTEB/sicke-pl-pairclassification
config: default
split: test
revision: 71bba34b0ece6c56dfcf46d9758a27f7a90f17e9
metrics:
- type: cosine_accuracy
value: 80.90093762739502
- type: cosine_accuracy_threshold
value: 94.40930485725403
- type: cosine_ap
value: 71.15400909912427
- type: cosine_f1
value: 66.8213457076566
- type: cosine_f1_threshold
value: 91.53673648834229
- type: cosine_precision
value: 62.4922504649721
- type: cosine_recall
value: 71.7948717948718
- type: dot_accuracy
value: 78.41418671015083
- type: dot_accuracy_threshold
value: 42924.45068359375
- type: dot_ap
value: 63.34003025365763
- type: dot_f1
value: 62.518258837277244
- type: dot_f1_threshold
value: 40900.738525390625
- type: dot_precision
value: 52.99653293709758
- type: dot_recall
value: 76.21082621082621
- type: euclidean_accuracy
value: 80.67672238075826
- type: euclidean_accuracy_threshold
value: 696.0524559020996
- type: euclidean_ap
value: 70.88762835990224
- type: euclidean_f1
value: 66.711051930759
- type: euclidean_f1_threshold
value: 878.5581588745117
- type: euclidean_precision
value: 62.625
- type: euclidean_recall
value: 71.36752136752136
- type: main_score
value: 71.15400909912427
- type: manhattan_accuracy
value: 80.65633917651854
- type: manhattan_accuracy_threshold
value: 17277.72674560547
- type: manhattan_ap
value: 70.67105336611716
- type: manhattan_f1
value: 66.51346027577151
- type: manhattan_f1_threshold
value: 21687.957763671875
- type: manhattan_precision
value: 61.69305724725944
- type: manhattan_recall
value: 72.15099715099716
- type: max_accuracy
value: 80.90093762739502
- type: max_ap
value: 71.15400909912427
- type: max_f1
value: 66.8213457076566
- type: max_precision
value: 62.625
- type: max_recall
value: 76.21082621082621
- type: similarity_accuracy
value: 80.90093762739502
- type: similarity_accuracy_threshold
value: 94.40930485725403
- type: similarity_ap
value: 71.15400909912427
- type: similarity_f1
value: 66.8213457076566
- type: similarity_f1_threshold
value: 91.53673648834229
- type: similarity_precision
value: 62.4922504649721
- type: similarity_recall
value: 71.7948717948718
- task:
type: STS
dataset:
name: MTEB SICK-R (default)
type: mteb/sickr-sts
config: default
split: test
revision: 20a6d6f312dd54037fe07a32d58e5e168867909d
metrics:
- type: cosine_pearson
value: 92.3339946866199
- type: cosine_spearman
value: 89.61697355115497
- type: euclidean_pearson
value: 90.3264916449669
- type: euclidean_spearman
value: 89.36270451308866
- type: main_score
value: 89.61697355115497
- type: manhattan_pearson
value: 90.18909339052534
- type: manhattan_spearman
value: 89.28337093097377
- type: pearson
value: 92.3339946866199
- type: spearman
value: 89.61697355115497
- task:
type: STS
dataset:
name: MTEB SICK-R-PL (default)
type: PL-MTEB/sickr-pl-sts
config: default
split: test
revision: fd5c2441b7eeff8676768036142af4cfa42c1339
metrics:
- type: cosine_pearson
value: 85.27883048457821
- type: cosine_spearman
value: 80.53204892678619
- type: euclidean_pearson
value: 82.78520705216168
- type: euclidean_spearman
value: 80.27848359873212
- type: main_score
value: 80.53204892678619
- type: manhattan_pearson
value: 82.63270640583454
- type: manhattan_spearman
value: 80.21507977473146
- type: pearson
value: 85.27883048457821
- type: spearman
value: 80.53204892678619
- task:
type: STS
dataset:
name: MTEB SICKFr (default)
type: Lajavaness/SICK-fr
config: default
split: test
revision: e077ab4cf4774a1e36d86d593b150422fafd8e8a
metrics:
- type: cosine_pearson
value: 88.77029361817212
- type: cosine_spearman
value: 83.9453600346894
- type: euclidean_pearson
value: 85.85331086208573
- type: euclidean_spearman
value: 83.70852031985308
- type: main_score
value: 83.9453600346894
- type: manhattan_pearson
value: 85.66222265885914
- type: manhattan_spearman
value: 83.60833111525962
- type: pearson
value: 88.77029361817212
- type: spearman
value: 83.9453600346894
- task:
type: STS
dataset:
name: MTEB STS12 (default)
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cosine_pearson
value: 88.76435859522375
- type: cosine_spearman
value: 82.43768167804375
- type: euclidean_pearson
value: 87.43566183874832
- type: euclidean_spearman
value: 82.82166873757507
- type: main_score
value: 82.43768167804375
- type: manhattan_pearson
value: 87.39450871380951
- type: manhattan_spearman
value: 82.89253043430163
- type: pearson
value: 88.76435859522375
- type: spearman
value: 82.43768167804375
- task:
type: STS
dataset:
name: MTEB STS13 (default)
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cosine_pearson
value: 88.86627241652141
- type: cosine_spearman
value: 89.49011599120688
- type: euclidean_pearson
value: 89.3314120073772
- type: euclidean_spearman
value: 89.8226502776963
- type: main_score
value: 89.49011599120688
- type: manhattan_pearson
value: 89.2252179076963
- type: manhattan_spearman
value: 89.74573844021225
- type: pearson
value: 88.86627241652141
- type: spearman
value: 89.49011599120688
- task:
type: STS
dataset:
name: MTEB STS14 (default)
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cosine_pearson
value: 87.22891405215968
- type: cosine_spearman
value: 84.9467188157614
- type: euclidean_pearson
value: 87.20330004726237
- type: euclidean_spearman
value: 85.34806059461808
- type: main_score
value: 84.9467188157614
- type: manhattan_pearson
value: 87.15224666107623
- type: manhattan_spearman
value: 85.34596898699708
- type: pearson
value: 87.22891405215968
- type: spearman
value: 84.9467188157614
- task:
type: STS
dataset:
name: MTEB STS15 (default)
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cosine_pearson
value: 88.14066430111033
- type: cosine_spearman
value: 89.31337445552545
- type: euclidean_pearson
value: 89.08039335366983
- type: euclidean_spearman
value: 89.6658762856415
- type: main_score
value: 89.31337445552545
- type: manhattan_pearson
value: 89.08057438154486
- type: manhattan_spearman
value: 89.68673984203022
- type: pearson
value: 88.14066430111033
- type: spearman
value: 89.31337445552545
- task:
type: STS
dataset:
name: MTEB STS16 (default)
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cosine_pearson
value: 85.14908856657084
- type: cosine_spearman
value: 86.84648320786727
- type: euclidean_pearson
value: 86.11454713131947
- type: euclidean_spearman
value: 86.77738862047961
- type: main_score
value: 86.84648320786727
- type: manhattan_pearson
value: 86.07804821916372
- type: manhattan_spearman
value: 86.78676064310474
- type: pearson
value: 85.14908856657084
- type: spearman
value: 86.84648320786727
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: faeb762787bd10488a50c8b5be4a3b82e411949c
metrics:
- type: cosine_pearson
value: 89.61633502468356
- type: cosine_spearman
value: 89.99772663224805
- type: euclidean_pearson
value: 90.14056501501044
- type: euclidean_spearman
value: 90.04496896837503
- type: main_score
value: 89.99772663224805
- type: manhattan_pearson
value: 90.08964860311801
- type: manhattan_spearman
value: 90.00091712362196
- type: pearson
value: 89.61633502468356
- type: spearman
value: 89.99772663224805
- task:
type: STS
dataset:
name: MTEB STS17 (es-en)
type: mteb/sts17-crosslingual-sts
config: es-en
split: test
revision: faeb762787bd10488a50c8b5be4a3b82e411949c
metrics:
- type: cosine_pearson
value: 86.44548026840202
- type: cosine_spearman
value: 87.26263108768539
- type: euclidean_pearson
value: 86.42844593583838
- type: euclidean_spearman
value: 86.89388428664364
- type: main_score
value: 87.26263108768539
- type: manhattan_pearson
value: 86.47186940800881
- type: manhattan_spearman
value: 87.02163091089946
- type: pearson
value: 86.44548026840202
- type: spearman
value: 87.26263108768539
- task:
type: STS
dataset:
name: MTEB STS17 (en-de)
type: mteb/sts17-crosslingual-sts
config: en-de
split: test
revision: faeb762787bd10488a50c8b5be4a3b82e411949c
metrics:
- type: cosine_pearson
value: 87.89345132532758
- type: cosine_spearman
value: 87.96246221327699
- type: euclidean_pearson
value: 88.49013032701419
- type: euclidean_spearman
value: 87.81981265317344
- type: main_score
value: 87.96246221327699
- type: manhattan_pearson
value: 88.31360914178538
- type: manhattan_spearman
value: 87.62734530005075
- type: pearson
value: 87.89345132532758
- type: spearman
value: 87.96246221327699
- task:
type: STS
dataset:
name: MTEB STS17 (es-es)
type: mteb/sts17-crosslingual-sts
config: es-es
split: test
revision: faeb762787bd10488a50c8b5be4a3b82e411949c
metrics:
- type: cosine_pearson
value: 88.4084678497171
- type: cosine_spearman
value: 88.77640638748285
- type: euclidean_pearson
value: 89.60124312475843
- type: euclidean_spearman
value: 88.4321442688528
- type: main_score
value: 88.77640638748285
- type: manhattan_pearson
value: 89.62375118021299
- type: manhattan_spearman
value: 88.46998118661577
- type: pearson
value: 88.4084678497171
- type: spearman
value: 88.77640638748285
- task:
type: STS
dataset:
name: MTEB STS17 (fr-en)
type: mteb/sts17-crosslingual-sts
config: fr-en
split: test
revision: faeb762787bd10488a50c8b5be4a3b82e411949c
metrics:
- type: cosine_pearson
value: 87.30688801326498
- type: cosine_spearman
value: 87.55684697258378
- type: euclidean_pearson
value: 87.89672951056794
- type: euclidean_spearman
value: 87.28050429201674
- type: main_score
value: 87.55684697258378
- type: manhattan_pearson
value: 87.74292745320572
- type: manhattan_spearman
value: 87.16383993876582
- type: pearson
value: 87.30688801326498
- type: spearman
value: 87.55684697258378
- task:
type: STS
dataset:
name: MTEB STS22 (zh-en)
type: mteb/sts22-crosslingual-sts
config: zh-en
split: test
revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3
metrics:
- type: cosine_pearson
value: 73.46180375170147
- type: cosine_spearman
value: 73.39559590127081
- type: euclidean_pearson
value: 73.72613901293681
- type: euclidean_spearman
value: 71.85465165176795
- type: main_score
value: 73.39559590127081
- type: manhattan_pearson
value: 73.07859140869076
- type: manhattan_spearman
value: 71.22047343718893
- type: pearson
value: 73.46180375170147
- type: spearman
value: 73.39559590127081
- task:
type: STS
dataset:
name: MTEB STS22 (zh)
type: mteb/sts22-crosslingual-sts
config: zh
split: test
revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3
metrics:
- type: cosine_pearson
value: 62.47531620842637
- type: cosine_spearman
value: 66.22504667157702
- type: euclidean_pearson
value: 66.76201254783692
- type: euclidean_spearman
value: 66.86115760269463
- type: main_score
value: 66.22504667157702
- type: manhattan_pearson
value: 66.73847836793489
- type: manhattan_spearman
value: 66.7677116377695
- type: pearson
value: 62.47531620842637
- type: spearman
value: 66.22504667157702
- task:
type: STS
dataset:
name: MTEB STS22 (es)
type: mteb/sts22-crosslingual-sts
config: es
split: test
revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3
metrics:
- type: cosine_pearson
value: 69.89707002436481
- type: cosine_spearman
value: 72.2054865735116
- type: euclidean_pearson
value: 71.81856615570756
- type: euclidean_spearman
value: 72.72593304629407
- type: main_score
value: 72.2054865735116
- type: manhattan_pearson
value: 72.00362684700072
- type: manhattan_spearman
value: 72.62783534769964
- type: pearson
value: 69.89707002436481
- type: spearman
value: 72.2054865735116
- task:
type: STS
dataset:
name: MTEB STS22 (fr)
type: mteb/sts22-crosslingual-sts
config: fr
split: test
revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3
metrics:
- type: cosine_pearson
value: 81.59623734395916
- type: cosine_spearman
value: 83.28946105111358
- type: euclidean_pearson
value: 79.377330171466
- type: euclidean_spearman
value: 81.81029781662205
- type: main_score
value: 83.28946105111358
- type: manhattan_pearson
value: 78.96970881689698
- type: manhattan_spearman
value: 81.91773236079703
- type: pearson
value: 81.59623734395916
- type: spearman
value: 83.28946105111358
- task:
type: STS
dataset:
name: MTEB STS22 (de-fr)
type: mteb/sts22-crosslingual-sts
config: de-fr
split: test
revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3
metrics:
- type: cosine_pearson
value: 55.03825643126142
- type: cosine_spearman
value: 58.25792501780429
- type: euclidean_pearson
value: 50.38007603973409
- type: euclidean_spearman
value: 59.39961789383097
- type: main_score
value: 58.25792501780429
- type: manhattan_pearson
value: 50.518568927999155
- type: manhattan_spearman
value: 59.84185466003894
- type: pearson
value: 55.03825643126142
- type: spearman
value: 58.25792501780429
- task:
type: STS
dataset:
name: MTEB STS22 (pl-en)
type: mteb/sts22-crosslingual-sts
config: pl-en
split: test
revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3
metrics:
- type: cosine_pearson
value: 77.77233721490776
- type: cosine_spearman
value: 76.17596588017625
- type: euclidean_pearson
value: 74.47600468156611
- type: euclidean_spearman
value: 72.61278728057012
- type: main_score
value: 76.17596588017625
- type: manhattan_pearson
value: 74.48118910099699
- type: manhattan_spearman
value: 73.33167419101696
- type: pearson
value: 77.77233721490776
- type: spearman
value: 76.17596588017625
- task:
type: STS
dataset:
name: MTEB STS22 (pl)
type: mteb/sts22-crosslingual-sts
config: pl
split: test
revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3
metrics:
- type: cosine_pearson
value: 42.87453608131507
- type: cosine_spearman
value: 45.137849894401185
- type: euclidean_pearson
value: 31.66964197694796
- type: euclidean_spearman
value: 44.1014900837869
- type: main_score
value: 45.137849894401185
- type: manhattan_pearson
value: 31.007199259384745
- type: manhattan_spearman
value: 43.48181523288926
- type: pearson
value: 42.87453608131507
- type: spearman
value: 45.137849894401185
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3
metrics:
- type: cosine_pearson
value: 66.87400150638176
- type: cosine_spearman
value: 67.27861354834066
- type: euclidean_pearson
value: 66.81789582140216
- type: euclidean_spearman
value: 66.44220479858708
- type: main_score
value: 67.27861354834066
- type: manhattan_pearson
value: 66.92509859033235
- type: manhattan_spearman
value: 66.46841124185076
- type: pearson
value: 66.87400150638176
- type: spearman
value: 67.27861354834066
- task:
type: STS
dataset:
name: MTEB STS22 (ru)
type: mteb/sts22-crosslingual-sts
config: ru
split: test
revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3
metrics:
- type: cosine_pearson
value: 61.819804551576084
- type: cosine_spearman
value: 65.0864146772135
- type: euclidean_pearson
value: 62.518151090361876
- type: euclidean_spearman
value: 65.13608138548017
- type: main_score
value: 65.0864146772135
- type: manhattan_pearson
value: 62.51413246915267
- type: manhattan_spearman
value: 65.19077543064323
- type: pearson
value: 61.819804551576084
- type: spearman
value: 65.0864146772135
- task:
type: STS
dataset:
name: MTEB STS22 (de)
type: mteb/sts22-crosslingual-sts
config: de
split: test
revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3
metrics:
- type: cosine_pearson
value: 54.85728696035389
- type: cosine_spearman
value: 61.60906359227576
- type: euclidean_pearson
value: 52.57582587901851
- type: euclidean_spearman
value: 61.41823097598308
- type: main_score
value: 61.60906359227576
- type: manhattan_pearson
value: 52.500978361080506
- type: manhattan_spearman
value: 61.30365596659758
- type: pearson
value: 54.85728696035389
- type: spearman
value: 61.60906359227576
- task:
type: STS
dataset:
name: MTEB STS22 (fr-pl)
type: mteb/sts22-crosslingual-sts
config: fr-pl
split: test
revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3
metrics:
- type: cosine_pearson
value: 67.68016005631422
- type: cosine_spearman
value: 84.51542547285167
- type: euclidean_pearson
value: 66.19871164667245
- type: euclidean_spearman
value: 73.24670207647144
- type: main_score
value: 84.51542547285167
- type: manhattan_pearson
value: 67.0443525268974
- type: manhattan_spearman
value: 73.24670207647144
- type: pearson
value: 67.68016005631422
- type: spearman
value: 84.51542547285167
- task:
type: STS
dataset:
name: MTEB STS22 (de-pl)
type: mteb/sts22-crosslingual-sts
config: de-pl
split: test
revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3
metrics:
- type: cosine_pearson
value: 47.49467414030747
- type: cosine_spearman
value: 56.81512095681289
- type: euclidean_pearson
value: 48.42860221765214
- type: euclidean_spearman
value: 58.63197306329092
- type: main_score
value: 56.81512095681289
- type: manhattan_pearson
value: 48.39594959260441
- type: manhattan_spearman
value: 58.63197306329092
- type: pearson
value: 47.49467414030747
- type: spearman
value: 56.81512095681289
- task:
type: STS
dataset:
name: MTEB STS22 (es-en)
type: mteb/sts22-crosslingual-sts
config: es-en
split: test
revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3
metrics:
- type: cosine_pearson
value: 76.8364678896155
- type: cosine_spearman
value: 78.45516413087114
- type: euclidean_pearson
value: 78.62779318576634
- type: euclidean_spearman
value: 78.88760695649488
- type: main_score
value: 78.45516413087114
- type: manhattan_pearson
value: 78.62131335760031
- type: manhattan_spearman
value: 78.81861844200388
- type: pearson
value: 76.8364678896155
- type: spearman
value: 78.45516413087114
- task:
type: STS
dataset:
name: MTEB STS22 (de-en)
type: mteb/sts22-crosslingual-sts
config: de-en
split: test
revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3
metrics:
- type: cosine_pearson
value: 65.16640313911604
- type: cosine_spearman
value: 60.887608967403914
- type: euclidean_pearson
value: 67.49902244990913
- type: euclidean_spearman
value: 59.2458787136538
- type: main_score
value: 60.887608967403914
- type: manhattan_pearson
value: 67.34313506388378
- type: manhattan_spearman
value: 59.05283429200166
- type: pearson
value: 65.16640313911604
- type: spearman
value: 60.887608967403914
- task:
type: STS
dataset:
name: MTEB STSB (default)
type: C-MTEB/STSB
config: default
split: test
revision: 0cde68302b3541bb8b3c340dc0644b0b745b3dc0
metrics:
- type: cosine_pearson
value: 81.5092853013241
- type: cosine_spearman
value: 83.54005474244292
- type: euclidean_pearson
value: 83.7246578378554
- type: euclidean_spearman
value: 84.46767551087716
- type: main_score
value: 83.54005474244292
- type: manhattan_pearson
value: 83.65922665594636
- type: manhattan_spearman
value: 84.42431449101848
- type: pearson
value: 81.5092853013241
- type: spearman
value: 83.54005474244292
- task:
type: STS
dataset:
name: MTEB STSBenchmark (default)
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cosine_pearson
value: 87.70246866744966
- type: cosine_spearman
value: 89.44070045346106
- type: euclidean_pearson
value: 89.56956519641007
- type: euclidean_spearman
value: 89.95830112784283
- type: main_score
value: 89.44070045346106
- type: manhattan_pearson
value: 89.48264471425145
- type: manhattan_spearman
value: 89.87900732483114
- type: pearson
value: 87.70246866744966
- type: spearman
value: 89.44070045346106
- task:
type: STS
dataset:
name: MTEB STSBenchmarkMultilingualSTS (de)
type: mteb/stsb_multi_mt
config: de
split: test
revision: 29afa2569dcedaaa2fe6a3dcfebab33d28b82e8c
metrics:
- type: cosine_pearson
value: 86.83701990805217
- type: cosine_spearman
value: 87.80280785492258
- type: euclidean_pearson
value: 87.77325330043514
- type: euclidean_spearman
value: 88.3564607283144
- type: main_score
value: 87.80280785492258
- type: manhattan_pearson
value: 87.6745449945946
- type: manhattan_spearman
value: 88.30660465978795
- type: pearson
value: 86.83701990805217
- type: spearman
value: 87.80280785492258
- task:
type: STS
dataset:
name: MTEB STSBenchmarkMultilingualSTS (zh)
type: mteb/stsb_multi_mt
config: zh
split: test
revision: 29afa2569dcedaaa2fe6a3dcfebab33d28b82e8c
metrics:
- type: cosine_pearson
value: 84.27751020600267
- type: cosine_spearman
value: 85.63500407412486
- type: euclidean_pearson
value: 85.21829891649696
- type: euclidean_spearman
value: 85.9384575715382
- type: main_score
value: 85.63500407412486
- type: manhattan_pearson
value: 85.10797194089801
- type: manhattan_spearman
value: 85.8770162042784
- type: pearson
value: 84.27751020600267
- type: spearman
value: 85.63500407412486
- task:
type: STS
dataset:
name: MTEB STSBenchmarkMultilingualSTS (fr)
type: mteb/stsb_multi_mt
config: fr
split: test
revision: 29afa2569dcedaaa2fe6a3dcfebab33d28b82e8c
metrics:
- type: cosine_pearson
value: 86.56833656723254
- type: cosine_spearman
value: 87.4393978501382
- type: euclidean_pearson
value: 87.45171512751267
- type: euclidean_spearman
value: 88.13106516566947
- type: main_score
value: 87.4393978501382
- type: manhattan_pearson
value: 87.33010961793333
- type: manhattan_spearman
value: 88.06707425102182
- type: pearson
value: 86.56833656723254
- type: spearman
value: 87.4393978501382
- task:
type: STS
dataset:
name: MTEB STSBenchmarkMultilingualSTS (pl)
type: mteb/stsb_multi_mt
config: pl
split: test
revision: 29afa2569dcedaaa2fe6a3dcfebab33d28b82e8c
metrics:
- type: cosine_pearson
value: 85.45065540325523
- type: cosine_spearman
value: 85.47881076789359
- type: euclidean_pearson
value: 85.1999493863155
- type: euclidean_spearman
value: 85.7874947669187
- type: main_score
value: 85.47881076789359
- type: manhattan_pearson
value: 85.06075305990376
- type: manhattan_spearman
value: 85.71563015639558
- type: pearson
value: 85.45065540325523
- type: spearman
value: 85.47881076789359
- task:
type: STS
dataset:
name: MTEB STSBenchmarkMultilingualSTS (es)
type: mteb/stsb_multi_mt
config: es
split: test
revision: 29afa2569dcedaaa2fe6a3dcfebab33d28b82e8c
metrics:
- type: cosine_pearson
value: 87.11952824079832
- type: cosine_spearman
value: 87.9643473573153
- type: euclidean_pearson
value: 88.11750364639971
- type: euclidean_spearman
value: 88.63695109016498
- type: main_score
value: 87.9643473573153
- type: manhattan_pearson
value: 88.00294453126699
- type: manhattan_spearman
value: 88.53750241758391
- type: pearson
value: 87.11952824079832
- type: spearman
value: 87.9643473573153
- task:
type: STS
dataset:
name: MTEB STSBenchmarkMultilingualSTS (ru)
type: mteb/stsb_multi_mt
config: ru
split: test
revision: 29afa2569dcedaaa2fe6a3dcfebab33d28b82e8c
metrics:
- type: cosine_pearson
value: 85.99804354414991
- type: cosine_spearman
value: 86.30252111551002
- type: euclidean_pearson
value: 86.1880652037762
- type: euclidean_spearman
value: 86.69556223944502
- type: main_score
value: 86.30252111551002
- type: manhattan_pearson
value: 86.0736400320898
- type: manhattan_spearman
value: 86.61747927593393
- type: pearson
value: 85.99804354414991
- type: spearman
value: 86.30252111551002
- task:
type: STS
dataset:
name: MTEB STSBenchmarkMultilingualSTS (en)
type: mteb/stsb_multi_mt
config: en
split: test
revision: 29afa2569dcedaaa2fe6a3dcfebab33d28b82e8c
metrics:
- type: cosine_pearson
value: 87.70246861738103
- type: cosine_spearman
value: 89.44070045346106
- type: euclidean_pearson
value: 89.56956518833663
- type: euclidean_spearman
value: 89.95830112784283
- type: main_score
value: 89.44070045346106
- type: manhattan_pearson
value: 89.48264470792915
- type: manhattan_spearman
value: 89.87900732483114
- type: pearson
value: 87.70246861738103
- type: spearman
value: 89.44070045346106
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR (default)
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 84.88064122814694
- type: mrr
value: 95.84832651009123
- type: main_score
value: 84.88064122814694
- task:
type: Retrieval
dataset:
name: MTEB SciFact (default)
type: mteb/scifact
config: default
split: test
revision: 0228b52cf27578f30900b9e5271d331663a030d7
metrics:
- type: map_at_1
value: 57.289
- type: map_at_10
value: 67.88499999999999
- type: map_at_100
value: 68.477
- type: map_at_1000
value: 68.50500000000001
- type: map_at_20
value: 68.33500000000001
- type: map_at_3
value: 65.08
- type: map_at_5
value: 67.001
- type: mrr_at_1
value: 59.667
- type: mrr_at_10
value: 68.626
- type: mrr_at_100
value: 69.082
- type: mrr_at_1000
value: 69.108
- type: mrr_at_20
value: 68.958
- type: mrr_at_3
value: 66.667
- type: mrr_at_5
value: 67.983
- type: ndcg_at_1
value: 59.667
- type: ndcg_at_10
value: 72.309
- type: ndcg_at_100
value: 74.58399999999999
- type: ndcg_at_1000
value: 75.25500000000001
- type: ndcg_at_20
value: 73.656
- type: ndcg_at_3
value: 67.791
- type: ndcg_at_5
value: 70.45
- type: precision_at_1
value: 59.667
- type: precision_at_10
value: 9.567
- type: precision_at_100
value: 1.073
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_20
value: 5.083
- type: precision_at_3
value: 26.333000000000002
- type: precision_at_5
value: 17.666999999999998
- type: recall_at_1
value: 57.289
- type: recall_at_10
value: 84.756
- type: recall_at_100
value: 94.5
- type: recall_at_1000
value: 99.667
- type: recall_at_20
value: 89.7
- type: recall_at_3
value: 73.22800000000001
- type: recall_at_5
value: 79.444
- type: main_score
value: 72.309
- task:
type: Clustering
dataset:
name: MTEB SpanishNewsClusteringP2P (default)
type: jinaai/spanish_news_clustering
config: default
split: test
revision: bf8ca8ddc5b7da4f7004720ddf99bbe0483480e6
metrics:
- type: main_score
value: 45.04477709795154
- type: v_measure
value: 45.04477709795154
- type: v_measure_std
value: 0.0
- task:
type: Retrieval
dataset:
name: MTEB SpanishPassageRetrievalS2S (default)
type: jinaai/spanish_passage_retrieval
config: default
split: test
revision: 9cddf2ce5209ade52c2115ccfa00eb22c6d3a837
metrics:
- type: main_score
value: 69.83
- type: map_at_1
value: 15.736
- type: map_at_10
value: 52.027
- type: map_at_100
value: 65.08800000000001
- type: map_at_1000
value: 65.08800000000001
- type: map_at_20
value: 60.79900000000001
- type: map_at_3
value: 32.869
- type: map_at_5
value: 41.436
- type: mrr_at_1
value: 75.44910179640718
- type: mrr_at_10
value: 84.43446440452426
- type: mrr_at_100
value: 84.48052612723271
- type: mrr_at_1000
value: 84.48052612723271
- type: mrr_at_20
value: 84.48052612723271
- type: mrr_at_3
value: 83.13373253493013
- type: mrr_at_5
value: 84.3013972055888
- type: nauc_map_at_1000_diff1
value: 50.611540149694356
- type: nauc_map_at_1000_max
value: 2.1102430434260238
- type: nauc_map_at_1000_std
value: -18.88993521335793
- type: nauc_map_at_100_diff1
value: 50.611540149694356
- type: nauc_map_at_100_max
value: 2.1102430434260238
- type: nauc_map_at_100_std
value: -18.88993521335793
- type: nauc_map_at_10_diff1
value: 59.13518981755268
- type: nauc_map_at_10_max
value: -9.810386627392807
- type: nauc_map_at_10_std
value: -38.31810152345078
- type: nauc_map_at_1_diff1
value: 74.96782567287174
- type: nauc_map_at_1_max
value: -29.648279252607875
- type: nauc_map_at_1_std
value: -54.017459339141595
- type: nauc_map_at_20_diff1
value: 55.26694458629849
- type: nauc_map_at_20_max
value: -1.9490244535020729
- type: nauc_map_at_20_std
value: -25.22211659104076
- type: nauc_map_at_3_diff1
value: 71.67607885031732
- type: nauc_map_at_3_max
value: -25.078101661694507
- type: nauc_map_at_3_std
value: -50.55408861920259
- type: nauc_map_at_5_diff1
value: 61.50111515417668
- type: nauc_map_at_5_max
value: -16.4114670513168
- type: nauc_map_at_5_std
value: -44.391416134859135
- type: nauc_mrr_at_1000_diff1
value: 74.18848063283234
- type: nauc_mrr_at_1000_max
value: 21.929205946778005
- type: nauc_mrr_at_1000_std
value: -36.27399268489433
- type: nauc_mrr_at_100_diff1
value: 74.18848063283234
- type: nauc_mrr_at_100_max
value: 21.929205946778005
- type: nauc_mrr_at_100_std
value: -36.27399268489433
- type: nauc_mrr_at_10_diff1
value: 74.27231582268745
- type: nauc_mrr_at_10_max
value: 21.481133301135337
- type: nauc_mrr_at_10_std
value: -36.72070854872902
- type: nauc_mrr_at_1_diff1
value: 76.54855950439561
- type: nauc_mrr_at_1_max
value: 26.99938321212366
- type: nauc_mrr_at_1_std
value: -33.098742603429635
- type: nauc_mrr_at_20_diff1
value: 74.18848063283234
- type: nauc_mrr_at_20_max
value: 21.929205946778005
- type: nauc_mrr_at_20_std
value: -36.27399268489433
- type: nauc_mrr_at_3_diff1
value: 72.05379526740143
- type: nauc_mrr_at_3_max
value: 18.875831185752528
- type: nauc_mrr_at_3_std
value: -37.27302006456391
- type: nauc_mrr_at_5_diff1
value: 74.25342356682029
- type: nauc_mrr_at_5_max
value: 20.756340085088738
- type: nauc_mrr_at_5_std
value: -37.99507208540703
- type: nauc_ndcg_at_1000_diff1
value: 53.259363764380275
- type: nauc_ndcg_at_1000_max
value: 12.936954959423218
- type: nauc_ndcg_at_1000_std
value: -16.953898675672153
- type: nauc_ndcg_at_100_diff1
value: 53.259363764380275
- type: nauc_ndcg_at_100_max
value: 12.936954959423218
- type: nauc_ndcg_at_100_std
value: -16.953898675672153
- type: nauc_ndcg_at_10_diff1
value: 53.70942345413554
- type: nauc_ndcg_at_10_max
value: -3.8465093347016186
- type: nauc_ndcg_at_10_std
value: -31.208127919994755
- type: nauc_ndcg_at_1_diff1
value: 75.30551289259554
- type: nauc_ndcg_at_1_max
value: 25.53292054129834
- type: nauc_ndcg_at_1_std
value: -33.285498788395145
- type: nauc_ndcg_at_20_diff1
value: 57.62409278278133
- type: nauc_ndcg_at_20_max
value: 2.8040586426056233
- type: nauc_ndcg_at_20_std
value: -26.270875776221704
- type: nauc_ndcg_at_3_diff1
value: 48.42294834754225
- type: nauc_ndcg_at_3_max
value: 16.912467881065822
- type: nauc_ndcg_at_3_std
value: -13.324841189277873
- type: nauc_ndcg_at_5_diff1
value: 47.512819802794596
- type: nauc_ndcg_at_5_max
value: 14.645518203506594
- type: nauc_ndcg_at_5_std
value: -17.641450435599275
- type: nauc_precision_at_1000_diff1
value: -34.43320975829637
- type: nauc_precision_at_1000_max
value: 29.08585622578186
- type: nauc_precision_at_1000_std
value: 46.55117940162061
- type: nauc_precision_at_100_diff1
value: -34.433209758296364
- type: nauc_precision_at_100_max
value: 29.085856225781885
- type: nauc_precision_at_100_std
value: 46.55117940162065
- type: nauc_precision_at_10_diff1
value: -21.895306304096902
- type: nauc_precision_at_10_max
value: 33.190476527593745
- type: nauc_precision_at_10_std
value: 37.64916268614298
- type: nauc_precision_at_1_diff1
value: 75.30551289259554
- type: nauc_precision_at_1_max
value: 25.53292054129834
- type: nauc_precision_at_1_std
value: -33.285498788395145
- type: nauc_precision_at_20_diff1
value: -27.63076748060466
- type: nauc_precision_at_20_max
value: 30.689810416086154
- type: nauc_precision_at_20_std
value: 46.164191636131626
- type: nauc_precision_at_3_diff1
value: 20.547345067837288
- type: nauc_precision_at_3_max
value: 26.177050942827528
- type: nauc_precision_at_3_std
value: 5.960466052973099
- type: nauc_precision_at_5_diff1
value: -8.928755534002669
- type: nauc_precision_at_5_max
value: 40.83262650073459
- type: nauc_precision_at_5_std
value: 26.158537031161494
- type: nauc_recall_at_1000_diff1
value: .nan
- type: nauc_recall_at_1000_max
value: .nan
- type: nauc_recall_at_1000_std
value: .nan
- type: nauc_recall_at_100_diff1
value: .nan
- type: nauc_recall_at_100_max
value: .nan
- type: nauc_recall_at_100_std
value: .nan
- type: nauc_recall_at_10_diff1
value: 53.08654386169444
- type: nauc_recall_at_10_max
value: -23.276269379519356
- type: nauc_recall_at_10_std
value: -50.80707792706157
- type: nauc_recall_at_1_diff1
value: 74.96782567287174
- type: nauc_recall_at_1_max
value: -29.648279252607875
- type: nauc_recall_at_1_std
value: -54.017459339141595
- type: nauc_recall_at_20_diff1
value: 51.60121897059633
- type: nauc_recall_at_20_max
value: -14.241779530735387
- type: nauc_recall_at_20_std
value: -37.877451525215456
- type: nauc_recall_at_3_diff1
value: 66.99474984329694
- type: nauc_recall_at_3_max
value: -30.802787353187966
- type: nauc_recall_at_3_std
value: -53.58737792129713
- type: nauc_recall_at_5_diff1
value: 54.64214444958567
- type: nauc_recall_at_5_max
value: -23.341309362104703
- type: nauc_recall_at_5_std
value: -51.381363923145265
- type: ndcg_at_1
value: 76.048
- type: ndcg_at_10
value: 69.83
- type: ndcg_at_100
value: 82.11500000000001
- type: ndcg_at_1000
value: 82.11500000000001
- type: ndcg_at_20
value: 75.995
- type: ndcg_at_3
value: 69.587
- type: ndcg_at_5
value: 69.062
- type: precision_at_1
value: 76.048
- type: precision_at_10
value: 43.653
- type: precision_at_100
value: 7.718999999999999
- type: precision_at_1000
value: 0.772
- type: precision_at_20
value: 31.108000000000004
- type: precision_at_3
value: 63.87199999999999
- type: precision_at_5
value: 56.407
- type: recall_at_1
value: 15.736
- type: recall_at_10
value: 66.873
- type: recall_at_100
value: 100.0
- type: recall_at_1000
value: 100.0
- type: recall_at_20
value: 85.01100000000001
- type: recall_at_3
value: 36.441
- type: recall_at_5
value: 49.109
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions (default)
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cosine_accuracy
value: 99.87326732673267
- type: cosine_accuracy_threshold
value: 86.0752820968628
- type: cosine_ap
value: 96.98758090713252
- type: cosine_f1
value: 93.52881698685542
- type: cosine_f1_threshold
value: 86.0752820968628
- type: cosine_precision
value: 94.58077709611452
- type: cosine_recall
value: 92.5
- type: dot_accuracy
value: 99.82574257425742
- type: dot_accuracy_threshold
value: 40484.73815917969
- type: dot_ap
value: 95.68959907254845
- type: dot_f1
value: 91.31293188548865
- type: dot_f1_threshold
value: 40336.810302734375
- type: dot_precision
value: 90.15594541910332
- type: dot_recall
value: 92.5
- type: euclidean_accuracy
value: 99.87128712871286
- type: euclidean_accuracy_threshold
value: 1162.5749588012695
- type: euclidean_ap
value: 96.92640435656577
- type: euclidean_f1
value: 93.4475806451613
- type: euclidean_f1_threshold
value: 1162.5749588012695
- type: euclidean_precision
value: 94.20731707317073
- type: euclidean_recall
value: 92.7
- type: main_score
value: 96.98758090713252
- type: manhattan_accuracy
value: 99.86930693069307
- type: manhattan_accuracy_threshold
value: 28348.71826171875
- type: manhattan_ap
value: 96.93832673967925
- type: manhattan_f1
value: 93.33333333333333
- type: manhattan_f1_threshold
value: 28348.71826171875
- type: manhattan_precision
value: 94.28571428571428
- type: manhattan_recall
value: 92.4
- type: max_accuracy
value: 99.87326732673267
- type: max_ap
value: 96.98758090713252
- type: max_f1
value: 93.52881698685542
- type: max_precision
value: 94.58077709611452
- type: max_recall
value: 92.7
- type: similarity_accuracy
value: 99.87326732673267
- type: similarity_accuracy_threshold
value: 86.0752820968628
- type: similarity_ap
value: 96.98758090713252
- type: similarity_f1
value: 93.52881698685542
- type: similarity_f1_threshold
value: 86.0752820968628
- type: similarity_precision
value: 94.58077709611452
- type: similarity_recall
value: 92.5
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering (default)
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: main_score
value: 65.6560129719848
- type: v_measure
value: 65.6560129719848
- type: v_measure_std
value: 4.781229811487539
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P (default)
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: main_score
value: 35.07546243853692
- type: v_measure
value: 35.07546243853692
- type: v_measure_std
value: 1.1978740356240998
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions (default)
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 51.771005199508835
- type: mrr
value: 52.65443298531534
- type: main_score
value: 51.771005199508835
- task:
type: Summarization
dataset:
name: MTEB SummEval (default)
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cosine_pearson
value: 29.48686238342228
- type: cosine_spearman
value: 29.706543509170054
- type: dot_pearson
value: 27.95853155597859
- type: dot_spearman
value: 27.604287986935162
- type: main_score
value: 29.706543509170054
- type: pearson
value: 29.48686238342228
- type: spearman
value: 29.706543509170054
- task:
type: Summarization
dataset:
name: MTEB SummEvalFr (default)
type: lyon-nlp/summarization-summeval-fr-p2p
config: default
split: test
revision: b385812de6a9577b6f4d0f88c6a6e35395a94054
metrics:
- type: cosine_pearson
value: 31.551301434917868
- type: cosine_spearman
value: 30.709049789175186
- type: dot_pearson
value: 27.77050901756549
- type: dot_spearman
value: 26.715505953561795
- type: main_score
value: 30.709049789175186
- type: pearson
value: 31.551301434917868
- type: spearman
value: 30.709049789175186
- task:
type: Reranking
dataset:
name: MTEB SyntecReranking (default)
type: lyon-nlp/mteb-fr-reranking-syntec-s2p
config: default
split: test
revision: b205c5084a0934ce8af14338bf03feb19499c84d
metrics:
- type: map
value: 73.31666666666666
- type: mrr
value: 73.31666666666666
- type: main_score
value: 73.31666666666666
- task:
type: Retrieval
dataset:
name: MTEB SyntecRetrieval (default)
type: lyon-nlp/mteb-fr-retrieval-syntec-s2p
config: default
split: test
revision: 19661ccdca4dfc2d15122d776b61685f48c68ca9
metrics:
- type: main_score
value: 83.851
- type: map_at_1
value: 68.0
- type: map_at_10
value: 79.187
- type: map_at_100
value: 79.32900000000001
- type: map_at_1000
value: 79.32900000000001
- type: map_at_20
value: 79.32900000000001
- type: map_at_3
value: 77.333
- type: map_at_5
value: 78.93299999999999
- type: mrr_at_1
value: 68.0
- type: mrr_at_10
value: 79.18730158730159
- type: mrr_at_100
value: 79.32945845004669
- type: mrr_at_1000
value: 79.32945845004669
- type: mrr_at_20
value: 79.32945845004669
- type: mrr_at_3
value: 77.33333333333333
- type: mrr_at_5
value: 78.93333333333332
- type: nauc_map_at_1000_diff1
value: 63.31103256935259
- type: nauc_map_at_1000_max
value: 11.073749121365623
- type: nauc_map_at_1000_std
value: 7.4973309839738
- type: nauc_map_at_100_diff1
value: 63.31103256935259
- type: nauc_map_at_100_max
value: 11.073749121365623
- type: nauc_map_at_100_std
value: 7.4973309839738
- type: nauc_map_at_10_diff1
value: 62.91585737195978
- type: nauc_map_at_10_max
value: 11.770664508983133
- type: nauc_map_at_10_std
value: 8.179883948527962
- type: nauc_map_at_1_diff1
value: 66.1236265634718
- type: nauc_map_at_1_max
value: 7.000207311173955
- type: nauc_map_at_1_std
value: 6.54412272821497
- type: nauc_map_at_20_diff1
value: 63.31103256935259
- type: nauc_map_at_20_max
value: 11.073749121365623
- type: nauc_map_at_20_std
value: 7.4973309839738
- type: nauc_map_at_3_diff1
value: 62.14039574010254
- type: nauc_map_at_3_max
value: 11.06996398110187
- type: nauc_map_at_3_std
value: 7.288759297085769
- type: nauc_map_at_5_diff1
value: 63.0401271126211
- type: nauc_map_at_5_max
value: 10.779317801858609
- type: nauc_map_at_5_std
value: 6.476660484760681
- type: nauc_mrr_at_1000_diff1
value: 63.31103256935259
- type: nauc_mrr_at_1000_max
value: 11.073749121365623
- type: nauc_mrr_at_1000_std
value: 7.4973309839738
- type: nauc_mrr_at_100_diff1
value: 63.31103256935259
- type: nauc_mrr_at_100_max
value: 11.073749121365623
- type: nauc_mrr_at_100_std
value: 7.4973309839738
- type: nauc_mrr_at_10_diff1
value: 62.91585737195978
- type: nauc_mrr_at_10_max
value: 11.770664508983133
- type: nauc_mrr_at_10_std
value: 8.179883948527962
- type: nauc_mrr_at_1_diff1
value: 66.1236265634718
- type: nauc_mrr_at_1_max
value: 7.000207311173955
- type: nauc_mrr_at_1_std
value: 6.54412272821497
- type: nauc_mrr_at_20_diff1
value: 63.31103256935259
- type: nauc_mrr_at_20_max
value: 11.073749121365623
- type: nauc_mrr_at_20_std
value: 7.4973309839738
- type: nauc_mrr_at_3_diff1
value: 62.14039574010254
- type: nauc_mrr_at_3_max
value: 11.06996398110187
- type: nauc_mrr_at_3_std
value: 7.288759297085769
- type: nauc_mrr_at_5_diff1
value: 63.0401271126211
- type: nauc_mrr_at_5_max
value: 10.779317801858609
- type: nauc_mrr_at_5_std
value: 6.476660484760681
- type: nauc_ndcg_at_1000_diff1
value: 62.9544299483241
- type: nauc_ndcg_at_1000_max
value: 11.577079766964538
- type: nauc_ndcg_at_1000_std
value: 7.703856790100716
- type: nauc_ndcg_at_100_diff1
value: 62.9544299483241
- type: nauc_ndcg_at_100_max
value: 11.577079766964538
- type: nauc_ndcg_at_100_std
value: 7.703856790100716
- type: nauc_ndcg_at_10_diff1
value: 61.29907952217381
- type: nauc_ndcg_at_10_max
value: 14.760627422715425
- type: nauc_ndcg_at_10_std
value: 10.805573898143368
- type: nauc_ndcg_at_1_diff1
value: 66.1236265634718
- type: nauc_ndcg_at_1_max
value: 7.000207311173955
- type: nauc_ndcg_at_1_std
value: 6.54412272821497
- type: nauc_ndcg_at_20_diff1
value: 62.9544299483241
- type: nauc_ndcg_at_20_max
value: 11.577079766964538
- type: nauc_ndcg_at_20_std
value: 7.703856790100716
- type: nauc_ndcg_at_3_diff1
value: 60.25643527856101
- type: nauc_ndcg_at_3_max
value: 12.236302709487546
- type: nauc_ndcg_at_3_std
value: 7.36883189112067
- type: nauc_ndcg_at_5_diff1
value: 61.65220590318238
- type: nauc_ndcg_at_5_max
value: 11.39969101913945
- type: nauc_ndcg_at_5_std
value: 5.406207922379402
- type: nauc_precision_at_1000_diff1
value: .nan
- type: nauc_precision_at_1000_max
value: .nan
- type: nauc_precision_at_1000_std
value: .nan
- type: nauc_precision_at_100_diff1
value: .nan
- type: nauc_precision_at_100_max
value: .nan
- type: nauc_precision_at_100_std
value: .nan
- type: nauc_precision_at_10_diff1
value: 19.14098972922579
- type: nauc_precision_at_10_max
value: 100.0
- type: nauc_precision_at_10_std
value: 93.46405228758135
- type: nauc_precision_at_1_diff1
value: 66.1236265634718
- type: nauc_precision_at_1_max
value: 7.000207311173955
- type: nauc_precision_at_1_std
value: 6.54412272821497
- type: nauc_precision_at_20_diff1
value: 100.0
- type: nauc_precision_at_20_max
value: 100.0
- type: nauc_precision_at_20_std
value: 100.0
- type: nauc_precision_at_3_diff1
value: 50.29636629155561
- type: nauc_precision_at_3_max
value: 18.00532600292076
- type: nauc_precision_at_3_std
value: 7.649686453053768
- type: nauc_precision_at_5_diff1
value: 43.522408963585356
- type: nauc_precision_at_5_max
value: 16.923436041082983
- type: nauc_precision_at_5_std
value: -10.854341736694092
- type: nauc_recall_at_1000_diff1
value: .nan
- type: nauc_recall_at_1000_max
value: .nan
- type: nauc_recall_at_1000_std
value: .nan
- type: nauc_recall_at_100_diff1
value: .nan
- type: nauc_recall_at_100_max
value: .nan
- type: nauc_recall_at_100_std
value: .nan
- type: nauc_recall_at_10_diff1
value: 19.1409897292252
- type: nauc_recall_at_10_max
value: 100.0
- type: nauc_recall_at_10_std
value: 93.46405228758134
- type: nauc_recall_at_1_diff1
value: 66.1236265634718
- type: nauc_recall_at_1_max
value: 7.000207311173955
- type: nauc_recall_at_1_std
value: 6.54412272821497
- type: nauc_recall_at_20_diff1
value: .nan
- type: nauc_recall_at_20_max
value: .nan
- type: nauc_recall_at_20_std
value: .nan
- type: nauc_recall_at_3_diff1
value: 50.29636629155569
- type: nauc_recall_at_3_max
value: 18.005326002920754
- type: nauc_recall_at_3_std
value: 7.649686453053851
- type: nauc_recall_at_5_diff1
value: 43.5224089635856
- type: nauc_recall_at_5_max
value: 16.92343604108335
- type: nauc_recall_at_5_std
value: -10.854341736694499
- type: ndcg_at_1
value: 68.0
- type: ndcg_at_10
value: 83.851
- type: ndcg_at_100
value: 84.36099999999999
- type: ndcg_at_1000
value: 84.36099999999999
- type: ndcg_at_20
value: 84.36099999999999
- type: ndcg_at_3
value: 80.333
- type: ndcg_at_5
value: 83.21600000000001
- type: precision_at_1
value: 68.0
- type: precision_at_10
value: 9.8
- type: precision_at_100
value: 1.0
- type: precision_at_1000
value: 0.1
- type: precision_at_20
value: 5.0
- type: precision_at_3
value: 29.666999999999998
- type: precision_at_5
value: 19.2
- type: recall_at_1
value: 68.0
- type: recall_at_10
value: 98.0
- type: recall_at_100
value: 100.0
- type: recall_at_1000
value: 100.0
- type: recall_at_20
value: 100.0
- type: recall_at_3
value: 89.0
- type: recall_at_5
value: 96.0
- task:
type: Reranking
dataset:
name: MTEB T2Reranking (default)
type: C-MTEB/T2Reranking
config: default
split: dev
revision: 76631901a18387f85eaa53e5450019b87ad58ef9
metrics:
- type: map
value: 65.3088203970324
- type: mrr
value: 74.79505862376546
- type: main_score
value: 65.3088203970324
- task:
type: Retrieval
dataset:
name: MTEB T2Retrieval (default)
type: C-MTEB/T2Retrieval
config: default
split: dev
revision: 8731a845f1bf500a4f111cf1070785c793d10e64
metrics:
- type: main_score
value: 83.163
- type: map_at_1
value: 26.875
- type: map_at_10
value: 75.454
- type: map_at_100
value: 79.036
- type: map_at_1000
value: 79.111
- type: map_at_20
value: 78.145
- type: map_at_3
value: 53.181
- type: map_at_5
value: 65.362
- type: mrr_at_1
value: 88.90057864281957
- type: mrr_at_10
value: 91.53186397301344
- type: mrr_at_100
value: 91.62809075510003
- type: mrr_at_1000
value: 91.63198173030787
- type: mrr_at_20
value: 91.59414668799909
- type: mrr_at_3
value: 91.0792565316499
- type: mrr_at_5
value: 91.35718043135199
- type: nauc_map_at_1000_diff1
value: 12.364843957982409
- type: nauc_map_at_1000_max
value: 52.07043464458799
- type: nauc_map_at_1000_std
value: 16.040095055100494
- type: nauc_map_at_100_diff1
value: 12.370621073823022
- type: nauc_map_at_100_max
value: 51.960738727635636
- type: nauc_map_at_100_std
value: 15.935832440430747
- type: nauc_map_at_10_diff1
value: 16.852819486606585
- type: nauc_map_at_10_max
value: 40.11184760756059
- type: nauc_map_at_10_std
value: 0.9306648364102376
- type: nauc_map_at_1_diff1
value: 52.87356542654683
- type: nauc_map_at_1_max
value: -22.210039746171255
- type: nauc_map_at_1_std
value: -38.11345358035342
- type: nauc_map_at_20_diff1
value: 13.045089059562837
- type: nauc_map_at_20_max
value: 49.591383082160036
- type: nauc_map_at_20_std
value: 12.54330050352008
- type: nauc_map_at_3_diff1
value: 38.08172234377615
- type: nauc_map_at_3_max
value: -6.868621684867697
- type: nauc_map_at_3_std
value: -35.4712388845996
- type: nauc_map_at_5_diff1
value: 29.665551705577474
- type: nauc_map_at_5_max
value: 10.958628576519045
- type: nauc_map_at_5_std
value: -25.113120842097057
- type: nauc_mrr_at_1000_diff1
value: 47.39372999496945
- type: nauc_mrr_at_1000_max
value: 83.11274997493808
- type: nauc_mrr_at_1000_std
value: 39.74195374546631
- type: nauc_mrr_at_100_diff1
value: 47.396678946057676
- type: nauc_mrr_at_100_max
value: 83.1192584274415
- type: nauc_mrr_at_100_std
value: 39.75840860374685
- type: nauc_mrr_at_10_diff1
value: 47.35365644138715
- type: nauc_mrr_at_10_max
value: 83.189165639531
- type: nauc_mrr_at_10_std
value: 39.83653157887758
- type: nauc_mrr_at_1_diff1
value: 47.98740362820094
- type: nauc_mrr_at_1_max
value: 80.32340034580369
- type: nauc_mrr_at_1_std
value: 34.57857131423388
- type: nauc_mrr_at_20_diff1
value: 47.399132055537194
- type: nauc_mrr_at_20_max
value: 83.16329919869686
- type: nauc_mrr_at_20_std
value: 39.84204692042734
- type: nauc_mrr_at_3_diff1
value: 47.09295580511751
- type: nauc_mrr_at_3_max
value: 82.95831045602642
- type: nauc_mrr_at_3_std
value: 38.98036804692351
- type: nauc_mrr_at_5_diff1
value: 47.20100268549764
- type: nauc_mrr_at_5_max
value: 83.16652480381642
- type: nauc_mrr_at_5_std
value: 39.55690491560902
- type: nauc_ndcg_at_1000_diff1
value: 17.201962509184547
- type: nauc_ndcg_at_1000_max
value: 63.75820559259539
- type: nauc_ndcg_at_1000_std
value: 29.28676096486067
- type: nauc_ndcg_at_100_diff1
value: 16.76847216096811
- type: nauc_ndcg_at_100_max
value: 62.646517934470744
- type: nauc_ndcg_at_100_std
value: 28.7441617667637
- type: nauc_ndcg_at_10_diff1
value: 16.559511980751886
- type: nauc_ndcg_at_10_max
value: 54.35027464277944
- type: nauc_ndcg_at_10_std
value: 16.98089333577716
- type: nauc_ndcg_at_1_diff1
value: 47.98740362820094
- type: nauc_ndcg_at_1_max
value: 80.32340034580369
- type: nauc_ndcg_at_1_std
value: 34.57857131423388
- type: nauc_ndcg_at_20_diff1
value: 16.721525245428243
- type: nauc_ndcg_at_20_max
value: 57.683661870555724
- type: nauc_ndcg_at_20_std
value: 21.736044200026853
- type: nauc_ndcg_at_3_diff1
value: 12.488009696556192
- type: nauc_ndcg_at_3_max
value: 69.2365575305502
- type: nauc_ndcg_at_3_std
value: 30.622418945055323
- type: nauc_ndcg_at_5_diff1
value: 12.364114556230609
- type: nauc_ndcg_at_5_max
value: 62.33360746285387
- type: nauc_ndcg_at_5_std
value: 24.898000803570227
- type: nauc_precision_at_1000_diff1
value: -35.14745130154524
- type: nauc_precision_at_1000_max
value: 48.811507982849065
- type: nauc_precision_at_1000_std
value: 62.43036496029399
- type: nauc_precision_at_100_diff1
value: -35.15276411320076
- type: nauc_precision_at_100_max
value: 50.87010333741109
- type: nauc_precision_at_100_std
value: 63.418221030407175
- type: nauc_precision_at_10_diff1
value: -34.84255710936113
- type: nauc_precision_at_10_max
value: 56.588401051428825
- type: nauc_precision_at_10_std
value: 57.4763370653757
- type: nauc_precision_at_1_diff1
value: 47.98740362820094
- type: nauc_precision_at_1_max
value: 80.32340034580369
- type: nauc_precision_at_1_std
value: 34.57857131423388
- type: nauc_precision_at_20_diff1
value: -35.165762365233505
- type: nauc_precision_at_20_max
value: 54.148762449660424
- type: nauc_precision_at_20_std
value: 61.569719669368716
- type: nauc_precision_at_3_diff1
value: -28.63023175340299
- type: nauc_precision_at_3_max
value: 68.69825987618499
- type: nauc_precision_at_3_std
value: 48.15479495755423
- type: nauc_precision_at_5_diff1
value: -34.13811355456687
- type: nauc_precision_at_5_max
value: 62.369363941490604
- type: nauc_precision_at_5_std
value: 52.282904411187914
- type: nauc_recall_at_1000_diff1
value: 8.686444579162663
- type: nauc_recall_at_1000_max
value: 59.58864478011338
- type: nauc_recall_at_1000_std
value: 56.692774954297455
- type: nauc_recall_at_100_diff1
value: 8.820596225758342
- type: nauc_recall_at_100_max
value: 53.15048885657892
- type: nauc_recall_at_100_std
value: 39.78931159236714
- type: nauc_recall_at_10_diff1
value: 16.022301106315027
- type: nauc_recall_at_10_max
value: 29.83242342459543
- type: nauc_recall_at_10_std
value: -4.805965555875844
- type: nauc_recall_at_1_diff1
value: 52.87356542654683
- type: nauc_recall_at_1_max
value: -22.210039746171255
- type: nauc_recall_at_1_std
value: -38.11345358035342
- type: nauc_recall_at_20_diff1
value: 10.35772828627265
- type: nauc_recall_at_20_max
value: 43.06420839754062
- type: nauc_recall_at_20_std
value: 15.040522218235692
- type: nauc_recall_at_3_diff1
value: 36.23953684770224
- type: nauc_recall_at_3_max
value: -11.709269151700374
- type: nauc_recall_at_3_std
value: -38.13943178150384
- type: nauc_recall_at_5_diff1
value: 28.644872415763384
- type: nauc_recall_at_5_max
value: 2.062151266111129
- type: nauc_recall_at_5_std
value: -30.81114034774277
- type: ndcg_at_1
value: 88.901
- type: ndcg_at_10
value: 83.163
- type: ndcg_at_100
value: 86.854
- type: ndcg_at_1000
value: 87.602
- type: ndcg_at_20
value: 84.908
- type: ndcg_at_3
value: 84.848
- type: ndcg_at_5
value: 83.372
- type: precision_at_1
value: 88.901
- type: precision_at_10
value: 41.343
- type: precision_at_100
value: 4.957000000000001
- type: precision_at_1000
value: 0.513
- type: precision_at_20
value: 22.955000000000002
- type: precision_at_3
value: 74.29599999999999
- type: precision_at_5
value: 62.251999999999995
- type: recall_at_1
value: 26.875
- type: recall_at_10
value: 81.902
- type: recall_at_100
value: 93.988
- type: recall_at_1000
value: 97.801
- type: recall_at_20
value: 87.809
- type: recall_at_3
value: 54.869
- type: recall_at_5
value: 68.728
- task:
type: PairClassification
dataset:
name: MTEB TERRa (default)
type: ai-forever/terra-pairclassification
config: default
split: dev
revision: 7b58f24536063837d644aab9a023c62199b2a612
metrics:
- type: cosine_accuracy
value: 60.586319218241044
- type: cosine_accuracy_threshold
value: 82.49806761741638
- type: cosine_ap
value: 58.73198048427448
- type: cosine_f1
value: 67.37967914438502
- type: cosine_f1_threshold
value: 77.46461033821106
- type: cosine_precision
value: 57.01357466063348
- type: cosine_recall
value: 82.35294117647058
- type: dot_accuracy
value: 60.26058631921825
- type: dot_accuracy_threshold
value: 35627.020263671875
- type: dot_ap
value: 57.418783612898224
- type: dot_f1
value: 66.51982378854623
- type: dot_f1_threshold
value: 27620.843505859375
- type: dot_precision
value: 50.16611295681063
- type: dot_recall
value: 98.69281045751634
- type: euclidean_accuracy
value: 60.26058631921825
- type: euclidean_accuracy_threshold
value: 1255.4466247558594
- type: euclidean_ap
value: 58.748656145387955
- type: euclidean_f1
value: 66.99029126213591
- type: euclidean_f1_threshold
value: 1565.1330947875977
- type: euclidean_precision
value: 53.28185328185329
- type: euclidean_recall
value: 90.19607843137256
- type: main_score
value: 58.8479126365766
- type: manhattan_accuracy
value: 59.934853420195445
- type: manhattan_accuracy_threshold
value: 29897.271728515625
- type: manhattan_ap
value: 58.8479126365766
- type: manhattan_f1
value: 66.81318681318683
- type: manhattan_f1_threshold
value: 46291.802978515625
- type: manhattan_precision
value: 50.331125827814574
- type: manhattan_recall
value: 99.34640522875817
- type: max_accuracy
value: 60.586319218241044
- type: max_ap
value: 58.8479126365766
- type: max_f1
value: 67.37967914438502
- type: max_precision
value: 57.01357466063348
- type: max_recall
value: 99.34640522875817
- type: similarity_accuracy
value: 60.586319218241044
- type: similarity_accuracy_threshold
value: 82.49806761741638
- type: similarity_ap
value: 58.73198048427448
- type: similarity_f1
value: 67.37967914438502
- type: similarity_f1_threshold
value: 77.46461033821106
- type: similarity_precision
value: 57.01357466063348
- type: similarity_recall
value: 82.35294117647058
- task:
type: Classification
dataset:
name: MTEB TNews (default)
type: C-MTEB/TNews-classification
config: default
split: validation
revision: 317f262bf1e6126357bbe89e875451e4b0938fe4
metrics:
- type: accuracy
value: 45.967999999999996
- type: f1
value: 44.699306100915706
- type: f1_weighted
value: 46.03730319014832
- type: main_score
value: 45.967999999999996
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID (default)
type: mteb/trec-covid
config: default
split: test
revision: bb9466bac8153a0349341eb1b22e06409e78ef4e
metrics:
- type: map_at_1
value: 0.251
- type: map_at_10
value: 1.9480000000000002
- type: map_at_100
value: 11.082
- type: map_at_1000
value: 26.700000000000003
- type: map_at_20
value: 3.3529999999999998
- type: map_at_3
value: 0.679
- type: map_at_5
value: 1.079
- type: mrr_at_1
value: 94.0
- type: mrr_at_10
value: 95.786
- type: mrr_at_100
value: 95.786
- type: mrr_at_1000
value: 95.786
- type: mrr_at_20
value: 95.786
- type: mrr_at_3
value: 95.0
- type: mrr_at_5
value: 95.5
- type: ndcg_at_1
value: 91.0
- type: ndcg_at_10
value: 77.71900000000001
- type: ndcg_at_100
value: 57.726
- type: ndcg_at_1000
value: 52.737
- type: ndcg_at_20
value: 72.54
- type: ndcg_at_3
value: 83.397
- type: ndcg_at_5
value: 80.806
- type: precision_at_1
value: 94.0
- type: precision_at_10
value: 81.0
- type: precision_at_100
value: 59.199999999999996
- type: precision_at_1000
value: 23.244
- type: precision_at_20
value: 75.2
- type: precision_at_3
value: 88.0
- type: precision_at_5
value: 84.8
- type: recall_at_1
value: 0.251
- type: recall_at_10
value: 2.1229999999999998
- type: recall_at_100
value: 14.496999999999998
- type: recall_at_1000
value: 50.09
- type: recall_at_20
value: 3.8309999999999995
- type: recall_at_3
value: 0.696
- type: recall_at_5
value: 1.1400000000000001
- type: main_score
value: 77.71900000000001
- task:
type: Clustering
dataset:
name: MTEB TenKGnadClusteringP2P (default)
type: slvnwhrl/tenkgnad-clustering-p2p
config: default
split: test
revision: 5c59e41555244b7e45c9a6be2d720ab4bafae558
metrics:
- type: main_score
value: 43.763609722295215
- type: v_measure
value: 43.763609722295215
- type: v_measure_std
value: 2.8751199473862457
- task:
type: Clustering
dataset:
name: MTEB TenKGnadClusteringS2S (default)
type: slvnwhrl/tenkgnad-clustering-s2s
config: default
split: test
revision: 6cddbe003f12b9b140aec477b583ac4191f01786
metrics:
- type: main_score
value: 39.762424448504355
- type: v_measure
value: 39.762424448504355
- type: v_measure_std
value: 3.30146124979502
- task:
type: Clustering
dataset:
name: MTEB ThuNewsClusteringP2P (default)
type: C-MTEB/ThuNewsClusteringP2P
config: default
split: test
revision: 5798586b105c0434e4f0fe5e767abe619442cf93
metrics:
- type: main_score
value: 63.133819258289456
- type: v_measure
value: 63.133819258289456
- type: v_measure_std
value: 1.8854253356479695
- task:
type: Clustering
dataset:
name: MTEB ThuNewsClusteringS2S (default)
type: C-MTEB/ThuNewsClusteringS2S
config: default
split: test
revision: 8a8b2caeda43f39e13c4bc5bea0f8a667896e10d
metrics:
- type: main_score
value: 58.98195851785808
- type: v_measure
value: 58.98195851785808
- type: v_measure_std
value: 1.6237600076393737
- task:
type: Retrieval
dataset:
name: MTEB Touche2020 (default)
type: mteb/touche2020
config: default
split: test
revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f
metrics:
- type: map_at_1
value: 3.3550000000000004
- type: map_at_10
value: 10.08
- type: map_at_100
value: 16.136
- type: map_at_1000
value: 17.605
- type: map_at_20
value: 12.561
- type: map_at_3
value: 5.641
- type: map_at_5
value: 7.3260000000000005
- type: mrr_at_1
value: 46.939
- type: mrr_at_10
value: 58.152
- type: mrr_at_100
value: 58.594
- type: mrr_at_1000
value: 58.601000000000006
- type: mrr_at_20
value: 58.279
- type: mrr_at_3
value: 55.102
- type: mrr_at_5
value: 56.531
- type: ndcg_at_1
value: 44.897999999999996
- type: ndcg_at_10
value: 26.298
- type: ndcg_at_100
value: 37.596000000000004
- type: ndcg_at_1000
value: 49.424
- type: ndcg_at_20
value: 27.066000000000003
- type: ndcg_at_3
value: 31.528
- type: ndcg_at_5
value: 28.219
- type: precision_at_1
value: 46.939
- type: precision_at_10
value: 22.245
- type: precision_at_100
value: 7.531000000000001
- type: precision_at_1000
value: 1.5350000000000001
- type: precision_at_20
value: 17.041
- type: precision_at_3
value: 30.612000000000002
- type: precision_at_5
value: 26.122
- type: recall_at_1
value: 3.3550000000000004
- type: recall_at_10
value: 16.41
- type: recall_at_100
value: 47.272
- type: recall_at_1000
value: 83.584
- type: recall_at_20
value: 24.091
- type: recall_at_3
value: 6.8180000000000005
- type: recall_at_5
value: 9.677
- type: main_score
value: 26.298
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification (default)
type: mteb/toxic_conversations_50k
config: default
split: test
revision: edfaf9da55d3dd50d43143d90c1ac476895ae6de
metrics:
- type: accuracy
value: 91.2890625
- type: ap
value: 33.95547153875715
- type: ap_weighted
value: 33.95547153875715
- type: f1
value: 75.10768597556462
- type: f1_weighted
value: 92.00161208992606
- type: main_score
value: 91.2890625
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification (default)
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 71.3978494623656
- type: f1
value: 71.7194818511814
- type: f1_weighted
value: 71.13860187349744
- type: main_score
value: 71.3978494623656
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering (default)
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: main_score
value: 52.4921688720602
- type: v_measure
value: 52.4921688720602
- type: v_measure_std
value: 0.992768152658908
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015 (default)
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cosine_accuracy
value: 85.11652858079513
- type: cosine_accuracy_threshold
value: 87.90839910507202
- type: cosine_ap
value: 70.90459908851724
- type: cosine_f1
value: 65.66581227877457
- type: cosine_f1_threshold
value: 85.13308763504028
- type: cosine_precision
value: 61.094708153531684
- type: cosine_recall
value: 70.97625329815304
- type: dot_accuracy
value: 83.41181379269239
- type: dot_accuracy_threshold
value: 43110.113525390625
- type: dot_ap
value: 65.64869491143095
- type: dot_f1
value: 62.05308447460914
- type: dot_f1_threshold
value: 41412.542724609375
- type: dot_precision
value: 57.38623626989464
- type: dot_recall
value: 67.54617414248021
- type: euclidean_accuracy
value: 85.15229182809799
- type: euclidean_accuracy_threshold
value: 1043.08500289917
- type: euclidean_ap
value: 70.71204383269375
- type: euclidean_f1
value: 65.20304568527919
- type: euclidean_f1_threshold
value: 1179.2595863342285
- type: euclidean_precision
value: 62.81173594132029
- type: euclidean_recall
value: 67.78364116094987
- type: main_score
value: 70.90459908851724
- type: manhattan_accuracy
value: 85.1820945341837
- type: manhattan_accuracy_threshold
value: 26115.0390625
- type: manhattan_ap
value: 70.66113937117431
- type: manhattan_f1
value: 65.33383628819313
- type: manhattan_f1_threshold
value: 29105.181884765625
- type: manhattan_precision
value: 62.40691808791736
- type: manhattan_recall
value: 68.54881266490766
- type: max_accuracy
value: 85.1820945341837
- type: max_ap
value: 70.90459908851724
- type: max_f1
value: 65.66581227877457
- type: max_precision
value: 62.81173594132029
- type: max_recall
value: 70.97625329815304
- type: similarity_accuracy
value: 85.11652858079513
- type: similarity_accuracy_threshold
value: 87.90839910507202
- type: similarity_ap
value: 70.90459908851724
- type: similarity_f1
value: 65.66581227877457
- type: similarity_f1_threshold
value: 85.13308763504028
- type: similarity_precision
value: 61.094708153531684
- type: similarity_recall
value: 70.97625329815304
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus (default)
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cosine_accuracy
value: 88.10299996119068
- type: cosine_accuracy_threshold
value: 84.34982895851135
- type: cosine_ap
value: 84.13755787769226
- type: cosine_f1
value: 76.0967548076923
- type: cosine_f1_threshold
value: 82.8936219215393
- type: cosine_precision
value: 74.28864769727193
- type: cosine_recall
value: 77.99507237449954
- type: dot_accuracy
value: 86.64182869561843
- type: dot_accuracy_threshold
value: 38794.677734375
- type: dot_ap
value: 80.20301567411457
- type: dot_f1
value: 73.50650291634967
- type: dot_f1_threshold
value: 37447.23205566406
- type: dot_precision
value: 69.41498460485802
- type: dot_recall
value: 78.11056359716662
- type: euclidean_accuracy
value: 87.9361198432103
- type: euclidean_accuracy_threshold
value: 1184.421157836914
- type: euclidean_ap
value: 83.79582690117218
- type: euclidean_f1
value: 75.81431709042175
- type: euclidean_f1_threshold
value: 1258.2727432250977
- type: euclidean_precision
value: 73.39099099099099
- type: euclidean_recall
value: 78.40314136125654
- type: main_score
value: 84.13755787769226
- type: manhattan_accuracy
value: 87.96134590755618
- type: manhattan_accuracy_threshold
value: 29077.291870117188
- type: manhattan_ap
value: 83.79487172269923
- type: manhattan_f1
value: 75.82421603424935
- type: manhattan_f1_threshold
value: 31224.124145507812
- type: manhattan_precision
value: 72.24740255212329
- type: manhattan_recall
value: 79.77363720357253
- type: max_accuracy
value: 88.10299996119068
- type: max_ap
value: 84.13755787769226
- type: max_f1
value: 76.0967548076923
- type: max_precision
value: 74.28864769727193
- type: max_recall
value: 79.77363720357253
- type: similarity_accuracy
value: 88.10299996119068
- type: similarity_accuracy_threshold
value: 84.34982895851135
- type: similarity_ap
value: 84.13755787769226
- type: similarity_f1
value: 76.0967548076923
- type: similarity_f1_threshold
value: 82.8936219215393
- type: similarity_precision
value: 74.28864769727193
- type: similarity_recall
value: 77.99507237449954
- task:
type: Retrieval
dataset:
name: MTEB VideoRetrieval (default)
type: C-MTEB/VideoRetrieval
config: default
split: dev
revision: 58c2597a5943a2ba48f4668c3b90d796283c5639
metrics:
- type: main_score
value: 70.433
- type: map_at_1
value: 55.7
- type: map_at_10
value: 66.013
- type: map_at_100
value: 66.534
- type: map_at_1000
value: 66.547
- type: map_at_20
value: 66.334
- type: map_at_3
value: 64.2
- type: map_at_5
value: 65.445
- type: mrr_at_1
value: 55.7
- type: mrr_at_10
value: 66.01329365079364
- type: mrr_at_100
value: 66.53350061744233
- type: mrr_at_1000
value: 66.54744831962995
- type: mrr_at_20
value: 66.3335147364675
- type: mrr_at_3
value: 64.2
- type: mrr_at_5
value: 65.44500000000002
- type: nauc_map_at_1000_diff1
value: 76.26428836976245
- type: nauc_map_at_1000_max
value: 35.41847367373575
- type: nauc_map_at_1000_std
value: -33.04639860831992
- type: nauc_map_at_100_diff1
value: 76.25793229023193
- type: nauc_map_at_100_max
value: 35.43663260110076
- type: nauc_map_at_100_std
value: -33.04238139882945
- type: nauc_map_at_10_diff1
value: 76.2108281297711
- type: nauc_map_at_10_max
value: 35.59442419423183
- type: nauc_map_at_10_std
value: -33.32346518997277
- type: nauc_map_at_1_diff1
value: 79.17728405262736
- type: nauc_map_at_1_max
value: 31.880738163589527
- type: nauc_map_at_1_std
value: -30.891888718004584
- type: nauc_map_at_20_diff1
value: 76.2181333410193
- type: nauc_map_at_20_max
value: 35.43448818430876
- type: nauc_map_at_20_std
value: -33.35682442863193
- type: nauc_map_at_3_diff1
value: 76.10046541433466
- type: nauc_map_at_3_max
value: 34.6831278555291
- type: nauc_map_at_3_std
value: -34.030826044831116
- type: nauc_map_at_5_diff1
value: 75.96513023582064
- type: nauc_map_at_5_max
value: 34.66920832438069
- type: nauc_map_at_5_std
value: -33.79799777830796
- type: nauc_mrr_at_1000_diff1
value: 76.26428836976245
- type: nauc_mrr_at_1000_max
value: 35.41847367373575
- type: nauc_mrr_at_1000_std
value: -33.04639860831992
- type: nauc_mrr_at_100_diff1
value: 76.25793229023193
- type: nauc_mrr_at_100_max
value: 35.43663260110076
- type: nauc_mrr_at_100_std
value: -33.04238139882945
- type: nauc_mrr_at_10_diff1
value: 76.2108281297711
- type: nauc_mrr_at_10_max
value: 35.59442419423183
- type: nauc_mrr_at_10_std
value: -33.32346518997277
- type: nauc_mrr_at_1_diff1
value: 79.17728405262736
- type: nauc_mrr_at_1_max
value: 31.880738163589527
- type: nauc_mrr_at_1_std
value: -30.891888718004584
- type: nauc_mrr_at_20_diff1
value: 76.2181333410193
- type: nauc_mrr_at_20_max
value: 35.43448818430876
- type: nauc_mrr_at_20_std
value: -33.35682442863193
- type: nauc_mrr_at_3_diff1
value: 76.10046541433466
- type: nauc_mrr_at_3_max
value: 34.6831278555291
- type: nauc_mrr_at_3_std
value: -34.030826044831116
- type: nauc_mrr_at_5_diff1
value: 75.96513023582064
- type: nauc_mrr_at_5_max
value: 34.66920832438069
- type: nauc_mrr_at_5_std
value: -33.79799777830796
- type: nauc_ndcg_at_1000_diff1
value: 75.68118206798317
- type: nauc_ndcg_at_1000_max
value: 37.12252980787349
- type: nauc_ndcg_at_1000_std
value: -31.457578337430505
- type: nauc_ndcg_at_100_diff1
value: 75.46730761564156
- type: nauc_ndcg_at_100_max
value: 37.549890025544265
- type: nauc_ndcg_at_100_std
value: -31.35066985945112
- type: nauc_ndcg_at_10_diff1
value: 75.09890404887037
- type: nauc_ndcg_at_10_max
value: 38.024147790014204
- type: nauc_ndcg_at_10_std
value: -33.67408368593356
- type: nauc_ndcg_at_1_diff1
value: 79.17728405262736
- type: nauc_ndcg_at_1_max
value: 31.880738163589527
- type: nauc_ndcg_at_1_std
value: -30.891888718004584
- type: nauc_ndcg_at_20_diff1
value: 75.12977548171354
- type: nauc_ndcg_at_20_max
value: 37.524926748917956
- type: nauc_ndcg_at_20_std
value: -33.771344674947485
- type: nauc_ndcg_at_3_diff1
value: 74.94037476984154
- type: nauc_ndcg_at_3_max
value: 35.60345554050552
- type: nauc_ndcg_at_3_std
value: -35.256991346321854
- type: nauc_ndcg_at_5_diff1
value: 74.54265907753783
- type: nauc_ndcg_at_5_max
value: 35.57662819978585
- type: nauc_ndcg_at_5_std
value: -34.879794448418465
- type: nauc_precision_at_1000_diff1
value: 74.52277207179142
- type: nauc_precision_at_1000_max
value: 94.25510945118707
- type: nauc_precision_at_1000_std
value: 91.6874157070222
- type: nauc_precision_at_100_diff1
value: 65.98346655735419
- type: nauc_precision_at_100_max
value: 78.81168727653687
- type: nauc_precision_at_100_std
value: 27.241465691967708
- type: nauc_precision_at_10_diff1
value: 69.55050319096688
- type: nauc_precision_at_10_max
value: 51.827749140893374
- type: nauc_precision_at_10_std
value: -34.60818605792837
- type: nauc_precision_at_1_diff1
value: 79.17728405262736
- type: nauc_precision_at_1_max
value: 31.880738163589527
- type: nauc_precision_at_1_std
value: -30.891888718004584
- type: nauc_precision_at_20_diff1
value: 68.08078305042736
- type: nauc_precision_at_20_max
value: 52.83318878288501
- type: nauc_precision_at_20_std
value: -35.46070292817927
- type: nauc_precision_at_3_diff1
value: 70.76249609881901
- type: nauc_precision_at_3_max
value: 38.86561868624655
- type: nauc_precision_at_3_std
value: -39.68917853446992
- type: nauc_precision_at_5_diff1
value: 68.39110629013278
- type: nauc_precision_at_5_max
value: 39.28677163904683
- type: nauc_precision_at_5_std
value: -39.39101423819562
- type: nauc_recall_at_1000_diff1
value: 74.52277207179175
- type: nauc_recall_at_1000_max
value: 94.25510945118776
- type: nauc_recall_at_1000_std
value: 91.68741570702382
- type: nauc_recall_at_100_diff1
value: 65.9834665573548
- type: nauc_recall_at_100_max
value: 78.81168727653679
- type: nauc_recall_at_100_std
value: 27.241465691967598
- type: nauc_recall_at_10_diff1
value: 69.55050319096708
- type: nauc_recall_at_10_max
value: 51.82774914089347
- type: nauc_recall_at_10_std
value: -34.6081860579283
- type: nauc_recall_at_1_diff1
value: 79.17728405262736
- type: nauc_recall_at_1_max
value: 31.880738163589527
- type: nauc_recall_at_1_std
value: -30.891888718004584
- type: nauc_recall_at_20_diff1
value: 68.08078305042746
- type: nauc_recall_at_20_max
value: 52.833188782885244
- type: nauc_recall_at_20_std
value: -35.46070292817895
- type: nauc_recall_at_3_diff1
value: 70.76249609881896
- type: nauc_recall_at_3_max
value: 38.865618686246464
- type: nauc_recall_at_3_std
value: -39.68917853446999
- type: nauc_recall_at_5_diff1
value: 68.39110629013274
- type: nauc_recall_at_5_max
value: 39.28677163904688
- type: nauc_recall_at_5_std
value: -39.39101423819562
- type: ndcg_at_1
value: 55.7
- type: ndcg_at_10
value: 70.433
- type: ndcg_at_100
value: 72.975
- type: ndcg_at_1000
value: 73.283
- type: ndcg_at_20
value: 71.58
- type: ndcg_at_3
value: 66.83099999999999
- type: ndcg_at_5
value: 69.085
- type: precision_at_1
value: 55.7
- type: precision_at_10
value: 8.4
- type: precision_at_100
value: 0.959
- type: precision_at_1000
value: 0.098
- type: precision_at_20
value: 4.425
- type: precision_at_3
value: 24.8
- type: precision_at_5
value: 15.98
- type: recall_at_1
value: 55.7
- type: recall_at_10
value: 84.0
- type: recall_at_100
value: 95.89999999999999
- type: recall_at_1000
value: 98.2
- type: recall_at_20
value: 88.5
- type: recall_at_3
value: 74.4
- type: recall_at_5
value: 79.9
- task:
type: Classification
dataset:
name: MTEB Waimai (default)
type: C-MTEB/waimai-classification
config: default
split: test
revision: 339287def212450dcaa9df8c22bf93e9980c7023
metrics:
- type: accuracy
value: 86.58999999999999
- type: ap
value: 70.02619249927523
- type: ap_weighted
value: 70.02619249927523
- type: f1
value: 84.97572770889423
- type: f1_weighted
value: 86.6865713531272
- type: main_score
value: 86.58999999999999
- task:
type: Retrieval
dataset:
name: MTEB XMarket (en)
type: jinaai/xmarket_ml
config: en
split: test
revision: dfe57acff5b62c23732a7b7d3e3fb84ff501708b
metrics:
- type: main_score
value: 34.772999999999996
- type: map_at_1
value: 7.2620000000000005
- type: map_at_10
value: 17.98
- type: map_at_100
value: 24.828
- type: map_at_1000
value: 26.633000000000003
- type: map_at_20
value: 20.699
- type: map_at_3
value: 12.383
- type: map_at_5
value: 14.871
- type: mrr_at_1
value: 34.718100890207715
- type: mrr_at_10
value: 43.9336827525092
- type: mrr_at_100
value: 44.66474011066837
- type: mrr_at_1000
value: 44.7075592197356
- type: mrr_at_20
value: 44.35984436569346
- type: mrr_at_3
value: 41.73901893981052
- type: mrr_at_5
value: 43.025973550207134
- type: nauc_map_at_1000_diff1
value: 13.899869081196364
- type: nauc_map_at_1000_max
value: 46.60452816386231
- type: nauc_map_at_1000_std
value: 24.87925799401773
- type: nauc_map_at_100_diff1
value: 16.164805650871084
- type: nauc_map_at_100_max
value: 44.720912958558095
- type: nauc_map_at_100_std
value: 20.236734536210477
- type: nauc_map_at_10_diff1
value: 23.58580520913581
- type: nauc_map_at_10_max
value: 31.276151869914216
- type: nauc_map_at_10_std
value: -0.1833326246041355
- type: nauc_map_at_1_diff1
value: 37.02663305598722
- type: nauc_map_at_1_max
value: 14.931071531116528
- type: nauc_map_at_1_std
value: -12.478790028708453
- type: nauc_map_at_20_diff1
value: 20.718297881540593
- type: nauc_map_at_20_max
value: 36.62264094841859
- type: nauc_map_at_20_std
value: 6.658514770057742
- type: nauc_map_at_3_diff1
value: 29.379034581120006
- type: nauc_map_at_3_max
value: 21.387214269548803
- type: nauc_map_at_3_std
value: -9.3404121914247
- type: nauc_map_at_5_diff1
value: 26.627169792839485
- type: nauc_map_at_5_max
value: 25.393331109666388
- type: nauc_map_at_5_std
value: -6.023485287246353
- type: nauc_mrr_at_1000_diff1
value: 12.047232036652295
- type: nauc_mrr_at_1000_max
value: 46.611862580860645
- type: nauc_mrr_at_1000_std
value: 27.89146066442305
- type: nauc_mrr_at_100_diff1
value: 12.05261747449997
- type: nauc_mrr_at_100_max
value: 46.61328535381203
- type: nauc_mrr_at_100_std
value: 27.886145596874535
- type: nauc_mrr_at_10_diff1
value: 12.006935553036941
- type: nauc_mrr_at_10_max
value: 46.53351686240496
- type: nauc_mrr_at_10_std
value: 27.708742470257462
- type: nauc_mrr_at_1_diff1
value: 13.323408127738782
- type: nauc_mrr_at_1_max
value: 43.78884661002012
- type: nauc_mrr_at_1_std
value: 25.164417588165673
- type: nauc_mrr_at_20_diff1
value: 12.036022973968011
- type: nauc_mrr_at_20_max
value: 46.56537838037131
- type: nauc_mrr_at_20_std
value: 27.78189157249635
- type: nauc_mrr_at_3_diff1
value: 11.943896700976381
- type: nauc_mrr_at_3_max
value: 46.33644663073225
- type: nauc_mrr_at_3_std
value: 27.523915405053845
- type: nauc_mrr_at_5_diff1
value: 12.03108009033769
- type: nauc_mrr_at_5_max
value: 46.49103616896692
- type: nauc_mrr_at_5_std
value: 27.630879129863366
- type: nauc_ndcg_at_1000_diff1
value: 9.766823796017324
- type: nauc_ndcg_at_1000_max
value: 52.85844801910602
- type: nauc_ndcg_at_1000_std
value: 36.43271437761207
- type: nauc_ndcg_at_100_diff1
value: 12.035059298282036
- type: nauc_ndcg_at_100_max
value: 50.05520240705682
- type: nauc_ndcg_at_100_std
value: 29.87678724506636
- type: nauc_ndcg_at_10_diff1
value: 10.281893031139424
- type: nauc_ndcg_at_10_max
value: 47.02153679426017
- type: nauc_ndcg_at_10_std
value: 26.624948330369126
- type: nauc_ndcg_at_1_diff1
value: 13.323408127738782
- type: nauc_ndcg_at_1_max
value: 43.78884661002012
- type: nauc_ndcg_at_1_std
value: 25.164417588165673
- type: nauc_ndcg_at_20_diff1
value: 11.463524849646598
- type: nauc_ndcg_at_20_max
value: 47.415073186019704
- type: nauc_ndcg_at_20_std
value: 26.359019620164307
- type: nauc_ndcg_at_3_diff1
value: 9.689199913805394
- type: nauc_ndcg_at_3_max
value: 45.68151849572808
- type: nauc_ndcg_at_3_std
value: 26.559193219799486
- type: nauc_ndcg_at_5_diff1
value: 9.448823370356575
- type: nauc_ndcg_at_5_max
value: 46.19999662690141
- type: nauc_ndcg_at_5_std
value: 26.8411706726069
- type: nauc_precision_at_1000_diff1
value: -20.379065598727024
- type: nauc_precision_at_1000_max
value: 13.162562437268427
- type: nauc_precision_at_1000_std
value: 22.658226157785812
- type: nauc_precision_at_100_diff1
value: -16.458155977309282
- type: nauc_precision_at_100_max
value: 35.97956789169889
- type: nauc_precision_at_100_std
value: 48.878375009979194
- type: nauc_precision_at_10_diff1
value: -7.810992317607771
- type: nauc_precision_at_10_max
value: 49.307339277444754
- type: nauc_precision_at_10_std
value: 42.82533951854582
- type: nauc_precision_at_1_diff1
value: 13.323408127738782
- type: nauc_precision_at_1_max
value: 43.78884661002012
- type: nauc_precision_at_1_std
value: 25.164417588165673
- type: nauc_precision_at_20_diff1
value: -11.43933465149542
- type: nauc_precision_at_20_max
value: 46.93722753460038
- type: nauc_precision_at_20_std
value: 47.36223769029678
- type: nauc_precision_at_3_diff1
value: 1.3230178593599737
- type: nauc_precision_at_3_max
value: 48.49039534395576
- type: nauc_precision_at_3_std
value: 33.161384183129194
- type: nauc_precision_at_5_diff1
value: -3.185516457926519
- type: nauc_precision_at_5_max
value: 49.5814309394308
- type: nauc_precision_at_5_std
value: 37.57637865900281
- type: nauc_recall_at_1000_diff1
value: 7.839499443984168
- type: nauc_recall_at_1000_max
value: 52.67165467640894
- type: nauc_recall_at_1000_std
value: 48.85318316702583
- type: nauc_recall_at_100_diff1
value: 14.117557049589418
- type: nauc_recall_at_100_max
value: 40.59046301348715
- type: nauc_recall_at_100_std
value: 24.379680901739505
- type: nauc_recall_at_10_diff1
value: 20.04536052614054
- type: nauc_recall_at_10_max
value: 25.54148839721574
- type: nauc_recall_at_10_std
value: -1.938182527562211
- type: nauc_recall_at_1_diff1
value: 37.02663305598722
- type: nauc_recall_at_1_max
value: 14.931071531116528
- type: nauc_recall_at_1_std
value: -12.478790028708453
- type: nauc_recall_at_20_diff1
value: 17.959977483235566
- type: nauc_recall_at_20_max
value: 29.88502687870809
- type: nauc_recall_at_20_std
value: 4.26527395196852
- type: nauc_recall_at_3_diff1
value: 26.297810954500456
- type: nauc_recall_at_3_max
value: 18.819406079307402
- type: nauc_recall_at_3_std
value: -10.002237229729081
- type: nauc_recall_at_5_diff1
value: 22.739080899568485
- type: nauc_recall_at_5_max
value: 21.0322968243985
- type: nauc_recall_at_5_std
value: -6.927749435306422
- type: ndcg_at_1
value: 34.717999999999996
- type: ndcg_at_10
value: 34.772999999999996
- type: ndcg_at_100
value: 39.407
- type: ndcg_at_1000
value: 44.830999999999996
- type: ndcg_at_20
value: 35.667
- type: ndcg_at_3
value: 34.332
- type: ndcg_at_5
value: 34.408
- type: precision_at_1
value: 34.717999999999996
- type: precision_at_10
value: 23.430999999999997
- type: precision_at_100
value: 9.31
- type: precision_at_1000
value: 2.259
- type: precision_at_20
value: 18.826999999999998
- type: precision_at_3
value: 30.553
- type: precision_at_5
value: 27.792
- type: recall_at_1
value: 7.2620000000000005
- type: recall_at_10
value: 26.384
- type: recall_at_100
value: 52.506
- type: recall_at_1000
value: 73.38
- type: recall_at_20
value: 34.032000000000004
- type: recall_at_3
value: 14.821000000000002
- type: recall_at_5
value: 19.481
- task:
type: Retrieval
dataset:
name: MTEB XMarket (de)
type: jinaai/xmarket_ml
config: de
split: test
revision: dfe57acff5b62c23732a7b7d3e3fb84ff501708b
metrics:
- type: main_score
value: 28.316000000000003
- type: map_at_1
value: 8.667
- type: map_at_10
value: 17.351
- type: map_at_100
value: 21.02
- type: map_at_1000
value: 21.951
- type: map_at_20
value: 18.994
- type: map_at_3
value: 13.23
- type: map_at_5
value: 15.17
- type: mrr_at_1
value: 27.27272727272727
- type: mrr_at_10
value: 36.10858487561485
- type: mrr_at_100
value: 36.92033814316568
- type: mrr_at_1000
value: 36.972226653870365
- type: mrr_at_20
value: 36.58914906427944
- type: mrr_at_3
value: 33.642969201552305
- type: mrr_at_5
value: 35.13417554289494
- type: nauc_map_at_1000_diff1
value: 23.345116790998063
- type: nauc_map_at_1000_max
value: 44.447240670835725
- type: nauc_map_at_1000_std
value: 18.34636500680144
- type: nauc_map_at_100_diff1
value: 24.458120909292347
- type: nauc_map_at_100_max
value: 43.31851431140378
- type: nauc_map_at_100_std
value: 15.654778355549965
- type: nauc_map_at_10_diff1
value: 29.376508937265044
- type: nauc_map_at_10_max
value: 36.650196725140795
- type: nauc_map_at_10_std
value: 4.682465435374843
- type: nauc_map_at_1_diff1
value: 40.382365672683214
- type: nauc_map_at_1_max
value: 22.894341150096785
- type: nauc_map_at_1_std
value: -5.610725673968323
- type: nauc_map_at_20_diff1
value: 27.197033425732908
- type: nauc_map_at_20_max
value: 39.71672400647207
- type: nauc_map_at_20_std
value: 8.944436813309933
- type: nauc_map_at_3_diff1
value: 34.49739294661502
- type: nauc_map_at_3_max
value: 29.006972420735284
- type: nauc_map_at_3_std
value: -3.0372650571243986
- type: nauc_map_at_5_diff1
value: 32.764901537277105
- type: nauc_map_at_5_max
value: 32.658533295918154
- type: nauc_map_at_5_std
value: 0.029626452286996906
- type: nauc_mrr_at_1000_diff1
value: 19.521229956280603
- type: nauc_mrr_at_1000_max
value: 44.39409866211472
- type: nauc_mrr_at_1000_std
value: 23.580697307036058
- type: nauc_mrr_at_100_diff1
value: 19.51312676591073
- type: nauc_mrr_at_100_max
value: 44.39559153963895
- type: nauc_mrr_at_100_std
value: 23.57913711397437
- type: nauc_mrr_at_10_diff1
value: 19.584635617935145
- type: nauc_mrr_at_10_max
value: 44.44842226236198
- type: nauc_mrr_at_10_std
value: 23.382684909390434
- type: nauc_mrr_at_1_diff1
value: 20.92594790923806
- type: nauc_mrr_at_1_max
value: 40.593939625252816
- type: nauc_mrr_at_1_std
value: 20.37467598073644
- type: nauc_mrr_at_20_diff1
value: 19.590641822115725
- type: nauc_mrr_at_20_max
value: 44.42512299604718
- type: nauc_mrr_at_20_std
value: 23.45564260800024
- type: nauc_mrr_at_3_diff1
value: 20.005307129527232
- type: nauc_mrr_at_3_max
value: 43.68300366192776
- type: nauc_mrr_at_3_std
value: 22.297190480842005
- type: nauc_mrr_at_5_diff1
value: 19.852896386271716
- type: nauc_mrr_at_5_max
value: 44.20641808920062
- type: nauc_mrr_at_5_std
value: 22.966517330852895
- type: nauc_ndcg_at_1000_diff1
value: 17.800116251376103
- type: nauc_ndcg_at_1000_max
value: 50.98332718061365
- type: nauc_ndcg_at_1000_std
value: 31.464484658102577
- type: nauc_ndcg_at_100_diff1
value: 19.555159680541088
- type: nauc_ndcg_at_100_max
value: 48.56377130899141
- type: nauc_ndcg_at_100_std
value: 25.77572748714817
- type: nauc_ndcg_at_10_diff1
value: 20.003008726679415
- type: nauc_ndcg_at_10_max
value: 45.1293725480628
- type: nauc_ndcg_at_10_std
value: 21.149213260765872
- type: nauc_ndcg_at_1_diff1
value: 21.00986278773023
- type: nauc_ndcg_at_1_max
value: 40.524637076774894
- type: nauc_ndcg_at_1_std
value: 20.29682194006685
- type: nauc_ndcg_at_20_diff1
value: 20.659734137312284
- type: nauc_ndcg_at_20_max
value: 45.73108736599869
- type: nauc_ndcg_at_20_std
value: 21.200736170346133
- type: nauc_ndcg_at_3_diff1
value: 19.200120542882544
- type: nauc_ndcg_at_3_max
value: 42.89772612963168
- type: nauc_ndcg_at_3_std
value: 20.713292754978983
- type: nauc_ndcg_at_5_diff1
value: 19.96329647992544
- type: nauc_ndcg_at_5_max
value: 44.296627037787324
- type: nauc_ndcg_at_5_std
value: 21.200135784971973
- type: nauc_precision_at_1000_diff1
value: -11.543221249009427
- type: nauc_precision_at_1000_max
value: 9.132801614448221
- type: nauc_precision_at_1000_std
value: 21.203720655381055
- type: nauc_precision_at_100_diff1
value: -12.510945425786039
- type: nauc_precision_at_100_max
value: 31.42530963666252
- type: nauc_precision_at_100_std
value: 44.99672783467617
- type: nauc_precision_at_10_diff1
value: -4.025802651746804
- type: nauc_precision_at_10_max
value: 47.50967924227793
- type: nauc_precision_at_10_std
value: 41.1558559268985
- type: nauc_precision_at_1_diff1
value: 21.00986278773023
- type: nauc_precision_at_1_max
value: 40.524637076774894
- type: nauc_precision_at_1_std
value: 20.29682194006685
- type: nauc_precision_at_20_diff1
value: -8.059482951110002
- type: nauc_precision_at_20_max
value: 44.28832115946278
- type: nauc_precision_at_20_std
value: 45.2005585353651
- type: nauc_precision_at_3_diff1
value: 8.53530005716248
- type: nauc_precision_at_3_max
value: 46.48353678905102
- type: nauc_precision_at_3_std
value: 28.868791323881972
- type: nauc_precision_at_5_diff1
value: 3.093619954821814
- type: nauc_precision_at_5_max
value: 48.43294475817019
- type: nauc_precision_at_5_std
value: 34.83430452745434
- type: nauc_recall_at_1000_diff1
value: 9.93680206699751
- type: nauc_recall_at_1000_max
value: 52.97840222394363
- type: nauc_recall_at_1000_std
value: 46.370023604436255
- type: nauc_recall_at_100_diff1
value: 14.100542445524972
- type: nauc_recall_at_100_max
value: 42.853775131475224
- type: nauc_recall_at_100_std
value: 26.93029971231028
- type: nauc_recall_at_10_diff1
value: 22.774547475714716
- type: nauc_recall_at_10_max
value: 33.984586405015044
- type: nauc_recall_at_10_std
value: 5.332325172373655
- type: nauc_recall_at_1_diff1
value: 40.382365672683214
- type: nauc_recall_at_1_max
value: 22.894341150096785
- type: nauc_recall_at_1_std
value: -5.610725673968323
- type: nauc_recall_at_20_diff1
value: 19.751060483835936
- type: nauc_recall_at_20_max
value: 36.18774034635102
- type: nauc_recall_at_20_std
value: 10.362242090308577
- type: nauc_recall_at_3_diff1
value: 30.29462372902671
- type: nauc_recall_at_3_max
value: 27.377175450099635
- type: nauc_recall_at_3_std
value: -3.015752705993425
- type: nauc_recall_at_5_diff1
value: 28.096893312615723
- type: nauc_recall_at_5_max
value: 30.485075571512425
- type: nauc_recall_at_5_std
value: 0.09106417003502826
- type: ndcg_at_1
value: 27.248
- type: ndcg_at_10
value: 28.316000000000003
- type: ndcg_at_100
value: 33.419
- type: ndcg_at_1000
value: 38.134
- type: ndcg_at_20
value: 29.707
- type: ndcg_at_3
value: 26.93
- type: ndcg_at_5
value: 27.363
- type: precision_at_1
value: 27.248
- type: precision_at_10
value: 15.073
- type: precision_at_100
value: 5.061
- type: precision_at_1000
value: 1.325
- type: precision_at_20
value: 11.407
- type: precision_at_3
value: 21.823
- type: precision_at_5
value: 18.984
- type: recall_at_1
value: 8.667
- type: recall_at_10
value: 26.984
- type: recall_at_100
value: 49.753
- type: recall_at_1000
value: 70.354
- type: recall_at_20
value: 33.955999999999996
- type: recall_at_3
value: 16.086
- type: recall_at_5
value: 20.544999999999998
- task:
type: Retrieval
dataset:
name: MTEB XMarket (es)
type: jinaai/xmarket_ml
config: es
split: test
revision: dfe57acff5b62c23732a7b7d3e3fb84ff501708b
metrics:
- type: main_score
value: 26.592
- type: map_at_1
value: 8.081000000000001
- type: map_at_10
value: 16.486
- type: map_at_100
value: 19.996
- type: map_at_1000
value: 20.889
- type: map_at_20
value: 18.088
- type: map_at_3
value: 12.864
- type: map_at_5
value: 14.515
- type: mrr_at_1
value: 24.643356643356643
- type: mrr_at_10
value: 33.755599955599926
- type: mrr_at_100
value: 34.55914769326114
- type: mrr_at_1000
value: 34.614384237219745
- type: mrr_at_20
value: 34.228909650276194
- type: mrr_at_3
value: 31.445221445221456
- type: mrr_at_5
value: 32.71375291375297
- type: nauc_map_at_1000_diff1
value: 19.17751654240679
- type: nauc_map_at_1000_max
value: 43.493743561136434
- type: nauc_map_at_1000_std
value: 21.14477911550252
- type: nauc_map_at_100_diff1
value: 20.259227234415395
- type: nauc_map_at_100_max
value: 42.510860292169106
- type: nauc_map_at_100_std
value: 18.63085160442346
- type: nauc_map_at_10_diff1
value: 24.12419385640694
- type: nauc_map_at_10_max
value: 35.99892932069915
- type: nauc_map_at_10_std
value: 8.488520124325058
- type: nauc_map_at_1_diff1
value: 35.09239143996649
- type: nauc_map_at_1_max
value: 23.72498533914286
- type: nauc_map_at_1_std
value: -4.164387883546102
- type: nauc_map_at_20_diff1
value: 22.411418237320817
- type: nauc_map_at_20_max
value: 39.12496266094892
- type: nauc_map_at_20_std
value: 12.371656353894227
- type: nauc_map_at_3_diff1
value: 28.106972376813506
- type: nauc_map_at_3_max
value: 29.57824316865409
- type: nauc_map_at_3_std
value: 1.8928791254813127
- type: nauc_map_at_5_diff1
value: 26.4958239149419
- type: nauc_map_at_5_max
value: 32.45906016649239
- type: nauc_map_at_5_std
value: 4.612735963224018
- type: nauc_mrr_at_1000_diff1
value: 17.614812607094446
- type: nauc_mrr_at_1000_max
value: 41.13031556228715
- type: nauc_mrr_at_1000_std
value: 22.564112871230318
- type: nauc_mrr_at_100_diff1
value: 17.614044568011085
- type: nauc_mrr_at_100_max
value: 41.129436273086796
- type: nauc_mrr_at_100_std
value: 22.566763500658766
- type: nauc_mrr_at_10_diff1
value: 17.61869494452089
- type: nauc_mrr_at_10_max
value: 41.091542329381426
- type: nauc_mrr_at_10_std
value: 22.370473458633594
- type: nauc_mrr_at_1_diff1
value: 20.321421442201913
- type: nauc_mrr_at_1_max
value: 38.36531448180009
- type: nauc_mrr_at_1_std
value: 18.422203207777688
- type: nauc_mrr_at_20_diff1
value: 17.614767736091625
- type: nauc_mrr_at_20_max
value: 41.11221420736687
- type: nauc_mrr_at_20_std
value: 22.44271891522012
- type: nauc_mrr_at_3_diff1
value: 17.98184651584625
- type: nauc_mrr_at_3_max
value: 40.424293610470144
- type: nauc_mrr_at_3_std
value: 21.554750947206706
- type: nauc_mrr_at_5_diff1
value: 17.72088314927416
- type: nauc_mrr_at_5_max
value: 40.662724739072694
- type: nauc_mrr_at_5_std
value: 21.822957528431928
- type: nauc_ndcg_at_1000_diff1
value: 15.310699428328398
- type: nauc_ndcg_at_1000_max
value: 48.83921393349997
- type: nauc_ndcg_at_1000_std
value: 32.22600294110774
- type: nauc_ndcg_at_100_diff1
value: 16.62672763977423
- type: nauc_ndcg_at_100_max
value: 47.36060653537392
- type: nauc_ndcg_at_100_std
value: 27.879865162871575
- type: nauc_ndcg_at_10_diff1
value: 16.436684176028116
- type: nauc_ndcg_at_10_max
value: 43.00026520872974
- type: nauc_ndcg_at_10_std
value: 22.507354939162806
- type: nauc_ndcg_at_1_diff1
value: 20.321421442201913
- type: nauc_ndcg_at_1_max
value: 38.36531448180009
- type: nauc_ndcg_at_1_std
value: 18.422203207777688
- type: nauc_ndcg_at_20_diff1
value: 17.127747123248835
- type: nauc_ndcg_at_20_max
value: 44.57322943752733
- type: nauc_ndcg_at_20_std
value: 23.146541187377036
- type: nauc_ndcg_at_3_diff1
value: 16.372742984728514
- type: nauc_ndcg_at_3_max
value: 40.91938017883993
- type: nauc_ndcg_at_3_std
value: 21.50917089194154
- type: nauc_ndcg_at_5_diff1
value: 16.40486505525073
- type: nauc_ndcg_at_5_max
value: 41.94597203181329
- type: nauc_ndcg_at_5_std
value: 22.068260809047562
- type: nauc_precision_at_1000_diff1
value: -15.9415313729527
- type: nauc_precision_at_1000_max
value: 12.653329948983643
- type: nauc_precision_at_1000_std
value: 26.371820703256173
- type: nauc_precision_at_100_diff1
value: -11.851070166675289
- type: nauc_precision_at_100_max
value: 32.164365923950115
- type: nauc_precision_at_100_std
value: 45.930226426725426
- type: nauc_precision_at_10_diff1
value: -3.1352660378259163
- type: nauc_precision_at_10_max
value: 45.48359878733272
- type: nauc_precision_at_10_std
value: 40.2917038044196
- type: nauc_precision_at_1_diff1
value: 20.321421442201913
- type: nauc_precision_at_1_max
value: 38.36531448180009
- type: nauc_precision_at_1_std
value: 18.422203207777688
- type: nauc_precision_at_20_diff1
value: -7.087513342144751
- type: nauc_precision_at_20_max
value: 43.66272019058357
- type: nauc_precision_at_20_std
value: 44.22863351071686
- type: nauc_precision_at_3_diff1
value: 7.836185032609045
- type: nauc_precision_at_3_max
value: 44.85412904097269
- type: nauc_precision_at_3_std
value: 30.209139149500057
- type: nauc_precision_at_5_diff1
value: 3.028150537253791
- type: nauc_precision_at_5_max
value: 45.73661708882973
- type: nauc_precision_at_5_std
value: 34.65500311185052
- type: nauc_recall_at_1000_diff1
value: 9.526124668370704
- type: nauc_recall_at_1000_max
value: 51.4190208452196
- type: nauc_recall_at_1000_std
value: 45.694891695646426
- type: nauc_recall_at_100_diff1
value: 12.68466215400009
- type: nauc_recall_at_100_max
value: 42.79112054268112
- type: nauc_recall_at_100_std
value: 28.61954251400998
- type: nauc_recall_at_10_diff1
value: 17.95124413416829
- type: nauc_recall_at_10_max
value: 33.1192036755167
- type: nauc_recall_at_10_std
value: 9.3588175959525
- type: nauc_recall_at_1_diff1
value: 35.09239143996649
- type: nauc_recall_at_1_max
value: 23.72498533914286
- type: nauc_recall_at_1_std
value: -4.164387883546102
- type: nauc_recall_at_20_diff1
value: 16.24916980445646
- type: nauc_recall_at_20_max
value: 36.51316122236076
- type: nauc_recall_at_20_std
value: 13.641588062425736
- type: nauc_recall_at_3_diff1
value: 23.263199724138786
- type: nauc_recall_at_3_max
value: 27.67354561610614
- type: nauc_recall_at_3_std
value: 3.103127242654415
- type: nauc_recall_at_5_diff1
value: 20.719704839229635
- type: nauc_recall_at_5_max
value: 29.66480839111333
- type: nauc_recall_at_5_std
value: 5.514884455797986
- type: ndcg_at_1
value: 24.643
- type: ndcg_at_10
value: 26.592
- type: ndcg_at_100
value: 31.887
- type: ndcg_at_1000
value: 36.695
- type: ndcg_at_20
value: 28.166000000000004
- type: ndcg_at_3
value: 25.238
- type: ndcg_at_5
value: 25.545
- type: precision_at_1
value: 24.643
- type: precision_at_10
value: 13.730999999999998
- type: precision_at_100
value: 4.744000000000001
- type: precision_at_1000
value: 1.167
- type: precision_at_20
value: 10.562000000000001
- type: precision_at_3
value: 20.288999999999998
- type: precision_at_5
value: 17.337
- type: recall_at_1
value: 8.081000000000001
- type: recall_at_10
value: 25.911
- type: recall_at_100
value: 48.176
- type: recall_at_1000
value: 69.655
- type: recall_at_20
value: 32.924
- type: recall_at_3
value: 16.125
- type: recall_at_5
value: 19.988
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (deu-deu)
type: jinaai/xpqa
config: deu-deu
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: main_score
value: 84.552
- type: map_at_1
value: 59.023
- type: map_at_10
value: 81.051
- type: map_at_100
value: 81.539
- type: map_at_1000
value: 81.54299999999999
- type: map_at_20
value: 81.401
- type: map_at_3
value: 76.969
- type: map_at_5
value: 80.07600000000001
- type: mrr_at_1
value: 77.67624020887729
- type: mrr_at_10
value: 83.30509967259314
- type: mrr_at_100
value: 83.58599391639456
- type: mrr_at_1000
value: 83.58970114722587
- type: mrr_at_20
value: 83.50275980440317
- type: mrr_at_3
value: 82.07136640557006
- type: mrr_at_5
value: 82.94604003481287
- type: nauc_map_at_1000_diff1
value: 63.12885104269942
- type: nauc_map_at_1000_max
value: 57.7017996674959
- type: nauc_map_at_1000_std
value: -24.951068985070513
- type: nauc_map_at_100_diff1
value: 63.12866509393162
- type: nauc_map_at_100_max
value: 57.70176426013332
- type: nauc_map_at_100_std
value: -24.96012290790273
- type: nauc_map_at_10_diff1
value: 62.847709436211204
- type: nauc_map_at_10_max
value: 57.408873624779524
- type: nauc_map_at_10_std
value: -25.635130363219062
- type: nauc_map_at_1_diff1
value: 71.89683981857102
- type: nauc_map_at_1_max
value: 20.204460967432645
- type: nauc_map_at_1_std
value: -23.07894656629493
- type: nauc_map_at_20_diff1
value: 63.00504457011043
- type: nauc_map_at_20_max
value: 57.66009512514262
- type: nauc_map_at_20_std
value: -25.100138593754885
- type: nauc_map_at_3_diff1
value: 63.199874607788274
- type: nauc_map_at_3_max
value: 47.54482033763308
- type: nauc_map_at_3_std
value: -27.714557098916963
- type: nauc_map_at_5_diff1
value: 63.01006523518669
- type: nauc_map_at_5_max
value: 56.501965964288495
- type: nauc_map_at_5_std
value: -25.367825762790925
- type: nauc_mrr_at_1000_diff1
value: 66.24988063948112
- type: nauc_mrr_at_1000_max
value: 63.56921667744273
- type: nauc_mrr_at_1000_std
value: -22.073973768031863
- type: nauc_mrr_at_100_diff1
value: 66.24919554296275
- type: nauc_mrr_at_100_max
value: 63.57382447608361
- type: nauc_mrr_at_100_std
value: -22.084627248538187
- type: nauc_mrr_at_10_diff1
value: 66.0143885124066
- type: nauc_mrr_at_10_max
value: 63.51277586011898
- type: nauc_mrr_at_10_std
value: -22.477523960705454
- type: nauc_mrr_at_1_diff1
value: 68.25415199323474
- type: nauc_mrr_at_1_max
value: 63.069019003272416
- type: nauc_mrr_at_1_std
value: -18.77085924093244
- type: nauc_mrr_at_20_diff1
value: 66.16203167351055
- type: nauc_mrr_at_20_max
value: 63.607477776215845
- type: nauc_mrr_at_20_std
value: -22.15083176017266
- type: nauc_mrr_at_3_diff1
value: 66.39368842782302
- type: nauc_mrr_at_3_max
value: 63.11411066585295
- type: nauc_mrr_at_3_std
value: -22.63174342814071
- type: nauc_mrr_at_5_diff1
value: 66.17932562332354
- type: nauc_mrr_at_5_max
value: 63.70434825329594
- type: nauc_mrr_at_5_std
value: -21.704012812430438
- type: nauc_ndcg_at_1000_diff1
value: 63.958010361549356
- type: nauc_ndcg_at_1000_max
value: 60.516445000134624
- type: nauc_ndcg_at_1000_std
value: -24.264672248289923
- type: nauc_ndcg_at_100_diff1
value: 63.97654644758022
- type: nauc_ndcg_at_100_max
value: 60.62187552803407
- type: nauc_ndcg_at_100_std
value: -24.317149225778312
- type: nauc_ndcg_at_10_diff1
value: 62.505321221321566
- type: nauc_ndcg_at_10_max
value: 59.77891112351258
- type: nauc_ndcg_at_10_std
value: -26.90910005589911
- type: nauc_ndcg_at_1_diff1
value: 68.25415199323474
- type: nauc_ndcg_at_1_max
value: 63.069019003272416
- type: nauc_ndcg_at_1_std
value: -18.77085924093244
- type: nauc_ndcg_at_20_diff1
value: 63.04281805056225
- type: nauc_ndcg_at_20_max
value: 60.600957307444226
- type: nauc_ndcg_at_20_std
value: -24.954862079889203
- type: nauc_ndcg_at_3_diff1
value: 62.970441139740316
- type: nauc_ndcg_at_3_max
value: 57.543715669055295
- type: nauc_ndcg_at_3_std
value: -25.659388431714703
- type: nauc_ndcg_at_5_diff1
value: 62.82652127664541
- type: nauc_ndcg_at_5_max
value: 58.6970443258532
- type: nauc_ndcg_at_5_std
value: -25.66329354851023
- type: nauc_precision_at_1000_diff1
value: -33.38530947486223
- type: nauc_precision_at_1000_max
value: 25.972468024345414
- type: nauc_precision_at_1000_std
value: 17.460222955117978
- type: nauc_precision_at_100_diff1
value: -32.45175999251703
- type: nauc_precision_at_100_max
value: 26.367996120487337
- type: nauc_precision_at_100_std
value: 17.097957946391208
- type: nauc_precision_at_10_diff1
value: -26.97411235289487
- type: nauc_precision_at_10_max
value: 31.504961687240762
- type: nauc_precision_at_10_std
value: 11.125341183874687
- type: nauc_precision_at_1_diff1
value: 68.25415199323474
- type: nauc_precision_at_1_max
value: 63.069019003272416
- type: nauc_precision_at_1_std
value: -18.77085924093244
- type: nauc_precision_at_20_diff1
value: -29.8678078736273
- type: nauc_precision_at_20_max
value: 29.031222186584504
- type: nauc_precision_at_20_std
value: 14.943600563087928
- type: nauc_precision_at_3_diff1
value: -15.92947221299854
- type: nauc_precision_at_3_max
value: 37.73833494235097
- type: nauc_precision_at_3_std
value: 3.1573228443500847
- type: nauc_precision_at_5_diff1
value: -22.269156821101642
- type: nauc_precision_at_5_max
value: 35.65821838116355
- type: nauc_precision_at_5_std
value: 9.265930386198972
- type: nauc_recall_at_1000_diff1
value: .nan
- type: nauc_recall_at_1000_max
value: .nan
- type: nauc_recall_at_1000_std
value: .nan
- type: nauc_recall_at_100_diff1
value: 66.17058859539249
- type: nauc_recall_at_100_max
value: 78.066942935192
- type: nauc_recall_at_100_std
value: -22.213377762074686
- type: nauc_recall_at_10_diff1
value: 50.82149700700275
- type: nauc_recall_at_10_max
value: 56.68053325008221
- type: nauc_recall_at_10_std
value: -41.81657941433277
- type: nauc_recall_at_1_diff1
value: 71.89683981857102
- type: nauc_recall_at_1_max
value: 20.204460967432645
- type: nauc_recall_at_1_std
value: -23.07894656629493
- type: nauc_recall_at_20_diff1
value: 48.28076011857885
- type: nauc_recall_at_20_max
value: 63.29641555519295
- type: nauc_recall_at_20_std
value: -32.953559708819405
- type: nauc_recall_at_3_diff1
value: 58.15516956312558
- type: nauc_recall_at_3_max
value: 42.66315890283056
- type: nauc_recall_at_3_std
value: -32.16572530544806
- type: nauc_recall_at_5_diff1
value: 55.900844052439766
- type: nauc_recall_at_5_max
value: 55.23702018862884
- type: nauc_recall_at_5_std
value: -30.105929528165
- type: ndcg_at_1
value: 77.676
- type: ndcg_at_10
value: 84.552
- type: ndcg_at_100
value: 86.232
- type: ndcg_at_1000
value: 86.33800000000001
- type: ndcg_at_20
value: 85.515
- type: ndcg_at_3
value: 81.112
- type: ndcg_at_5
value: 82.943
- type: precision_at_1
value: 77.676
- type: precision_at_10
value: 15.17
- type: precision_at_100
value: 1.6230000000000002
- type: precision_at_1000
value: 0.163
- type: precision_at_20
value: 7.858999999999999
- type: precision_at_3
value: 42.994
- type: precision_at_5
value: 28.747
- type: recall_at_1
value: 59.023
- type: recall_at_10
value: 92.465
- type: recall_at_100
value: 99.18400000000001
- type: recall_at_1000
value: 100.0
- type: recall_at_20
value: 95.844
- type: recall_at_3
value: 81.826
- type: recall_at_5
value: 88.22
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (deu-eng)
type: jinaai/xpqa
config: deu-eng
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: main_score
value: 82.149
- type: map_at_1
value: 56.277
- type: map_at_10
value: 78.36999999999999
- type: map_at_100
value: 78.94
- type: map_at_1000
value: 78.95
- type: map_at_20
value: 78.818
- type: map_at_3
value: 74.25
- type: map_at_5
value: 77.11099999999999
- type: mrr_at_1
value: 74.28198433420366
- type: mrr_at_10
value: 80.57487877657589
- type: mrr_at_100
value: 80.94025764149008
- type: mrr_at_1000
value: 80.94608738871234
- type: mrr_at_20
value: 80.86240675885023
- type: mrr_at_3
value: 79.4604003481288
- type: mrr_at_5
value: 80.10008703220191
- type: nauc_map_at_1000_diff1
value: 60.44369249057189
- type: nauc_map_at_1000_max
value: 49.822240441830246
- type: nauc_map_at_1000_std
value: -27.34026380762817
- type: nauc_map_at_100_diff1
value: 60.44635668050401
- type: nauc_map_at_100_max
value: 49.838675926660684
- type: nauc_map_at_100_std
value: -27.310365556055583
- type: nauc_map_at_10_diff1
value: 60.18546951726522
- type: nauc_map_at_10_max
value: 49.72075398096832
- type: nauc_map_at_10_std
value: -27.86056102461558
- type: nauc_map_at_1_diff1
value: 71.2906657099758
- type: nauc_map_at_1_max
value: 18.970399251589
- type: nauc_map_at_1_std
value: -27.260776614286602
- type: nauc_map_at_20_diff1
value: 60.3525975566164
- type: nauc_map_at_20_max
value: 49.852487866710646
- type: nauc_map_at_20_std
value: -27.305173830170332
- type: nauc_map_at_3_diff1
value: 60.66803500571236
- type: nauc_map_at_3_max
value: 41.18191941521972
- type: nauc_map_at_3_std
value: -28.71383593401732
- type: nauc_map_at_5_diff1
value: 60.57216514504887
- type: nauc_map_at_5_max
value: 47.99837400446299
- type: nauc_map_at_5_std
value: -28.756183015949986
- type: nauc_mrr_at_1000_diff1
value: 63.77031955602516
- type: nauc_mrr_at_1000_max
value: 54.26907383811417
- type: nauc_mrr_at_1000_std
value: -26.227442087164714
- type: nauc_mrr_at_100_diff1
value: 63.77196650108669
- type: nauc_mrr_at_100_max
value: 54.281801457913126
- type: nauc_mrr_at_100_std
value: -26.216077891830793
- type: nauc_mrr_at_10_diff1
value: 63.50095284903051
- type: nauc_mrr_at_10_max
value: 54.3186301730016
- type: nauc_mrr_at_10_std
value: -26.29570241722173
- type: nauc_mrr_at_1_diff1
value: 65.15855770999057
- type: nauc_mrr_at_1_max
value: 53.213286738515066
- type: nauc_mrr_at_1_std
value: -24.683178252901943
- type: nauc_mrr_at_20_diff1
value: 63.74936550280859
- type: nauc_mrr_at_20_max
value: 54.355343751439065
- type: nauc_mrr_at_20_std
value: -26.197316900009817
- type: nauc_mrr_at_3_diff1
value: 63.912612979082695
- type: nauc_mrr_at_3_max
value: 53.75399024225975
- type: nauc_mrr_at_3_std
value: -27.194143264554675
- type: nauc_mrr_at_5_diff1
value: 63.72491059053639
- type: nauc_mrr_at_5_max
value: 53.66107604019352
- type: nauc_mrr_at_5_std
value: -26.92281560584754
- type: nauc_ndcg_at_1000_diff1
value: 61.304218998714354
- type: nauc_ndcg_at_1000_max
value: 52.409135743660386
- type: nauc_ndcg_at_1000_std
value: -26.539796489464056
- type: nauc_ndcg_at_100_diff1
value: 61.40355045085304
- type: nauc_ndcg_at_100_max
value: 52.79402259608008
- type: nauc_ndcg_at_100_std
value: -25.927273456979965
- type: nauc_ndcg_at_10_diff1
value: 59.93675608684116
- type: nauc_ndcg_at_10_max
value: 52.617848197542706
- type: nauc_ndcg_at_10_std
value: -27.314820020095887
- type: nauc_ndcg_at_1_diff1
value: 65.15855770999057
- type: nauc_ndcg_at_1_max
value: 53.213286738515066
- type: nauc_ndcg_at_1_std
value: -24.683178252901943
- type: nauc_ndcg_at_20_diff1
value: 60.85093704358376
- type: nauc_ndcg_at_20_max
value: 53.14529242671602
- type: nauc_ndcg_at_20_std
value: -25.93187916231906
- type: nauc_ndcg_at_3_diff1
value: 60.42301123518882
- type: nauc_ndcg_at_3_max
value: 49.59021992975956
- type: nauc_ndcg_at_3_std
value: -27.397117967810363
- type: nauc_ndcg_at_5_diff1
value: 60.78655153154219
- type: nauc_ndcg_at_5_max
value: 49.54194799556953
- type: nauc_ndcg_at_5_std
value: -29.467910172913413
- type: nauc_precision_at_1000_diff1
value: -34.35027108027456
- type: nauc_precision_at_1000_max
value: 23.762671066858815
- type: nauc_precision_at_1000_std
value: 16.1704780298982
- type: nauc_precision_at_100_diff1
value: -32.66610016754961
- type: nauc_precision_at_100_max
value: 25.504044603109588
- type: nauc_precision_at_100_std
value: 16.932402988816786
- type: nauc_precision_at_10_diff1
value: -25.720903145017342
- type: nauc_precision_at_10_max
value: 30.37029690599926
- type: nauc_precision_at_10_std
value: 10.560753160200314
- type: nauc_precision_at_1_diff1
value: 65.15855770999057
- type: nauc_precision_at_1_max
value: 53.213286738515066
- type: nauc_precision_at_1_std
value: -24.683178252901943
- type: nauc_precision_at_20_diff1
value: -29.577582332619084
- type: nauc_precision_at_20_max
value: 27.984145595920417
- type: nauc_precision_at_20_std
value: 15.083711704044727
- type: nauc_precision_at_3_diff1
value: -14.736267532892697
- type: nauc_precision_at_3_max
value: 36.12211021824307
- type: nauc_precision_at_3_std
value: 3.068643876519412
- type: nauc_precision_at_5_diff1
value: -19.846707283120825
- type: nauc_precision_at_5_max
value: 33.573804532177896
- type: nauc_precision_at_5_std
value: 5.700545622744924
- type: nauc_recall_at_1000_diff1
value: .nan
- type: nauc_recall_at_1000_max
value: .nan
- type: nauc_recall_at_1000_std
value: .nan
- type: nauc_recall_at_100_diff1
value: 68.24749796604452
- type: nauc_recall_at_100_max
value: 83.30024864929815
- type: nauc_recall_at_100_std
value: 21.23763053711522
- type: nauc_recall_at_10_diff1
value: 50.704049683241436
- type: nauc_recall_at_10_max
value: 57.64578984555556
- type: nauc_recall_at_10_std
value: -26.632759037746073
- type: nauc_recall_at_1_diff1
value: 71.2906657099758
- type: nauc_recall_at_1_max
value: 18.970399251589
- type: nauc_recall_at_1_std
value: -27.260776614286602
- type: nauc_recall_at_20_diff1
value: 54.124480837579505
- type: nauc_recall_at_20_max
value: 66.4641515433479
- type: nauc_recall_at_20_std
value: -14.615911455379393
- type: nauc_recall_at_3_diff1
value: 56.54358788321059
- type: nauc_recall_at_3_max
value: 37.765735322465744
- type: nauc_recall_at_3_std
value: -30.824147408598574
- type: nauc_recall_at_5_diff1
value: 56.392894535029214
- type: nauc_recall_at_5_max
value: 45.959268387521554
- type: nauc_recall_at_5_std
value: -33.58175576925282
- type: ndcg_at_1
value: 74.28200000000001
- type: ndcg_at_10
value: 82.149
- type: ndcg_at_100
value: 84.129
- type: ndcg_at_1000
value: 84.307
- type: ndcg_at_20
value: 83.39999999999999
- type: ndcg_at_3
value: 78.583
- type: ndcg_at_5
value: 80.13900000000001
- type: precision_at_1
value: 74.28200000000001
- type: precision_at_10
value: 14.960999999999999
- type: precision_at_100
value: 1.6119999999999999
- type: precision_at_1000
value: 0.163
- type: precision_at_20
value: 7.813000000000001
- type: precision_at_3
value: 41.819
- type: precision_at_5
value: 27.911
- type: recall_at_1
value: 56.277
- type: recall_at_10
value: 90.729
- type: recall_at_100
value: 98.792
- type: recall_at_1000
value: 100.0
- type: recall_at_20
value: 95.148
- type: recall_at_3
value: 79.989
- type: recall_at_5
value: 85.603
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (eng-deu)
type: jinaai/xpqa
config: eng-deu
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: main_score
value: 60.428000000000004
- type: map_at_1
value: 33.453
- type: map_at_10
value: 54.217000000000006
- type: map_at_100
value: 55.832
- type: map_at_1000
value: 55.884
- type: map_at_20
value: 55.236
- type: map_at_3
value: 48.302
- type: map_at_5
value: 51.902
- type: mrr_at_1
value: 53.916449086161876
- type: mrr_at_10
value: 61.4685647975465
- type: mrr_at_100
value: 62.13718159287348
- type: mrr_at_1000
value: 62.15799113826325
- type: mrr_at_20
value: 61.885388764243544
- type: mrr_at_3
value: 59.44299390774582
- type: mrr_at_5
value: 60.26544821583981
- type: nauc_map_at_1000_diff1
value: 39.824412602121804
- type: nauc_map_at_1000_max
value: 39.49332709959374
- type: nauc_map_at_1000_std
value: -17.27462623749702
- type: nauc_map_at_100_diff1
value: 39.80528910003463
- type: nauc_map_at_100_max
value: 39.51471609156093
- type: nauc_map_at_100_std
value: -17.275536933094937
- type: nauc_map_at_10_diff1
value: 39.28558292349772
- type: nauc_map_at_10_max
value: 38.13220294838968
- type: nauc_map_at_10_std
value: -18.235985574392863
- type: nauc_map_at_1_diff1
value: 43.68892397816937
- type: nauc_map_at_1_max
value: 14.478978190224353
- type: nauc_map_at_1_std
value: -18.435031919225477
- type: nauc_map_at_20_diff1
value: 39.8733530971344
- type: nauc_map_at_20_max
value: 39.30513202591992
- type: nauc_map_at_20_std
value: -17.62362848144766
- type: nauc_map_at_3_diff1
value: 40.31116611188815
- type: nauc_map_at_3_max
value: 31.107314675202165
- type: nauc_map_at_3_std
value: -19.52930881946966
- type: nauc_map_at_5_diff1
value: 39.1241499095765
- type: nauc_map_at_5_max
value: 37.330543901034055
- type: nauc_map_at_5_std
value: -17.893862772447548
- type: nauc_mrr_at_1000_diff1
value: 43.07490530140024
- type: nauc_mrr_at_1000_max
value: 42.28469195779226
- type: nauc_mrr_at_1000_std
value: -15.583217110180737
- type: nauc_mrr_at_100_diff1
value: 43.068836494603886
- type: nauc_mrr_at_100_max
value: 42.29612450479168
- type: nauc_mrr_at_100_std
value: -15.57218089438229
- type: nauc_mrr_at_10_diff1
value: 42.88685919151777
- type: nauc_mrr_at_10_max
value: 41.89944452003811
- type: nauc_mrr_at_10_std
value: -15.909673572763165
- type: nauc_mrr_at_1_diff1
value: 45.67646898532131
- type: nauc_mrr_at_1_max
value: 43.0541870425035
- type: nauc_mrr_at_1_std
value: -15.597124291613563
- type: nauc_mrr_at_20_diff1
value: 43.14141873150977
- type: nauc_mrr_at_20_max
value: 42.33063543184022
- type: nauc_mrr_at_20_std
value: -15.607612016107304
- type: nauc_mrr_at_3_diff1
value: 43.18370928261982
- type: nauc_mrr_at_3_max
value: 42.18529980773961
- type: nauc_mrr_at_3_std
value: -15.900151400673629
- type: nauc_mrr_at_5_diff1
value: 42.43443044877765
- type: nauc_mrr_at_5_max
value: 42.05818605278972
- type: nauc_mrr_at_5_std
value: -15.436502733299893
- type: nauc_ndcg_at_1000_diff1
value: 40.60606676178781
- type: nauc_ndcg_at_1000_max
value: 41.71923393878376
- type: nauc_ndcg_at_1000_std
value: -15.694740326899556
- type: nauc_ndcg_at_100_diff1
value: 40.15270376312309
- type: nauc_ndcg_at_100_max
value: 42.234126305709225
- type: nauc_ndcg_at_100_std
value: -15.436051984708952
- type: nauc_ndcg_at_10_diff1
value: 39.142259831299455
- type: nauc_ndcg_at_10_max
value: 38.61470104273746
- type: nauc_ndcg_at_10_std
value: -18.577452829132742
- type: nauc_ndcg_at_1_diff1
value: 45.67646898532131
- type: nauc_ndcg_at_1_max
value: 43.0541870425035
- type: nauc_ndcg_at_1_std
value: -15.597124291613563
- type: nauc_ndcg_at_20_diff1
value: 40.805159395901306
- type: nauc_ndcg_at_20_max
value: 41.58685629374952
- type: nauc_ndcg_at_20_std
value: -16.862408156222592
- type: nauc_ndcg_at_3_diff1
value: 39.12028215488432
- type: nauc_ndcg_at_3_max
value: 39.70580596343164
- type: nauc_ndcg_at_3_std
value: -16.705546903936213
- type: nauc_ndcg_at_5_diff1
value: 38.42075404927361
- type: nauc_ndcg_at_5_max
value: 38.064219879504385
- type: nauc_ndcg_at_5_std
value: -17.20282111665876
- type: nauc_precision_at_1000_diff1
value: -4.419224540552891
- type: nauc_precision_at_1000_max
value: 35.686022591225246
- type: nauc_precision_at_1000_std
value: 15.023520191032972
- type: nauc_precision_at_100_diff1
value: -2.9027602601603895
- type: nauc_precision_at_100_max
value: 39.99864013028808
- type: nauc_precision_at_100_std
value: 13.863497117255525
- type: nauc_precision_at_10_diff1
value: 5.539104839809501
- type: nauc_precision_at_10_max
value: 42.41625740557432
- type: nauc_precision_at_10_std
value: 1.0894693748662556
- type: nauc_precision_at_1_diff1
value: 45.67646898532131
- type: nauc_precision_at_1_max
value: 43.0541870425035
- type: nauc_precision_at_1_std
value: -15.597124291613563
- type: nauc_precision_at_20_diff1
value: 4.734562571681868
- type: nauc_precision_at_20_max
value: 44.35081213316202
- type: nauc_precision_at_20_std
value: 6.642891478284595
- type: nauc_precision_at_3_diff1
value: 13.936559341472101
- type: nauc_precision_at_3_max
value: 45.426668552497524
- type: nauc_precision_at_3_std
value: -5.219785419247125
- type: nauc_precision_at_5_diff1
value: 8.366706789546015
- type: nauc_precision_at_5_max
value: 46.161942989326896
- type: nauc_precision_at_5_std
value: -0.193140343545876
- type: nauc_recall_at_1000_diff1
value: 45.61785312444842
- type: nauc_recall_at_1000_max
value: 75.68258976531774
- type: nauc_recall_at_1000_std
value: 37.469059422121575
- type: nauc_recall_at_100_diff1
value: 26.798748531805096
- type: nauc_recall_at_100_max
value: 54.72134095197765
- type: nauc_recall_at_100_std
value: -1.5967608233799417
- type: nauc_recall_at_10_diff1
value: 32.13211696200521
- type: nauc_recall_at_10_max
value: 31.13866254975895
- type: nauc_recall_at_10_std
value: -22.31404161136118
- type: nauc_recall_at_1_diff1
value: 43.68892397816937
- type: nauc_recall_at_1_max
value: 14.478978190224353
- type: nauc_recall_at_1_std
value: -18.435031919225477
- type: nauc_recall_at_20_diff1
value: 38.597996930461385
- type: nauc_recall_at_20_max
value: 42.49849027366794
- type: nauc_recall_at_20_std
value: -16.536471900752154
- type: nauc_recall_at_3_diff1
value: 35.343730012759266
- type: nauc_recall_at_3_max
value: 26.898722085043392
- type: nauc_recall_at_3_std
value: -19.4459792273884
- type: nauc_recall_at_5_diff1
value: 31.8310298012186
- type: nauc_recall_at_5_max
value: 32.67800489655844
- type: nauc_recall_at_5_std
value: -16.800929103347283
- type: ndcg_at_1
value: 53.916
- type: ndcg_at_10
value: 60.428000000000004
- type: ndcg_at_100
value: 65.95
- type: ndcg_at_1000
value: 66.88
- type: ndcg_at_20
value: 62.989
- type: ndcg_at_3
value: 55.204
- type: ndcg_at_5
value: 56.42700000000001
- type: precision_at_1
value: 53.916
- type: precision_at_10
value: 14.346999999999998
- type: precision_at_100
value: 1.849
- type: precision_at_1000
value: 0.196
- type: precision_at_20
value: 8.022
- type: precision_at_3
value: 34.552
- type: precision_at_5
value: 24.569
- type: recall_at_1
value: 33.453
- type: recall_at_10
value: 71.07900000000001
- type: recall_at_100
value: 93.207
- type: recall_at_1000
value: 99.60799999999999
- type: recall_at_20
value: 79.482
- type: recall_at_3
value: 53.98
- type: recall_at_5
value: 60.781
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (eng-pol)
type: jinaai/xpqa
config: eng-pol
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: main_score
value: 34.042
- type: map_at_1
value: 13.236
- type: map_at_10
value: 27.839999999999996
- type: map_at_100
value: 30.171999999999997
- type: map_at_1000
value: 30.349999999999998
- type: map_at_20
value: 29.044999999999998
- type: map_at_3
value: 22.58
- type: map_at_5
value: 25.83
- type: mrr_at_1
value: 30.318471337579616
- type: mrr_at_10
value: 37.4983823678091
- type: mrr_at_100
value: 38.5784523175009
- type: mrr_at_1000
value: 38.63608698968148
- type: mrr_at_20
value: 38.02996157871825
- type: mrr_at_3
value: 34.798301486199584
- type: mrr_at_5
value: 36.39702760084925
- type: nauc_map_at_1000_diff1
value: 21.07199789609177
- type: nauc_map_at_1000_max
value: 25.959233507893277
- type: nauc_map_at_1000_std
value: -28.011925372852826
- type: nauc_map_at_100_diff1
value: 21.086788412737548
- type: nauc_map_at_100_max
value: 25.8611620203686
- type: nauc_map_at_100_std
value: -28.179239912057515
- type: nauc_map_at_10_diff1
value: 21.23841745922078
- type: nauc_map_at_10_max
value: 25.44290342378288
- type: nauc_map_at_10_std
value: -28.75578689110275
- type: nauc_map_at_1_diff1
value: 28.87454015638211
- type: nauc_map_at_1_max
value: 17.50681123879997
- type: nauc_map_at_1_std
value: -30.382831850562432
- type: nauc_map_at_20_diff1
value: 21.076559713540455
- type: nauc_map_at_20_max
value: 25.538154202494535
- type: nauc_map_at_20_std
value: -28.518764617658555
- type: nauc_map_at_3_diff1
value: 22.159185358766468
- type: nauc_map_at_3_max
value: 23.01652660927249
- type: nauc_map_at_3_std
value: -29.567722713221862
- type: nauc_map_at_5_diff1
value: 21.35578810370897
- type: nauc_map_at_5_max
value: 25.550550437767395
- type: nauc_map_at_5_std
value: -28.7889035461355
- type: nauc_mrr_at_1000_diff1
value: 22.28633009221923
- type: nauc_mrr_at_1000_max
value: 26.920205393136392
- type: nauc_mrr_at_1000_std
value: -25.887791634977642
- type: nauc_mrr_at_100_diff1
value: 22.2754975739755
- type: nauc_mrr_at_100_max
value: 26.90235716615346
- type: nauc_mrr_at_100_std
value: -25.891596020584345
- type: nauc_mrr_at_10_diff1
value: 22.415076305593534
- type: nauc_mrr_at_10_max
value: 26.504643796222222
- type: nauc_mrr_at_10_std
value: -26.6046081215833
- type: nauc_mrr_at_1_diff1
value: 23.406748619244368
- type: nauc_mrr_at_1_max
value: 29.058228240823553
- type: nauc_mrr_at_1_std
value: -26.450169820901078
- type: nauc_mrr_at_20_diff1
value: 22.29233141817678
- type: nauc_mrr_at_20_max
value: 26.69021351064081
- type: nauc_mrr_at_20_std
value: -26.086596227376656
- type: nauc_mrr_at_3_diff1
value: 22.20746187500145
- type: nauc_mrr_at_3_max
value: 27.143725946169457
- type: nauc_mrr_at_3_std
value: -26.7017708594376
- type: nauc_mrr_at_5_diff1
value: 22.71898965233195
- type: nauc_mrr_at_5_max
value: 26.932386658571662
- type: nauc_mrr_at_5_std
value: -26.725541058780234
- type: nauc_ndcg_at_1000_diff1
value: 20.541734305148466
- type: nauc_ndcg_at_1000_max
value: 27.180534238090758
- type: nauc_ndcg_at_1000_std
value: -23.74197745177845
- type: nauc_ndcg_at_100_diff1
value: 20.570052839937468
- type: nauc_ndcg_at_100_max
value: 26.21605034405486
- type: nauc_ndcg_at_100_std
value: -25.359817188805028
- type: nauc_ndcg_at_10_diff1
value: 21.241423075073467
- type: nauc_ndcg_at_10_max
value: 24.599199195239475
- type: nauc_ndcg_at_10_std
value: -28.404540333309008
- type: nauc_ndcg_at_1_diff1
value: 23.406748619244368
- type: nauc_ndcg_at_1_max
value: 29.058228240823553
- type: nauc_ndcg_at_1_std
value: -26.450169820901078
- type: nauc_ndcg_at_20_diff1
value: 20.740460046196873
- type: nauc_ndcg_at_20_max
value: 24.82380195169634
- type: nauc_ndcg_at_20_std
value: -27.376298834244313
- type: nauc_ndcg_at_3_diff1
value: 19.994948682426504
- type: nauc_ndcg_at_3_max
value: 26.153790759405105
- type: nauc_ndcg_at_3_std
value: -27.194548404540885
- type: nauc_ndcg_at_5_diff1
value: 21.48414272096384
- type: nauc_ndcg_at_5_max
value: 25.239652015076373
- type: nauc_ndcg_at_5_std
value: -28.2620160957961
- type: nauc_precision_at_1000_diff1
value: -0.7557639926687744
- type: nauc_precision_at_1000_max
value: 24.265591636994436
- type: nauc_precision_at_1000_std
value: 16.833104654292654
- type: nauc_precision_at_100_diff1
value: 4.647847665941115
- type: nauc_precision_at_100_max
value: 24.42192644844434
- type: nauc_precision_at_100_std
value: 0.2718848568876648
- type: nauc_precision_at_10_diff1
value: 9.465969286722654
- type: nauc_precision_at_10_max
value: 27.448993150448043
- type: nauc_precision_at_10_std
value: -16.519099596502212
- type: nauc_precision_at_1_diff1
value: 23.406748619244368
- type: nauc_precision_at_1_max
value: 29.058228240823553
- type: nauc_precision_at_1_std
value: -26.450169820901078
- type: nauc_precision_at_20_diff1
value: 8.021421615668114
- type: nauc_precision_at_20_max
value: 26.18556481398635
- type: nauc_precision_at_20_std
value: -12.207152108668367
- type: nauc_precision_at_3_diff1
value: 11.783572803634241
- type: nauc_precision_at_3_max
value: 29.259715774978893
- type: nauc_precision_at_3_std
value: -20.407524967717425
- type: nauc_precision_at_5_diff1
value: 10.371728615220821
- type: nauc_precision_at_5_max
value: 30.270642833482864
- type: nauc_precision_at_5_std
value: -18.407334880575494
- type: nauc_recall_at_1000_diff1
value: 6.008969959111555
- type: nauc_recall_at_1000_max
value: 39.79691734058127
- type: nauc_recall_at_1000_std
value: 32.43591825510109
- type: nauc_recall_at_100_diff1
value: 15.2374566058917
- type: nauc_recall_at_100_max
value: 23.058785539503717
- type: nauc_recall_at_100_std
value: -15.962888794058165
- type: nauc_recall_at_10_diff1
value: 19.46184821807753
- type: nauc_recall_at_10_max
value: 19.001003513986866
- type: nauc_recall_at_10_std
value: -27.753332786663876
- type: nauc_recall_at_1_diff1
value: 28.87454015638211
- type: nauc_recall_at_1_max
value: 17.50681123879997
- type: nauc_recall_at_1_std
value: -30.382831850562432
- type: nauc_recall_at_20_diff1
value: 17.237090858517405
- type: nauc_recall_at_20_max
value: 18.42118474134871
- type: nauc_recall_at_20_std
value: -24.862787724031957
- type: nauc_recall_at_3_diff1
value: 18.813019521758577
- type: nauc_recall_at_3_max
value: 19.198572333053544
- type: nauc_recall_at_3_std
value: -28.5644958605618
- type: nauc_recall_at_5_diff1
value: 20.247501986329482
- type: nauc_recall_at_5_max
value: 21.121526202170358
- type: nauc_recall_at_5_std
value: -27.220378617864853
- type: ndcg_at_1
value: 30.318
- type: ndcg_at_10
value: 34.042
- type: ndcg_at_100
value: 42.733
- type: ndcg_at_1000
value: 46.015
- type: ndcg_at_20
value: 37.053999999999995
- type: ndcg_at_3
value: 29.254
- type: ndcg_at_5
value: 30.514000000000003
- type: precision_at_1
value: 30.318
- type: precision_at_10
value: 10.981
- type: precision_at_100
value: 1.889
- type: precision_at_1000
value: 0.234
- type: precision_at_20
value: 6.643000000000001
- type: precision_at_3
value: 22.166
- type: precision_at_5
value: 17.477999999999998
- type: recall_at_1
value: 13.236
- type: recall_at_10
value: 41.461
- type: recall_at_100
value: 75.008
- type: recall_at_1000
value: 96.775
- type: recall_at_20
value: 50.754
- type: recall_at_3
value: 26.081
- type: recall_at_5
value: 33.168
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (eng-cmn)
type: jinaai/xpqa
config: eng-cmn
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: main_score
value: 37.504
- type: map_at_1
value: 16.019
- type: map_at_10
value: 30.794
- type: map_at_100
value: 33.157
- type: map_at_1000
value: 33.324999999999996
- type: map_at_20
value: 32.161
- type: map_at_3
value: 25.372
- type: map_at_5
value: 28.246
- type: mrr_at_1
value: 30.461165048543688
- type: mrr_at_10
value: 39.393107566651224
- type: mrr_at_100
value: 40.570039540602295
- type: mrr_at_1000
value: 40.6306116407744
- type: mrr_at_20
value: 40.09428159978876
- type: mrr_at_3
value: 37.176375404530745
- type: mrr_at_5
value: 38.09870550161812
- type: nauc_map_at_1000_diff1
value: 30.82306881892873
- type: nauc_map_at_1000_max
value: 5.877636000666466
- type: nauc_map_at_1000_std
value: -30.7140513386797
- type: nauc_map_at_100_diff1
value: 30.85192449151961
- type: nauc_map_at_100_max
value: 5.809195131550909
- type: nauc_map_at_100_std
value: -30.838556702972063
- type: nauc_map_at_10_diff1
value: 30.50359163635058
- type: nauc_map_at_10_max
value: 6.373491595869303
- type: nauc_map_at_10_std
value: -29.89368007827676
- type: nauc_map_at_1_diff1
value: 38.60240510083884
- type: nauc_map_at_1_max
value: 10.407392664609139
- type: nauc_map_at_1_std
value: -17.76327278732833
- type: nauc_map_at_20_diff1
value: 30.897489125753598
- type: nauc_map_at_20_max
value: 5.9303381898248
- type: nauc_map_at_20_std
value: -30.863345188760515
- type: nauc_map_at_3_diff1
value: 32.8150951852729
- type: nauc_map_at_3_max
value: 7.671931402215177
- type: nauc_map_at_3_std
value: -25.654809758216533
- type: nauc_map_at_5_diff1
value: 31.19558194781019
- type: nauc_map_at_5_max
value: 6.426885613116939
- type: nauc_map_at_5_std
value: -28.609027858850016
- type: nauc_mrr_at_1000_diff1
value: 30.7596332048733
- type: nauc_mrr_at_1000_max
value: 1.1970748115580212
- type: nauc_mrr_at_1000_std
value: -34.647570668150216
- type: nauc_mrr_at_100_diff1
value: 30.74693370788581
- type: nauc_mrr_at_100_max
value: 1.1673272262754841
- type: nauc_mrr_at_100_std
value: -34.67761028542745
- type: nauc_mrr_at_10_diff1
value: 30.537820575183076
- type: nauc_mrr_at_10_max
value: 1.0261868725502707
- type: nauc_mrr_at_10_std
value: -34.999990560631204
- type: nauc_mrr_at_1_diff1
value: 35.51868580113285
- type: nauc_mrr_at_1_max
value: 5.117103773147307
- type: nauc_mrr_at_1_std
value: -30.633913466736956
- type: nauc_mrr_at_20_diff1
value: 30.67318175430903
- type: nauc_mrr_at_20_max
value: 1.0979983974981327
- type: nauc_mrr_at_20_std
value: -34.8388339739997
- type: nauc_mrr_at_3_diff1
value: 30.884642006045702
- type: nauc_mrr_at_3_max
value: 1.7970996544095983
- type: nauc_mrr_at_3_std
value: -34.290172894906085
- type: nauc_mrr_at_5_diff1
value: 30.89687518368571
- type: nauc_mrr_at_5_max
value: 1.2123714988495347
- type: nauc_mrr_at_5_std
value: -35.01704580471926
- type: nauc_ndcg_at_1000_diff1
value: 29.214476799077342
- type: nauc_ndcg_at_1000_max
value: 3.6379035546112872
- type: nauc_ndcg_at_1000_std
value: -32.35757522049194
- type: nauc_ndcg_at_100_diff1
value: 29.130004541376298
- type: nauc_ndcg_at_100_max
value: 2.9580589185293045
- type: nauc_ndcg_at_100_std
value: -33.26884643871724
- type: nauc_ndcg_at_10_diff1
value: 28.521001084366393
- type: nauc_ndcg_at_10_max
value: 3.630223957267483
- type: nauc_ndcg_at_10_std
value: -33.14524140940815
- type: nauc_ndcg_at_1_diff1
value: 35.51868580113285
- type: nauc_ndcg_at_1_max
value: 5.117103773147307
- type: nauc_ndcg_at_1_std
value: -30.633913466736956
- type: nauc_ndcg_at_20_diff1
value: 29.194462756848782
- type: nauc_ndcg_at_20_max
value: 2.61162903136461
- type: nauc_ndcg_at_20_std
value: -34.59161403211834
- type: nauc_ndcg_at_3_diff1
value: 30.183555327135203
- type: nauc_ndcg_at_3_max
value: 5.61949040917093
- type: nauc_ndcg_at_3_std
value: -30.350117794058175
- type: nauc_ndcg_at_5_diff1
value: 29.74420394139971
- type: nauc_ndcg_at_5_max
value: 3.952183813937688
- type: nauc_ndcg_at_5_std
value: -31.807833795302038
- type: nauc_precision_at_1000_diff1
value: -5.467049121617333
- type: nauc_precision_at_1000_max
value: -3.993986884198271
- type: nauc_precision_at_1000_std
value: -13.703967324212224
- type: nauc_precision_at_100_diff1
value: 1.5585428307943647
- type: nauc_precision_at_100_max
value: -4.250455723613214
- type: nauc_precision_at_100_std
value: -22.294689856776493
- type: nauc_precision_at_10_diff1
value: 11.076036917255259
- type: nauc_precision_at_10_max
value: -1.5859394644365377
- type: nauc_precision_at_10_std
value: -34.94912594413202
- type: nauc_precision_at_1_diff1
value: 35.51868580113285
- type: nauc_precision_at_1_max
value: 5.117103773147307
- type: nauc_precision_at_1_std
value: -30.633913466736956
- type: nauc_precision_at_20_diff1
value: 9.311484455773828
- type: nauc_precision_at_20_max
value: -3.678383428592432
- type: nauc_precision_at_20_std
value: -33.700002761401635
- type: nauc_precision_at_3_diff1
value: 19.2787260874381
- type: nauc_precision_at_3_max
value: 0.18292109396940018
- type: nauc_precision_at_3_std
value: -35.23939824276542
- type: nauc_precision_at_5_diff1
value: 14.97930592298584
- type: nauc_precision_at_5_max
value: -1.63540635880963
- type: nauc_precision_at_5_std
value: -35.908283558321315
- type: nauc_recall_at_1000_diff1
value: 26.63056473607804
- type: nauc_recall_at_1000_max
value: 62.7304558520689
- type: nauc_recall_at_1000_std
value: 58.12421701377561
- type: nauc_recall_at_100_diff1
value: 21.42127379898579
- type: nauc_recall_at_100_max
value: 1.4748203516921914
- type: nauc_recall_at_100_std
value: -27.56467339041136
- type: nauc_recall_at_10_diff1
value: 21.20479652609812
- type: nauc_recall_at_10_max
value: 1.7394881489709888
- type: nauc_recall_at_10_std
value: -32.15116902585072
- type: nauc_recall_at_1_diff1
value: 38.60240510083884
- type: nauc_recall_at_1_max
value: 10.407392664609139
- type: nauc_recall_at_1_std
value: -17.76327278732833
- type: nauc_recall_at_20_diff1
value: 23.049652721582632
- type: nauc_recall_at_20_max
value: -1.7715787106286838
- type: nauc_recall_at_20_std
value: -36.14203686002867
- type: nauc_recall_at_3_diff1
value: 26.522179829461873
- type: nauc_recall_at_3_max
value: 6.078208732431124
- type: nauc_recall_at_3_std
value: -25.02625711226274
- type: nauc_recall_at_5_diff1
value: 24.19538553561693
- type: nauc_recall_at_5_max
value: 2.4963810785503524
- type: nauc_recall_at_5_std
value: -30.449635496921257
- type: ndcg_at_1
value: 30.461
- type: ndcg_at_10
value: 37.504
- type: ndcg_at_100
value: 46.156000000000006
- type: ndcg_at_1000
value: 48.985
- type: ndcg_at_20
value: 41.025
- type: ndcg_at_3
value: 32.165
- type: ndcg_at_5
value: 33.072
- type: precision_at_1
value: 30.461
- type: precision_at_10
value: 11.032
- type: precision_at_100
value: 1.8870000000000002
- type: precision_at_1000
value: 0.22499999999999998
- type: precision_at_20
value: 6.833
- type: precision_at_3
value: 22.532
- type: precision_at_5
value: 16.966
- type: recall_at_1
value: 16.019
- type: recall_at_10
value: 47.557
- type: recall_at_100
value: 80.376
- type: recall_at_1000
value: 98.904
- type: recall_at_20
value: 58.48100000000001
- type: recall_at_3
value: 30.682
- type: recall_at_5
value: 36.714999999999996
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (eng-spa)
type: jinaai/xpqa
config: eng-spa
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: main_score
value: 53.359
- type: map_at_1
value: 22.892000000000003
- type: map_at_10
value: 45.773
- type: map_at_100
value: 47.778999999999996
- type: map_at_1000
value: 47.882999999999996
- type: map_at_20
value: 46.869
- type: map_at_3
value: 37.643
- type: map_at_5
value: 43.120999999999995
- type: mrr_at_1
value: 47.28877679697352
- type: mrr_at_10
value: 56.95890630316857
- type: mrr_at_100
value: 57.71103367009639
- type: mrr_at_1000
value: 57.73661441948852
- type: mrr_at_20
value: 57.37701091311334
- type: mrr_at_3
value: 54.74989491382929
- type: mrr_at_5
value: 56.08659100462372
- type: nauc_map_at_1000_diff1
value: 27.8347129954991
- type: nauc_map_at_1000_max
value: 38.04300600762859
- type: nauc_map_at_1000_std
value: -18.294653328262868
- type: nauc_map_at_100_diff1
value: 27.818449297770858
- type: nauc_map_at_100_max
value: 38.03533462156633
- type: nauc_map_at_100_std
value: -18.332989980880644
- type: nauc_map_at_10_diff1
value: 27.520664180018358
- type: nauc_map_at_10_max
value: 37.67109855753314
- type: nauc_map_at_10_std
value: -18.496721673888683
- type: nauc_map_at_1_diff1
value: 37.56020148060502
- type: nauc_map_at_1_max
value: 10.298394230150745
- type: nauc_map_at_1_std
value: -20.41359936101547
- type: nauc_map_at_20_diff1
value: 27.615023038189722
- type: nauc_map_at_20_max
value: 37.808525116320254
- type: nauc_map_at_20_std
value: -18.49235775420803
- type: nauc_map_at_3_diff1
value: 30.797347567428424
- type: nauc_map_at_3_max
value: 29.374407828869497
- type: nauc_map_at_3_std
value: -19.75905772914969
- type: nauc_map_at_5_diff1
value: 28.431802888884803
- type: nauc_map_at_5_max
value: 35.57723911610521
- type: nauc_map_at_5_std
value: -19.093588845366824
- type: nauc_mrr_at_1000_diff1
value: 33.263611009054586
- type: nauc_mrr_at_1000_max
value: 40.620639901613664
- type: nauc_mrr_at_1000_std
value: -17.083016011032036
- type: nauc_mrr_at_100_diff1
value: 33.25375012559163
- type: nauc_mrr_at_100_max
value: 40.62376205172005
- type: nauc_mrr_at_100_std
value: -17.091930575226684
- type: nauc_mrr_at_10_diff1
value: 33.05787202690095
- type: nauc_mrr_at_10_max
value: 40.4516362611674
- type: nauc_mrr_at_10_std
value: -17.088910666499892
- type: nauc_mrr_at_1_diff1
value: 36.424151087824555
- type: nauc_mrr_at_1_max
value: 40.955715626650445
- type: nauc_mrr_at_1_std
value: -16.56636409111209
- type: nauc_mrr_at_20_diff1
value: 33.12029456858138
- type: nauc_mrr_at_20_max
value: 40.56409347292635
- type: nauc_mrr_at_20_std
value: -17.102034817242068
- type: nauc_mrr_at_3_diff1
value: 33.52377926814156
- type: nauc_mrr_at_3_max
value: 40.824911575046876
- type: nauc_mrr_at_3_std
value: -16.855935748811092
- type: nauc_mrr_at_5_diff1
value: 33.08646471768442
- type: nauc_mrr_at_5_max
value: 40.59323589955881
- type: nauc_mrr_at_5_std
value: -16.77829710500156
- type: nauc_ndcg_at_1000_diff1
value: 28.741186244590207
- type: nauc_ndcg_at_1000_max
value: 40.0113825410539
- type: nauc_ndcg_at_1000_std
value: -17.15655081742458
- type: nauc_ndcg_at_100_diff1
value: 28.680521359782972
- type: nauc_ndcg_at_100_max
value: 39.94751899984445
- type: nauc_ndcg_at_100_std
value: -17.82813814043932
- type: nauc_ndcg_at_10_diff1
value: 27.22858072673168
- type: nauc_ndcg_at_10_max
value: 38.600188968554725
- type: nauc_ndcg_at_10_std
value: -18.517203924893614
- type: nauc_ndcg_at_1_diff1
value: 36.424151087824555
- type: nauc_ndcg_at_1_max
value: 40.955715626650445
- type: nauc_ndcg_at_1_std
value: -16.56636409111209
- type: nauc_ndcg_at_20_diff1
value: 27.56875900623774
- type: nauc_ndcg_at_20_max
value: 38.95264310199067
- type: nauc_ndcg_at_20_std
value: -18.709973965688445
- type: nauc_ndcg_at_3_diff1
value: 28.682842749851574
- type: nauc_ndcg_at_3_max
value: 38.361215408395964
- type: nauc_ndcg_at_3_std
value: -16.800291231827515
- type: nauc_ndcg_at_5_diff1
value: 28.178239259093484
- type: nauc_ndcg_at_5_max
value: 36.77096292606479
- type: nauc_ndcg_at_5_std
value: -18.718861696641145
- type: nauc_precision_at_1000_diff1
value: -7.3686253252869305
- type: nauc_precision_at_1000_max
value: 31.98896996987639
- type: nauc_precision_at_1000_std
value: 13.125659676392267
- type: nauc_precision_at_100_diff1
value: -2.8239113056969156
- type: nauc_precision_at_100_max
value: 36.95062472971812
- type: nauc_precision_at_100_std
value: 7.230228733647562
- type: nauc_precision_at_10_diff1
value: 2.5515545798843555
- type: nauc_precision_at_10_max
value: 45.46146019314904
- type: nauc_precision_at_10_std
value: -1.3249340536211553
- type: nauc_precision_at_1_diff1
value: 36.424151087824555
- type: nauc_precision_at_1_max
value: 40.955715626650445
- type: nauc_precision_at_1_std
value: -16.56636409111209
- type: nauc_precision_at_20_diff1
value: 0.7202861770489576
- type: nauc_precision_at_20_max
value: 41.9937596214609
- type: nauc_precision_at_20_std
value: 0.2756400069730064
- type: nauc_precision_at_3_diff1
value: 12.89221206929447
- type: nauc_precision_at_3_max
value: 48.57775126381142
- type: nauc_precision_at_3_std
value: -8.042242254131068
- type: nauc_precision_at_5_diff1
value: 7.063616193387763
- type: nauc_precision_at_5_max
value: 47.26496887331675
- type: nauc_precision_at_5_std
value: -4.735805200913049
- type: nauc_recall_at_1000_diff1
value: 2.6650052980682224
- type: nauc_recall_at_1000_max
value: 81.94826279951472
- type: nauc_recall_at_1000_std
value: 48.46012388224573
- type: nauc_recall_at_100_diff1
value: 24.516371948375827
- type: nauc_recall_at_100_max
value: 39.17639620389552
- type: nauc_recall_at_100_std
value: -17.884197602579533
- type: nauc_recall_at_10_diff1
value: 19.93892097640112
- type: nauc_recall_at_10_max
value: 33.079079440022106
- type: nauc_recall_at_10_std
value: -20.22227622801884
- type: nauc_recall_at_1_diff1
value: 37.56020148060502
- type: nauc_recall_at_1_max
value: 10.298394230150745
- type: nauc_recall_at_1_std
value: -20.41359936101547
- type: nauc_recall_at_20_diff1
value: 20.363784035670633
- type: nauc_recall_at_20_max
value: 33.39352971625336
- type: nauc_recall_at_20_std
value: -21.712050932168875
- type: nauc_recall_at_3_diff1
value: 26.220072121604655
- type: nauc_recall_at_3_max
value: 25.853218030218507
- type: nauc_recall_at_3_std
value: -17.830613372910907
- type: nauc_recall_at_5_diff1
value: 22.25850162680252
- type: nauc_recall_at_5_max
value: 30.89620539042785
- type: nauc_recall_at_5_std
value: -19.16786434439169
- type: ndcg_at_1
value: 47.288999999999994
- type: ndcg_at_10
value: 53.359
- type: ndcg_at_100
value: 60.25899999999999
- type: ndcg_at_1000
value: 61.902
- type: ndcg_at_20
value: 56.025000000000006
- type: ndcg_at_3
value: 47.221999999999994
- type: ndcg_at_5
value: 49.333
- type: precision_at_1
value: 47.288999999999994
- type: precision_at_10
value: 16.003
- type: precision_at_100
value: 2.221
- type: precision_at_1000
value: 0.246
- type: precision_at_20
value: 8.985
- type: precision_at_3
value: 34.510000000000005
- type: precision_at_5
value: 26.961000000000002
- type: recall_at_1
value: 22.892000000000003
- type: recall_at_10
value: 62.928
- type: recall_at_100
value: 89.105
- type: recall_at_1000
value: 99.319
- type: recall_at_20
value: 71.387
- type: recall_at_3
value: 43.492999999999995
- type: recall_at_5
value: 53.529
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (eng-fra)
type: jinaai/xpqa
config: eng-fra
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: main_score
value: 54.888000000000005
- type: map_at_1
value: 26.079
- type: map_at_10
value: 47.434
- type: map_at_100
value: 49.376
- type: map_at_1000
value: 49.461
- type: map_at_20
value: 48.634
- type: map_at_3
value: 40.409
- type: map_at_5
value: 44.531
- type: mrr_at_1
value: 46.86248331108144
- type: mrr_at_10
value: 56.45506177548896
- type: mrr_at_100
value: 57.20360629445577
- type: mrr_at_1000
value: 57.227004696897986
- type: mrr_at_20
value: 56.905302765737865
- type: mrr_at_3
value: 54.09434801958164
- type: mrr_at_5
value: 55.40943480195811
- type: nauc_map_at_1000_diff1
value: 37.739936045535885
- type: nauc_map_at_1000_max
value: 35.92625003516368
- type: nauc_map_at_1000_std
value: -15.825119611638398
- type: nauc_map_at_100_diff1
value: 37.71697833661983
- type: nauc_map_at_100_max
value: 35.91174068136317
- type: nauc_map_at_100_std
value: -15.838841891589006
- type: nauc_map_at_10_diff1
value: 37.52309268219689
- type: nauc_map_at_10_max
value: 35.4887130483351
- type: nauc_map_at_10_std
value: -16.61132378136234
- type: nauc_map_at_1_diff1
value: 42.705087329207984
- type: nauc_map_at_1_max
value: 12.047671550242974
- type: nauc_map_at_1_std
value: -17.156030827065834
- type: nauc_map_at_20_diff1
value: 37.59446680137666
- type: nauc_map_at_20_max
value: 35.80559546695052
- type: nauc_map_at_20_std
value: -16.158338316249786
- type: nauc_map_at_3_diff1
value: 38.618415267131816
- type: nauc_map_at_3_max
value: 27.030227996183925
- type: nauc_map_at_3_std
value: -18.962500694157857
- type: nauc_map_at_5_diff1
value: 37.980845601534256
- type: nauc_map_at_5_max
value: 32.82374761283266
- type: nauc_map_at_5_std
value: -17.856875825229565
- type: nauc_mrr_at_1000_diff1
value: 40.26059509279346
- type: nauc_mrr_at_1000_max
value: 39.28453752990871
- type: nauc_mrr_at_1000_std
value: -13.306217279524212
- type: nauc_mrr_at_100_diff1
value: 40.23390833398881
- type: nauc_mrr_at_100_max
value: 39.26041461025653
- type: nauc_mrr_at_100_std
value: -13.317700798873153
- type: nauc_mrr_at_10_diff1
value: 40.163737640180145
- type: nauc_mrr_at_10_max
value: 39.27138538165913
- type: nauc_mrr_at_10_std
value: -13.472971360323038
- type: nauc_mrr_at_1_diff1
value: 42.95339241383707
- type: nauc_mrr_at_1_max
value: 40.62982307619158
- type: nauc_mrr_at_1_std
value: -10.429597045942748
- type: nauc_mrr_at_20_diff1
value: 40.23703505923782
- type: nauc_mrr_at_20_max
value: 39.27051308063652
- type: nauc_mrr_at_20_std
value: -13.390197643922038
- type: nauc_mrr_at_3_diff1
value: 40.5721313555661
- type: nauc_mrr_at_3_max
value: 39.254774354468594
- type: nauc_mrr_at_3_std
value: -13.773803807863827
- type: nauc_mrr_at_5_diff1
value: 40.41081287079734
- type: nauc_mrr_at_5_max
value: 39.515241132077335
- type: nauc_mrr_at_5_std
value: -13.306544090087336
- type: nauc_ndcg_at_1000_diff1
value: 38.04772268296103
- type: nauc_ndcg_at_1000_max
value: 38.03364565521176
- type: nauc_ndcg_at_1000_std
value: -14.203182726102263
- type: nauc_ndcg_at_100_diff1
value: 37.51752795463643
- type: nauc_ndcg_at_100_max
value: 37.809671511710604
- type: nauc_ndcg_at_100_std
value: -13.880578225081408
- type: nauc_ndcg_at_10_diff1
value: 36.78438984005559
- type: nauc_ndcg_at_10_max
value: 36.98105155993232
- type: nauc_ndcg_at_10_std
value: -16.886308645939113
- type: nauc_ndcg_at_1_diff1
value: 42.95339241383707
- type: nauc_ndcg_at_1_max
value: 40.62982307619158
- type: nauc_ndcg_at_1_std
value: -10.429597045942748
- type: nauc_ndcg_at_20_diff1
value: 36.94164323893683
- type: nauc_ndcg_at_20_max
value: 37.333583379288285
- type: nauc_ndcg_at_20_std
value: -15.853318071434716
- type: nauc_ndcg_at_3_diff1
value: 36.905604845477384
- type: nauc_ndcg_at_3_max
value: 35.10252586688781
- type: nauc_ndcg_at_3_std
value: -17.128435988977742
- type: nauc_ndcg_at_5_diff1
value: 37.96742463612705
- type: nauc_ndcg_at_5_max
value: 34.65945109443365
- type: nauc_ndcg_at_5_std
value: -17.916428667861183
- type: nauc_precision_at_1000_diff1
value: -3.740861894117653
- type: nauc_precision_at_1000_max
value: 31.993854396874177
- type: nauc_precision_at_1000_std
value: 17.445629474196448
- type: nauc_precision_at_100_diff1
value: -0.4825948747911606
- type: nauc_precision_at_100_max
value: 35.834638448782954
- type: nauc_precision_at_100_std
value: 16.82718796079511
- type: nauc_precision_at_10_diff1
value: 8.285949866268147
- type: nauc_precision_at_10_max
value: 45.3292519726866
- type: nauc_precision_at_10_std
value: 4.5574850748441555
- type: nauc_precision_at_1_diff1
value: 42.95339241383707
- type: nauc_precision_at_1_max
value: 40.62982307619158
- type: nauc_precision_at_1_std
value: -10.429597045942748
- type: nauc_precision_at_20_diff1
value: 4.890590733611442
- type: nauc_precision_at_20_max
value: 41.83051757078859
- type: nauc_precision_at_20_std
value: 9.197347125630467
- type: nauc_precision_at_3_diff1
value: 17.79940075411976
- type: nauc_precision_at_3_max
value: 45.224103632426946
- type: nauc_precision_at_3_std
value: -5.017203435609909
- type: nauc_precision_at_5_diff1
value: 13.548063145911929
- type: nauc_precision_at_5_max
value: 46.84837547409909
- type: nauc_precision_at_5_std
value: -0.8925939386354484
- type: nauc_recall_at_1000_diff1
value: 74.48441717138078
- type: nauc_recall_at_1000_max
value: 74.66717137705027
- type: nauc_recall_at_1000_std
value: 0.24030117471512125
- type: nauc_recall_at_100_diff1
value: 22.553777341988656
- type: nauc_recall_at_100_max
value: 31.67861029246527
- type: nauc_recall_at_100_std
value: 0.2707450517253687
- type: nauc_recall_at_10_diff1
value: 28.490866614443235
- type: nauc_recall_at_10_max
value: 31.722970141434352
- type: nauc_recall_at_10_std
value: -21.97893365028007
- type: nauc_recall_at_1_diff1
value: 42.705087329207984
- type: nauc_recall_at_1_max
value: 12.047671550242974
- type: nauc_recall_at_1_std
value: -17.156030827065834
- type: nauc_recall_at_20_diff1
value: 27.44043454173112
- type: nauc_recall_at_20_max
value: 31.454281772040716
- type: nauc_recall_at_20_std
value: -20.1735695305415
- type: nauc_recall_at_3_diff1
value: 34.08447534706394
- type: nauc_recall_at_3_max
value: 21.793973773840865
- type: nauc_recall_at_3_std
value: -22.753978372378906
- type: nauc_recall_at_5_diff1
value: 33.59686526199479
- type: nauc_recall_at_5_max
value: 29.188889073761302
- type: nauc_recall_at_5_std
value: -21.96156333744562
- type: ndcg_at_1
value: 46.861999999999995
- type: ndcg_at_10
value: 54.888000000000005
- type: ndcg_at_100
value: 61.477000000000004
- type: ndcg_at_1000
value: 62.768
- type: ndcg_at_20
value: 57.812
- type: ndcg_at_3
value: 48.721
- type: ndcg_at_5
value: 50.282000000000004
- type: precision_at_1
value: 46.861999999999995
- type: precision_at_10
value: 15.167
- type: precision_at_100
value: 2.072
- type: precision_at_1000
value: 0.22499999999999998
- type: precision_at_20
value: 8.672
- type: precision_at_3
value: 33.066
- type: precision_at_5
value: 24.726
- type: recall_at_1
value: 26.079
- type: recall_at_10
value: 66.095
- type: recall_at_100
value: 91.65299999999999
- type: recall_at_1000
value: 99.83999999999999
- type: recall_at_20
value: 75.28
- type: recall_at_3
value: 46.874
- type: recall_at_5
value: 55.062
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (pol-eng)
type: jinaai/xpqa
config: pol-eng
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: main_score
value: 50.831
- type: map_at_1
value: 25.549
- type: map_at_10
value: 44.432
- type: map_at_100
value: 46.431
- type: map_at_1000
value: 46.525
- type: map_at_20
value: 45.595
- type: map_at_3
value: 38.574000000000005
- type: map_at_5
value: 42.266999999999996
- type: mrr_at_1
value: 43.5006435006435
- type: mrr_at_10
value: 51.561255132683684
- type: mrr_at_100
value: 52.59912482635216
- type: mrr_at_1000
value: 52.631337587043056
- type: mrr_at_20
value: 52.23234440063273
- type: mrr_at_3
value: 48.97039897039895
- type: mrr_at_5
value: 50.31531531531527
- type: nauc_map_at_1000_diff1
value: 35.907901295900174
- type: nauc_map_at_1000_max
value: 24.573763602041687
- type: nauc_map_at_1000_std
value: -29.524077960309313
- type: nauc_map_at_100_diff1
value: 35.86869121827827
- type: nauc_map_at_100_max
value: 24.532343818487494
- type: nauc_map_at_100_std
value: -29.613979124488864
- type: nauc_map_at_10_diff1
value: 35.90171794022391
- type: nauc_map_at_10_max
value: 23.90914892943268
- type: nauc_map_at_10_std
value: -30.43698820061533
- type: nauc_map_at_1_diff1
value: 50.80313333312038
- type: nauc_map_at_1_max
value: 16.649890421888156
- type: nauc_map_at_1_std
value: -22.323989416471683
- type: nauc_map_at_20_diff1
value: 35.77755470212964
- type: nauc_map_at_20_max
value: 24.199895270297034
- type: nauc_map_at_20_std
value: -30.223411960170647
- type: nauc_map_at_3_diff1
value: 38.964124882315936
- type: nauc_map_at_3_max
value: 21.187432510177167
- type: nauc_map_at_3_std
value: -28.976663506389887
- type: nauc_map_at_5_diff1
value: 36.04644236616672
- type: nauc_map_at_5_max
value: 23.501186429317094
- type: nauc_map_at_5_std
value: -30.068144596060748
- type: nauc_mrr_at_1000_diff1
value: 41.36555452105447
- type: nauc_mrr_at_1000_max
value: 26.376799280402867
- type: nauc_mrr_at_1000_std
value: -30.008603028757424
- type: nauc_mrr_at_100_diff1
value: 41.35523965220727
- type: nauc_mrr_at_100_max
value: 26.402612115967706
- type: nauc_mrr_at_100_std
value: -29.991754627128024
- type: nauc_mrr_at_10_diff1
value: 41.001395127259315
- type: nauc_mrr_at_10_max
value: 26.104860505051384
- type: nauc_mrr_at_10_std
value: -30.38420449487516
- type: nauc_mrr_at_1_diff1
value: 44.882846373248206
- type: nauc_mrr_at_1_max
value: 26.61905322890808
- type: nauc_mrr_at_1_std
value: -28.724565662206153
- type: nauc_mrr_at_20_diff1
value: 41.278009142648834
- type: nauc_mrr_at_20_max
value: 26.284565529087295
- type: nauc_mrr_at_20_std
value: -30.19549140549242
- type: nauc_mrr_at_3_diff1
value: 41.74663893951077
- type: nauc_mrr_at_3_max
value: 26.263048464325884
- type: nauc_mrr_at_3_std
value: -30.676733442965688
- type: nauc_mrr_at_5_diff1
value: 41.11461477846568
- type: nauc_mrr_at_5_max
value: 25.94713927964926
- type: nauc_mrr_at_5_std
value: -30.317066480767817
- type: nauc_ndcg_at_1000_diff1
value: 36.34161052445199
- type: nauc_ndcg_at_1000_max
value: 26.321036033696206
- type: nauc_ndcg_at_1000_std
value: -27.59146917115399
- type: nauc_ndcg_at_100_diff1
value: 35.66557800007035
- type: nauc_ndcg_at_100_max
value: 26.282211208336136
- type: nauc_ndcg_at_100_std
value: -27.905634124461333
- type: nauc_ndcg_at_10_diff1
value: 35.34872687407275
- type: nauc_ndcg_at_10_max
value: 24.018561915792272
- type: nauc_ndcg_at_10_std
value: -31.57712772869015
- type: nauc_ndcg_at_1_diff1
value: 44.882846373248206
- type: nauc_ndcg_at_1_max
value: 26.865602442152554
- type: nauc_ndcg_at_1_std
value: -28.509295454329152
- type: nauc_ndcg_at_20_diff1
value: 35.46177768045546
- type: nauc_ndcg_at_20_max
value: 24.921273675141542
- type: nauc_ndcg_at_20_std
value: -30.84348812979793
- type: nauc_ndcg_at_3_diff1
value: 36.84688489063923
- type: nauc_ndcg_at_3_max
value: 24.088513229463736
- type: nauc_ndcg_at_3_std
value: -30.05640995379297
- type: nauc_ndcg_at_5_diff1
value: 35.623143276796185
- type: nauc_ndcg_at_5_max
value: 23.76654250474061
- type: nauc_ndcg_at_5_std
value: -30.87847710074466
- type: nauc_precision_at_1000_diff1
value: -16.270532533886932
- type: nauc_precision_at_1000_max
value: 17.37365042394671
- type: nauc_precision_at_1000_std
value: 16.27166715693082
- type: nauc_precision_at_100_diff1
value: -13.175264889436313
- type: nauc_precision_at_100_max
value: 19.488571046893963
- type: nauc_precision_at_100_std
value: 9.055429698007798
- type: nauc_precision_at_10_diff1
value: 0.6806938753592942
- type: nauc_precision_at_10_max
value: 21.933083960522616
- type: nauc_precision_at_10_std
value: -18.2147036942157
- type: nauc_precision_at_1_diff1
value: 44.882846373248206
- type: nauc_precision_at_1_max
value: 26.865602442152554
- type: nauc_precision_at_1_std
value: -28.509295454329152
- type: nauc_precision_at_20_diff1
value: -4.318119150162302
- type: nauc_precision_at_20_max
value: 21.089702301041687
- type: nauc_precision_at_20_std
value: -10.333077681479546
- type: nauc_precision_at_3_diff1
value: 11.496076462671107
- type: nauc_precision_at_3_max
value: 23.018301549827008
- type: nauc_precision_at_3_std
value: -23.98652995416454
- type: nauc_precision_at_5_diff1
value: 4.271050668117355
- type: nauc_precision_at_5_max
value: 23.61051327966779
- type: nauc_precision_at_5_std
value: -21.557618503107847
- type: nauc_recall_at_1000_diff1
value: 62.23955911850697
- type: nauc_recall_at_1000_max
value: 83.20491723365542
- type: nauc_recall_at_1000_std
value: 66.5173462601958
- type: nauc_recall_at_100_diff1
value: 20.503778602988177
- type: nauc_recall_at_100_max
value: 29.379026288767506
- type: nauc_recall_at_100_std
value: -16.139120874540573
- type: nauc_recall_at_10_diff1
value: 27.659110249896557
- type: nauc_recall_at_10_max
value: 19.69557968026332
- type: nauc_recall_at_10_std
value: -33.95657132767551
- type: nauc_recall_at_1_diff1
value: 50.80313333312038
- type: nauc_recall_at_1_max
value: 16.649890421888156
- type: nauc_recall_at_1_std
value: -22.323989416471683
- type: nauc_recall_at_20_diff1
value: 27.084453724565176
- type: nauc_recall_at_20_max
value: 21.40080632474994
- type: nauc_recall_at_20_std
value: -32.83683639340239
- type: nauc_recall_at_3_diff1
value: 34.32950941333572
- type: nauc_recall_at_3_max
value: 18.55616615958199
- type: nauc_recall_at_3_std
value: -30.375983327454076
- type: nauc_recall_at_5_diff1
value: 29.44516734974564
- type: nauc_recall_at_5_max
value: 20.630543534300312
- type: nauc_recall_at_5_std
value: -31.30763062499127
- type: ndcg_at_1
value: 43.501
- type: ndcg_at_10
value: 50.831
- type: ndcg_at_100
value: 58.17099999999999
- type: ndcg_at_1000
value: 59.705
- type: ndcg_at_20
value: 54.047999999999995
- type: ndcg_at_3
value: 44.549
- type: ndcg_at_5
value: 46.861000000000004
- type: precision_at_1
value: 43.501
- type: precision_at_10
value: 12.895999999999999
- type: precision_at_100
value: 1.9
- type: precision_at_1000
value: 0.21
- type: precision_at_20
value: 7.593
- type: precision_at_3
value: 29.215000000000003
- type: precision_at_5
value: 21.57
- type: recall_at_1
value: 25.549
- type: recall_at_10
value: 61.795
- type: recall_at_100
value: 90.019
- type: recall_at_1000
value: 99.807
- type: recall_at_20
value: 72.096
- type: recall_at_3
value: 43.836999999999996
- type: recall_at_5
value: 51.714000000000006
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (pol-pol)
type: jinaai/xpqa
config: pol-pol
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: main_score
value: 53.70399999999999
- type: map_at_1
value: 27.739000000000004
- type: map_at_10
value: 47.469
- type: map_at_100
value: 49.392
- type: map_at_1000
value: 49.483
- type: map_at_20
value: 48.646
- type: map_at_3
value: 41.467
- type: map_at_5
value: 45.467
- type: mrr_at_1
value: 47.00636942675159
- type: mrr_at_10
value: 54.63699322616519
- type: mrr_at_100
value: 55.54525182833755
- type: mrr_at_1000
value: 55.581331515356155
- type: mrr_at_20
value: 55.22918377451415
- type: mrr_at_3
value: 52.03821656050952
- type: mrr_at_5
value: 53.38216560509549
- type: nauc_map_at_1000_diff1
value: 45.03530825034854
- type: nauc_map_at_1000_max
value: 34.22740272603397
- type: nauc_map_at_1000_std
value: -30.428880484199244
- type: nauc_map_at_100_diff1
value: 44.978704455592805
- type: nauc_map_at_100_max
value: 34.20908357964765
- type: nauc_map_at_100_std
value: -30.47325365059666
- type: nauc_map_at_10_diff1
value: 44.9560579177672
- type: nauc_map_at_10_max
value: 33.70097588985278
- type: nauc_map_at_10_std
value: -31.205563222357885
- type: nauc_map_at_1_diff1
value: 57.94711780881773
- type: nauc_map_at_1_max
value: 21.60278071836319
- type: nauc_map_at_1_std
value: -23.273741268035923
- type: nauc_map_at_20_diff1
value: 44.97859054699532
- type: nauc_map_at_20_max
value: 34.153729150181846
- type: nauc_map_at_20_std
value: -30.97482545902907
- type: nauc_map_at_3_diff1
value: 47.52016138686765
- type: nauc_map_at_3_max
value: 30.176197065298417
- type: nauc_map_at_3_std
value: -29.90628984041898
- type: nauc_map_at_5_diff1
value: 45.36581638257985
- type: nauc_map_at_5_max
value: 33.697200263698036
- type: nauc_map_at_5_std
value: -31.165331120088453
- type: nauc_mrr_at_1000_diff1
value: 53.32889526818364
- type: nauc_mrr_at_1000_max
value: 36.104118340589736
- type: nauc_mrr_at_1000_std
value: -31.321132494516984
- type: nauc_mrr_at_100_diff1
value: 53.30695875258367
- type: nauc_mrr_at_100_max
value: 36.114890079024455
- type: nauc_mrr_at_100_std
value: -31.291749322117447
- type: nauc_mrr_at_10_diff1
value: 53.189084772141435
- type: nauc_mrr_at_10_max
value: 35.939061062282484
- type: nauc_mrr_at_10_std
value: -31.502185884653645
- type: nauc_mrr_at_1_diff1
value: 56.89368291041337
- type: nauc_mrr_at_1_max
value: 36.07581125496313
- type: nauc_mrr_at_1_std
value: -29.703764232519475
- type: nauc_mrr_at_20_diff1
value: 53.23955737199497
- type: nauc_mrr_at_20_max
value: 36.068824838215676
- type: nauc_mrr_at_20_std
value: -31.420039428197594
- type: nauc_mrr_at_3_diff1
value: 53.74385074861207
- type: nauc_mrr_at_3_max
value: 35.57054587735015
- type: nauc_mrr_at_3_std
value: -32.356894834537684
- type: nauc_mrr_at_5_diff1
value: 53.66669556981826
- type: nauc_mrr_at_5_max
value: 36.02102289605049
- type: nauc_mrr_at_5_std
value: -32.030437067359124
- type: nauc_ndcg_at_1000_diff1
value: 46.34900536768847
- type: nauc_ndcg_at_1000_max
value: 35.6314995837715
- type: nauc_ndcg_at_1000_std
value: -28.965103958822624
- type: nauc_ndcg_at_100_diff1
value: 45.1587893788861
- type: nauc_ndcg_at_100_max
value: 35.62430753595297
- type: nauc_ndcg_at_100_std
value: -28.77303405812772
- type: nauc_ndcg_at_10_diff1
value: 44.928781590765965
- type: nauc_ndcg_at_10_max
value: 34.315200006430366
- type: nauc_ndcg_at_10_std
value: -32.05164097076614
- type: nauc_ndcg_at_1_diff1
value: 57.228262350455125
- type: nauc_ndcg_at_1_max
value: 35.645285703387366
- type: nauc_ndcg_at_1_std
value: -29.893553821348718
- type: nauc_ndcg_at_20_diff1
value: 44.959903633039865
- type: nauc_ndcg_at_20_max
value: 35.493022926282755
- type: nauc_ndcg_at_20_std
value: -31.54989291850644
- type: nauc_ndcg_at_3_diff1
value: 46.65266185996905
- type: nauc_ndcg_at_3_max
value: 33.74458119579594
- type: nauc_ndcg_at_3_std
value: -31.493683304534176
- type: nauc_ndcg_at_5_diff1
value: 46.08707037187612
- type: nauc_ndcg_at_5_max
value: 34.7401426055243
- type: nauc_ndcg_at_5_std
value: -32.44390676345172
- type: nauc_precision_at_1000_diff1
value: -12.11355300492561
- type: nauc_precision_at_1000_max
value: 14.490738062121233
- type: nauc_precision_at_1000_std
value: 14.448811005059097
- type: nauc_precision_at_100_diff1
value: -9.742085657181239
- type: nauc_precision_at_100_max
value: 18.030305489251223
- type: nauc_precision_at_100_std
value: 8.213089709529765
- type: nauc_precision_at_10_diff1
value: 5.153466672774969
- type: nauc_precision_at_10_max
value: 27.29412644661678
- type: nauc_precision_at_10_std
value: -15.505053884112355
- type: nauc_precision_at_1_diff1
value: 57.228262350455125
- type: nauc_precision_at_1_max
value: 35.645285703387366
- type: nauc_precision_at_1_std
value: -29.893553821348718
- type: nauc_precision_at_20_diff1
value: -0.6812430761066635
- type: nauc_precision_at_20_max
value: 25.81911286466295
- type: nauc_precision_at_20_std
value: -8.388506222482595
- type: nauc_precision_at_3_diff1
value: 18.263873866510576
- type: nauc_precision_at_3_max
value: 30.879576105862345
- type: nauc_precision_at_3_std
value: -24.0342929870108
- type: nauc_precision_at_5_diff1
value: 10.9905804265327
- type: nauc_precision_at_5_max
value: 30.88468087429045
- type: nauc_precision_at_5_std
value: -20.458684056213507
- type: nauc_recall_at_1000_diff1
value: -64.887668417171
- type: nauc_recall_at_1000_max
value: 52.25501730358092
- type: nauc_recall_at_1000_std
value: 85.13647916200132
- type: nauc_recall_at_100_diff1
value: 18.956777346127655
- type: nauc_recall_at_100_max
value: 36.10473493564588
- type: nauc_recall_at_100_std
value: -10.007474558899949
- type: nauc_recall_at_10_diff1
value: 33.810344497568046
- type: nauc_recall_at_10_max
value: 31.395430183214245
- type: nauc_recall_at_10_std
value: -33.12920524433795
- type: nauc_recall_at_1_diff1
value: 57.94711780881773
- type: nauc_recall_at_1_max
value: 21.60278071836319
- type: nauc_recall_at_1_std
value: -23.273741268035923
- type: nauc_recall_at_20_diff1
value: 31.449657437065397
- type: nauc_recall_at_20_max
value: 34.519574934321945
- type: nauc_recall_at_20_std
value: -33.43406862055647
- type: nauc_recall_at_3_diff1
value: 42.07841848382365
- type: nauc_recall_at_3_max
value: 28.7648772833266
- type: nauc_recall_at_3_std
value: -31.56367736320086
- type: nauc_recall_at_5_diff1
value: 39.21392858246301
- type: nauc_recall_at_5_max
value: 34.28338202081927
- type: nauc_recall_at_5_std
value: -33.725680523721906
- type: ndcg_at_1
value: 46.879
- type: ndcg_at_10
value: 53.70399999999999
- type: ndcg_at_100
value: 60.532
- type: ndcg_at_1000
value: 61.997
- type: ndcg_at_20
value: 56.818999999999996
- type: ndcg_at_3
value: 47.441
- type: ndcg_at_5
value: 49.936
- type: precision_at_1
value: 46.879
- type: precision_at_10
value: 13.376
- type: precision_at_100
value: 1.8980000000000001
- type: precision_at_1000
value: 0.208
- type: precision_at_20
value: 7.771
- type: precision_at_3
value: 30.658
- type: precision_at_5
value: 22.828
- type: recall_at_1
value: 27.739000000000004
- type: recall_at_10
value: 64.197
- type: recall_at_100
value: 90.54100000000001
- type: recall_at_1000
value: 99.90400000000001
- type: recall_at_20
value: 74.178
- type: recall_at_3
value: 46.312
- type: recall_at_5
value: 54.581999999999994
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (cmn-eng)
type: jinaai/xpqa
config: cmn-eng
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: main_score
value: 64.64
- type: map_at_1
value: 35.858000000000004
- type: map_at_10
value: 58.547000000000004
- type: map_at_100
value: 60.108
- type: map_at_1000
value: 60.153999999999996
- type: map_at_20
value: 59.528000000000006
- type: map_at_3
value: 51.578
- type: map_at_5
value: 56.206999999999994
- type: mrr_at_1
value: 56.95121951219512
- type: mrr_at_10
value: 64.93975029036001
- type: mrr_at_100
value: 65.63357055718294
- type: mrr_at_1000
value: 65.64844109026834
- type: mrr_at_20
value: 65.41280668715439
- type: mrr_at_3
value: 62.68292682926826
- type: mrr_at_5
value: 64.1585365853658
- type: nauc_map_at_1000_diff1
value: 45.82740870907091
- type: nauc_map_at_1000_max
value: 21.9696540066807
- type: nauc_map_at_1000_std
value: -32.028262356639495
- type: nauc_map_at_100_diff1
value: 45.802053117616396
- type: nauc_map_at_100_max
value: 21.946002070290966
- type: nauc_map_at_100_std
value: -32.06190418866229
- type: nauc_map_at_10_diff1
value: 46.017774155748945
- type: nauc_map_at_10_max
value: 21.876909086095544
- type: nauc_map_at_10_std
value: -32.13913568843985
- type: nauc_map_at_1_diff1
value: 56.34671160956164
- type: nauc_map_at_1_max
value: 17.6796949796236
- type: nauc_map_at_1_std
value: -13.741140688066045
- type: nauc_map_at_20_diff1
value: 46.027469176858716
- type: nauc_map_at_20_max
value: 21.80738432042703
- type: nauc_map_at_20_std
value: -32.430379634015395
- type: nauc_map_at_3_diff1
value: 48.40096725254027
- type: nauc_map_at_3_max
value: 21.15442803574233
- type: nauc_map_at_3_std
value: -26.205850292181417
- type: nauc_map_at_5_diff1
value: 45.77800041356389
- type: nauc_map_at_5_max
value: 22.11718771798752
- type: nauc_map_at_5_std
value: -30.32876338031471
- type: nauc_mrr_at_1000_diff1
value: 49.748274798877944
- type: nauc_mrr_at_1000_max
value: 24.547774167219906
- type: nauc_mrr_at_1000_std
value: -32.728447209433504
- type: nauc_mrr_at_100_diff1
value: 49.734549290377856
- type: nauc_mrr_at_100_max
value: 24.536933315055222
- type: nauc_mrr_at_100_std
value: -32.74076335880697
- type: nauc_mrr_at_10_diff1
value: 49.82827711456392
- type: nauc_mrr_at_10_max
value: 24.536773657485075
- type: nauc_mrr_at_10_std
value: -33.05707547166962
- type: nauc_mrr_at_1_diff1
value: 51.954289992321044
- type: nauc_mrr_at_1_max
value: 26.336255074856886
- type: nauc_mrr_at_1_std
value: -29.042962019692446
- type: nauc_mrr_at_20_diff1
value: 49.70938465628863
- type: nauc_mrr_at_20_max
value: 24.433219849576947
- type: nauc_mrr_at_20_std
value: -32.94123791846049
- type: nauc_mrr_at_3_diff1
value: 50.289486880347134
- type: nauc_mrr_at_3_max
value: 24.978796972860142
- type: nauc_mrr_at_3_std
value: -32.11305594784892
- type: nauc_mrr_at_5_diff1
value: 49.95013396316144
- type: nauc_mrr_at_5_max
value: 24.514452761198303
- type: nauc_mrr_at_5_std
value: -32.865859962984146
- type: nauc_ndcg_at_1000_diff1
value: 45.73806489233998
- type: nauc_ndcg_at_1000_max
value: 22.404941391043867
- type: nauc_ndcg_at_1000_std
value: -33.063445720849685
- type: nauc_ndcg_at_100_diff1
value: 45.1046206923062
- type: nauc_ndcg_at_100_max
value: 22.081133719684658
- type: nauc_ndcg_at_100_std
value: -33.299291459450146
- type: nauc_ndcg_at_10_diff1
value: 46.140608688357496
- type: nauc_ndcg_at_10_max
value: 21.442489279388916
- type: nauc_ndcg_at_10_std
value: -35.115870342856006
- type: nauc_ndcg_at_1_diff1
value: 51.954289992321044
- type: nauc_ndcg_at_1_max
value: 26.336255074856886
- type: nauc_ndcg_at_1_std
value: -29.042962019692446
- type: nauc_ndcg_at_20_diff1
value: 45.966784725457046
- type: nauc_ndcg_at_20_max
value: 21.166632858613145
- type: nauc_ndcg_at_20_std
value: -35.65112890375392
- type: nauc_ndcg_at_3_diff1
value: 46.7404863978999
- type: nauc_ndcg_at_3_max
value: 22.701743709129456
- type: nauc_ndcg_at_3_std
value: -30.907633466983192
- type: nauc_ndcg_at_5_diff1
value: 45.86487199083486
- type: nauc_ndcg_at_5_max
value: 22.088804840002513
- type: nauc_ndcg_at_5_std
value: -32.3853481632832
- type: nauc_precision_at_1000_diff1
value: -25.69710612774455
- type: nauc_precision_at_1000_max
value: 1.3964400247388091
- type: nauc_precision_at_1000_std
value: -8.873947511634814
- type: nauc_precision_at_100_diff1
value: -24.013497191077978
- type: nauc_precision_at_100_max
value: 2.0197725715909343
- type: nauc_precision_at_100_std
value: -11.387423148770633
- type: nauc_precision_at_10_diff1
value: -6.47728645242781
- type: nauc_precision_at_10_max
value: 6.815261443768304
- type: nauc_precision_at_10_std
value: -26.825062292855943
- type: nauc_precision_at_1_diff1
value: 51.954289992321044
- type: nauc_precision_at_1_max
value: 26.336255074856886
- type: nauc_precision_at_1_std
value: -29.042962019692446
- type: nauc_precision_at_20_diff1
value: -12.355232044747511
- type: nauc_precision_at_20_max
value: 4.022126850949725
- type: nauc_precision_at_20_std
value: -23.688935769326772
- type: nauc_precision_at_3_diff1
value: 7.662671665835864
- type: nauc_precision_at_3_max
value: 14.372394760986248
- type: nauc_precision_at_3_std
value: -28.635125665532453
- type: nauc_precision_at_5_diff1
value: -1.4592476425511611
- type: nauc_precision_at_5_max
value: 11.124310161474174
- type: nauc_precision_at_5_std
value: -27.89526669318053
- type: nauc_recall_at_1000_diff1
value: -19.58450046684932
- type: nauc_recall_at_1000_max
value: 70.71661998133165
- type: nauc_recall_at_1000_std
value: 93.05555555556315
- type: nauc_recall_at_100_diff1
value: 15.06356457571853
- type: nauc_recall_at_100_max
value: 14.051414749344806
- type: nauc_recall_at_100_std
value: -29.461874235153008
- type: nauc_recall_at_10_diff1
value: 41.29842726117901
- type: nauc_recall_at_10_max
value: 15.768699673830898
- type: nauc_recall_at_10_std
value: -42.11585661287712
- type: nauc_recall_at_1_diff1
value: 56.34671160956164
- type: nauc_recall_at_1_max
value: 17.6796949796236
- type: nauc_recall_at_1_std
value: -13.741140688066045
- type: nauc_recall_at_20_diff1
value: 38.8078283585263
- type: nauc_recall_at_20_max
value: 12.06816084005326
- type: nauc_recall_at_20_std
value: -48.20956170056591
- type: nauc_recall_at_3_diff1
value: 44.71028758038993
- type: nauc_recall_at_3_max
value: 19.1059093689162
- type: nauc_recall_at_3_std
value: -26.795164453784253
- type: nauc_recall_at_5_diff1
value: 41.06320797773054
- type: nauc_recall_at_5_max
value: 19.117028272530998
- type: nauc_recall_at_5_std
value: -33.985747504612156
- type: ndcg_at_1
value: 56.95099999999999
- type: ndcg_at_10
value: 64.64
- type: ndcg_at_100
value: 70.017
- type: ndcg_at_1000
value: 70.662
- type: ndcg_at_20
value: 67.256
- type: ndcg_at_3
value: 58.269000000000005
- type: ndcg_at_5
value: 60.94199999999999
- type: precision_at_1
value: 56.95099999999999
- type: precision_at_10
value: 15.671
- type: precision_at_100
value: 2.002
- type: precision_at_1000
value: 0.208
- type: precision_at_20
value: 8.689
- type: precision_at_3
value: 36.341
- type: precision_at_5
value: 26.854
- type: recall_at_1
value: 35.858000000000004
- type: recall_at_10
value: 75.02
- type: recall_at_100
value: 95.76
- type: recall_at_1000
value: 99.837
- type: recall_at_20
value: 83.732
- type: recall_at_3
value: 57.093
- type: recall_at_5
value: 66.193
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (cmn-cmn)
type: jinaai/xpqa
config: cmn-cmn
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: main_score
value: 69.446
- type: map_at_1
value: 39.995999999999995
- type: map_at_10
value: 64.033
- type: map_at_100
value: 65.51599999999999
- type: map_at_1000
value: 65.545
- type: map_at_20
value: 64.958
- type: map_at_3
value: 57.767
- type: map_at_5
value: 61.998
- type: mrr_at_1
value: 63.3495145631068
- type: mrr_at_10
value: 70.21146363075978
- type: mrr_at_100
value: 70.82810974202124
- type: mrr_at_1000
value: 70.83816803303915
- type: mrr_at_20
value: 70.60140248428802
- type: mrr_at_3
value: 68.66909385113267
- type: mrr_at_5
value: 69.56108414239482
- type: nauc_map_at_1000_diff1
value: 51.649897072831465
- type: nauc_map_at_1000_max
value: 38.25222728655331
- type: nauc_map_at_1000_std
value: -39.10327919949334
- type: nauc_map_at_100_diff1
value: 51.644205886401465
- type: nauc_map_at_100_max
value: 38.23611154355255
- type: nauc_map_at_100_std
value: -39.1677073977285
- type: nauc_map_at_10_diff1
value: 51.81444145636039
- type: nauc_map_at_10_max
value: 38.03382104326485
- type: nauc_map_at_10_std
value: -38.999395639812015
- type: nauc_map_at_1_diff1
value: 59.785298201044704
- type: nauc_map_at_1_max
value: 23.273537759937785
- type: nauc_map_at_1_std
value: -17.838712689290194
- type: nauc_map_at_20_diff1
value: 51.680208795601004
- type: nauc_map_at_20_max
value: 38.23334583518634
- type: nauc_map_at_20_std
value: -39.24344495939061
- type: nauc_map_at_3_diff1
value: 52.180913298194056
- type: nauc_map_at_3_max
value: 33.45482478000481
- type: nauc_map_at_3_std
value: -31.682911030586297
- type: nauc_map_at_5_diff1
value: 50.804900676175436
- type: nauc_map_at_5_max
value: 37.68924816012326
- type: nauc_map_at_5_std
value: -36.85016896616712
- type: nauc_mrr_at_1000_diff1
value: 56.371477471577535
- type: nauc_mrr_at_1000_max
value: 42.773877962050086
- type: nauc_mrr_at_1000_std
value: -40.41765081873682
- type: nauc_mrr_at_100_diff1
value: 56.3619751528192
- type: nauc_mrr_at_100_max
value: 42.76298794859916
- type: nauc_mrr_at_100_std
value: -40.44070582448831
- type: nauc_mrr_at_10_diff1
value: 56.33810523477712
- type: nauc_mrr_at_10_max
value: 42.76591937795783
- type: nauc_mrr_at_10_std
value: -40.69339583030244
- type: nauc_mrr_at_1_diff1
value: 58.90399906884378
- type: nauc_mrr_at_1_max
value: 43.38806571165292
- type: nauc_mrr_at_1_std
value: -38.224015285584
- type: nauc_mrr_at_20_diff1
value: 56.32629070537032
- type: nauc_mrr_at_20_max
value: 42.79615263472604
- type: nauc_mrr_at_20_std
value: -40.496777397603076
- type: nauc_mrr_at_3_diff1
value: 55.96989454480743
- type: nauc_mrr_at_3_max
value: 42.49832220744744
- type: nauc_mrr_at_3_std
value: -39.883799467132384
- type: nauc_mrr_at_5_diff1
value: 56.003080766475755
- type: nauc_mrr_at_5_max
value: 42.73308051011805
- type: nauc_mrr_at_5_std
value: -39.87179511166683
- type: nauc_ndcg_at_1000_diff1
value: 52.49054229225255
- type: nauc_ndcg_at_1000_max
value: 39.61644750719859
- type: nauc_ndcg_at_1000_std
value: -40.89845763194674
- type: nauc_ndcg_at_100_diff1
value: 52.33511250864434
- type: nauc_ndcg_at_100_max
value: 39.25530146124452
- type: nauc_ndcg_at_100_std
value: -41.92444498004374
- type: nauc_ndcg_at_10_diff1
value: 52.62031505931842
- type: nauc_ndcg_at_10_max
value: 38.667195545396766
- type: nauc_ndcg_at_10_std
value: -42.59503924641507
- type: nauc_ndcg_at_1_diff1
value: 58.90399906884378
- type: nauc_ndcg_at_1_max
value: 43.38806571165292
- type: nauc_ndcg_at_1_std
value: -38.224015285584
- type: nauc_ndcg_at_20_diff1
value: 52.15061629809436
- type: nauc_ndcg_at_20_max
value: 39.09332400054708
- type: nauc_ndcg_at_20_std
value: -42.80018671618001
- type: nauc_ndcg_at_3_diff1
value: 51.04210728138207
- type: nauc_ndcg_at_3_max
value: 38.19034802567046
- type: nauc_ndcg_at_3_std
value: -38.179821090765216
- type: nauc_ndcg_at_5_diff1
value: 51.04399574045204
- type: nauc_ndcg_at_5_max
value: 38.42492210204548
- type: nauc_ndcg_at_5_std
value: -38.868073241617715
- type: nauc_precision_at_1000_diff1
value: -25.151369907213734
- type: nauc_precision_at_1000_max
value: 9.012549147054989
- type: nauc_precision_at_1000_std
value: -9.319786589947698
- type: nauc_precision_at_100_diff1
value: -23.20945211843088
- type: nauc_precision_at_100_max
value: 9.860701593969862
- type: nauc_precision_at_100_std
value: -13.073877818347231
- type: nauc_precision_at_10_diff1
value: -6.970781124246847
- type: nauc_precision_at_10_max
value: 19.392675322254487
- type: nauc_precision_at_10_std
value: -26.74943490717657
- type: nauc_precision_at_1_diff1
value: 58.90399906884378
- type: nauc_precision_at_1_max
value: 43.38806571165292
- type: nauc_precision_at_1_std
value: -38.224015285584
- type: nauc_precision_at_20_diff1
value: -13.046456108081102
- type: nauc_precision_at_20_max
value: 15.69439950383875
- type: nauc_precision_at_20_std
value: -23.836004512018093
- type: nauc_precision_at_3_diff1
value: 3.5444232965528846
- type: nauc_precision_at_3_max
value: 27.08858445453865
- type: nauc_precision_at_3_std
value: -29.12757283665593
- type: nauc_precision_at_5_diff1
value: -3.6853986353320267
- type: nauc_precision_at_5_max
value: 24.32059689571271
- type: nauc_precision_at_5_std
value: -27.46188072134163
- type: nauc_recall_at_1000_diff1
value: 86.93515141907919
- type: nauc_recall_at_1000_max
value: 100.0
- type: nauc_recall_at_1000_std
value: 100.0
- type: nauc_recall_at_100_diff1
value: 39.7052887613879
- type: nauc_recall_at_100_max
value: 18.40943977796887
- type: nauc_recall_at_100_std
value: -88.74014854144974
- type: nauc_recall_at_10_diff1
value: 48.85342500870892
- type: nauc_recall_at_10_max
value: 32.69617204234419
- type: nauc_recall_at_10_std
value: -51.9937231860804
- type: nauc_recall_at_1_diff1
value: 59.785298201044704
- type: nauc_recall_at_1_max
value: 23.273537759937785
- type: nauc_recall_at_1_std
value: -17.838712689290194
- type: nauc_recall_at_20_diff1
value: 45.40839773314378
- type: nauc_recall_at_20_max
value: 33.02458321493215
- type: nauc_recall_at_20_std
value: -55.97800739448166
- type: nauc_recall_at_3_diff1
value: 47.05565693416531
- type: nauc_recall_at_3_max
value: 28.743850400344297
- type: nauc_recall_at_3_std
value: -32.436470486397475
- type: nauc_recall_at_5_diff1
value: 45.30223758669577
- type: nauc_recall_at_5_max
value: 33.6567274747059
- type: nauc_recall_at_5_std
value: -39.946712017948514
- type: ndcg_at_1
value: 63.349999999999994
- type: ndcg_at_10
value: 69.446
- type: ndcg_at_100
value: 74.439
- type: ndcg_at_1000
value: 74.834
- type: ndcg_at_20
value: 71.763
- type: ndcg_at_3
value: 64.752
- type: ndcg_at_5
value: 66.316
- type: precision_at_1
value: 63.349999999999994
- type: precision_at_10
value: 16.286
- type: precision_at_100
value: 2.024
- type: precision_at_1000
value: 0.207
- type: precision_at_20
value: 8.908000000000001
- type: precision_at_3
value: 40.655
- type: precision_at_5
value: 28.859
- type: recall_at_1
value: 39.995999999999995
- type: recall_at_10
value: 78.107
- type: recall_at_100
value: 97.538
- type: recall_at_1000
value: 99.96000000000001
- type: recall_at_20
value: 85.72
- type: recall_at_3
value: 63.291
- type: recall_at_5
value: 70.625
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (spa-eng)
type: jinaai/xpqa
config: spa-eng
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: main_score
value: 68.258
- type: map_at_1
value: 33.06
- type: map_at_10
value: 61.590999999999994
- type: map_at_100
value: 63.341
- type: map_at_1000
value: 63.385999999999996
- type: map_at_20
value: 62.77700000000001
- type: map_at_3
value: 52.547999999999995
- type: map_at_5
value: 58.824
- type: mrr_at_1
value: 63.80832282471627
- type: mrr_at_10
value: 70.76848015372607
- type: mrr_at_100
value: 71.33996704518061
- type: mrr_at_1000
value: 71.35368444388072
- type: mrr_at_20
value: 71.18191741103522
- type: mrr_at_3
value: 68.83144178226142
- type: mrr_at_5
value: 69.88440521227405
- type: nauc_map_at_1000_diff1
value: 41.59255746310511
- type: nauc_map_at_1000_max
value: 42.064075373358065
- type: nauc_map_at_1000_std
value: -25.130730194381723
- type: nauc_map_at_100_diff1
value: 41.56447648820406
- type: nauc_map_at_100_max
value: 42.06711634651607
- type: nauc_map_at_100_std
value: -25.14871585556968
- type: nauc_map_at_10_diff1
value: 41.28968387107058
- type: nauc_map_at_10_max
value: 41.511538272139774
- type: nauc_map_at_10_std
value: -25.99906440164276
- type: nauc_map_at_1_diff1
value: 51.09859596320021
- type: nauc_map_at_1_max
value: 12.406789321338222
- type: nauc_map_at_1_std
value: -18.227486548655076
- type: nauc_map_at_20_diff1
value: 41.39469672947315
- type: nauc_map_at_20_max
value: 41.98309315808902
- type: nauc_map_at_20_std
value: -25.44704720985219
- type: nauc_map_at_3_diff1
value: 43.16164995512842
- type: nauc_map_at_3_max
value: 30.935400935562818
- type: nauc_map_at_3_std
value: -23.53095555148866
- type: nauc_map_at_5_diff1
value: 41.23474352142375
- type: nauc_map_at_5_max
value: 39.03088859147947
- type: nauc_map_at_5_std
value: -26.046526443708366
- type: nauc_mrr_at_1000_diff1
value: 51.79649678213789
- type: nauc_mrr_at_1000_max
value: 50.50340748045259
- type: nauc_mrr_at_1000_std
value: -24.777183703493407
- type: nauc_mrr_at_100_diff1
value: 51.78609028166551
- type: nauc_mrr_at_100_max
value: 50.51732896833555
- type: nauc_mrr_at_100_std
value: -24.760054686874717
- type: nauc_mrr_at_10_diff1
value: 51.705268395036995
- type: nauc_mrr_at_10_max
value: 50.35818415293149
- type: nauc_mrr_at_10_std
value: -25.170367120250404
- type: nauc_mrr_at_1_diff1
value: 53.91475115581825
- type: nauc_mrr_at_1_max
value: 49.122529616282016
- type: nauc_mrr_at_1_std
value: -22.377647552937155
- type: nauc_mrr_at_20_diff1
value: 51.778984221197774
- type: nauc_mrr_at_20_max
value: 50.5070957827813
- type: nauc_mrr_at_20_std
value: -24.908935023607285
- type: nauc_mrr_at_3_diff1
value: 51.82683773090423
- type: nauc_mrr_at_3_max
value: 50.77993196421369
- type: nauc_mrr_at_3_std
value: -24.3925832021831
- type: nauc_mrr_at_5_diff1
value: 51.722232683543034
- type: nauc_mrr_at_5_max
value: 50.334865493961864
- type: nauc_mrr_at_5_std
value: -25.513593495703297
- type: nauc_ndcg_at_1000_diff1
value: 44.21851582991263
- type: nauc_ndcg_at_1000_max
value: 45.73539068637836
- type: nauc_ndcg_at_1000_std
value: -24.716522467580397
- type: nauc_ndcg_at_100_diff1
value: 43.8002401615357
- type: nauc_ndcg_at_100_max
value: 45.801409410061915
- type: nauc_ndcg_at_100_std
value: -24.73171742499903
- type: nauc_ndcg_at_10_diff1
value: 42.540922778755885
- type: nauc_ndcg_at_10_max
value: 44.348836943874595
- type: nauc_ndcg_at_10_std
value: -28.05403666494785
- type: nauc_ndcg_at_1_diff1
value: 53.91475115581825
- type: nauc_ndcg_at_1_max
value: 49.122529616282016
- type: nauc_ndcg_at_1_std
value: -22.377647552937155
- type: nauc_ndcg_at_20_diff1
value: 43.10347921163421
- type: nauc_ndcg_at_20_max
value: 45.53253270265022
- type: nauc_ndcg_at_20_std
value: -26.63902791862846
- type: nauc_ndcg_at_3_diff1
value: 42.41720274782384
- type: nauc_ndcg_at_3_max
value: 42.91778219334943
- type: nauc_ndcg_at_3_std
value: -24.793252033594076
- type: nauc_ndcg_at_5_diff1
value: 42.51515034945093
- type: nauc_ndcg_at_5_max
value: 41.62080576508792
- type: nauc_ndcg_at_5_std
value: -28.209669314955065
- type: nauc_precision_at_1000_diff1
value: -14.89794075433148
- type: nauc_precision_at_1000_max
value: 27.85387929356412
- type: nauc_precision_at_1000_std
value: 10.728618597190849
- type: nauc_precision_at_100_diff1
value: -13.075270046295856
- type: nauc_precision_at_100_max
value: 29.77208946756632
- type: nauc_precision_at_100_std
value: 8.491662697326039
- type: nauc_precision_at_10_diff1
value: -4.0826025188781205
- type: nauc_precision_at_10_max
value: 39.04278085180075
- type: nauc_precision_at_10_std
value: -5.925408651372333
- type: nauc_precision_at_1_diff1
value: 53.91475115581825
- type: nauc_precision_at_1_max
value: 49.122529616282016
- type: nauc_precision_at_1_std
value: -22.377647552937155
- type: nauc_precision_at_20_diff1
value: -7.93186440645135
- type: nauc_precision_at_20_max
value: 35.81281308891365
- type: nauc_precision_at_20_std
value: 0.1241277857515697
- type: nauc_precision_at_3_diff1
value: 7.563562511484409
- type: nauc_precision_at_3_max
value: 43.43738862378524
- type: nauc_precision_at_3_std
value: -11.958059731912615
- type: nauc_precision_at_5_diff1
value: -0.1801152449011624
- type: nauc_precision_at_5_max
value: 41.32486715619513
- type: nauc_precision_at_5_std
value: -10.088699021919552
- type: nauc_recall_at_1000_diff1
value: 86.93359696819986
- type: nauc_recall_at_1000_max
value: 100.0
- type: nauc_recall_at_1000_std
value: 72.21843645604022
- type: nauc_recall_at_100_diff1
value: 29.86050842714198
- type: nauc_recall_at_100_max
value: 48.106658251136245
- type: nauc_recall_at_100_std
value: -14.981886214880035
- type: nauc_recall_at_10_diff1
value: 33.67119240737528
- type: nauc_recall_at_10_max
value: 39.271984859561414
- type: nauc_recall_at_10_std
value: -35.6434883839217
- type: nauc_recall_at_1_diff1
value: 51.09859596320021
- type: nauc_recall_at_1_max
value: 12.406789321338222
- type: nauc_recall_at_1_std
value: -18.227486548655076
- type: nauc_recall_at_20_diff1
value: 33.211979983240724
- type: nauc_recall_at_20_max
value: 43.47676074743184
- type: nauc_recall_at_20_std
value: -33.88107138395349
- type: nauc_recall_at_3_diff1
value: 39.22513750146998
- type: nauc_recall_at_3_max
value: 27.066674083840166
- type: nauc_recall_at_3_std
value: -26.963282529629893
- type: nauc_recall_at_5_diff1
value: 36.53718917129459
- type: nauc_recall_at_5_max
value: 35.40550013169686
- type: nauc_recall_at_5_std
value: -34.209159379410806
- type: ndcg_at_1
value: 63.808
- type: ndcg_at_10
value: 68.258
- type: ndcg_at_100
value: 73.38799999999999
- type: ndcg_at_1000
value: 74.03
- type: ndcg_at_20
value: 70.968
- type: ndcg_at_3
value: 62.33
- type: ndcg_at_5
value: 64.096
- type: precision_at_1
value: 63.808
- type: precision_at_10
value: 19.243
- type: precision_at_100
value: 2.367
- type: precision_at_1000
value: 0.245
- type: precision_at_20
value: 10.599
- type: precision_at_3
value: 44.515
- type: precision_at_5
value: 33.467999999999996
- type: recall_at_1
value: 33.06
- type: recall_at_10
value: 77.423
- type: recall_at_100
value: 95.923
- type: recall_at_1000
value: 99.874
- type: recall_at_20
value: 85.782
- type: recall_at_3
value: 57.098000000000006
- type: recall_at_5
value: 67.472
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (spa-spa)
type: jinaai/xpqa
config: spa-spa
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: main_score
value: 72.004
- type: map_at_1
value: 36.248000000000005
- type: map_at_10
value: 65.679
- type: map_at_100
value: 67.22399999999999
- type: map_at_1000
value: 67.264
- type: map_at_20
value: 66.705
- type: map_at_3
value: 56.455
- type: map_at_5
value: 62.997
- type: mrr_at_1
value: 67.71752837326608
- type: mrr_at_10
value: 74.59782021257429
- type: mrr_at_100
value: 75.0640960767943
- type: mrr_at_1000
value: 75.07324799466076
- type: mrr_at_20
value: 74.9323963386884
- type: mrr_at_3
value: 72.95081967213115
- type: mrr_at_5
value: 73.82723833543506
- type: nauc_map_at_1000_diff1
value: 43.111810717567714
- type: nauc_map_at_1000_max
value: 44.835247208972476
- type: nauc_map_at_1000_std
value: -32.798405973931985
- type: nauc_map_at_100_diff1
value: 43.090223482932764
- type: nauc_map_at_100_max
value: 44.83392441557943
- type: nauc_map_at_100_std
value: -32.81149166676563
- type: nauc_map_at_10_diff1
value: 42.87841934951979
- type: nauc_map_at_10_max
value: 43.9838653389494
- type: nauc_map_at_10_std
value: -33.588084643627084
- type: nauc_map_at_1_diff1
value: 54.509245848379095
- type: nauc_map_at_1_max
value: 10.05921648322742
- type: nauc_map_at_1_std
value: -24.652326014826762
- type: nauc_map_at_20_diff1
value: 43.07468612984794
- type: nauc_map_at_20_max
value: 44.75663122615032
- type: nauc_map_at_20_std
value: -33.11788887878321
- type: nauc_map_at_3_diff1
value: 44.63272828938906
- type: nauc_map_at_3_max
value: 32.1584369869227
- type: nauc_map_at_3_std
value: -30.761662210142944
- type: nauc_map_at_5_diff1
value: 42.77296997803048
- type: nauc_map_at_5_max
value: 41.78894616737652
- type: nauc_map_at_5_std
value: -33.56459774477362
- type: nauc_mrr_at_1000_diff1
value: 53.097544131833494
- type: nauc_mrr_at_1000_max
value: 50.61134979184588
- type: nauc_mrr_at_1000_std
value: -35.6221191487669
- type: nauc_mrr_at_100_diff1
value: 53.096609856182106
- type: nauc_mrr_at_100_max
value: 50.61951585642645
- type: nauc_mrr_at_100_std
value: -35.62396157508327
- type: nauc_mrr_at_10_diff1
value: 52.771534471912304
- type: nauc_mrr_at_10_max
value: 50.430863224435726
- type: nauc_mrr_at_10_std
value: -36.027992076620365
- type: nauc_mrr_at_1_diff1
value: 55.05316238884337
- type: nauc_mrr_at_1_max
value: 49.461858515275196
- type: nauc_mrr_at_1_std
value: -31.87492636319712
- type: nauc_mrr_at_20_diff1
value: 53.083253469629746
- type: nauc_mrr_at_20_max
value: 50.62156424256193
- type: nauc_mrr_at_20_std
value: -35.879153692447154
- type: nauc_mrr_at_3_diff1
value: 52.98283109188415
- type: nauc_mrr_at_3_max
value: 50.83561260429378
- type: nauc_mrr_at_3_std
value: -35.30839538038797
- type: nauc_mrr_at_5_diff1
value: 52.93270510879709
- type: nauc_mrr_at_5_max
value: 50.54595596761199
- type: nauc_mrr_at_5_std
value: -35.84059376434395
- type: nauc_ndcg_at_1000_diff1
value: 45.343685089209416
- type: nauc_ndcg_at_1000_max
value: 47.801141576669465
- type: nauc_ndcg_at_1000_std
value: -33.512958862879195
- type: nauc_ndcg_at_100_diff1
value: 45.255590461515894
- type: nauc_ndcg_at_100_max
value: 47.99240031881967
- type: nauc_ndcg_at_100_std
value: -33.614465006695205
- type: nauc_ndcg_at_10_diff1
value: 43.93472511731019
- type: nauc_ndcg_at_10_max
value: 45.92599752897053
- type: nauc_ndcg_at_10_std
value: -36.43629114491574
- type: nauc_ndcg_at_1_diff1
value: 55.05316238884337
- type: nauc_ndcg_at_1_max
value: 49.461858515275196
- type: nauc_ndcg_at_1_std
value: -31.87492636319712
- type: nauc_ndcg_at_20_diff1
value: 44.93534591273201
- type: nauc_ndcg_at_20_max
value: 47.55153940713458
- type: nauc_ndcg_at_20_std
value: -35.56392448745206
- type: nauc_ndcg_at_3_diff1
value: 43.17916122133396
- type: nauc_ndcg_at_3_max
value: 45.603634205103276
- type: nauc_ndcg_at_3_std
value: -32.473227507181214
- type: nauc_ndcg_at_5_diff1
value: 44.10242961669216
- type: nauc_ndcg_at_5_max
value: 43.61666669031808
- type: nauc_ndcg_at_5_std
value: -35.98808321497782
- type: nauc_precision_at_1000_diff1
value: -23.264714449991146
- type: nauc_precision_at_1000_max
value: 28.505729576735465
- type: nauc_precision_at_1000_std
value: 11.987379232920926
- type: nauc_precision_at_100_diff1
value: -21.156119174614627
- type: nauc_precision_at_100_max
value: 30.711646221646255
- type: nauc_precision_at_100_std
value: 9.650486536340322
- type: nauc_precision_at_10_diff1
value: -10.98001328477502
- type: nauc_precision_at_10_max
value: 39.25638073760597
- type: nauc_precision_at_10_std
value: -4.3456859257488
- type: nauc_precision_at_1_diff1
value: 55.05316238884337
- type: nauc_precision_at_1_max
value: 49.461858515275196
- type: nauc_precision_at_1_std
value: -31.87492636319712
- type: nauc_precision_at_20_diff1
value: -14.97565390664424
- type: nauc_precision_at_20_max
value: 36.383835295942355
- type: nauc_precision_at_20_std
value: 1.525158880381114
- type: nauc_precision_at_3_diff1
value: 1.0448345623903483
- type: nauc_precision_at_3_max
value: 45.69772060667404
- type: nauc_precision_at_3_std
value: -13.002685018948293
- type: nauc_precision_at_5_diff1
value: -5.434185597628904
- type: nauc_precision_at_5_max
value: 42.99162431099203
- type: nauc_precision_at_5_std
value: -9.789308817624534
- type: nauc_recall_at_1000_diff1
value: 12.309303236094845
- type: nauc_recall_at_1000_max
value: 100.0
- type: nauc_recall_at_1000_std
value: 86.93359696819986
- type: nauc_recall_at_100_diff1
value: 39.093544920901415
- type: nauc_recall_at_100_max
value: 55.62814395062938
- type: nauc_recall_at_100_std
value: -22.6919033301514
- type: nauc_recall_at_10_diff1
value: 35.50100141633622
- type: nauc_recall_at_10_max
value: 39.25750019586647
- type: nauc_recall_at_10_std
value: -43.01273078031791
- type: nauc_recall_at_1_diff1
value: 54.509245848379095
- type: nauc_recall_at_1_max
value: 10.05921648322742
- type: nauc_recall_at_1_std
value: -24.652326014826762
- type: nauc_recall_at_20_diff1
value: 38.1281707132327
- type: nauc_recall_at_20_max
value: 43.97950642900301
- type: nauc_recall_at_20_std
value: -44.049952771307574
- type: nauc_recall_at_3_diff1
value: 40.01986938242728
- type: nauc_recall_at_3_max
value: 27.517114421061173
- type: nauc_recall_at_3_std
value: -32.99056780232045
- type: nauc_recall_at_5_diff1
value: 38.52035606499483
- type: nauc_recall_at_5_max
value: 37.05834604678859
- type: nauc_recall_at_5_std
value: -39.86196378897912
- type: ndcg_at_1
value: 67.718
- type: ndcg_at_10
value: 72.004
- type: ndcg_at_100
value: 76.554
- type: ndcg_at_1000
value: 77.07300000000001
- type: ndcg_at_20
value: 74.37899999999999
- type: ndcg_at_3
value: 66.379
- type: ndcg_at_5
value: 68.082
- type: precision_at_1
value: 67.718
- type: precision_at_10
value: 19.849
- type: precision_at_100
value: 2.3800000000000003
- type: precision_at_1000
value: 0.245
- type: precision_at_20
value: 10.813
- type: precision_at_3
value: 46.574
- type: precision_at_5
value: 34.83
- type: recall_at_1
value: 36.248000000000005
- type: recall_at_10
value: 80.252
- type: recall_at_100
value: 96.73
- type: recall_at_1000
value: 99.874
- type: recall_at_20
value: 87.703
- type: recall_at_3
value: 60.815
- type: recall_at_5
value: 71.16
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (fra-eng)
type: jinaai/xpqa
config: fra-eng
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: main_score
value: 73.729
- type: map_at_1
value: 43.964999999999996
- type: map_at_10
value: 67.803
- type: map_at_100
value: 69.188
- type: map_at_1000
value: 69.21000000000001
- type: map_at_20
value: 68.747
- type: map_at_3
value: 60.972
- type: map_at_5
value: 65.39399999999999
- type: mrr_at_1
value: 68.4913217623498
- type: mrr_at_10
value: 75.2600822260368
- type: mrr_at_100
value: 75.6599169808848
- type: mrr_at_1000
value: 75.66720883727534
- type: mrr_at_20
value: 75.52375865860405
- type: mrr_at_3
value: 73.54250111259452
- type: mrr_at_5
value: 74.51713395638626
- type: nauc_map_at_1000_diff1
value: 46.81533703002097
- type: nauc_map_at_1000_max
value: 46.30794757084772
- type: nauc_map_at_1000_std
value: -14.953470500312335
- type: nauc_map_at_100_diff1
value: 46.82464740277745
- type: nauc_map_at_100_max
value: 46.32852879948254
- type: nauc_map_at_100_std
value: -14.950035098066172
- type: nauc_map_at_10_diff1
value: 46.31406143369831
- type: nauc_map_at_10_max
value: 45.337593270786634
- type: nauc_map_at_10_std
value: -16.011789445907876
- type: nauc_map_at_1_diff1
value: 57.097134715065835
- type: nauc_map_at_1_max
value: 21.93931500350721
- type: nauc_map_at_1_std
value: -15.134457251301637
- type: nauc_map_at_20_diff1
value: 46.47030891134173
- type: nauc_map_at_20_max
value: 46.29169960276292
- type: nauc_map_at_20_std
value: -15.14241106541829
- type: nauc_map_at_3_diff1
value: 50.27064228648596
- type: nauc_map_at_3_max
value: 39.43058773971639
- type: nauc_map_at_3_std
value: -16.16545993089126
- type: nauc_map_at_5_diff1
value: 46.974867679747426
- type: nauc_map_at_5_max
value: 44.31091104855002
- type: nauc_map_at_5_std
value: -16.50175337658926
- type: nauc_mrr_at_1000_diff1
value: 55.20294005110399
- type: nauc_mrr_at_1000_max
value: 51.947725719119966
- type: nauc_mrr_at_1000_std
value: -14.586112939597232
- type: nauc_mrr_at_100_diff1
value: 55.20426251109304
- type: nauc_mrr_at_100_max
value: 51.95648725402534
- type: nauc_mrr_at_100_std
value: -14.579769236539143
- type: nauc_mrr_at_10_diff1
value: 54.93870506205835
- type: nauc_mrr_at_10_max
value: 51.89312772900638
- type: nauc_mrr_at_10_std
value: -14.692635010092939
- type: nauc_mrr_at_1_diff1
value: 56.54945935175171
- type: nauc_mrr_at_1_max
value: 51.28134504197991
- type: nauc_mrr_at_1_std
value: -12.909042186563061
- type: nauc_mrr_at_20_diff1
value: 55.10667018041461
- type: nauc_mrr_at_20_max
value: 51.98236870783707
- type: nauc_mrr_at_20_std
value: -14.599377575198025
- type: nauc_mrr_at_3_diff1
value: 55.67124311746892
- type: nauc_mrr_at_3_max
value: 51.77903236246767
- type: nauc_mrr_at_3_std
value: -14.94452633860763
- type: nauc_mrr_at_5_diff1
value: 55.42849172366371
- type: nauc_mrr_at_5_max
value: 51.76902965753959
- type: nauc_mrr_at_5_std
value: -15.357993534727072
- type: nauc_ndcg_at_1000_diff1
value: 48.736844959280326
- type: nauc_ndcg_at_1000_max
value: 48.92891159935398
- type: nauc_ndcg_at_1000_std
value: -13.983968675611056
- type: nauc_ndcg_at_100_diff1
value: 48.73859328503975
- type: nauc_ndcg_at_100_max
value: 49.31867149556439
- type: nauc_ndcg_at_100_std
value: -13.72387564912742
- type: nauc_ndcg_at_10_diff1
value: 46.50313862975287
- type: nauc_ndcg_at_10_max
value: 47.13599793554596
- type: nauc_ndcg_at_10_std
value: -16.317919977400113
- type: nauc_ndcg_at_1_diff1
value: 56.54945935175171
- type: nauc_ndcg_at_1_max
value: 51.28134504197991
- type: nauc_ndcg_at_1_std
value: -12.909042186563061
- type: nauc_ndcg_at_20_diff1
value: 47.01727117133912
- type: nauc_ndcg_at_20_max
value: 49.121366036709105
- type: nauc_ndcg_at_20_std
value: -14.411078677638775
- type: nauc_ndcg_at_3_diff1
value: 49.229581145458276
- type: nauc_ndcg_at_3_max
value: 47.427609717032
- type: nauc_ndcg_at_3_std
value: -16.52066627289908
- type: nauc_ndcg_at_5_diff1
value: 48.0152514127505
- type: nauc_ndcg_at_5_max
value: 46.12152407850816
- type: nauc_ndcg_at_5_std
value: -17.613295491954656
- type: nauc_precision_at_1000_diff1
value: -25.959006032642463
- type: nauc_precision_at_1000_max
value: 12.81002362947137
- type: nauc_precision_at_1000_std
value: 12.575312826061513
- type: nauc_precision_at_100_diff1
value: -24.35413527283394
- type: nauc_precision_at_100_max
value: 14.878359236477303
- type: nauc_precision_at_100_std
value: 12.384426050018428
- type: nauc_precision_at_10_diff1
value: -17.93220761770618
- type: nauc_precision_at_10_max
value: 23.523485811847294
- type: nauc_precision_at_10_std
value: 4.424456968716939
- type: nauc_precision_at_1_diff1
value: 56.54945935175171
- type: nauc_precision_at_1_max
value: 51.28134504197991
- type: nauc_precision_at_1_std
value: -12.909042186563061
- type: nauc_precision_at_20_diff1
value: -21.776871398686936
- type: nauc_precision_at_20_max
value: 21.18436338264366
- type: nauc_precision_at_20_std
value: 9.937274986573321
- type: nauc_precision_at_3_diff1
value: -1.2411845580934435
- type: nauc_precision_at_3_max
value: 34.962281941875
- type: nauc_precision_at_3_std
value: -2.447892908501237
- type: nauc_precision_at_5_diff1
value: -11.134164534114085
- type: nauc_precision_at_5_max
value: 30.22079740070525
- type: nauc_precision_at_5_std
value: -0.24232594421765946
- type: nauc_recall_at_1000_diff1
value: .nan
- type: nauc_recall_at_1000_max
value: .nan
- type: nauc_recall_at_1000_std
value: .nan
- type: nauc_recall_at_100_diff1
value: 43.3647412452869
- type: nauc_recall_at_100_max
value: 63.50094950500327
- type: nauc_recall_at_100_std
value: 2.3911909633714044
- type: nauc_recall_at_10_diff1
value: 33.993445071666855
- type: nauc_recall_at_10_max
value: 41.38694129134144
- type: nauc_recall_at_10_std
value: -19.308698266099096
- type: nauc_recall_at_1_diff1
value: 57.097134715065835
- type: nauc_recall_at_1_max
value: 21.93931500350721
- type: nauc_recall_at_1_std
value: -15.134457251301637
- type: nauc_recall_at_20_diff1
value: 32.03888531880772
- type: nauc_recall_at_20_max
value: 49.660787482562085
- type: nauc_recall_at_20_std
value: -12.641456758778382
- type: nauc_recall_at_3_diff1
value: 47.94527082900579
- type: nauc_recall_at_3_max
value: 36.51733131437679
- type: nauc_recall_at_3_std
value: -18.65511713247495
- type: nauc_recall_at_5_diff1
value: 42.04545772092305
- type: nauc_recall_at_5_max
value: 41.21440912972303
- type: nauc_recall_at_5_std
value: -21.47386527081128
- type: ndcg_at_1
value: 68.491
- type: ndcg_at_10
value: 73.729
- type: ndcg_at_100
value: 77.684
- type: ndcg_at_1000
value: 78.084
- type: ndcg_at_20
value: 75.795
- type: ndcg_at_3
value: 68.568
- type: ndcg_at_5
value: 70.128
- type: precision_at_1
value: 68.491
- type: precision_at_10
value: 16.996
- type: precision_at_100
value: 2.023
- type: precision_at_1000
value: 0.207
- type: precision_at_20
value: 9.246
- type: precision_at_3
value: 41.923
- type: precision_at_5
value: 29.826000000000004
- type: recall_at_1
value: 43.964999999999996
- type: recall_at_10
value: 82.777
- type: recall_at_100
value: 97.287
- type: recall_at_1000
value: 100.0
- type: recall_at_20
value: 89.183
- type: recall_at_3
value: 65.803
- type: recall_at_5
value: 74.119
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (fr)
type: jinaai/xpqa
config: fra-fra
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: main_score
value: 77.581
- type: map_at_1
value: 46.444
- type: map_at_10
value: 72.084
- type: map_at_100
value: 73.175
- type: map_at_1000
value: 73.193
- type: map_at_20
value: 72.77799999999999
- type: map_at_3
value: 65.242
- type: map_at_5
value: 69.926
- type: mrr_at_1
value: 71.82910547396529
- type: mrr_at_10
value: 78.66594612923046
- type: mrr_at_100
value: 78.97334934049613
- type: mrr_at_1000
value: 78.97687021803557
- type: mrr_at_20
value: 78.85701141744282
- type: mrr_at_3
value: 76.96929238985311
- type: mrr_at_5
value: 77.99732977303067
- type: nauc_map_at_1000_diff1
value: 49.090956807097804
- type: nauc_map_at_1000_max
value: 52.01095354889508
- type: nauc_map_at_1000_std
value: -12.182870421711026
- type: nauc_map_at_100_diff1
value: 49.091664766684566
- type: nauc_map_at_100_max
value: 52.017499797253755
- type: nauc_map_at_100_std
value: -12.188342487271528
- type: nauc_map_at_10_diff1
value: 48.6619338205362
- type: nauc_map_at_10_max
value: 50.93591260329888
- type: nauc_map_at_10_std
value: -12.899399261673365
- type: nauc_map_at_1_diff1
value: 61.89699552471587
- type: nauc_map_at_1_max
value: 22.387748207421946
- type: nauc_map_at_1_std
value: -17.139518194308437
- type: nauc_map_at_20_diff1
value: 48.72828404686453
- type: nauc_map_at_20_max
value: 51.781074586075434
- type: nauc_map_at_20_std
value: -12.174270605093136
- type: nauc_map_at_3_diff1
value: 53.11509580126934
- type: nauc_map_at_3_max
value: 42.1768380145106
- type: nauc_map_at_3_std
value: -14.98340833032363
- type: nauc_map_at_5_diff1
value: 49.60521390803235
- type: nauc_map_at_5_max
value: 49.80360562029127
- type: nauc_map_at_5_std
value: -13.900652140457618
- type: nauc_mrr_at_1000_diff1
value: 58.10782478654255
- type: nauc_mrr_at_1000_max
value: 61.31083013535486
- type: nauc_mrr_at_1000_std
value: -9.624904298545921
- type: nauc_mrr_at_100_diff1
value: 58.11041683306092
- type: nauc_mrr_at_100_max
value: 61.31590199755797
- type: nauc_mrr_at_100_std
value: -9.625991053580865
- type: nauc_mrr_at_10_diff1
value: 57.883701815695375
- type: nauc_mrr_at_10_max
value: 61.36276126424689
- type: nauc_mrr_at_10_std
value: -9.495072468420386
- type: nauc_mrr_at_1_diff1
value: 60.18176977079093
- type: nauc_mrr_at_1_max
value: 59.697615236642555
- type: nauc_mrr_at_1_std
value: -9.396133077966779
- type: nauc_mrr_at_20_diff1
value: 57.964817434006754
- type: nauc_mrr_at_20_max
value: 61.34073539502932
- type: nauc_mrr_at_20_std
value: -9.602378876645131
- type: nauc_mrr_at_3_diff1
value: 58.44338049427257
- type: nauc_mrr_at_3_max
value: 60.92272989411293
- type: nauc_mrr_at_3_std
value: -9.928970439416162
- type: nauc_mrr_at_5_diff1
value: 58.01513016866578
- type: nauc_mrr_at_5_max
value: 61.46805302986586
- type: nauc_mrr_at_5_std
value: -9.842227002440984
- type: nauc_ndcg_at_1000_diff1
value: 50.99293152828167
- type: nauc_ndcg_at_1000_max
value: 56.14232784664811
- type: nauc_ndcg_at_1000_std
value: -10.529213072410288
- type: nauc_ndcg_at_100_diff1
value: 50.99385944312529
- type: nauc_ndcg_at_100_max
value: 56.34825518954588
- type: nauc_ndcg_at_100_std
value: -10.398943874846047
- type: nauc_ndcg_at_10_diff1
value: 48.51273364357823
- type: nauc_ndcg_at_10_max
value: 53.77871849486298
- type: nauc_ndcg_at_10_std
value: -11.82105972112472
- type: nauc_ndcg_at_1_diff1
value: 60.18176977079093
- type: nauc_ndcg_at_1_max
value: 59.697615236642555
- type: nauc_ndcg_at_1_std
value: -9.396133077966779
- type: nauc_ndcg_at_20_diff1
value: 49.04268319033412
- type: nauc_ndcg_at_20_max
value: 55.47011381097071
- type: nauc_ndcg_at_20_std
value: -10.486452945493042
- type: nauc_ndcg_at_3_diff1
value: 50.95112745400584
- type: nauc_ndcg_at_3_max
value: 53.45473828705577
- type: nauc_ndcg_at_3_std
value: -13.420699384045728
- type: nauc_ndcg_at_5_diff1
value: 50.313156212000074
- type: nauc_ndcg_at_5_max
value: 52.78539129309866
- type: nauc_ndcg_at_5_std
value: -13.586274096509122
- type: nauc_precision_at_1000_diff1
value: -31.13772049254778
- type: nauc_precision_at_1000_max
value: 17.2847598361294
- type: nauc_precision_at_1000_std
value: 15.497531773816887
- type: nauc_precision_at_100_diff1
value: -29.98812263553739
- type: nauc_precision_at_100_max
value: 19.048620003227654
- type: nauc_precision_at_100_std
value: 15.38499952171958
- type: nauc_precision_at_10_diff1
value: -25.33028097412579
- type: nauc_precision_at_10_max
value: 26.077919168306853
- type: nauc_precision_at_10_std
value: 11.35352933466097
- type: nauc_precision_at_1_diff1
value: 60.18176977079093
- type: nauc_precision_at_1_max
value: 59.697615236642555
- type: nauc_precision_at_1_std
value: -9.396133077966779
- type: nauc_precision_at_20_diff1
value: -28.417606311068905
- type: nauc_precision_at_20_max
value: 23.958679828637692
- type: nauc_precision_at_20_std
value: 14.442021499194205
- type: nauc_precision_at_3_diff1
value: -8.127396049790482
- type: nauc_precision_at_3_max
value: 37.348067982957076
- type: nauc_precision_at_3_std
value: 4.747913619596849
- type: nauc_precision_at_5_diff1
value: -16.902418446058395
- type: nauc_precision_at_5_max
value: 32.73583852552014
- type: nauc_precision_at_5_std
value: 7.031446423850052
- type: nauc_recall_at_1000_diff1
value: -14.485978369112514
- type: nauc_recall_at_1000_max
value: 78.59123887333172
- type: nauc_recall_at_1000_std
value: 90.7384575424963
- type: nauc_recall_at_100_diff1
value: 41.47842281590715
- type: nauc_recall_at_100_max
value: 67.47271545727422
- type: nauc_recall_at_100_std
value: 14.555561992253999
- type: nauc_recall_at_10_diff1
value: 33.05308907973924
- type: nauc_recall_at_10_max
value: 45.49878918493155
- type: nauc_recall_at_10_std
value: -11.560069806810926
- type: nauc_recall_at_1_diff1
value: 61.89699552471587
- type: nauc_recall_at_1_max
value: 22.387748207421946
- type: nauc_recall_at_1_std
value: -17.139518194308437
- type: nauc_recall_at_20_diff1
value: 31.305721376453754
- type: nauc_recall_at_20_max
value: 51.24817763724019
- type: nauc_recall_at_20_std
value: -5.0809908162023145
- type: nauc_recall_at_3_diff1
value: 49.27109038342917
- type: nauc_recall_at_3_max
value: 37.69188317998447
- type: nauc_recall_at_3_std
value: -17.119900758664336
- type: nauc_recall_at_5_diff1
value: 42.74501803377967
- type: nauc_recall_at_5_max
value: 46.877008503354844
- type: nauc_recall_at_5_std
value: -15.704892082115975
- type: ndcg_at_1
value: 71.829
- type: ndcg_at_10
value: 77.581
- type: ndcg_at_100
value: 80.75
- type: ndcg_at_1000
value: 81.026
- type: ndcg_at_20
value: 79.092
- type: ndcg_at_3
value: 72.81
- type: ndcg_at_5
value: 74.22999999999999
- type: precision_at_1
value: 71.829
- type: precision_at_10
value: 17.717
- type: precision_at_100
value: 2.031
- type: precision_at_1000
value: 0.207
- type: precision_at_20
value: 9.399000000000001
- type: precision_at_3
value: 44.458999999999996
- type: precision_at_5
value: 31.535000000000004
- type: recall_at_1
value: 46.444
- type: recall_at_10
value: 86.275
- type: recall_at_100
value: 98.017
- type: recall_at_1000
value: 99.8
- type: recall_at_20
value: 90.935
- type: recall_at_3
value: 70.167
- type: recall_at_5
value: 78.2
---
<br><br>
<p align="center">
<img src="https://aeiljuispo.cloudimg.io/v7/https://cdn-uploads.huggingface.co/production/uploads/603763514de52ff951d89793/AFoybzd5lpBQXEBrQHuTt.png?w=200&h=200&f=face" alt="Finetuner logo: Finetuner helps you to create experiments in order to improve embeddings on search tasks. It accompanies you to deliver the last mile of performance-tuning for neural search applications." width="150px">
</p>
<p align="center">
<b>The embedding model trained by <a href="https://jina.ai/"><b>Jina AI</b></a>.</b>
</p>
<p align="center">
<b>jina-embeddings-v3: Multilingual Embeddings With Task LoRA</b>
</p>
## Quick Start
[Blog](https://jina.ai/news/jina-embeddings-v3-a-frontier-multilingual-embedding-model/#parameter-dimensions) | [Azure](https://azuremarketplace.microsoft.com/en-us/marketplace/apps/jinaai.jina-embeddings-v3) | [AWS SageMaker](https://aws.amazon.com/marketplace/pp/prodview-kdi3xkt62lo32) | [API](https://jina.ai/embeddings)
## Intended Usage & Model Info
`jina-embeddings-v3` is a **multilingual multi-task text embedding model** designed for a variety of NLP applications.
Based on the [Jina-XLM-RoBERTa architecture](https://huggingface.co/jinaai/xlm-roberta-flash-implementation),
this model supports Rotary Position Embeddings to handle long input sequences up to **8192 tokens**.
Additionally, it features 5 LoRA adapters to generate task-specific embeddings efficiently.
### Key Features:
- **Extended Sequence Length:** Supports up to 8192 tokens with RoPE.
- **Task-Specific Embedding:** Customize embeddings through the `task` argument with the following options:
- `retrieval.query`: Used for query embeddings in asymmetric retrieval tasks
- `retrieval.passage`: Used for passage embeddings in asymmetric retrieval tasks
- `separation`: Used for embeddings in clustering and re-ranking applications
- `classification`: Used for embeddings in classification tasks
- `text-matching`: Used for embeddings in tasks that quantify similarity between two texts, such as STS or symmetric retrieval tasks
- **Matryoshka Embeddings**: Supports flexible embedding sizes (`32, 64, 128, 256, 512, 768, 1024`), allowing for truncating embeddings to fit your application.
### Supported Languages:
While the foundation model supports 100 languages, we've focused our tuning efforts on the following 30 languages:
**Arabic, Bengali, Chinese, Danish, Dutch, English, Finnish, French, Georgian, German, Greek,
Hindi, Indonesian, Italian, Japanese, Korean, Latvian, Norwegian, Polish, Portuguese, Romanian,
Russian, Slovak, Spanish, Swedish, Thai, Turkish, Ukrainian, Urdu,** and **Vietnamese.**
## Usage
**<details><summary>Apply mean pooling when integrating the model.</summary>**
<p>
### Why Use Mean Pooling?
Mean pooling takes all token embeddings from the model's output and averages them at the sentence or paragraph level.
This approach has been shown to produce high-quality sentence embeddings.
We provide an `encode` function that handles this for you automatically.
However, if you're working with the model directly, outside of the `encode` function,
you'll need to apply mean pooling manually. Here's how you can do it:
```python
import torch
import torch.nn.functional as F
from transformers import AutoTokenizer, AutoModel
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0]
input_mask_expanded = (
attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
)
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(
input_mask_expanded.sum(1), min=1e-9
)
sentences = ["How is the weather today?", "What is the current weather like today?"]
tokenizer = AutoTokenizer.from_pretrained("jinaai/jina-embeddings-v3")
model = AutoModel.from_pretrained("jinaai/jina-embeddings-v3", trust_remote_code=True)
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors="pt")
task = 'retrieval.query'
task_id = model._adaptation_map[task]
adapter_mask = torch.full((len(sentences),), task_id, dtype=torch.int32)
with torch.no_grad():
model_output = model(**encoded_input, adapter_mask=adapter_mask)
embeddings = mean_pooling(model_output, encoded_input["attention_mask"])
embeddings = F.normalize(embeddings, p=2, dim=1)
```
</p>
</details>
The easiest way to start using `jina-embeddings-v3` is with the [Jina Embedding API](https://jina.ai/embeddings/).
Alternatively, you can use `jina-embeddings-v3` directly via Transformers package:
```bash
!pip install transformers torch einops
!pip install 'numpy<2'
```
If you run it on a GPU that support [FlashAttention-2](https://github.com/Dao-AILab/flash-attention). By 2024.9.12, it supports Ampere, Ada, or Hopper GPUs (e.g., A100, RTX 3090, RTX 4090, H100),
```bash
!pip install flash-attn --no-build-isolation
```
```python
from transformers import AutoModel
# Initialize the model
model = AutoModel.from_pretrained("jinaai/jina-embeddings-v3", trust_remote_code=True)
texts = [
"Follow the white rabbit.", # English
"Sigue al conejo blanco.", # Spanish
"Suis le lapin blanc.", # French
"跟着白兔走。", # Chinese
"اتبع الأرنب الأبيض.", # Arabic
"Folge dem weißen Kaninchen.", # German
]
# When calling the `encode` function, you can choose a `task` based on the use case:
# 'retrieval.query', 'retrieval.passage', 'separation', 'classification', 'text-matching'
# Alternatively, you can choose not to pass a `task`, and no specific LoRA adapter will be used.
embeddings = model.encode(texts, task="text-matching")
# Compute similarities
print(embeddings[0] @ embeddings[1].T)
```
By default, the model supports a maximum sequence length of 8192 tokens.
However, if you want to truncate your input texts to a shorter length, you can pass the `max_length` parameter to the `encode` function:
```python
embeddings = model.encode(["Very long ... document"], max_length=2048)
```
In case you want to use **Matryoshka embeddings** and switch to a different dimension,
you can adjust it by passing the `truncate_dim` parameter to the `encode` function:
```python
embeddings = model.encode(['Sample text'], truncate_dim=256)
```
The latest version (3.1.0) of [SentenceTransformers](https://github.com/UKPLab/sentence-transformers) also supports `jina-embeddings-v3`:
```bash
!pip install -U sentence-transformers
```
```python
from sentence_transformers import SentenceTransformer
model = SentenceTransformer("jinaai/jina-embeddings-v3", trust_remote_code=True)
task = "retrieval.query"
embeddings = model.encode(
["What is the weather like in Berlin today?"],
task=task,
prompt_name=task,
)
```
You can fine-tune `jina-embeddings-v3` using [SentenceTransformerTrainer](https://sbert.net/docs/package_reference/sentence_transformer/trainer.html).
To fine-tune for a specific task, you should set the task before passing the model to the ST Trainer, either during initialization:
```python
model = SentenceTransformer("jinaai/jina-embeddings-v3", trust_remote_code=True, model_kwargs={'default_task': 'classification'})
```
Or afterwards:
```python
model = SentenceTransformer("jinaai/jina-embeddings-v3", trust_remote_code=True)
model[0].default_task = 'classification'
```
This way you can fine-tune the LoRA adapter for the chosen task.
However, If you want to fine-tune the entire model, make sure the main parameters are set as trainable when loading the model:
```python
model = SentenceTransformer("jinaai/jina-embeddings-v3", trust_remote_code=True, model_kwargs={'lora_main_params_trainable': True})
```
This will allow fine-tuning the whole model instead of just the LoRA adapters.
**<details><summary>ONNX Inference.</summary>**
<p>
You can use ONNX for efficient inference with `jina-embeddings-v3`:
```python
import onnxruntime
import numpy as np
from transformers import AutoTokenizer, PretrainedConfig
# Load tokenizer and model config
tokenizer = AutoTokenizer.from_pretrained('jinaai/jina-embeddings-v3')
config = PretrainedConfig.from_pretrained('jinaai/jina-embeddings-v3')
# Tokenize input
input_text = tokenizer('sample text', return_tensors='np')
# ONNX session
model_path = 'jina-embeddings-v3/onnx/model.onnx'
session = onnxruntime.InferenceSession(model_path)
# Prepare inputs for ONNX model
task_type = 'text-matching'
task_id = np.array(config.lora_adaptations.index(task_type), dtype=np.int64)
inputs = {
'input_ids': input_text['input_ids'],
'attention_mask': input_text['attention_mask'],
'task_id': task_id
}
# Run model
outputs = session.run(None, inputs)
```
</p>
</details>
## Contact
Join our [Discord community](https://discord.jina.ai) and chat with other community members about ideas.
## License
`jina-embeddings-v3` is listed on AWS & Azure. If you need to use it beyond those platforms or on-premises within your company, note that the models is licensed under CC BY-NC 4.0. For commercial usage inquiries, feel free to [contact us](https://jina.ai/contact-sales/).
## Citation
If you find `jina-embeddings-v3` useful in your research, please cite the following paper:
```bibtex
@misc{sturua2024jinaembeddingsv3multilingualembeddingstask,
title={jina-embeddings-v3: Multilingual Embeddings With Task LoRA},
author={Saba Sturua and Isabelle Mohr and Mohammad Kalim Akram and Michael Günther and Bo Wang and Markus Krimmel and Feng Wang and Georgios Mastrapas and Andreas Koukounas and Andreas Koukounas and Nan Wang and Han Xiao},
year={2024},
eprint={2409.10173},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2409.10173},
}
```
| [
"SUMMARIZATION"
] | [
"BIOSSES",
"SCIFACT"
] |
khoa-klaytn/bge-base-en-v1.5-angle | khoa-klaytn | feature-extraction | [
"sentence-transformers",
"safetensors",
"bert",
"feature-extraction",
"sentence-similarity",
"transformers",
"mteb",
"en",
"arxiv:2310.07554",
"arxiv:2309.07597",
"license:mit",
"model-index",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
] | 2024-01-10T03:25:15 | 2024-01-10T03:25:20 | 745 | 2 | ---
language:
- en
license: mit
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
- mteb
model-index:
- name: bge-base-en-v1.5
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 76.14925373134328
- type: ap
value: 39.32336517995478
- type: f1
value: 70.16902252611425
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 93.386825
- type: ap
value: 90.21276917991995
- type: f1
value: 93.37741030006174
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 48.846000000000004
- type: f1
value: 48.14646269778261
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: arguana
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 40.754000000000005
- type: map_at_10
value: 55.761
- type: map_at_100
value: 56.330999999999996
- type: map_at_1000
value: 56.333999999999996
- type: map_at_3
value: 51.92
- type: map_at_5
value: 54.010999999999996
- type: mrr_at_1
value: 41.181
- type: mrr_at_10
value: 55.967999999999996
- type: mrr_at_100
value: 56.538
- type: mrr_at_1000
value: 56.542
- type: mrr_at_3
value: 51.980000000000004
- type: mrr_at_5
value: 54.208999999999996
- type: ndcg_at_1
value: 40.754000000000005
- type: ndcg_at_10
value: 63.605000000000004
- type: ndcg_at_100
value: 66.05199999999999
- type: ndcg_at_1000
value: 66.12
- type: ndcg_at_3
value: 55.708
- type: ndcg_at_5
value: 59.452000000000005
- type: precision_at_1
value: 40.754000000000005
- type: precision_at_10
value: 8.841000000000001
- type: precision_at_100
value: 0.991
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 22.238
- type: precision_at_5
value: 15.149000000000001
- type: recall_at_1
value: 40.754000000000005
- type: recall_at_10
value: 88.407
- type: recall_at_100
value: 99.14699999999999
- type: recall_at_1000
value: 99.644
- type: recall_at_3
value: 66.714
- type: recall_at_5
value: 75.747
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 48.74884539679369
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 42.8075893810716
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 62.128470519187736
- type: mrr
value: 74.28065778481289
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 89.24629081484655
- type: cos_sim_spearman
value: 86.93752309911496
- type: euclidean_pearson
value: 87.58589628573816
- type: euclidean_spearman
value: 88.05622328825284
- type: manhattan_pearson
value: 87.5594959805773
- type: manhattan_spearman
value: 88.19658793233961
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 86.9512987012987
- type: f1
value: 86.92515357973708
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 39.10263762928872
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 36.69711517426737
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 32.327
- type: map_at_10
value: 44.099
- type: map_at_100
value: 45.525
- type: map_at_1000
value: 45.641999999999996
- type: map_at_3
value: 40.47
- type: map_at_5
value: 42.36
- type: mrr_at_1
value: 39.199
- type: mrr_at_10
value: 49.651
- type: mrr_at_100
value: 50.29
- type: mrr_at_1000
value: 50.329
- type: mrr_at_3
value: 46.924
- type: mrr_at_5
value: 48.548
- type: ndcg_at_1
value: 39.199
- type: ndcg_at_10
value: 50.773
- type: ndcg_at_100
value: 55.67999999999999
- type: ndcg_at_1000
value: 57.495
- type: ndcg_at_3
value: 45.513999999999996
- type: ndcg_at_5
value: 47.703
- type: precision_at_1
value: 39.199
- type: precision_at_10
value: 9.914000000000001
- type: precision_at_100
value: 1.5310000000000001
- type: precision_at_1000
value: 0.198
- type: precision_at_3
value: 21.984
- type: precision_at_5
value: 15.737000000000002
- type: recall_at_1
value: 32.327
- type: recall_at_10
value: 63.743
- type: recall_at_100
value: 84.538
- type: recall_at_1000
value: 96.089
- type: recall_at_3
value: 48.065000000000005
- type: recall_at_5
value: 54.519
- type: map_at_1
value: 32.671
- type: map_at_10
value: 42.954
- type: map_at_100
value: 44.151
- type: map_at_1000
value: 44.287
- type: map_at_3
value: 39.912
- type: map_at_5
value: 41.798
- type: mrr_at_1
value: 41.465
- type: mrr_at_10
value: 49.351
- type: mrr_at_100
value: 49.980000000000004
- type: mrr_at_1000
value: 50.016000000000005
- type: mrr_at_3
value: 47.144000000000005
- type: mrr_at_5
value: 48.592999999999996
- type: ndcg_at_1
value: 41.465
- type: ndcg_at_10
value: 48.565999999999995
- type: ndcg_at_100
value: 52.76499999999999
- type: ndcg_at_1000
value: 54.749
- type: ndcg_at_3
value: 44.57
- type: ndcg_at_5
value: 46.759
- type: precision_at_1
value: 41.465
- type: precision_at_10
value: 9.107999999999999
- type: precision_at_100
value: 1.433
- type: precision_at_1000
value: 0.191
- type: precision_at_3
value: 21.423000000000002
- type: precision_at_5
value: 15.414
- type: recall_at_1
value: 32.671
- type: recall_at_10
value: 57.738
- type: recall_at_100
value: 75.86500000000001
- type: recall_at_1000
value: 88.36
- type: recall_at_3
value: 45.626
- type: recall_at_5
value: 51.812000000000005
- type: map_at_1
value: 41.185
- type: map_at_10
value: 53.929
- type: map_at_100
value: 54.92
- type: map_at_1000
value: 54.967999999999996
- type: map_at_3
value: 50.70400000000001
- type: map_at_5
value: 52.673
- type: mrr_at_1
value: 47.398
- type: mrr_at_10
value: 57.303000000000004
- type: mrr_at_100
value: 57.959
- type: mrr_at_1000
value: 57.985
- type: mrr_at_3
value: 54.932
- type: mrr_at_5
value: 56.464999999999996
- type: ndcg_at_1
value: 47.398
- type: ndcg_at_10
value: 59.653
- type: ndcg_at_100
value: 63.627
- type: ndcg_at_1000
value: 64.596
- type: ndcg_at_3
value: 54.455
- type: ndcg_at_5
value: 57.245000000000005
- type: precision_at_1
value: 47.398
- type: precision_at_10
value: 9.524000000000001
- type: precision_at_100
value: 1.243
- type: precision_at_1000
value: 0.13699999999999998
- type: precision_at_3
value: 24.389
- type: precision_at_5
value: 16.752
- type: recall_at_1
value: 41.185
- type: recall_at_10
value: 73.193
- type: recall_at_100
value: 90.357
- type: recall_at_1000
value: 97.253
- type: recall_at_3
value: 59.199999999999996
- type: recall_at_5
value: 66.118
- type: map_at_1
value: 27.27
- type: map_at_10
value: 36.223
- type: map_at_100
value: 37.218
- type: map_at_1000
value: 37.293
- type: map_at_3
value: 33.503
- type: map_at_5
value: 35.097
- type: mrr_at_1
value: 29.492
- type: mrr_at_10
value: 38.352000000000004
- type: mrr_at_100
value: 39.188
- type: mrr_at_1000
value: 39.247
- type: mrr_at_3
value: 35.876000000000005
- type: mrr_at_5
value: 37.401
- type: ndcg_at_1
value: 29.492
- type: ndcg_at_10
value: 41.239
- type: ndcg_at_100
value: 46.066
- type: ndcg_at_1000
value: 47.992000000000004
- type: ndcg_at_3
value: 36.11
- type: ndcg_at_5
value: 38.772
- type: precision_at_1
value: 29.492
- type: precision_at_10
value: 6.260000000000001
- type: precision_at_100
value: 0.914
- type: precision_at_1000
value: 0.11100000000000002
- type: precision_at_3
value: 15.104000000000001
- type: precision_at_5
value: 10.644
- type: recall_at_1
value: 27.27
- type: recall_at_10
value: 54.589
- type: recall_at_100
value: 76.70700000000001
- type: recall_at_1000
value: 91.158
- type: recall_at_3
value: 40.974
- type: recall_at_5
value: 47.327000000000005
- type: map_at_1
value: 17.848
- type: map_at_10
value: 26.207
- type: map_at_100
value: 27.478
- type: map_at_1000
value: 27.602
- type: map_at_3
value: 23.405
- type: map_at_5
value: 24.98
- type: mrr_at_1
value: 21.891
- type: mrr_at_10
value: 31.041999999999998
- type: mrr_at_100
value: 32.092
- type: mrr_at_1000
value: 32.151999999999994
- type: mrr_at_3
value: 28.358
- type: mrr_at_5
value: 29.969
- type: ndcg_at_1
value: 21.891
- type: ndcg_at_10
value: 31.585
- type: ndcg_at_100
value: 37.531
- type: ndcg_at_1000
value: 40.256
- type: ndcg_at_3
value: 26.508
- type: ndcg_at_5
value: 28.894
- type: precision_at_1
value: 21.891
- type: precision_at_10
value: 5.795999999999999
- type: precision_at_100
value: 0.9990000000000001
- type: precision_at_1000
value: 0.13799999999999998
- type: precision_at_3
value: 12.769
- type: precision_at_5
value: 9.279
- type: recall_at_1
value: 17.848
- type: recall_at_10
value: 43.452
- type: recall_at_100
value: 69.216
- type: recall_at_1000
value: 88.102
- type: recall_at_3
value: 29.18
- type: recall_at_5
value: 35.347
- type: map_at_1
value: 30.94
- type: map_at_10
value: 41.248000000000005
- type: map_at_100
value: 42.495
- type: map_at_1000
value: 42.602000000000004
- type: map_at_3
value: 37.939
- type: map_at_5
value: 39.924
- type: mrr_at_1
value: 37.824999999999996
- type: mrr_at_10
value: 47.041
- type: mrr_at_100
value: 47.83
- type: mrr_at_1000
value: 47.878
- type: mrr_at_3
value: 44.466
- type: mrr_at_5
value: 46.111999999999995
- type: ndcg_at_1
value: 37.824999999999996
- type: ndcg_at_10
value: 47.223
- type: ndcg_at_100
value: 52.394
- type: ndcg_at_1000
value: 54.432
- type: ndcg_at_3
value: 42.032000000000004
- type: ndcg_at_5
value: 44.772
- type: precision_at_1
value: 37.824999999999996
- type: precision_at_10
value: 8.393
- type: precision_at_100
value: 1.2890000000000001
- type: precision_at_1000
value: 0.164
- type: precision_at_3
value: 19.698
- type: precision_at_5
value: 14.013
- type: recall_at_1
value: 30.94
- type: recall_at_10
value: 59.316
- type: recall_at_100
value: 80.783
- type: recall_at_1000
value: 94.15400000000001
- type: recall_at_3
value: 44.712
- type: recall_at_5
value: 51.932
- type: map_at_1
value: 27.104
- type: map_at_10
value: 36.675999999999995
- type: map_at_100
value: 38.076
- type: map_at_1000
value: 38.189
- type: map_at_3
value: 33.733999999999995
- type: map_at_5
value: 35.287
- type: mrr_at_1
value: 33.904
- type: mrr_at_10
value: 42.55
- type: mrr_at_100
value: 43.434
- type: mrr_at_1000
value: 43.494
- type: mrr_at_3
value: 40.126
- type: mrr_at_5
value: 41.473
- type: ndcg_at_1
value: 33.904
- type: ndcg_at_10
value: 42.414
- type: ndcg_at_100
value: 48.203
- type: ndcg_at_1000
value: 50.437
- type: ndcg_at_3
value: 37.633
- type: ndcg_at_5
value: 39.67
- type: precision_at_1
value: 33.904
- type: precision_at_10
value: 7.82
- type: precision_at_100
value: 1.2409999999999999
- type: precision_at_1000
value: 0.159
- type: precision_at_3
value: 17.884
- type: precision_at_5
value: 12.648000000000001
- type: recall_at_1
value: 27.104
- type: recall_at_10
value: 53.563
- type: recall_at_100
value: 78.557
- type: recall_at_1000
value: 93.533
- type: recall_at_3
value: 39.92
- type: recall_at_5
value: 45.457
- type: map_at_1
value: 27.707749999999997
- type: map_at_10
value: 36.961
- type: map_at_100
value: 38.158833333333334
- type: map_at_1000
value: 38.270333333333326
- type: map_at_3
value: 34.07183333333334
- type: map_at_5
value: 35.69533333333334
- type: mrr_at_1
value: 32.81875
- type: mrr_at_10
value: 41.293
- type: mrr_at_100
value: 42.116499999999995
- type: mrr_at_1000
value: 42.170249999999996
- type: mrr_at_3
value: 38.83983333333333
- type: mrr_at_5
value: 40.29775
- type: ndcg_at_1
value: 32.81875
- type: ndcg_at_10
value: 42.355
- type: ndcg_at_100
value: 47.41374999999999
- type: ndcg_at_1000
value: 49.5805
- type: ndcg_at_3
value: 37.52825
- type: ndcg_at_5
value: 39.83266666666667
- type: precision_at_1
value: 32.81875
- type: precision_at_10
value: 7.382416666666666
- type: precision_at_100
value: 1.1640833333333334
- type: precision_at_1000
value: 0.15383333333333335
- type: precision_at_3
value: 17.134166666666665
- type: precision_at_5
value: 12.174833333333336
- type: recall_at_1
value: 27.707749999999997
- type: recall_at_10
value: 53.945
- type: recall_at_100
value: 76.191
- type: recall_at_1000
value: 91.101
- type: recall_at_3
value: 40.39083333333334
- type: recall_at_5
value: 46.40083333333333
- type: map_at_1
value: 26.482
- type: map_at_10
value: 33.201
- type: map_at_100
value: 34.107
- type: map_at_1000
value: 34.197
- type: map_at_3
value: 31.174000000000003
- type: map_at_5
value: 32.279
- type: mrr_at_1
value: 29.908
- type: mrr_at_10
value: 36.235
- type: mrr_at_100
value: 37.04
- type: mrr_at_1000
value: 37.105
- type: mrr_at_3
value: 34.355999999999995
- type: mrr_at_5
value: 35.382999999999996
- type: ndcg_at_1
value: 29.908
- type: ndcg_at_10
value: 37.325
- type: ndcg_at_100
value: 41.795
- type: ndcg_at_1000
value: 44.105
- type: ndcg_at_3
value: 33.555
- type: ndcg_at_5
value: 35.266999999999996
- type: precision_at_1
value: 29.908
- type: precision_at_10
value: 5.721
- type: precision_at_100
value: 0.8630000000000001
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_3
value: 14.008000000000001
- type: precision_at_5
value: 9.754999999999999
- type: recall_at_1
value: 26.482
- type: recall_at_10
value: 47.072
- type: recall_at_100
value: 67.27
- type: recall_at_1000
value: 84.371
- type: recall_at_3
value: 36.65
- type: recall_at_5
value: 40.774
- type: map_at_1
value: 18.815
- type: map_at_10
value: 26.369999999999997
- type: map_at_100
value: 27.458
- type: map_at_1000
value: 27.588
- type: map_at_3
value: 23.990000000000002
- type: map_at_5
value: 25.345000000000002
- type: mrr_at_1
value: 22.953000000000003
- type: mrr_at_10
value: 30.342999999999996
- type: mrr_at_100
value: 31.241000000000003
- type: mrr_at_1000
value: 31.319000000000003
- type: mrr_at_3
value: 28.16
- type: mrr_at_5
value: 29.406
- type: ndcg_at_1
value: 22.953000000000003
- type: ndcg_at_10
value: 31.151
- type: ndcg_at_100
value: 36.309000000000005
- type: ndcg_at_1000
value: 39.227000000000004
- type: ndcg_at_3
value: 26.921
- type: ndcg_at_5
value: 28.938000000000002
- type: precision_at_1
value: 22.953000000000003
- type: precision_at_10
value: 5.602
- type: precision_at_100
value: 0.9530000000000001
- type: precision_at_1000
value: 0.13899999999999998
- type: precision_at_3
value: 12.606
- type: precision_at_5
value: 9.119
- type: recall_at_1
value: 18.815
- type: recall_at_10
value: 41.574
- type: recall_at_100
value: 64.84400000000001
- type: recall_at_1000
value: 85.406
- type: recall_at_3
value: 29.694
- type: recall_at_5
value: 34.935
- type: map_at_1
value: 27.840999999999998
- type: map_at_10
value: 36.797999999999995
- type: map_at_100
value: 37.993
- type: map_at_1000
value: 38.086999999999996
- type: map_at_3
value: 34.050999999999995
- type: map_at_5
value: 35.379
- type: mrr_at_1
value: 32.649
- type: mrr_at_10
value: 41.025
- type: mrr_at_100
value: 41.878
- type: mrr_at_1000
value: 41.929
- type: mrr_at_3
value: 38.573
- type: mrr_at_5
value: 39.715
- type: ndcg_at_1
value: 32.649
- type: ndcg_at_10
value: 42.142
- type: ndcg_at_100
value: 47.558
- type: ndcg_at_1000
value: 49.643
- type: ndcg_at_3
value: 37.12
- type: ndcg_at_5
value: 38.983000000000004
- type: precision_at_1
value: 32.649
- type: precision_at_10
value: 7.08
- type: precision_at_100
value: 1.1039999999999999
- type: precision_at_1000
value: 0.13899999999999998
- type: precision_at_3
value: 16.698
- type: precision_at_5
value: 11.511000000000001
- type: recall_at_1
value: 27.840999999999998
- type: recall_at_10
value: 54.245
- type: recall_at_100
value: 77.947
- type: recall_at_1000
value: 92.36999999999999
- type: recall_at_3
value: 40.146
- type: recall_at_5
value: 44.951
- type: map_at_1
value: 26.529000000000003
- type: map_at_10
value: 35.010000000000005
- type: map_at_100
value: 36.647
- type: map_at_1000
value: 36.857
- type: map_at_3
value: 31.968000000000004
- type: map_at_5
value: 33.554
- type: mrr_at_1
value: 31.818
- type: mrr_at_10
value: 39.550999999999995
- type: mrr_at_100
value: 40.54
- type: mrr_at_1000
value: 40.596
- type: mrr_at_3
value: 36.726
- type: mrr_at_5
value: 38.416
- type: ndcg_at_1
value: 31.818
- type: ndcg_at_10
value: 40.675
- type: ndcg_at_100
value: 46.548
- type: ndcg_at_1000
value: 49.126
- type: ndcg_at_3
value: 35.829
- type: ndcg_at_5
value: 38.0
- type: precision_at_1
value: 31.818
- type: precision_at_10
value: 7.826
- type: precision_at_100
value: 1.538
- type: precision_at_1000
value: 0.24
- type: precision_at_3
value: 16.601
- type: precision_at_5
value: 12.095
- type: recall_at_1
value: 26.529000000000003
- type: recall_at_10
value: 51.03
- type: recall_at_100
value: 77.556
- type: recall_at_1000
value: 93.804
- type: recall_at_3
value: 36.986000000000004
- type: recall_at_5
value: 43.096000000000004
- type: map_at_1
value: 23.480999999999998
- type: map_at_10
value: 30.817
- type: map_at_100
value: 31.838
- type: map_at_1000
value: 31.932
- type: map_at_3
value: 28.011999999999997
- type: map_at_5
value: 29.668
- type: mrr_at_1
value: 25.323
- type: mrr_at_10
value: 33.072
- type: mrr_at_100
value: 33.926
- type: mrr_at_1000
value: 33.993
- type: mrr_at_3
value: 30.436999999999998
- type: mrr_at_5
value: 32.092
- type: ndcg_at_1
value: 25.323
- type: ndcg_at_10
value: 35.514
- type: ndcg_at_100
value: 40.489000000000004
- type: ndcg_at_1000
value: 42.908
- type: ndcg_at_3
value: 30.092000000000002
- type: ndcg_at_5
value: 32.989000000000004
- type: precision_at_1
value: 25.323
- type: precision_at_10
value: 5.545
- type: precision_at_100
value: 0.861
- type: precision_at_1000
value: 0.117
- type: precision_at_3
value: 12.446
- type: precision_at_5
value: 9.131
- type: recall_at_1
value: 23.480999999999998
- type: recall_at_10
value: 47.825
- type: recall_at_100
value: 70.652
- type: recall_at_1000
value: 88.612
- type: recall_at_3
value: 33.537
- type: recall_at_5
value: 40.542
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: climate-fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 13.333999999999998
- type: map_at_10
value: 22.524
- type: map_at_100
value: 24.506
- type: map_at_1000
value: 24.715
- type: map_at_3
value: 19.022
- type: map_at_5
value: 20.693
- type: mrr_at_1
value: 29.186
- type: mrr_at_10
value: 41.22
- type: mrr_at_100
value: 42.16
- type: mrr_at_1000
value: 42.192
- type: mrr_at_3
value: 38.013000000000005
- type: mrr_at_5
value: 39.704
- type: ndcg_at_1
value: 29.186
- type: ndcg_at_10
value: 31.167
- type: ndcg_at_100
value: 38.879000000000005
- type: ndcg_at_1000
value: 42.376000000000005
- type: ndcg_at_3
value: 25.817
- type: ndcg_at_5
value: 27.377000000000002
- type: precision_at_1
value: 29.186
- type: precision_at_10
value: 9.693999999999999
- type: precision_at_100
value: 1.8030000000000002
- type: precision_at_1000
value: 0.246
- type: precision_at_3
value: 19.11
- type: precision_at_5
value: 14.344999999999999
- type: recall_at_1
value: 13.333999999999998
- type: recall_at_10
value: 37.092000000000006
- type: recall_at_100
value: 63.651
- type: recall_at_1000
value: 83.05
- type: recall_at_3
value: 23.74
- type: recall_at_5
value: 28.655
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: dbpedia-entity
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 9.151
- type: map_at_10
value: 19.653000000000002
- type: map_at_100
value: 28.053
- type: map_at_1000
value: 29.709000000000003
- type: map_at_3
value: 14.191
- type: map_at_5
value: 16.456
- type: mrr_at_1
value: 66.25
- type: mrr_at_10
value: 74.4
- type: mrr_at_100
value: 74.715
- type: mrr_at_1000
value: 74.726
- type: mrr_at_3
value: 72.417
- type: mrr_at_5
value: 73.667
- type: ndcg_at_1
value: 54.25
- type: ndcg_at_10
value: 40.77
- type: ndcg_at_100
value: 46.359
- type: ndcg_at_1000
value: 54.193000000000005
- type: ndcg_at_3
value: 44.832
- type: ndcg_at_5
value: 42.63
- type: precision_at_1
value: 66.25
- type: precision_at_10
value: 32.175
- type: precision_at_100
value: 10.668
- type: precision_at_1000
value: 2.067
- type: precision_at_3
value: 47.667
- type: precision_at_5
value: 41.3
- type: recall_at_1
value: 9.151
- type: recall_at_10
value: 25.003999999999998
- type: recall_at_100
value: 52.976
- type: recall_at_1000
value: 78.315
- type: recall_at_3
value: 15.487
- type: recall_at_5
value: 18.999
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 51.89999999999999
- type: f1
value: 46.47777925067403
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 73.706
- type: map_at_10
value: 82.423
- type: map_at_100
value: 82.67999999999999
- type: map_at_1000
value: 82.694
- type: map_at_3
value: 81.328
- type: map_at_5
value: 82.001
- type: mrr_at_1
value: 79.613
- type: mrr_at_10
value: 87.07000000000001
- type: mrr_at_100
value: 87.169
- type: mrr_at_1000
value: 87.17
- type: mrr_at_3
value: 86.404
- type: mrr_at_5
value: 86.856
- type: ndcg_at_1
value: 79.613
- type: ndcg_at_10
value: 86.289
- type: ndcg_at_100
value: 87.201
- type: ndcg_at_1000
value: 87.428
- type: ndcg_at_3
value: 84.625
- type: ndcg_at_5
value: 85.53699999999999
- type: precision_at_1
value: 79.613
- type: precision_at_10
value: 10.399
- type: precision_at_100
value: 1.1079999999999999
- type: precision_at_1000
value: 0.11499999999999999
- type: precision_at_3
value: 32.473
- type: precision_at_5
value: 20.132
- type: recall_at_1
value: 73.706
- type: recall_at_10
value: 93.559
- type: recall_at_100
value: 97.188
- type: recall_at_1000
value: 98.555
- type: recall_at_3
value: 88.98700000000001
- type: recall_at_5
value: 91.373
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: fiqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 19.841
- type: map_at_10
value: 32.643
- type: map_at_100
value: 34.575
- type: map_at_1000
value: 34.736
- type: map_at_3
value: 28.317999999999998
- type: map_at_5
value: 30.964000000000002
- type: mrr_at_1
value: 39.660000000000004
- type: mrr_at_10
value: 48.620000000000005
- type: mrr_at_100
value: 49.384
- type: mrr_at_1000
value: 49.415
- type: mrr_at_3
value: 45.988
- type: mrr_at_5
value: 47.361
- type: ndcg_at_1
value: 39.660000000000004
- type: ndcg_at_10
value: 40.646
- type: ndcg_at_100
value: 47.657
- type: ndcg_at_1000
value: 50.428
- type: ndcg_at_3
value: 36.689
- type: ndcg_at_5
value: 38.211
- type: precision_at_1
value: 39.660000000000004
- type: precision_at_10
value: 11.235000000000001
- type: precision_at_100
value: 1.8530000000000002
- type: precision_at_1000
value: 0.23600000000000002
- type: precision_at_3
value: 24.587999999999997
- type: precision_at_5
value: 18.395
- type: recall_at_1
value: 19.841
- type: recall_at_10
value: 48.135
- type: recall_at_100
value: 74.224
- type: recall_at_1000
value: 90.826
- type: recall_at_3
value: 33.536
- type: recall_at_5
value: 40.311
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: hotpotqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 40.358
- type: map_at_10
value: 64.497
- type: map_at_100
value: 65.362
- type: map_at_1000
value: 65.41900000000001
- type: map_at_3
value: 61.06700000000001
- type: map_at_5
value: 63.317
- type: mrr_at_1
value: 80.716
- type: mrr_at_10
value: 86.10799999999999
- type: mrr_at_100
value: 86.265
- type: mrr_at_1000
value: 86.27
- type: mrr_at_3
value: 85.271
- type: mrr_at_5
value: 85.82499999999999
- type: ndcg_at_1
value: 80.716
- type: ndcg_at_10
value: 72.597
- type: ndcg_at_100
value: 75.549
- type: ndcg_at_1000
value: 76.61
- type: ndcg_at_3
value: 67.874
- type: ndcg_at_5
value: 70.655
- type: precision_at_1
value: 80.716
- type: precision_at_10
value: 15.148
- type: precision_at_100
value: 1.745
- type: precision_at_1000
value: 0.188
- type: precision_at_3
value: 43.597
- type: precision_at_5
value: 28.351
- type: recall_at_1
value: 40.358
- type: recall_at_10
value: 75.739
- type: recall_at_100
value: 87.259
- type: recall_at_1000
value: 94.234
- type: recall_at_3
value: 65.39500000000001
- type: recall_at_5
value: 70.878
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 90.80799999999998
- type: ap
value: 86.81350378180757
- type: f1
value: 90.79901248314215
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: msmarco
config: default
split: dev
revision: None
metrics:
- type: map_at_1
value: 22.096
- type: map_at_10
value: 34.384
- type: map_at_100
value: 35.541
- type: map_at_1000
value: 35.589999999999996
- type: map_at_3
value: 30.496000000000002
- type: map_at_5
value: 32.718
- type: mrr_at_1
value: 22.750999999999998
- type: mrr_at_10
value: 35.024
- type: mrr_at_100
value: 36.125
- type: mrr_at_1000
value: 36.168
- type: mrr_at_3
value: 31.225
- type: mrr_at_5
value: 33.416000000000004
- type: ndcg_at_1
value: 22.750999999999998
- type: ndcg_at_10
value: 41.351
- type: ndcg_at_100
value: 46.92
- type: ndcg_at_1000
value: 48.111
- type: ndcg_at_3
value: 33.439
- type: ndcg_at_5
value: 37.407000000000004
- type: precision_at_1
value: 22.750999999999998
- type: precision_at_10
value: 6.564
- type: precision_at_100
value: 0.935
- type: precision_at_1000
value: 0.104
- type: precision_at_3
value: 14.288
- type: precision_at_5
value: 10.581999999999999
- type: recall_at_1
value: 22.096
- type: recall_at_10
value: 62.771
- type: recall_at_100
value: 88.529
- type: recall_at_1000
value: 97.55
- type: recall_at_3
value: 41.245
- type: recall_at_5
value: 50.788
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 94.16780665754673
- type: f1
value: 93.96331194859894
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 76.90606475148198
- type: f1
value: 58.58344986604187
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 76.14660390047075
- type: f1
value: 74.31533923533614
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 80.16139878950908
- type: f1
value: 80.18532656824924
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 32.949880906135085
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 31.56300351524862
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 31.196521894371315
- type: mrr
value: 32.22644231694389
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: nfcorpus
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 6.783
- type: map_at_10
value: 14.549000000000001
- type: map_at_100
value: 18.433
- type: map_at_1000
value: 19.949
- type: map_at_3
value: 10.936
- type: map_at_5
value: 12.514
- type: mrr_at_1
value: 47.368
- type: mrr_at_10
value: 56.42
- type: mrr_at_100
value: 56.908
- type: mrr_at_1000
value: 56.95
- type: mrr_at_3
value: 54.283
- type: mrr_at_5
value: 55.568
- type: ndcg_at_1
value: 45.666000000000004
- type: ndcg_at_10
value: 37.389
- type: ndcg_at_100
value: 34.253
- type: ndcg_at_1000
value: 43.059999999999995
- type: ndcg_at_3
value: 42.725
- type: ndcg_at_5
value: 40.193
- type: precision_at_1
value: 47.368
- type: precision_at_10
value: 27.988000000000003
- type: precision_at_100
value: 8.672
- type: precision_at_1000
value: 2.164
- type: precision_at_3
value: 40.248
- type: precision_at_5
value: 34.737
- type: recall_at_1
value: 6.783
- type: recall_at_10
value: 17.838
- type: recall_at_100
value: 33.672000000000004
- type: recall_at_1000
value: 66.166
- type: recall_at_3
value: 11.849
- type: recall_at_5
value: 14.205000000000002
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: nq
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 31.698999999999998
- type: map_at_10
value: 46.556
- type: map_at_100
value: 47.652
- type: map_at_1000
value: 47.68
- type: map_at_3
value: 42.492000000000004
- type: map_at_5
value: 44.763999999999996
- type: mrr_at_1
value: 35.747
- type: mrr_at_10
value: 49.242999999999995
- type: mrr_at_100
value: 50.052
- type: mrr_at_1000
value: 50.068
- type: mrr_at_3
value: 45.867000000000004
- type: mrr_at_5
value: 47.778999999999996
- type: ndcg_at_1
value: 35.717999999999996
- type: ndcg_at_10
value: 54.14600000000001
- type: ndcg_at_100
value: 58.672999999999995
- type: ndcg_at_1000
value: 59.279
- type: ndcg_at_3
value: 46.407
- type: ndcg_at_5
value: 50.181
- type: precision_at_1
value: 35.717999999999996
- type: precision_at_10
value: 8.844000000000001
- type: precision_at_100
value: 1.139
- type: precision_at_1000
value: 0.12
- type: precision_at_3
value: 20.993000000000002
- type: precision_at_5
value: 14.791000000000002
- type: recall_at_1
value: 31.698999999999998
- type: recall_at_10
value: 74.693
- type: recall_at_100
value: 94.15299999999999
- type: recall_at_1000
value: 98.585
- type: recall_at_3
value: 54.388999999999996
- type: recall_at_5
value: 63.08200000000001
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: quora
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 71.283
- type: map_at_10
value: 85.24000000000001
- type: map_at_100
value: 85.882
- type: map_at_1000
value: 85.897
- type: map_at_3
value: 82.326
- type: map_at_5
value: 84.177
- type: mrr_at_1
value: 82.21000000000001
- type: mrr_at_10
value: 88.228
- type: mrr_at_100
value: 88.32
- type: mrr_at_1000
value: 88.32
- type: mrr_at_3
value: 87.323
- type: mrr_at_5
value: 87.94800000000001
- type: ndcg_at_1
value: 82.17999999999999
- type: ndcg_at_10
value: 88.9
- type: ndcg_at_100
value: 90.079
- type: ndcg_at_1000
value: 90.158
- type: ndcg_at_3
value: 86.18299999999999
- type: ndcg_at_5
value: 87.71799999999999
- type: precision_at_1
value: 82.17999999999999
- type: precision_at_10
value: 13.464
- type: precision_at_100
value: 1.533
- type: precision_at_1000
value: 0.157
- type: precision_at_3
value: 37.693
- type: precision_at_5
value: 24.792
- type: recall_at_1
value: 71.283
- type: recall_at_10
value: 95.742
- type: recall_at_100
value: 99.67200000000001
- type: recall_at_1000
value: 99.981
- type: recall_at_3
value: 87.888
- type: recall_at_5
value: 92.24
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 56.24267063669042
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 62.88056988932578
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: scidocs
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 4.903
- type: map_at_10
value: 13.202
- type: map_at_100
value: 15.5
- type: map_at_1000
value: 15.870999999999999
- type: map_at_3
value: 9.407
- type: map_at_5
value: 11.238
- type: mrr_at_1
value: 24.2
- type: mrr_at_10
value: 35.867
- type: mrr_at_100
value: 37.001
- type: mrr_at_1000
value: 37.043
- type: mrr_at_3
value: 32.5
- type: mrr_at_5
value: 34.35
- type: ndcg_at_1
value: 24.2
- type: ndcg_at_10
value: 21.731
- type: ndcg_at_100
value: 30.7
- type: ndcg_at_1000
value: 36.618
- type: ndcg_at_3
value: 20.72
- type: ndcg_at_5
value: 17.954
- type: precision_at_1
value: 24.2
- type: precision_at_10
value: 11.33
- type: precision_at_100
value: 2.4410000000000003
- type: precision_at_1000
value: 0.386
- type: precision_at_3
value: 19.667
- type: precision_at_5
value: 15.86
- type: recall_at_1
value: 4.903
- type: recall_at_10
value: 22.962
- type: recall_at_100
value: 49.563
- type: recall_at_1000
value: 78.238
- type: recall_at_3
value: 11.953
- type: recall_at_5
value: 16.067999999999998
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 84.12694254604078
- type: cos_sim_spearman
value: 80.30141815181918
- type: euclidean_pearson
value: 81.34015449877128
- type: euclidean_spearman
value: 80.13984197010849
- type: manhattan_pearson
value: 81.31767068124086
- type: manhattan_spearman
value: 80.11720513114103
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 86.13112984010417
- type: cos_sim_spearman
value: 78.03063573402875
- type: euclidean_pearson
value: 83.51928418844804
- type: euclidean_spearman
value: 78.4045235411144
- type: manhattan_pearson
value: 83.49981637388689
- type: manhattan_spearman
value: 78.4042575139372
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 82.50327987379504
- type: cos_sim_spearman
value: 84.18556767756205
- type: euclidean_pearson
value: 82.69684424327679
- type: euclidean_spearman
value: 83.5368106038335
- type: manhattan_pearson
value: 82.57967581007374
- type: manhattan_spearman
value: 83.43009053133697
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 82.50756863007814
- type: cos_sim_spearman
value: 82.27204331279108
- type: euclidean_pearson
value: 81.39535251429741
- type: euclidean_spearman
value: 81.84386626336239
- type: manhattan_pearson
value: 81.34281737280695
- type: manhattan_spearman
value: 81.81149375673166
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 86.8727714856726
- type: cos_sim_spearman
value: 87.95738287792312
- type: euclidean_pearson
value: 86.62920602795887
- type: euclidean_spearman
value: 87.05207355381243
- type: manhattan_pearson
value: 86.53587918472225
- type: manhattan_spearman
value: 86.95382961029586
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 83.52240359769479
- type: cos_sim_spearman
value: 85.47685776238286
- type: euclidean_pearson
value: 84.25815333483058
- type: euclidean_spearman
value: 85.27415639683198
- type: manhattan_pearson
value: 84.29127757025637
- type: manhattan_spearman
value: 85.30226224917351
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 86.42501708915708
- type: cos_sim_spearman
value: 86.42276182795041
- type: euclidean_pearson
value: 86.5408207354761
- type: euclidean_spearman
value: 85.46096321750838
- type: manhattan_pearson
value: 86.54177303026881
- type: manhattan_spearman
value: 85.50313151916117
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 64.86521089250766
- type: cos_sim_spearman
value: 65.94868540323003
- type: euclidean_pearson
value: 67.16569626533084
- type: euclidean_spearman
value: 66.37667004134917
- type: manhattan_pearson
value: 67.1482365102333
- type: manhattan_spearman
value: 66.53240122580029
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 84.64746265365318
- type: cos_sim_spearman
value: 86.41888825906786
- type: euclidean_pearson
value: 85.27453642725811
- type: euclidean_spearman
value: 85.94095796602544
- type: manhattan_pearson
value: 85.28643660505334
- type: manhattan_spearman
value: 85.95028003260744
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 87.48903153618527
- type: mrr
value: 96.41081503826601
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: scifact
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 58.594
- type: map_at_10
value: 69.296
- type: map_at_100
value: 69.782
- type: map_at_1000
value: 69.795
- type: map_at_3
value: 66.23
- type: map_at_5
value: 68.293
- type: mrr_at_1
value: 61.667
- type: mrr_at_10
value: 70.339
- type: mrr_at_100
value: 70.708
- type: mrr_at_1000
value: 70.722
- type: mrr_at_3
value: 68.0
- type: mrr_at_5
value: 69.56700000000001
- type: ndcg_at_1
value: 61.667
- type: ndcg_at_10
value: 74.039
- type: ndcg_at_100
value: 76.103
- type: ndcg_at_1000
value: 76.47800000000001
- type: ndcg_at_3
value: 68.967
- type: ndcg_at_5
value: 71.96900000000001
- type: precision_at_1
value: 61.667
- type: precision_at_10
value: 9.866999999999999
- type: precision_at_100
value: 1.097
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_3
value: 27.111
- type: precision_at_5
value: 18.2
- type: recall_at_1
value: 58.594
- type: recall_at_10
value: 87.422
- type: recall_at_100
value: 96.667
- type: recall_at_1000
value: 99.667
- type: recall_at_3
value: 74.217
- type: recall_at_5
value: 81.539
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.85049504950496
- type: cos_sim_ap
value: 96.33111544137081
- type: cos_sim_f1
value: 92.35443037974684
- type: cos_sim_precision
value: 93.53846153846153
- type: cos_sim_recall
value: 91.2
- type: dot_accuracy
value: 99.82376237623762
- type: dot_ap
value: 95.38082527310888
- type: dot_f1
value: 90.90909090909092
- type: dot_precision
value: 92.90187891440502
- type: dot_recall
value: 89.0
- type: euclidean_accuracy
value: 99.84851485148515
- type: euclidean_ap
value: 96.32316003996347
- type: euclidean_f1
value: 92.2071392659628
- type: euclidean_precision
value: 92.71991911021233
- type: euclidean_recall
value: 91.7
- type: manhattan_accuracy
value: 99.84851485148515
- type: manhattan_ap
value: 96.3655668249217
- type: manhattan_f1
value: 92.18356026222895
- type: manhattan_precision
value: 92.98067141403867
- type: manhattan_recall
value: 91.4
- type: max_accuracy
value: 99.85049504950496
- type: max_ap
value: 96.3655668249217
- type: max_f1
value: 92.35443037974684
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 65.94861371629051
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 35.009430451385
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 54.61164066427969
- type: mrr
value: 55.49710603938544
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 30.622620124907662
- type: cos_sim_spearman
value: 31.0678351356163
- type: dot_pearson
value: 30.863727693306814
- type: dot_spearman
value: 31.230306567021255
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: trec-covid
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 0.22
- type: map_at_10
value: 2.011
- type: map_at_100
value: 10.974
- type: map_at_1000
value: 25.819
- type: map_at_3
value: 0.6649999999999999
- type: map_at_5
value: 1.076
- type: mrr_at_1
value: 86.0
- type: mrr_at_10
value: 91.8
- type: mrr_at_100
value: 91.8
- type: mrr_at_1000
value: 91.8
- type: mrr_at_3
value: 91.0
- type: mrr_at_5
value: 91.8
- type: ndcg_at_1
value: 82.0
- type: ndcg_at_10
value: 78.07300000000001
- type: ndcg_at_100
value: 58.231
- type: ndcg_at_1000
value: 51.153000000000006
- type: ndcg_at_3
value: 81.123
- type: ndcg_at_5
value: 81.059
- type: precision_at_1
value: 86.0
- type: precision_at_10
value: 83.0
- type: precision_at_100
value: 59.38
- type: precision_at_1000
value: 22.55
- type: precision_at_3
value: 87.333
- type: precision_at_5
value: 86.8
- type: recall_at_1
value: 0.22
- type: recall_at_10
value: 2.2079999999999997
- type: recall_at_100
value: 14.069
- type: recall_at_1000
value: 47.678
- type: recall_at_3
value: 0.7040000000000001
- type: recall_at_5
value: 1.161
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: webis-touche2020
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 2.809
- type: map_at_10
value: 10.394
- type: map_at_100
value: 16.598
- type: map_at_1000
value: 18.142
- type: map_at_3
value: 5.572
- type: map_at_5
value: 7.1370000000000005
- type: mrr_at_1
value: 32.653
- type: mrr_at_10
value: 46.564
- type: mrr_at_100
value: 47.469
- type: mrr_at_1000
value: 47.469
- type: mrr_at_3
value: 42.177
- type: mrr_at_5
value: 44.524
- type: ndcg_at_1
value: 30.612000000000002
- type: ndcg_at_10
value: 25.701
- type: ndcg_at_100
value: 37.532
- type: ndcg_at_1000
value: 48.757
- type: ndcg_at_3
value: 28.199999999999996
- type: ndcg_at_5
value: 25.987
- type: precision_at_1
value: 32.653
- type: precision_at_10
value: 23.469
- type: precision_at_100
value: 7.9799999999999995
- type: precision_at_1000
value: 1.5350000000000001
- type: precision_at_3
value: 29.932
- type: precision_at_5
value: 26.122
- type: recall_at_1
value: 2.809
- type: recall_at_10
value: 16.887
- type: recall_at_100
value: 48.67
- type: recall_at_1000
value: 82.89699999999999
- type: recall_at_3
value: 6.521000000000001
- type: recall_at_5
value: 9.609
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 71.57860000000001
- type: ap
value: 13.82629211536393
- type: f1
value: 54.59860966183956
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 59.38030560271647
- type: f1
value: 59.69685552567865
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 51.4736717043405
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 86.92853311080646
- type: cos_sim_ap
value: 77.67872502591382
- type: cos_sim_f1
value: 70.33941236068895
- type: cos_sim_precision
value: 67.63273258645884
- type: cos_sim_recall
value: 73.27176781002639
- type: dot_accuracy
value: 85.79603027954938
- type: dot_ap
value: 73.73786190233379
- type: dot_f1
value: 67.3437901774235
- type: dot_precision
value: 65.67201604814443
- type: dot_recall
value: 69.10290237467018
- type: euclidean_accuracy
value: 86.94045419324074
- type: euclidean_ap
value: 77.6687791535167
- type: euclidean_f1
value: 70.47209214023542
- type: euclidean_precision
value: 67.7207492094381
- type: euclidean_recall
value: 73.45646437994723
- type: manhattan_accuracy
value: 86.87488823985218
- type: manhattan_ap
value: 77.63373392430728
- type: manhattan_f1
value: 70.40920716112532
- type: manhattan_precision
value: 68.31265508684864
- type: manhattan_recall
value: 72.63852242744063
- type: max_accuracy
value: 86.94045419324074
- type: max_ap
value: 77.67872502591382
- type: max_f1
value: 70.47209214023542
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 88.67155664221679
- type: cos_sim_ap
value: 85.64591703003417
- type: cos_sim_f1
value: 77.59531005352656
- type: cos_sim_precision
value: 73.60967184801382
- type: cos_sim_recall
value: 82.03726516784724
- type: dot_accuracy
value: 88.41541506578181
- type: dot_ap
value: 84.6482788957769
- type: dot_f1
value: 77.04748541466657
- type: dot_precision
value: 74.02440754931176
- type: dot_recall
value: 80.3279950723745
- type: euclidean_accuracy
value: 88.63080684596576
- type: euclidean_ap
value: 85.44570045321562
- type: euclidean_f1
value: 77.28769403336106
- type: euclidean_precision
value: 72.90600040958427
- type: euclidean_recall
value: 82.22975053895904
- type: manhattan_accuracy
value: 88.59393798269105
- type: manhattan_ap
value: 85.40271361038187
- type: manhattan_f1
value: 77.17606419344392
- type: manhattan_precision
value: 72.4447747078295
- type: manhattan_recall
value: 82.5685247921158
- type: max_accuracy
value: 88.67155664221679
- type: max_ap
value: 85.64591703003417
- type: max_f1
value: 77.59531005352656
---
<h1 align="center">FlagEmbedding</h1>
<h4 align="center">
<p>
<a href=#model-list>Model List</a> |
<a href=#frequently-asked-questions>FAQ</a> |
<a href=#usage>Usage</a> |
<a href="#evaluation">Evaluation</a> |
<a href="#train">Train</a> |
<a href="#contact">Contact</a> |
<a href="#citation">Citation</a> |
<a href="#license">License</a>
<p>
</h4>
More details please refer to our Github: [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding).
[English](README.md) | [中文](https://github.com/FlagOpen/FlagEmbedding/blob/master/README_zh.md)
FlagEmbedding can map any text to a low-dimensional dense vector which can be used for tasks like retrieval, classification, clustering, or semantic search.
And it also can be used in vector databases for LLMs.
************* 🌟**Updates**🌟 *************
- 10/12/2023: Release [LLM-Embedder](./FlagEmbedding/llm_embedder/README.md), a unified embedding model to support diverse retrieval augmentation needs for LLMs. [Paper](https://arxiv.org/pdf/2310.07554.pdf) :fire:
- 09/15/2023: The [technical report](https://arxiv.org/pdf/2309.07597.pdf) of BGE has been released
- 09/15/2023: The [masive training data](https://data.baai.ac.cn/details/BAAI-MTP) of BGE has been released
- 09/12/2023: New models:
- **New reranker model**: release cross-encoder models `BAAI/bge-reranker-base` and `BAAI/bge-reranker-large`, which are more powerful than embedding model. We recommend to use/fine-tune them to re-rank top-k documents returned by embedding models.
- **update embedding model**: release `bge-*-v1.5` embedding model to alleviate the issue of the similarity distribution, and enhance its retrieval ability without instruction.
<details>
<summary>More</summary>
<!-- ### More -->
- 09/07/2023: Update [fine-tune code](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md): Add script to mine hard negatives and support adding instruction during fine-tuning.
- 08/09/2023: BGE Models are integrated into **Langchain**, you can use it like [this](#using-langchain); C-MTEB **leaderboard** is [available](https://huggingface.co/spaces/mteb/leaderboard).
- 08/05/2023: Release base-scale and small-scale models, **best performance among the models of the same size 🤗**
- 08/02/2023: Release `bge-large-*`(short for BAAI General Embedding) Models, **rank 1st on MTEB and C-MTEB benchmark!** :tada: :tada:
- 08/01/2023: We release the [Chinese Massive Text Embedding Benchmark](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB) (**C-MTEB**), consisting of 31 test dataset.
</details>
## Model List
`bge` is short for `BAAI general embedding`.
| Model | Language | | Description | query instruction for retrieval [1] |
|:-------------------------------|:--------:| :--------:| :--------:|:--------:|
| [BAAI/llm-embedder](https://huggingface.co/BAAI/llm-embedder) | English | [Inference](./FlagEmbedding/llm_embedder/README.md) [Fine-tune](./FlagEmbedding/llm_embedder/README.md) | a unified embedding model to support diverse retrieval augmentation needs for LLMs | See [README](./FlagEmbedding/llm_embedder/README.md) |
| [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient [2] | |
| [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient [2] | |
| [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-large-zh-v1.5](https://huggingface.co/BAAI/bge-large-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-large-en](https://huggingface.co/BAAI/bge-large-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [MTEB](https://huggingface.co/spaces/mteb/leaderboard) leaderboard | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-base-en](https://huggingface.co/BAAI/bge-base-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-en` | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-small-en](https://huggingface.co/BAAI/bge-small-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) |a small-scale model but with competitive performance | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [C-MTEB](https://github.com/FlagOpen/FlagEmbedding/tree/master/C_MTEB) benchmark | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-zh` | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a small-scale model but with competitive performance | `为这个句子生成表示以用于检索相关文章:` |
[1\]: If you need to search the relevant passages to a query, we suggest to add the instruction to the query; in other cases, no instruction is needed, just use the original query directly. In all cases, **no instruction** needs to be added to passages.
[2\]: Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding. To balance the accuracy and time cost, cross-encoder is widely used to re-rank top-k documents retrieved by other simple models.
For examples, use bge embedding model to retrieve top 100 relevant documents, and then use bge reranker to re-rank the top 100 document to get the final top-3 results.
All models have been uploaded to Huggingface Hub, and you can see them at https://huggingface.co/BAAI.
If you cannot open the Huggingface Hub, you also can download the models at https://model.baai.ac.cn/models .
## Frequently asked questions
<details>
<summary>1. How to fine-tune bge embedding model?</summary>
<!-- ### How to fine-tune bge embedding model? -->
Following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) to prepare data and fine-tune your model.
Some suggestions:
- Mine hard negatives following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune#hard-negatives), which can improve the retrieval performance.
- If you pre-train bge on your data, the pre-trained model cannot be directly used to calculate similarity, and it must be fine-tuned with contrastive learning before computing similarity.
- If the accuracy of the fine-tuned model is still not high, it is recommended to use/fine-tune the cross-encoder model (bge-reranker) to re-rank top-k results. Hard negatives also are needed to fine-tune reranker.
</details>
<details>
<summary>2. The similarity score between two dissimilar sentences is higher than 0.5</summary>
<!-- ### The similarity score between two dissimilar sentences is higher than 0.5 -->
**Suggest to use bge v1.5, which alleviates the issue of the similarity distribution.**
Since we finetune the models by contrastive learning with a temperature of 0.01,
the similarity distribution of the current BGE model is about in the interval \[0.6, 1\].
So a similarity score greater than 0.5 does not indicate that the two sentences are similar.
For downstream tasks, such as passage retrieval or semantic similarity,
**what matters is the relative order of the scores, not the absolute value.**
If you need to filter similar sentences based on a similarity threshold,
please select an appropriate similarity threshold based on the similarity distribution on your data (such as 0.8, 0.85, or even 0.9).
</details>
<details>
<summary>3. When does the query instruction need to be used</summary>
<!-- ### When does the query instruction need to be used -->
For the `bge-*-v1.5`, we improve its retrieval ability when not using instruction.
No instruction only has a slight degradation in retrieval performance compared with using instruction.
So you can generate embedding without instruction in all cases for convenience.
For a retrieval task that uses short queries to find long related documents,
it is recommended to add instructions for these short queries.
**The best method to decide whether to add instructions for queries is choosing the setting that achieves better performance on your task.**
In all cases, the documents/passages do not need to add the instruction.
</details>
## Usage
### Usage for Embedding Model
Here are some examples for using `bge` models with
[FlagEmbedding](#using-flagembedding), [Sentence-Transformers](#using-sentence-transformers), [Langchain](#using-langchain), or [Huggingface Transformers](#using-huggingface-transformers).
#### Using FlagEmbedding
```
pip install -U FlagEmbedding
```
If it doesn't work for you, you can see [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md) for more methods to install FlagEmbedding.
```python
from FlagEmbedding import FlagModel
sentences_1 = ["样例数据-1", "样例数据-2"]
sentences_2 = ["样例数据-3", "样例数据-4"]
model = FlagModel('BAAI/bge-large-zh-v1.5',
query_instruction_for_retrieval="为这个句子生成表示以用于检索相关文章:",
use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation
embeddings_1 = model.encode(sentences_1)
embeddings_2 = model.encode(sentences_2)
similarity = embeddings_1 @ embeddings_2.T
print(similarity)
# for s2p(short query to long passage) retrieval task, suggest to use encode_queries() which will automatically add the instruction to each query
# corpus in retrieval task can still use encode() or encode_corpus(), since they don't need instruction
queries = ['query_1', 'query_2']
passages = ["样例文档-1", "样例文档-2"]
q_embeddings = model.encode_queries(queries)
p_embeddings = model.encode(passages)
scores = q_embeddings @ p_embeddings.T
```
For the value of the argument `query_instruction_for_retrieval`, see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list).
By default, FlagModel will use all available GPUs when encoding. Please set `os.environ["CUDA_VISIBLE_DEVICES"]` to select specific GPUs.
You also can set `os.environ["CUDA_VISIBLE_DEVICES"]=""` to make all GPUs unavailable.
#### Using Sentence-Transformers
You can also use the `bge` models with [sentence-transformers](https://www.SBERT.net):
```
pip install -U sentence-transformers
```
```python
from sentence_transformers import SentenceTransformer
sentences_1 = ["样例数据-1", "样例数据-2"]
sentences_2 = ["样例数据-3", "样例数据-4"]
model = SentenceTransformer('BAAI/bge-large-zh-v1.5')
embeddings_1 = model.encode(sentences_1, normalize_embeddings=True)
embeddings_2 = model.encode(sentences_2, normalize_embeddings=True)
similarity = embeddings_1 @ embeddings_2.T
print(similarity)
```
For s2p(short query to long passage) retrieval task,
each short query should start with an instruction (instructions see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list)).
But the instruction is not needed for passages.
```python
from sentence_transformers import SentenceTransformer
queries = ['query_1', 'query_2']
passages = ["样例文档-1", "样例文档-2"]
instruction = "为这个句子生成表示以用于检索相关文章:"
model = SentenceTransformer('BAAI/bge-large-zh-v1.5')
q_embeddings = model.encode([instruction+q for q in queries], normalize_embeddings=True)
p_embeddings = model.encode(passages, normalize_embeddings=True)
scores = q_embeddings @ p_embeddings.T
```
#### Using Langchain
You can use `bge` in langchain like this:
```python
from langchain.embeddings import HuggingFaceBgeEmbeddings
model_name = "BAAI/bge-large-en-v1.5"
model_kwargs = {'device': 'cuda'}
encode_kwargs = {'normalize_embeddings': True} # set True to compute cosine similarity
model = HuggingFaceBgeEmbeddings(
model_name=model_name,
model_kwargs=model_kwargs,
encode_kwargs=encode_kwargs,
query_instruction="为这个句子生成表示以用于检索相关文章:"
)
model.query_instruction = "为这个句子生成表示以用于检索相关文章:"
```
#### Using HuggingFace Transformers
With the transformers package, you can use the model like this: First, you pass your input through the transformer model, then you select the last hidden state of the first token (i.e., [CLS]) as the sentence embedding.
```python
from transformers import AutoTokenizer, AutoModel
import torch
# Sentences we want sentence embeddings for
sentences = ["样例数据-1", "样例数据-2"]
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-large-zh-v1.5')
model = AutoModel.from_pretrained('BAAI/bge-large-zh-v1.5')
model.eval()
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# for s2p(short query to long passage) retrieval task, add an instruction to query (not add instruction for passages)
# encoded_input = tokenizer([instruction + q for q in queries], padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, cls pooling.
sentence_embeddings = model_output[0][:, 0]
# normalize embeddings
sentence_embeddings = torch.nn.functional.normalize(sentence_embeddings, p=2, dim=1)
print("Sentence embeddings:", sentence_embeddings)
```
### Usage for Reranker
Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding.
You can get a relevance score by inputting query and passage to the reranker.
The reranker is optimized based cross-entropy loss, so the relevance score is not bounded to a specific range.
#### Using FlagEmbedding
```
pip install -U FlagEmbedding
```
Get relevance scores (higher scores indicate more relevance):
```python
from FlagEmbedding import FlagReranker
reranker = FlagReranker('BAAI/bge-reranker-large', use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation
score = reranker.compute_score(['query', 'passage'])
print(score)
scores = reranker.compute_score([['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']])
print(scores)
```
#### Using Huggingface transformers
```python
import torch
from transformers import AutoModelForSequenceClassification, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-reranker-large')
model = AutoModelForSequenceClassification.from_pretrained('BAAI/bge-reranker-large')
model.eval()
pairs = [['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']]
with torch.no_grad():
inputs = tokenizer(pairs, padding=True, truncation=True, return_tensors='pt', max_length=512)
scores = model(**inputs, return_dict=True).logits.view(-1, ).float()
print(scores)
```
## Evaluation
`baai-general-embedding` models achieve **state-of-the-art performance on both MTEB and C-MTEB leaderboard!**
For more details and evaluation tools see our [scripts](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md).
- **MTEB**:
| Model Name | Dimension | Sequence Length | Average (56) | Retrieval (15) |Clustering (11) | Pair Classification (3) | Reranking (4) | STS (10) | Summarization (1) | Classification (12) |
|:----:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|
| [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | 1024 | 512 | **64.23** | **54.29** | 46.08 | 87.12 | 60.03 | 83.11 | 31.61 | 75.97 |
| [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | 768 | 512 | 63.55 | 53.25 | 45.77 | 86.55 | 58.86 | 82.4 | 31.07 | 75.53 |
| [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | 384 | 512 | 62.17 |51.68 | 43.82 | 84.92 | 58.36 | 81.59 | 30.12 | 74.14 |
| [bge-large-en](https://huggingface.co/BAAI/bge-large-en) | 1024 | 512 | 63.98 | 53.9 | 46.98 | 85.8 | 59.48 | 81.56 | 32.06 | 76.21 |
| [bge-base-en](https://huggingface.co/BAAI/bge-base-en) | 768 | 512 | 63.36 | 53.0 | 46.32 | 85.86 | 58.7 | 81.84 | 29.27 | 75.27 |
| [gte-large](https://huggingface.co/thenlper/gte-large) | 1024 | 512 | 63.13 | 52.22 | 46.84 | 85.00 | 59.13 | 83.35 | 31.66 | 73.33 |
| [gte-base](https://huggingface.co/thenlper/gte-base) | 768 | 512 | 62.39 | 51.14 | 46.2 | 84.57 | 58.61 | 82.3 | 31.17 | 73.01 |
| [e5-large-v2](https://huggingface.co/intfloat/e5-large-v2) | 1024| 512 | 62.25 | 50.56 | 44.49 | 86.03 | 56.61 | 82.05 | 30.19 | 75.24 |
| [bge-small-en](https://huggingface.co/BAAI/bge-small-en) | 384 | 512 | 62.11 | 51.82 | 44.31 | 83.78 | 57.97 | 80.72 | 30.53 | 74.37 |
| [instructor-xl](https://huggingface.co/hkunlp/instructor-xl) | 768 | 512 | 61.79 | 49.26 | 44.74 | 86.62 | 57.29 | 83.06 | 32.32 | 61.79 |
| [e5-base-v2](https://huggingface.co/intfloat/e5-base-v2) | 768 | 512 | 61.5 | 50.29 | 43.80 | 85.73 | 55.91 | 81.05 | 30.28 | 73.84 |
| [gte-small](https://huggingface.co/thenlper/gte-small) | 384 | 512 | 61.36 | 49.46 | 44.89 | 83.54 | 57.7 | 82.07 | 30.42 | 72.31 |
| [text-embedding-ada-002](https://platform.openai.com/docs/guides/embeddings) | 1536 | 8192 | 60.99 | 49.25 | 45.9 | 84.89 | 56.32 | 80.97 | 30.8 | 70.93 |
| [e5-small-v2](https://huggingface.co/intfloat/e5-base-v2) | 384 | 512 | 59.93 | 49.04 | 39.92 | 84.67 | 54.32 | 80.39 | 31.16 | 72.94 |
| [sentence-t5-xxl](https://huggingface.co/sentence-transformers/sentence-t5-xxl) | 768 | 512 | 59.51 | 42.24 | 43.72 | 85.06 | 56.42 | 82.63 | 30.08 | 73.42 |
| [all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) | 768 | 514 | 57.78 | 43.81 | 43.69 | 83.04 | 59.36 | 80.28 | 27.49 | 65.07 |
| [sgpt-bloom-7b1-msmarco](https://huggingface.co/bigscience/sgpt-bloom-7b1-msmarco) | 4096 | 2048 | 57.59 | 48.22 | 38.93 | 81.9 | 55.65 | 77.74 | 33.6 | 66.19 |
- **C-MTEB**:
We create the benchmark C-MTEB for Chinese text embedding which consists of 31 datasets from 6 tasks.
Please refer to [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md) for a detailed introduction.
| Model | Embedding dimension | Avg | Retrieval | STS | PairClassification | Classification | Reranking | Clustering |
|:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|
| [**BAAI/bge-large-zh-v1.5**](https://huggingface.co/BAAI/bge-large-zh-v1.5) | 1024 | **64.53** | 70.46 | 56.25 | 81.6 | 69.13 | 65.84 | 48.99 |
| [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | 768 | 63.13 | 69.49 | 53.72 | 79.75 | 68.07 | 65.39 | 47.53 |
| [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | 512 | 57.82 | 61.77 | 49.11 | 70.41 | 63.96 | 60.92 | 44.18 |
| [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | 1024 | 64.20 | 71.53 | 54.98 | 78.94 | 68.32 | 65.11 | 48.39 |
| [bge-large-zh-noinstruct](https://huggingface.co/BAAI/bge-large-zh-noinstruct) | 1024 | 63.53 | 70.55 | 53 | 76.77 | 68.58 | 64.91 | 50.01 |
| [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | 768 | 62.96 | 69.53 | 54.12 | 77.5 | 67.07 | 64.91 | 47.63 |
| [multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large) | 1024 | 58.79 | 63.66 | 48.44 | 69.89 | 67.34 | 56.00 | 48.23 |
| [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | 512 | 58.27 | 63.07 | 49.45 | 70.35 | 63.64 | 61.48 | 45.09 |
| [m3e-base](https://huggingface.co/moka-ai/m3e-base) | 768 | 57.10 | 56.91 | 50.47 | 63.99 | 67.52 | 59.34 | 47.68 |
| [m3e-large](https://huggingface.co/moka-ai/m3e-large) | 1024 | 57.05 | 54.75 | 50.42 | 64.3 | 68.2 | 59.66 | 48.88 |
| [multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) | 768 | 55.48 | 61.63 | 46.49 | 67.07 | 65.35 | 54.35 | 40.68 |
| [multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small) | 384 | 55.38 | 59.95 | 45.27 | 66.45 | 65.85 | 53.86 | 45.26 |
| [text-embedding-ada-002(OpenAI)](https://platform.openai.com/docs/guides/embeddings/what-are-embeddings) | 1536 | 53.02 | 52.0 | 43.35 | 69.56 | 64.31 | 54.28 | 45.68 |
| [luotuo](https://huggingface.co/silk-road/luotuo-bert-medium) | 1024 | 49.37 | 44.4 | 42.78 | 66.62 | 61 | 49.25 | 44.39 |
| [text2vec-base](https://huggingface.co/shibing624/text2vec-base-chinese) | 768 | 47.63 | 38.79 | 43.41 | 67.41 | 62.19 | 49.45 | 37.66 |
| [text2vec-large](https://huggingface.co/GanymedeNil/text2vec-large-chinese) | 1024 | 47.36 | 41.94 | 44.97 | 70.86 | 60.66 | 49.16 | 30.02 |
- **Reranking**:
See [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/) for evaluation script.
| Model | T2Reranking | T2RerankingZh2En\* | T2RerankingEn2Zh\* | MMarcoReranking | CMedQAv1 | CMedQAv2 | Avg |
|:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|
| text2vec-base-multilingual | 64.66 | 62.94 | 62.51 | 14.37 | 48.46 | 48.6 | 50.26 |
| multilingual-e5-small | 65.62 | 60.94 | 56.41 | 29.91 | 67.26 | 66.54 | 57.78 |
| multilingual-e5-large | 64.55 | 61.61 | 54.28 | 28.6 | 67.42 | 67.92 | 57.4 |
| multilingual-e5-base | 64.21 | 62.13 | 54.68 | 29.5 | 66.23 | 66.98 | 57.29 |
| m3e-base | 66.03 | 62.74 | 56.07 | 17.51 | 77.05 | 76.76 | 59.36 |
| m3e-large | 66.13 | 62.72 | 56.1 | 16.46 | 77.76 | 78.27 | 59.57 |
| bge-base-zh-v1.5 | 66.49 | 63.25 | 57.02 | 29.74 | 80.47 | 84.88 | 63.64 |
| bge-large-zh-v1.5 | 65.74 | 63.39 | 57.03 | 28.74 | 83.45 | 85.44 | 63.97 |
| [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | 67.28 | 63.95 | 60.45 | 35.46 | 81.26 | 84.1 | 65.42 |
| [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | 67.6 | 64.03 | 61.44 | 37.16 | 82.15 | 84.18 | 66.09 |
\* : T2RerankingZh2En and T2RerankingEn2Zh are cross-language retrieval tasks
## Train
### BAAI Embedding
We pre-train the models using [retromae](https://github.com/staoxiao/RetroMAE) and train them on large-scale pairs data using contrastive learning.
**You can fine-tune the embedding model on your data following our [examples](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune).**
We also provide a [pre-train example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/pretrain).
Note that the goal of pre-training is to reconstruct the text, and the pre-trained model cannot be used for similarity calculation directly, it needs to be fine-tuned.
More training details for bge see [baai_general_embedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md).
### BGE Reranker
Cross-encoder will perform full-attention over the input pair,
which is more accurate than embedding model (i.e., bi-encoder) but more time-consuming than embedding model.
Therefore, it can be used to re-rank the top-k documents returned by embedding model.
We train the cross-encoder on a multilingual pair data,
The data format is the same as embedding model, so you can fine-tune it easily following our [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker).
More details please refer to [./FlagEmbedding/reranker/README.md](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/reranker)
## Contact
If you have any question or suggestion related to this project, feel free to open an issue or pull request.
You also can email Shitao Xiao([email protected]) and Zheng Liu([email protected]).
## Citation
If you find this repository useful, please consider giving a star :star: and citation
```
@misc{bge_embedding,
title={C-Pack: Packaged Resources To Advance General Chinese Embedding},
author={Shitao Xiao and Zheng Liu and Peitian Zhang and Niklas Muennighoff},
year={2023},
eprint={2309.07597},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
## License
FlagEmbedding is licensed under the [MIT License](https://github.com/FlagOpen/FlagEmbedding/blob/master/LICENSE). The released models can be used for commercial purposes free of charge.
| [
"SEMANTIC_SIMILARITY",
"SUMMARIZATION"
] | [
"BEAR",
"BIOSSES",
"SCIFACT"
] |
corto-ai/nomic-embed-text-v1 | corto-ai | sentence-similarity | [
"sentence-transformers",
"pytorch",
"onnx",
"safetensors",
"nomic_bert",
"feature-extraction",
"sentence-similarity",
"mteb",
"transformers",
"transformers.js",
"custom_code",
"en",
"arxiv:2402.01613",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
] | 2024-05-06T05:04:53 | 2024-05-06T05:18:56 | 740 | 2 | ---
language:
- en
library_name: sentence-transformers
license: apache-2.0
pipeline_tag: sentence-similarity
tags:
- feature-extraction
- sentence-similarity
- mteb
- transformers
- transformers.js
model-index:
- name: epoch_0_model
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 76.8507462686567
- type: ap
value: 40.592189159090495
- type: f1
value: 71.01634655512476
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 91.51892500000001
- type: ap
value: 88.50346762975335
- type: f1
value: 91.50342077459624
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 47.364
- type: f1
value: 46.72708080922794
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: arguana
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 25.178
- type: map_at_10
value: 40.244
- type: map_at_100
value: 41.321999999999996
- type: map_at_1000
value: 41.331
- type: map_at_3
value: 35.016999999999996
- type: map_at_5
value: 37.99
- type: mrr_at_1
value: 25.605
- type: mrr_at_10
value: 40.422000000000004
- type: mrr_at_100
value: 41.507
- type: mrr_at_1000
value: 41.516
- type: mrr_at_3
value: 35.23
- type: mrr_at_5
value: 38.15
- type: ndcg_at_1
value: 25.178
- type: ndcg_at_10
value: 49.258
- type: ndcg_at_100
value: 53.776
- type: ndcg_at_1000
value: 53.995000000000005
- type: ndcg_at_3
value: 38.429
- type: ndcg_at_5
value: 43.803
- type: precision_at_1
value: 25.178
- type: precision_at_10
value: 7.831
- type: precision_at_100
value: 0.979
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 16.121
- type: precision_at_5
value: 12.29
- type: recall_at_1
value: 25.178
- type: recall_at_10
value: 78.307
- type: recall_at_100
value: 97.866
- type: recall_at_1000
value: 99.57300000000001
- type: recall_at_3
value: 48.364000000000004
- type: recall_at_5
value: 61.451
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 45.93034494751465
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 36.64579480054327
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 60.601310529222054
- type: mrr
value: 75.04484896451656
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 88.57797718095814
- type: cos_sim_spearman
value: 86.47064499110101
- type: euclidean_pearson
value: 87.4559602783142
- type: euclidean_spearman
value: 86.47064499110101
- type: manhattan_pearson
value: 87.7232764230245
- type: manhattan_spearman
value: 86.91222131777742
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 84.5422077922078
- type: f1
value: 84.47657456950589
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 38.48953561974464
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 32.75995857510105
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 30.008000000000003
- type: map_at_10
value: 39.51
- type: map_at_100
value: 40.841
- type: map_at_1000
value: 40.973
- type: map_at_3
value: 36.248999999999995
- type: map_at_5
value: 38.096999999999994
- type: mrr_at_1
value: 36.481
- type: mrr_at_10
value: 44.818000000000005
- type: mrr_at_100
value: 45.64
- type: mrr_at_1000
value: 45.687
- type: mrr_at_3
value: 42.036
- type: mrr_at_5
value: 43.782
- type: ndcg_at_1
value: 36.481
- type: ndcg_at_10
value: 45.152
- type: ndcg_at_100
value: 50.449
- type: ndcg_at_1000
value: 52.76499999999999
- type: ndcg_at_3
value: 40.161
- type: ndcg_at_5
value: 42.577999999999996
- type: precision_at_1
value: 36.481
- type: precision_at_10
value: 8.369
- type: precision_at_100
value: 1.373
- type: precision_at_1000
value: 0.186
- type: precision_at_3
value: 18.693
- type: precision_at_5
value: 13.533999999999999
- type: recall_at_1
value: 30.008000000000003
- type: recall_at_10
value: 56.108999999999995
- type: recall_at_100
value: 78.55499999999999
- type: recall_at_1000
value: 93.659
- type: recall_at_3
value: 41.754999999999995
- type: recall_at_5
value: 48.296
- type: map_at_1
value: 30.262
- type: map_at_10
value: 40.139
- type: map_at_100
value: 41.394
- type: map_at_1000
value: 41.526
- type: map_at_3
value: 37.155
- type: map_at_5
value: 38.785
- type: mrr_at_1
value: 38.153
- type: mrr_at_10
value: 46.369
- type: mrr_at_100
value: 47.072
- type: mrr_at_1000
value: 47.111999999999995
- type: mrr_at_3
value: 44.268
- type: mrr_at_5
value: 45.389
- type: ndcg_at_1
value: 38.153
- type: ndcg_at_10
value: 45.925
- type: ndcg_at_100
value: 50.394000000000005
- type: ndcg_at_1000
value: 52.37500000000001
- type: ndcg_at_3
value: 41.754000000000005
- type: ndcg_at_5
value: 43.574
- type: precision_at_1
value: 38.153
- type: precision_at_10
value: 8.796
- type: precision_at_100
value: 1.432
- type: precision_at_1000
value: 0.189
- type: precision_at_3
value: 20.318
- type: precision_at_5
value: 14.395
- type: recall_at_1
value: 30.262
- type: recall_at_10
value: 55.72200000000001
- type: recall_at_100
value: 74.97500000000001
- type: recall_at_1000
value: 87.342
- type: recall_at_3
value: 43.129
- type: recall_at_5
value: 48.336
- type: map_at_1
value: 39.951
- type: map_at_10
value: 51.248000000000005
- type: map_at_100
value: 52.188
- type: map_at_1000
value: 52.247
- type: map_at_3
value: 48.211
- type: map_at_5
value: 49.797000000000004
- type: mrr_at_1
value: 45.329
- type: mrr_at_10
value: 54.749
- type: mrr_at_100
value: 55.367999999999995
- type: mrr_at_1000
value: 55.400000000000006
- type: mrr_at_3
value: 52.382
- type: mrr_at_5
value: 53.649
- type: ndcg_at_1
value: 45.329
- type: ndcg_at_10
value: 56.847
- type: ndcg_at_100
value: 60.738
- type: ndcg_at_1000
value: 61.976
- type: ndcg_at_3
value: 51.59
- type: ndcg_at_5
value: 53.915
- type: precision_at_1
value: 45.329
- type: precision_at_10
value: 8.959
- type: precision_at_100
value: 1.187
- type: precision_at_1000
value: 0.134
- type: precision_at_3
value: 22.612
- type: precision_at_5
value: 15.273
- type: recall_at_1
value: 39.951
- type: recall_at_10
value: 70.053
- type: recall_at_100
value: 86.996
- type: recall_at_1000
value: 95.707
- type: recall_at_3
value: 56.032000000000004
- type: recall_at_5
value: 61.629999999999995
- type: map_at_1
value: 25.566
- type: map_at_10
value: 33.207
- type: map_at_100
value: 34.166000000000004
- type: map_at_1000
value: 34.245
- type: map_at_3
value: 30.94
- type: map_at_5
value: 32.01
- type: mrr_at_1
value: 27.345000000000002
- type: mrr_at_10
value: 35.193000000000005
- type: mrr_at_100
value: 35.965
- type: mrr_at_1000
value: 36.028999999999996
- type: mrr_at_3
value: 32.806000000000004
- type: mrr_at_5
value: 34.021
- type: ndcg_at_1
value: 27.345000000000002
- type: ndcg_at_10
value: 37.891999999999996
- type: ndcg_at_100
value: 42.664
- type: ndcg_at_1000
value: 44.757000000000005
- type: ndcg_at_3
value: 33.123000000000005
- type: ndcg_at_5
value: 35.035
- type: precision_at_1
value: 27.345000000000002
- type: precision_at_10
value: 5.763
- type: precision_at_100
value: 0.859
- type: precision_at_1000
value: 0.108
- type: precision_at_3
value: 13.71
- type: precision_at_5
value: 9.401
- type: recall_at_1
value: 25.566
- type: recall_at_10
value: 50.563
- type: recall_at_100
value: 72.86399999999999
- type: recall_at_1000
value: 88.68599999999999
- type: recall_at_3
value: 37.43
- type: recall_at_5
value: 41.894999999999996
- type: map_at_1
value: 16.663
- type: map_at_10
value: 23.552
- type: map_at_100
value: 24.538
- type: map_at_1000
value: 24.661
- type: map_at_3
value: 21.085
- type: map_at_5
value: 22.391
- type: mrr_at_1
value: 20.025000000000002
- type: mrr_at_10
value: 27.643
- type: mrr_at_100
value: 28.499999999999996
- type: mrr_at_1000
value: 28.582
- type: mrr_at_3
value: 25.083
- type: mrr_at_5
value: 26.544
- type: ndcg_at_1
value: 20.025000000000002
- type: ndcg_at_10
value: 28.272000000000002
- type: ndcg_at_100
value: 33.353
- type: ndcg_at_1000
value: 36.454
- type: ndcg_at_3
value: 23.579
- type: ndcg_at_5
value: 25.685000000000002
- type: precision_at_1
value: 20.025000000000002
- type: precision_at_10
value: 5.187
- type: precision_at_100
value: 0.897
- type: precision_at_1000
value: 0.13
- type: precision_at_3
value: 10.987
- type: precision_at_5
value: 8.06
- type: recall_at_1
value: 16.663
- type: recall_at_10
value: 38.808
- type: recall_at_100
value: 61.305
- type: recall_at_1000
value: 83.571
- type: recall_at_3
value: 25.907999999999998
- type: recall_at_5
value: 31.214
- type: map_at_1
value: 27.695999999999998
- type: map_at_10
value: 37.018
- type: map_at_100
value: 38.263000000000005
- type: map_at_1000
value: 38.371
- type: map_at_3
value: 34.226
- type: map_at_5
value: 35.809999999999995
- type: mrr_at_1
value: 32.916000000000004
- type: mrr_at_10
value: 42.067
- type: mrr_at_100
value: 42.925000000000004
- type: mrr_at_1000
value: 42.978
- type: mrr_at_3
value: 39.637
- type: mrr_at_5
value: 41.134
- type: ndcg_at_1
value: 32.916000000000004
- type: ndcg_at_10
value: 42.539
- type: ndcg_at_100
value: 47.873
- type: ndcg_at_1000
value: 50.08200000000001
- type: ndcg_at_3
value: 37.852999999999994
- type: ndcg_at_5
value: 40.201
- type: precision_at_1
value: 32.916000000000004
- type: precision_at_10
value: 7.5840000000000005
- type: precision_at_100
value: 1.199
- type: precision_at_1000
value: 0.155
- type: precision_at_3
value: 17.485
- type: precision_at_5
value: 12.512
- type: recall_at_1
value: 27.695999999999998
- type: recall_at_10
value: 53.638
- type: recall_at_100
value: 76.116
- type: recall_at_1000
value: 91.069
- type: recall_at_3
value: 41.13
- type: recall_at_5
value: 46.872
- type: map_at_1
value: 24.108
- type: map_at_10
value: 33.372
- type: map_at_100
value: 34.656
- type: map_at_1000
value: 34.768
- type: map_at_3
value: 30.830999999999996
- type: map_at_5
value: 32.204
- type: mrr_at_1
value: 29.110000000000003
- type: mrr_at_10
value: 37.979
- type: mrr_at_100
value: 38.933
- type: mrr_at_1000
value: 38.988
- type: mrr_at_3
value: 35.731
- type: mrr_at_5
value: 36.963
- type: ndcg_at_1
value: 29.110000000000003
- type: ndcg_at_10
value: 38.635000000000005
- type: ndcg_at_100
value: 44.324999999999996
- type: ndcg_at_1000
value: 46.747
- type: ndcg_at_3
value: 34.37
- type: ndcg_at_5
value: 36.228
- type: precision_at_1
value: 29.110000000000003
- type: precision_at_10
value: 6.963
- type: precision_at_100
value: 1.146
- type: precision_at_1000
value: 0.152
- type: precision_at_3
value: 16.400000000000002
- type: precision_at_5
value: 11.552999999999999
- type: recall_at_1
value: 24.108
- type: recall_at_10
value: 49.597
- type: recall_at_100
value: 73.88900000000001
- type: recall_at_1000
value: 90.62400000000001
- type: recall_at_3
value: 37.662
- type: recall_at_5
value: 42.565
- type: map_at_1
value: 25.00791666666667
- type: map_at_10
value: 33.287749999999996
- type: map_at_100
value: 34.41141666666667
- type: map_at_1000
value: 34.52583333333333
- type: map_at_3
value: 30.734416666666668
- type: map_at_5
value: 32.137166666666666
- type: mrr_at_1
value: 29.305666666666664
- type: mrr_at_10
value: 37.22966666666666
- type: mrr_at_100
value: 38.066583333333334
- type: mrr_at_1000
value: 38.12616666666667
- type: mrr_at_3
value: 34.92275
- type: mrr_at_5
value: 36.23333333333334
- type: ndcg_at_1
value: 29.305666666666664
- type: ndcg_at_10
value: 38.25533333333333
- type: ndcg_at_100
value: 43.25266666666666
- type: ndcg_at_1000
value: 45.63583333333334
- type: ndcg_at_3
value: 33.777166666666666
- type: ndcg_at_5
value: 35.85
- type: precision_at_1
value: 29.305666666666664
- type: precision_at_10
value: 6.596416666666667
- type: precision_at_100
value: 1.0784166666666668
- type: precision_at_1000
value: 0.14666666666666664
- type: precision_at_3
value: 15.31075
- type: precision_at_5
value: 10.830916666666667
- type: recall_at_1
value: 25.00791666666667
- type: recall_at_10
value: 49.10933333333333
- type: recall_at_100
value: 71.09216666666667
- type: recall_at_1000
value: 87.77725000000001
- type: recall_at_3
value: 36.660916666666665
- type: recall_at_5
value: 41.94149999999999
- type: map_at_1
value: 23.521
- type: map_at_10
value: 30.043
- type: map_at_100
value: 30.936000000000003
- type: map_at_1000
value: 31.022
- type: map_at_3
value: 27.926000000000002
- type: map_at_5
value: 29.076999999999998
- type: mrr_at_1
value: 26.227
- type: mrr_at_10
value: 32.822
- type: mrr_at_100
value: 33.61
- type: mrr_at_1000
value: 33.672000000000004
- type: mrr_at_3
value: 30.776999999999997
- type: mrr_at_5
value: 31.866
- type: ndcg_at_1
value: 26.227
- type: ndcg_at_10
value: 34.041
- type: ndcg_at_100
value: 38.394
- type: ndcg_at_1000
value: 40.732
- type: ndcg_at_3
value: 30.037999999999997
- type: ndcg_at_5
value: 31.845000000000002
- type: precision_at_1
value: 26.227
- type: precision_at_10
value: 5.244999999999999
- type: precision_at_100
value: 0.808
- type: precision_at_1000
value: 0.107
- type: precision_at_3
value: 12.679000000000002
- type: precision_at_5
value: 8.773
- type: recall_at_1
value: 23.521
- type: recall_at_10
value: 43.633
- type: recall_at_100
value: 63.126000000000005
- type: recall_at_1000
value: 80.765
- type: recall_at_3
value: 32.614
- type: recall_at_5
value: 37.15
- type: map_at_1
value: 16.236
- type: map_at_10
value: 22.898
- type: map_at_100
value: 23.878
- type: map_at_1000
value: 24.009
- type: map_at_3
value: 20.87
- type: map_at_5
value: 22.025
- type: mrr_at_1
value: 19.339000000000002
- type: mrr_at_10
value: 26.382
- type: mrr_at_100
value: 27.245
- type: mrr_at_1000
value: 27.33
- type: mrr_at_3
value: 24.386
- type: mrr_at_5
value: 25.496000000000002
- type: ndcg_at_1
value: 19.339000000000002
- type: ndcg_at_10
value: 27.139999999999997
- type: ndcg_at_100
value: 31.944
- type: ndcg_at_1000
value: 35.077999999999996
- type: ndcg_at_3
value: 23.424
- type: ndcg_at_5
value: 25.188
- type: precision_at_1
value: 19.339000000000002
- type: precision_at_10
value: 4.8309999999999995
- type: precision_at_100
value: 0.845
- type: precision_at_1000
value: 0.128
- type: precision_at_3
value: 10.874
- type: precision_at_5
value: 7.825
- type: recall_at_1
value: 16.236
- type: recall_at_10
value: 36.513
- type: recall_at_100
value: 57.999
- type: recall_at_1000
value: 80.512
- type: recall_at_3
value: 26.179999999999996
- type: recall_at_5
value: 30.712
- type: map_at_1
value: 24.11
- type: map_at_10
value: 31.566
- type: map_at_100
value: 32.647
- type: map_at_1000
value: 32.753
- type: map_at_3
value: 29.24
- type: map_at_5
value: 30.564999999999998
- type: mrr_at_1
value: 28.265
- type: mrr_at_10
value: 35.504000000000005
- type: mrr_at_100
value: 36.436
- type: mrr_at_1000
value: 36.503
- type: mrr_at_3
value: 33.349000000000004
- type: mrr_at_5
value: 34.622
- type: ndcg_at_1
value: 28.265
- type: ndcg_at_10
value: 36.192
- type: ndcg_at_100
value: 41.388000000000005
- type: ndcg_at_1000
value: 43.948
- type: ndcg_at_3
value: 31.959
- type: ndcg_at_5
value: 33.998
- type: precision_at_1
value: 28.265
- type: precision_at_10
value: 5.989
- type: precision_at_100
value: 0.9650000000000001
- type: precision_at_1000
value: 0.13
- type: precision_at_3
value: 14.335
- type: precision_at_5
value: 10.112
- type: recall_at_1
value: 24.11
- type: recall_at_10
value: 46.418
- type: recall_at_100
value: 69.314
- type: recall_at_1000
value: 87.397
- type: recall_at_3
value: 34.724
- type: recall_at_5
value: 39.925
- type: map_at_1
value: 22.091
- type: map_at_10
value: 29.948999999999998
- type: map_at_100
value: 31.502000000000002
- type: map_at_1000
value: 31.713
- type: map_at_3
value: 27.464
- type: map_at_5
value: 28.968
- type: mrr_at_1
value: 26.482
- type: mrr_at_10
value: 34.009
- type: mrr_at_100
value: 35.081
- type: mrr_at_1000
value: 35.138000000000005
- type: mrr_at_3
value: 31.785000000000004
- type: mrr_at_5
value: 33.178999999999995
- type: ndcg_at_1
value: 26.482
- type: ndcg_at_10
value: 35.008
- type: ndcg_at_100
value: 41.272999999999996
- type: ndcg_at_1000
value: 43.972
- type: ndcg_at_3
value: 30.804
- type: ndcg_at_5
value: 33.046
- type: precision_at_1
value: 26.482
- type: precision_at_10
value: 6.462
- type: precision_at_100
value: 1.431
- type: precision_at_1000
value: 0.22899999999999998
- type: precision_at_3
value: 14.360999999999999
- type: precision_at_5
value: 10.474
- type: recall_at_1
value: 22.091
- type: recall_at_10
value: 45.125
- type: recall_at_100
value: 72.313
- type: recall_at_1000
value: 89.503
- type: recall_at_3
value: 33.158
- type: recall_at_5
value: 39.086999999999996
- type: map_at_1
value: 19.883
- type: map_at_10
value: 26.951000000000004
- type: map_at_100
value: 27.927999999999997
- type: map_at_1000
value: 28.022000000000002
- type: map_at_3
value: 24.616
- type: map_at_5
value: 25.917
- type: mrr_at_1
value: 21.996
- type: mrr_at_10
value: 29.221000000000004
- type: mrr_at_100
value: 30.024
- type: mrr_at_1000
value: 30.095
- type: mrr_at_3
value: 26.833000000000002
- type: mrr_at_5
value: 28.155
- type: ndcg_at_1
value: 21.996
- type: ndcg_at_10
value: 31.421
- type: ndcg_at_100
value: 36.237
- type: ndcg_at_1000
value: 38.744
- type: ndcg_at_3
value: 26.671
- type: ndcg_at_5
value: 28.907
- type: precision_at_1
value: 21.996
- type: precision_at_10
value: 5.009
- type: precision_at_100
value: 0.799
- type: precision_at_1000
value: 0.11199999999999999
- type: precision_at_3
value: 11.275
- type: precision_at_5
value: 8.059
- type: recall_at_1
value: 19.883
- type: recall_at_10
value: 43.132999999999996
- type: recall_at_100
value: 65.654
- type: recall_at_1000
value: 84.492
- type: recall_at_3
value: 30.209000000000003
- type: recall_at_5
value: 35.616
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: climate-fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 17.756
- type: map_at_10
value: 30.378
- type: map_at_100
value: 32.537
- type: map_at_1000
value: 32.717
- type: map_at_3
value: 25.599
- type: map_at_5
value: 28.372999999999998
- type: mrr_at_1
value: 41.303
- type: mrr_at_10
value: 53.483999999999995
- type: mrr_at_100
value: 54.106
- type: mrr_at_1000
value: 54.127
- type: mrr_at_3
value: 50.315
- type: mrr_at_5
value: 52.396
- type: ndcg_at_1
value: 41.303
- type: ndcg_at_10
value: 40.503
- type: ndcg_at_100
value: 47.821000000000005
- type: ndcg_at_1000
value: 50.788
- type: ndcg_at_3
value: 34.364
- type: ndcg_at_5
value: 36.818
- type: precision_at_1
value: 41.303
- type: precision_at_10
value: 12.463000000000001
- type: precision_at_100
value: 2.037
- type: precision_at_1000
value: 0.26
- type: precision_at_3
value: 25.798
- type: precision_at_5
value: 19.896
- type: recall_at_1
value: 17.756
- type: recall_at_10
value: 46.102
- type: recall_at_100
value: 70.819
- type: recall_at_1000
value: 87.21799999999999
- type: recall_at_3
value: 30.646
- type: recall_at_5
value: 38.022
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: dbpedia-entity
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 9.033
- type: map_at_10
value: 20.584
- type: map_at_100
value: 29.518
- type: map_at_1000
value: 31.186000000000003
- type: map_at_3
value: 14.468
- type: map_at_5
value: 17.177
- type: mrr_at_1
value: 69.75
- type: mrr_at_10
value: 77.025
- type: mrr_at_100
value: 77.36699999999999
- type: mrr_at_1000
value: 77.373
- type: mrr_at_3
value: 75.583
- type: mrr_at_5
value: 76.396
- type: ndcg_at_1
value: 58.5
- type: ndcg_at_10
value: 45.033
- type: ndcg_at_100
value: 49.071
- type: ndcg_at_1000
value: 56.056
- type: ndcg_at_3
value: 49.936
- type: ndcg_at_5
value: 47.471999999999994
- type: precision_at_1
value: 69.75
- type: precision_at_10
value: 35.775
- type: precision_at_100
value: 11.594999999999999
- type: precision_at_1000
value: 2.062
- type: precision_at_3
value: 52.5
- type: precision_at_5
value: 45.300000000000004
- type: recall_at_1
value: 9.033
- type: recall_at_10
value: 26.596999999999998
- type: recall_at_100
value: 54.607000000000006
- type: recall_at_1000
value: 76.961
- type: recall_at_3
value: 15.754999999999999
- type: recall_at_5
value: 20.033
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 48.345000000000006
- type: f1
value: 43.4514918068706
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 71.29100000000001
- type: map_at_10
value: 81.059
- type: map_at_100
value: 81.341
- type: map_at_1000
value: 81.355
- type: map_at_3
value: 79.74799999999999
- type: map_at_5
value: 80.612
- type: mrr_at_1
value: 76.40299999999999
- type: mrr_at_10
value: 84.615
- type: mrr_at_100
value: 84.745
- type: mrr_at_1000
value: 84.748
- type: mrr_at_3
value: 83.776
- type: mrr_at_5
value: 84.343
- type: ndcg_at_1
value: 76.40299999999999
- type: ndcg_at_10
value: 84.981
- type: ndcg_at_100
value: 86.00999999999999
- type: ndcg_at_1000
value: 86.252
- type: ndcg_at_3
value: 82.97
- type: ndcg_at_5
value: 84.152
- type: precision_at_1
value: 76.40299999999999
- type: precision_at_10
value: 10.446
- type: precision_at_100
value: 1.1199999999999999
- type: precision_at_1000
value: 0.116
- type: precision_at_3
value: 32.147999999999996
- type: precision_at_5
value: 20.135
- type: recall_at_1
value: 71.29100000000001
- type: recall_at_10
value: 93.232
- type: recall_at_100
value: 97.363
- type: recall_at_1000
value: 98.905
- type: recall_at_3
value: 87.893
- type: recall_at_5
value: 90.804
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: fiqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 18.667
- type: map_at_10
value: 30.853
- type: map_at_100
value: 32.494
- type: map_at_1000
value: 32.677
- type: map_at_3
value: 26.91
- type: map_at_5
value: 29.099000000000004
- type: mrr_at_1
value: 37.191
- type: mrr_at_10
value: 46.171
- type: mrr_at_100
value: 47.056
- type: mrr_at_1000
value: 47.099000000000004
- type: mrr_at_3
value: 44.059
- type: mrr_at_5
value: 45.147
- type: ndcg_at_1
value: 37.191
- type: ndcg_at_10
value: 38.437
- type: ndcg_at_100
value: 44.62
- type: ndcg_at_1000
value: 47.795
- type: ndcg_at_3
value: 35.003
- type: ndcg_at_5
value: 36.006
- type: precision_at_1
value: 37.191
- type: precision_at_10
value: 10.586
- type: precision_at_100
value: 1.688
- type: precision_at_1000
value: 0.22699999999999998
- type: precision_at_3
value: 23.302
- type: precision_at_5
value: 17.006
- type: recall_at_1
value: 18.667
- type: recall_at_10
value: 45.367000000000004
- type: recall_at_100
value: 68.207
- type: recall_at_1000
value: 87.072
- type: recall_at_3
value: 32.129000000000005
- type: recall_at_5
value: 37.719
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: hotpotqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 39.494
- type: map_at_10
value: 66.223
- type: map_at_100
value: 67.062
- type: map_at_1000
value: 67.11500000000001
- type: map_at_3
value: 62.867
- type: map_at_5
value: 64.994
- type: mrr_at_1
value: 78.987
- type: mrr_at_10
value: 84.585
- type: mrr_at_100
value: 84.773
- type: mrr_at_1000
value: 84.77900000000001
- type: mrr_at_3
value: 83.592
- type: mrr_at_5
value: 84.235
- type: ndcg_at_1
value: 78.987
- type: ndcg_at_10
value: 73.64
- type: ndcg_at_100
value: 76.519
- type: ndcg_at_1000
value: 77.51
- type: ndcg_at_3
value: 68.893
- type: ndcg_at_5
value: 71.585
- type: precision_at_1
value: 78.987
- type: precision_at_10
value: 15.529000000000002
- type: precision_at_100
value: 1.7770000000000001
- type: precision_at_1000
value: 0.191
- type: precision_at_3
value: 44.808
- type: precision_at_5
value: 29.006999999999998
- type: recall_at_1
value: 39.494
- type: recall_at_10
value: 77.643
- type: recall_at_100
value: 88.825
- type: recall_at_1000
value: 95.321
- type: recall_at_3
value: 67.211
- type: recall_at_5
value: 72.519
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 85.55959999999999
- type: ap
value: 80.7246500384617
- type: f1
value: 85.52336485065454
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: msmarco
config: default
split: dev
revision: None
metrics:
- type: map_at_1
value: 23.631
- type: map_at_10
value: 36.264
- type: map_at_100
value: 37.428
- type: map_at_1000
value: 37.472
- type: map_at_3
value: 32.537
- type: map_at_5
value: 34.746
- type: mrr_at_1
value: 24.312
- type: mrr_at_10
value: 36.858000000000004
- type: mrr_at_100
value: 37.966
- type: mrr_at_1000
value: 38.004
- type: mrr_at_3
value: 33.188
- type: mrr_at_5
value: 35.367
- type: ndcg_at_1
value: 24.312
- type: ndcg_at_10
value: 43.126999999999995
- type: ndcg_at_100
value: 48.642
- type: ndcg_at_1000
value: 49.741
- type: ndcg_at_3
value: 35.589
- type: ndcg_at_5
value: 39.515
- type: precision_at_1
value: 24.312
- type: precision_at_10
value: 6.699
- type: precision_at_100
value: 0.9450000000000001
- type: precision_at_1000
value: 0.104
- type: precision_at_3
value: 15.153
- type: precision_at_5
value: 11.065999999999999
- type: recall_at_1
value: 23.631
- type: recall_at_10
value: 64.145
- type: recall_at_100
value: 89.41
- type: recall_at_1000
value: 97.83500000000001
- type: recall_at_3
value: 43.769000000000005
- type: recall_at_5
value: 53.169
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 93.4108527131783
- type: f1
value: 93.1415880261038
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 77.24806201550388
- type: f1
value: 60.531916308197175
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 73.71553463349024
- type: f1
value: 71.70753174900791
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 77.79757901815736
- type: f1
value: 77.83719850433258
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 33.74193296622113
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 30.64257594108566
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 30.811018518883625
- type: mrr
value: 31.910376577445003
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: nfcorpus
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 5.409
- type: map_at_10
value: 13.093
- type: map_at_100
value: 16.256999999999998
- type: map_at_1000
value: 17.617
- type: map_at_3
value: 9.555
- type: map_at_5
value: 11.428
- type: mrr_at_1
value: 45.201
- type: mrr_at_10
value: 54.179
- type: mrr_at_100
value: 54.812000000000005
- type: mrr_at_1000
value: 54.840999999999994
- type: mrr_at_3
value: 51.909000000000006
- type: mrr_at_5
value: 53.519000000000005
- type: ndcg_at_1
value: 43.189
- type: ndcg_at_10
value: 35.028
- type: ndcg_at_100
value: 31.226
- type: ndcg_at_1000
value: 39.678000000000004
- type: ndcg_at_3
value: 40.596
- type: ndcg_at_5
value: 38.75
- type: precision_at_1
value: 44.582
- type: precision_at_10
value: 25.974999999999998
- type: precision_at_100
value: 7.793
- type: precision_at_1000
value: 2.036
- type: precision_at_3
value: 38.493
- type: precision_at_5
value: 33.994
- type: recall_at_1
value: 5.409
- type: recall_at_10
value: 16.875999999999998
- type: recall_at_100
value: 30.316
- type: recall_at_1000
value: 60.891
- type: recall_at_3
value: 10.688
- type: recall_at_5
value: 13.832
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: nq
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 36.375
- type: map_at_10
value: 51.991
- type: map_at_100
value: 52.91400000000001
- type: map_at_1000
value: 52.93600000000001
- type: map_at_3
value: 48.014
- type: map_at_5
value: 50.381
- type: mrr_at_1
value: 40.759
- type: mrr_at_10
value: 54.617000000000004
- type: mrr_at_100
value: 55.301
- type: mrr_at_1000
value: 55.315000000000005
- type: mrr_at_3
value: 51.516
- type: mrr_at_5
value: 53.435
- type: ndcg_at_1
value: 40.759
- type: ndcg_at_10
value: 59.384
- type: ndcg_at_100
value: 63.157
- type: ndcg_at_1000
value: 63.654999999999994
- type: ndcg_at_3
value: 52.114000000000004
- type: ndcg_at_5
value: 55.986000000000004
- type: precision_at_1
value: 40.759
- type: precision_at_10
value: 9.411999999999999
- type: precision_at_100
value: 1.153
- type: precision_at_1000
value: 0.12
- type: precision_at_3
value: 23.329
- type: precision_at_5
value: 16.256999999999998
- type: recall_at_1
value: 36.375
- type: recall_at_10
value: 79.053
- type: recall_at_100
value: 95.167
- type: recall_at_1000
value: 98.82
- type: recall_at_3
value: 60.475
- type: recall_at_5
value: 69.327
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: quora
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 70.256
- type: map_at_10
value: 83.8
- type: map_at_100
value: 84.425
- type: map_at_1000
value: 84.444
- type: map_at_3
value: 80.906
- type: map_at_5
value: 82.717
- type: mrr_at_1
value: 80.97999999999999
- type: mrr_at_10
value: 87.161
- type: mrr_at_100
value: 87.262
- type: mrr_at_1000
value: 87.263
- type: mrr_at_3
value: 86.175
- type: mrr_at_5
value: 86.848
- type: ndcg_at_1
value: 80.97999999999999
- type: ndcg_at_10
value: 87.697
- type: ndcg_at_100
value: 88.959
- type: ndcg_at_1000
value: 89.09899999999999
- type: ndcg_at_3
value: 84.83800000000001
- type: ndcg_at_5
value: 86.401
- type: precision_at_1
value: 80.97999999999999
- type: precision_at_10
value: 13.261000000000001
- type: precision_at_100
value: 1.5150000000000001
- type: precision_at_1000
value: 0.156
- type: precision_at_3
value: 37.01
- type: precision_at_5
value: 24.298000000000002
- type: recall_at_1
value: 70.256
- type: recall_at_10
value: 94.935
- type: recall_at_100
value: 99.274
- type: recall_at_1000
value: 99.928
- type: recall_at_3
value: 86.602
- type: recall_at_5
value: 91.133
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 56.322692497613104
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 61.895813503775074
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: scidocs
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 4.338
- type: map_at_10
value: 10.767
- type: map_at_100
value: 12.537999999999998
- type: map_at_1000
value: 12.803999999999998
- type: map_at_3
value: 7.788
- type: map_at_5
value: 9.302000000000001
- type: mrr_at_1
value: 21.4
- type: mrr_at_10
value: 31.637999999999998
- type: mrr_at_100
value: 32.688
- type: mrr_at_1000
value: 32.756
- type: mrr_at_3
value: 28.433000000000003
- type: mrr_at_5
value: 30.178
- type: ndcg_at_1
value: 21.4
- type: ndcg_at_10
value: 18.293
- type: ndcg_at_100
value: 25.274
- type: ndcg_at_1000
value: 30.284
- type: ndcg_at_3
value: 17.391000000000002
- type: ndcg_at_5
value: 15.146999999999998
- type: precision_at_1
value: 21.4
- type: precision_at_10
value: 9.48
- type: precision_at_100
value: 1.949
- type: precision_at_1000
value: 0.316
- type: precision_at_3
value: 16.167
- type: precision_at_5
value: 13.22
- type: recall_at_1
value: 4.338
- type: recall_at_10
value: 19.213
- type: recall_at_100
value: 39.562999999999995
- type: recall_at_1000
value: 64.08
- type: recall_at_3
value: 9.828000000000001
- type: recall_at_5
value: 13.383000000000001
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 82.42568163642142
- type: cos_sim_spearman
value: 78.5797159641342
- type: euclidean_pearson
value: 80.22151260811604
- type: euclidean_spearman
value: 78.5797151953878
- type: manhattan_pearson
value: 80.21224215864788
- type: manhattan_spearman
value: 78.55641478381344
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 85.44020710812569
- type: cos_sim_spearman
value: 78.91631735081286
- type: euclidean_pearson
value: 81.64188964182102
- type: euclidean_spearman
value: 78.91633286881678
- type: manhattan_pearson
value: 81.69294748512496
- type: manhattan_spearman
value: 78.93438558002656
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 84.27165426412311
- type: cos_sim_spearman
value: 85.40429140249618
- type: euclidean_pearson
value: 84.7509580724893
- type: euclidean_spearman
value: 85.40429140249618
- type: manhattan_pearson
value: 84.76488289321308
- type: manhattan_spearman
value: 85.4256793698708
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 83.138851760732
- type: cos_sim_spearman
value: 81.64101363896586
- type: euclidean_pearson
value: 82.55165038934942
- type: euclidean_spearman
value: 81.64105257080502
- type: manhattan_pearson
value: 82.52802949883335
- type: manhattan_spearman
value: 81.61255430718158
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 86.0654695484029
- type: cos_sim_spearman
value: 87.20408521902229
- type: euclidean_pearson
value: 86.8110651362115
- type: euclidean_spearman
value: 87.20408521902229
- type: manhattan_pearson
value: 86.77984656478691
- type: manhattan_spearman
value: 87.1719947099227
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 83.77823915496512
- type: cos_sim_spearman
value: 85.43566325729779
- type: euclidean_pearson
value: 84.5396956658821
- type: euclidean_spearman
value: 85.43566325729779
- type: manhattan_pearson
value: 84.5665398848169
- type: manhattan_spearman
value: 85.44375870303232
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 87.20030208471798
- type: cos_sim_spearman
value: 87.20485505076539
- type: euclidean_pearson
value: 88.10588324368722
- type: euclidean_spearman
value: 87.20485505076539
- type: manhattan_pearson
value: 87.92324770415183
- type: manhattan_spearman
value: 87.0571314561877
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 63.06093161604453
- type: cos_sim_spearman
value: 64.2163140357722
- type: euclidean_pearson
value: 65.27589680994006
- type: euclidean_spearman
value: 64.2163140357722
- type: manhattan_pearson
value: 65.45904383711101
- type: manhattan_spearman
value: 64.55404716679305
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 84.32976164578706
- type: cos_sim_spearman
value: 85.54302197678368
- type: euclidean_pearson
value: 85.26307149193056
- type: euclidean_spearman
value: 85.54302197678368
- type: manhattan_pearson
value: 85.26647282029371
- type: manhattan_spearman
value: 85.5316135265568
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 81.44675968318754
- type: mrr
value: 94.92741826075158
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: scifact
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 56.34400000000001
- type: map_at_10
value: 65.927
- type: map_at_100
value: 66.431
- type: map_at_1000
value: 66.461
- type: map_at_3
value: 63.529
- type: map_at_5
value: 64.818
- type: mrr_at_1
value: 59.333000000000006
- type: mrr_at_10
value: 67.54599999999999
- type: mrr_at_100
value: 67.892
- type: mrr_at_1000
value: 67.917
- type: mrr_at_3
value: 65.778
- type: mrr_at_5
value: 66.794
- type: ndcg_at_1
value: 59.333000000000006
- type: ndcg_at_10
value: 70.5
- type: ndcg_at_100
value: 72.688
- type: ndcg_at_1000
value: 73.483
- type: ndcg_at_3
value: 66.338
- type: ndcg_at_5
value: 68.265
- type: precision_at_1
value: 59.333000000000006
- type: precision_at_10
value: 9.3
- type: precision_at_100
value: 1.053
- type: precision_at_1000
value: 0.11199999999999999
- type: precision_at_3
value: 25.889
- type: precision_at_5
value: 16.866999999999997
- type: recall_at_1
value: 56.34400000000001
- type: recall_at_10
value: 82.789
- type: recall_at_100
value: 92.767
- type: recall_at_1000
value: 99
- type: recall_at_3
value: 71.64399999999999
- type: recall_at_5
value: 76.322
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.75742574257426
- type: cos_sim_ap
value: 93.52081548447406
- type: cos_sim_f1
value: 87.33850129198966
- type: cos_sim_precision
value: 90.37433155080214
- type: cos_sim_recall
value: 84.5
- type: dot_accuracy
value: 99.75742574257426
- type: dot_ap
value: 93.52081548447406
- type: dot_f1
value: 87.33850129198966
- type: dot_precision
value: 90.37433155080214
- type: dot_recall
value: 84.5
- type: euclidean_accuracy
value: 99.75742574257426
- type: euclidean_ap
value: 93.52081548447406
- type: euclidean_f1
value: 87.33850129198966
- type: euclidean_precision
value: 90.37433155080214
- type: euclidean_recall
value: 84.5
- type: manhattan_accuracy
value: 99.75841584158415
- type: manhattan_ap
value: 93.4975678585854
- type: manhattan_f1
value: 87.26708074534162
- type: manhattan_precision
value: 90.45064377682404
- type: manhattan_recall
value: 84.3
- type: max_accuracy
value: 99.75841584158415
- type: max_ap
value: 93.52081548447406
- type: max_f1
value: 87.33850129198966
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 64.31437036686651
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 33.25569319007206
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 49.90474939720706
- type: mrr
value: 50.568115503777264
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 29.866828641244712
- type: cos_sim_spearman
value: 30.077555055873866
- type: dot_pearson
value: 29.866832988572266
- type: dot_spearman
value: 30.077555055873866
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: trec-covid
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 0.232
- type: map_at_10
value: 2.094
- type: map_at_100
value: 11.971
- type: map_at_1000
value: 28.158
- type: map_at_3
value: 0.688
- type: map_at_5
value: 1.114
- type: mrr_at_1
value: 88
- type: mrr_at_10
value: 93.4
- type: mrr_at_100
value: 93.4
- type: mrr_at_1000
value: 93.4
- type: mrr_at_3
value: 93
- type: mrr_at_5
value: 93.4
- type: ndcg_at_1
value: 84
- type: ndcg_at_10
value: 79.923
- type: ndcg_at_100
value: 61.17
- type: ndcg_at_1000
value: 53.03
- type: ndcg_at_3
value: 84.592
- type: ndcg_at_5
value: 82.821
- type: precision_at_1
value: 88
- type: precision_at_10
value: 85
- type: precision_at_100
value: 63.019999999999996
- type: precision_at_1000
value: 23.554
- type: precision_at_3
value: 89.333
- type: precision_at_5
value: 87.2
- type: recall_at_1
value: 0.232
- type: recall_at_10
value: 2.255
- type: recall_at_100
value: 14.823
- type: recall_at_1000
value: 49.456
- type: recall_at_3
value: 0.718
- type: recall_at_5
value: 1.175
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: webis-touche2020
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 2.547
- type: map_at_10
value: 11.375
- type: map_at_100
value: 18.194
- type: map_at_1000
value: 19.749
- type: map_at_3
value: 5.825
- type: map_at_5
value: 8.581
- type: mrr_at_1
value: 32.653
- type: mrr_at_10
value: 51.32
- type: mrr_at_100
value: 51.747
- type: mrr_at_1000
value: 51.747
- type: mrr_at_3
value: 47.278999999999996
- type: mrr_at_5
value: 48.605
- type: ndcg_at_1
value: 29.592000000000002
- type: ndcg_at_10
value: 28.151
- type: ndcg_at_100
value: 39.438
- type: ndcg_at_1000
value: 50.769
- type: ndcg_at_3
value: 30.758999999999997
- type: ndcg_at_5
value: 30.366
- type: precision_at_1
value: 32.653
- type: precision_at_10
value: 25.714
- type: precision_at_100
value: 8.041
- type: precision_at_1000
value: 1.555
- type: precision_at_3
value: 33.333
- type: precision_at_5
value: 31.837
- type: recall_at_1
value: 2.547
- type: recall_at_10
value: 18.19
- type: recall_at_100
value: 49.538
- type: recall_at_1000
value: 83.86
- type: recall_at_3
value: 7.329
- type: recall_at_5
value: 11.532
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 71.4952
- type: ap
value: 14.793362635531409
- type: f1
value: 55.204635551516915
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 61.5365025466893
- type: f1
value: 61.81742556334845
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 49.05531070301185
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 86.51725576682364
- type: cos_sim_ap
value: 75.2292304265163
- type: cos_sim_f1
value: 69.54022988505749
- type: cos_sim_precision
value: 63.65629110039457
- type: cos_sim_recall
value: 76.62269129287598
- type: dot_accuracy
value: 86.51725576682364
- type: dot_ap
value: 75.22922386081054
- type: dot_f1
value: 69.54022988505749
- type: dot_precision
value: 63.65629110039457
- type: dot_recall
value: 76.62269129287598
- type: euclidean_accuracy
value: 86.51725576682364
- type: euclidean_ap
value: 75.22925730473472
- type: euclidean_f1
value: 69.54022988505749
- type: euclidean_precision
value: 63.65629110039457
- type: euclidean_recall
value: 76.62269129287598
- type: manhattan_accuracy
value: 86.52321630804077
- type: manhattan_ap
value: 75.20608115037336
- type: manhattan_f1
value: 69.60000000000001
- type: manhattan_precision
value: 64.37219730941705
- type: manhattan_recall
value: 75.75197889182058
- type: max_accuracy
value: 86.52321630804077
- type: max_ap
value: 75.22925730473472
- type: max_f1
value: 69.60000000000001
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 89.34877944657896
- type: cos_sim_ap
value: 86.71257569277373
- type: cos_sim_f1
value: 79.10386355986088
- type: cos_sim_precision
value: 76.91468470434214
- type: cos_sim_recall
value: 81.4213119802895
- type: dot_accuracy
value: 89.34877944657896
- type: dot_ap
value: 86.71257133133368
- type: dot_f1
value: 79.10386355986088
- type: dot_precision
value: 76.91468470434214
- type: dot_recall
value: 81.4213119802895
- type: euclidean_accuracy
value: 89.34877944657896
- type: euclidean_ap
value: 86.71257651501476
- type: euclidean_f1
value: 79.10386355986088
- type: euclidean_precision
value: 76.91468470434214
- type: euclidean_recall
value: 81.4213119802895
- type: manhattan_accuracy
value: 89.35848177901967
- type: manhattan_ap
value: 86.69330615469126
- type: manhattan_f1
value: 79.13867741453949
- type: manhattan_precision
value: 76.78881807647741
- type: manhattan_recall
value: 81.63689559593472
- type: max_accuracy
value: 89.35848177901967
- type: max_ap
value: 86.71257651501476
- type: max_f1
value: 79.13867741453949
---
# nomic-embed-text-v1: A Reproducible Long Context (8192) Text Embedder
`nomic-embed-text-v1` is 8192 context length text encoder that surpasses OpenAI text-embedding-ada-002 and text-embedding-3-small performance on short and long context tasks.
| Name | SeqLen | MTEB | LoCo | Jina Long Context | Open Weights | Open Training Code | Open Data |
| :-------------------------------:| :----- | :-------- | :------: | :---------------: | :-----------: | :----------------: | :---------- |
| nomic-embed-text-v1 | 8192 | **62.39** |**85.53** | 54.16 | ✅ | ✅ | ✅ |
| jina-embeddings-v2-base-en | 8192 | 60.39 | 85.45 | 51.90 | ✅ | ❌ | ❌ |
| text-embedding-3-small | 8191 | 62.26 | 82.40 | **58.20** | ❌ | ❌ | ❌ |
| text-embedding-ada-002 | 8191 | 60.99 | 52.7 | 55.25 | ❌ | ❌ | ❌ |
## Hosted Inference API
The easiest way to get started with Nomic Embed is through the Nomic Embedding API.
Generating embeddings with the `nomic` Python client is as easy as
```python
from nomic import embed
output = embed.text(
texts=['Nomic Embedding API', '#keepAIOpen'],
model='nomic-embed-text-v1',
task_type='search_document'
)
print(output)
```
For more information, see the [API reference](https://docs.nomic.ai/reference/endpoints/nomic-embed-text)
## Data Visualization
Click the Nomic Atlas map below to visualize a 5M sample of our contrastive pretraining data!
[](https://atlas.nomic.ai/map/nomic-text-embed-v1-5m-sample)
## Training Details
We train our embedder using a multi-stage training pipeline. Starting from a long-context [BERT model](https://huggingface.co/nomic-ai/nomic-bert-2048),
the first unsupervised contrastive stage trains on a dataset generated from weakly related text pairs, such as question-answer pairs from forums like StackExchange and Quora, title-body pairs from Amazon reviews, and summarizations from news articles.
In the second finetuning stage, higher quality labeled datasets such as search queries and answers from web searches are leveraged. Data curation and hard-example mining is crucial in this stage.
For more details, see the Nomic Embed [Technical Report](https://static.nomic.ai/reports/2024_Nomic_Embed_Text_Technical_Report.pdf) and corresponding [blog post](https://blog.nomic.ai/posts/nomic-embed-text-v1).
Training data to train the models is released in its entirety. For more details, see the `contrastors` [repository](https://github.com/nomic-ai/contrastors)
## Usage
Note `nomic-embed-text` requires prefixes! We support the prefixes `[search_query, search_document, classification, clustering]`.
For retrieval applications, you should prepend `search_document` for all your documents and `search_query` for your queries.
### Sentence Transformers
```python
from sentence_transformers import SentenceTransformer
model = SentenceTransformer("nomic-ai/nomic-embed-text-v1", trust_remote_code=True)
sentences = ['search_query: What is TSNE?', 'search_query: Who is Laurens van der Maaten?']
embeddings = model.encode(sentences)
print(embeddings)
```
### Transformers
```python
import torch
import torch.nn.functional as F
from transformers import AutoTokenizer, AutoModel
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0]
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
sentences = ['search_query: What is TSNE?', 'search_query: Who is Laurens van der Maaten?']
tokenizer = AutoTokenizer.from_pretrained('bert-base-uncased')
model = AutoModel.from_pretrained('nomic-ai/nomic-embed-text-v1', trust_remote_code=True)
model.eval()
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
with torch.no_grad():
model_output = model(**encoded_input)
embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
embeddings = F.normalize(embeddings, p=2, dim=1)
print(embeddings)
```
The model natively supports scaling of the sequence length past 2048 tokens. To do so,
```diff
- tokenizer = AutoTokenizer.from_pretrained('bert-base-uncased')
+ tokenizer = AutoTokenizer.from_pretrained('bert-base-uncased', model_max_length=8192)
- model = AutoModel.from_pretrained('nomic-ai/nomic-embed-text-v1', trust_remote_code=True)
+ model = AutoModel.from_pretrained('nomic-ai/nomic-embed-text-v1', trust_remote_code=True, rotary_scaling_factor=2)
```
### Transformers.js
```js
import { pipeline } from '@xenova/transformers';
// Create a feature extraction pipeline
const extractor = await pipeline('feature-extraction', 'nomic-ai/nomic-embed-text-v1', {
quantized: false, // Comment out this line to use the quantized version
});
// Compute sentence embeddings
const texts = ['search_query: What is TSNE?', 'search_query: Who is Laurens van der Maaten?'];
const embeddings = await extractor(texts, { pooling: 'mean', normalize: true });
console.log(embeddings);
```
# Join the Nomic Community
- Nomic: [https://nomic.ai](https://nomic.ai)
- Discord: [https://discord.gg/myY5YDR8z8](https://discord.gg/myY5YDR8z8)
- Twitter: [https://twitter.com/nomic_ai](https://twitter.com/nomic_ai)
# Citation
If you find the model, dataset, or training code useful, please cite our work
```bibtex
@misc{nussbaum2024nomic,
title={Nomic Embed: Training a Reproducible Long Context Text Embedder},
author={Zach Nussbaum and John X. Morris and Brandon Duderstadt and Andriy Mulyar},
year={2024},
eprint={2402.01613},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | [
"SUMMARIZATION"
] | [
"BIOSSES",
"SCIFACT"
] |
EleutherAI/pythia-1.4b-v0 | EleutherAI | text-generation | [
"transformers",
"pytorch",
"gpt_neox",
"text-generation",
"causal-lm",
"pythia",
"pythia_v0",
"en",
"dataset:the_pile",
"arxiv:2101.00027",
"arxiv:2201.07311",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | 2022-10-16T18:24:39 | 2023-03-29T18:50:36 | 739 | 7 | ---
datasets:
- the_pile
language:
- en
license: apache-2.0
tags:
- pytorch
- causal-lm
- pythia
- pythia_v0
---
The *Pythia Scaling Suite* is a collection of models developed to facilitate
interpretability research. It contains two sets of eight models of sizes
70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two
models: one trained on the Pile, and one trained on the Pile after the dataset
has been globally deduplicated. All 8 model sizes are trained on the exact
same data, in the exact same order. All Pythia models are available
[on Hugging Face](https://huggingface.co/models?other=pythia).
The Pythia model suite was deliberately designed to promote scientific
research on large language models, especially interpretability research.
Despite not centering downstream performance as a design goal, we find the
models <a href="#evaluations">match or exceed</a> the performance of
similar and same-sized models, such as those in the OPT and GPT-Neo suites.
Please note that all models in the *Pythia* suite were renamed in January
2023. For clarity, a <a href="#naming-convention-and-parameter-count">table
comparing the old and new names</a> is provided in this model card, together
with exact parameter counts.
## Pythia-1.4B
### Model Details
- Developed by: [EleutherAI](http://eleuther.ai)
- Model type: Transformer-based Language Model
- Language: English
- Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia)
for training procedure, config files, and details on how to use.
- Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
- License: Apache 2.0
- Contact: to ask questions about this model, join the [EleutherAI
Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`.
Please read the existing *Pythia* documentation before asking about it in the
EleutherAI Discord. For general correspondence: [contact@eleuther.
ai](mailto:[email protected]).
<figure>
| Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models |
| -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: |
| 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — |
| 160M | 85,056,000 | 12 | 768 | 12 | 4M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M |
| 410M | 302,311,424 | 24 | 1024 | 16 | 4M | 3.0 x 10<sup>-4</sup> | OPT-350M |
| 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — |
| 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 4M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B |
| 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B |
| 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B |
| 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — |
<figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and
non-deduped models of a given size have the same hyperparameters. “Equivalent”
models have <b>exactly</b> the same architecture, and the same number of
non-embedding parameters.</figcaption>
</figure>
### Uses and Limitations
#### Intended Use
The primary intended use of Pythia is research on the behavior, functionality,
and limitations of large language models. This suite is intended to provide
a controlled setting for performing scientific experiments. To enable the
study of how language models change over the course of training, we provide
143 evenly spaced intermediate checkpoints per model. These checkpoints are
hosted on Hugging Face as branches. Note that branch `143000` corresponds
exactly to the model checkpoint on the `main` branch of each model.
You may also further fine-tune and adapt Pythia-1.4B for deployment,
as long as your use is in accordance with the Apache 2.0 license. Pythia
models work with the Hugging Face [Transformers
Library](https://huggingface.co/docs/transformers/index). If you decide to use
pre-trained Pythia-1.4B as a basis for your fine-tuned model, please
conduct your own risk and bias assessment.
#### Out-of-scope use
The Pythia Suite is **not** intended for deployment. It is not a in itself
a product and cannot be used for human-facing interactions.
Pythia models are English-language only, and are not suitable for translation
or generating text in other languages.
Pythia-1.4B has not been fine-tuned for downstream contexts in which
language models are commonly deployed, such as writing genre prose,
or commercial chatbots. This means Pythia-1.4B will **not**
respond to a given prompt the way a product like ChatGPT does. This is because,
unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement
Learning from Human Feedback (RLHF) to better “understand” human instructions.
#### Limitations and biases
The core functionality of a large language model is to take a string of text
and predict the next token. The token deemed statistically most likely by the
model need not produce the most “accurate” text. Never rely on
Pythia-1.4B to produce factually accurate output.
This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset
known to contain profanity and texts that are lewd or otherwise offensive.
See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a
discussion of documented biases with regards to gender, religion, and race.
Pythia-1.4B may produce socially unacceptable or undesirable text, *even if*
the prompt itself does not include anything explicitly offensive.
If you plan on using text generated through, for example, the Hosted Inference
API, we recommend having a human curate the outputs of this language model
before presenting it to other people. Please inform your audience that the
text was generated by Pythia-1.4B.
### Quickstart
Pythia models can be loaded and used via the following code, demonstrated here
for the third `pythia-70m-deduped` checkpoint:
```python
from transformers import GPTNeoXForCausalLM, AutoTokenizer
model = GPTNeoXForCausalLM.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
tokenizer = AutoTokenizer.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
inputs = tokenizer("Hello, I am", return_tensors="pt")
tokens = model.generate(**inputs)
tokenizer.decode(tokens[0])
```
Revision/branch `step143000` corresponds exactly to the model checkpoint on
the `main` branch of each model.<br>
For more information on how to use all Pythia models, see [documentation on
GitHub](https://github.com/EleutherAI/pythia).
### Training
#### Training data
[The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in
English. It was created by EleutherAI specifically for training large language
models. It contains texts from 22 diverse sources, roughly broken down into
five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl),
prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and
miscellaneous (e.g. GitHub, Enron Emails). See [the Pile
paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources,
methodology, and a discussion of ethical implications. Consult [the
datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation
about the Pile and its component datasets. The Pile can be downloaded from
the [official website](https://pile.eleuther.ai/), or from a [community
mirror](https://the-eye.eu/public/AI/pile/).<br>
The Pile was **not** deduplicated before being used to train Pythia-1.4B.
#### Training procedure
All models were trained on the exact same data, in the exact same order. Each
model saw 299,892,736,000 tokens during training, and 143 checkpoints for each
model are saved every 2,097,152,000 tokens, spaced evenly throughout training.
This corresponds to training for just under 1 epoch on the Pile for
non-deduplicated models, and about 1.5 epochs on the deduplicated Pile.
All *Pythia* models trained for the equivalent of 143000 steps at a batch size
of 2,097,152 tokens. Two batch sizes were used: 2M and 4M. Models with a batch
size of 4M tokens listed were originally trained for 71500 steps instead, with
checkpoints every 500 steps. The checkpoints on Hugging Face are renamed for
consistency with all 2M batch models, so `step1000` is the first checkpoint
for `pythia-1.4b` that was saved (corresponding to step 500 in training), and
`step1000` is likewise the first `pythia-6.9b` checkpoint that was saved
(corresponding to 1000 “actual” steps).<br>
See [GitHub](https://github.com/EleutherAI/pythia) for more details on training
procedure, including [how to reproduce
it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br>
Pythia uses the same tokenizer as [GPT-NeoX-
20B](https://huggingface.co/EleutherAI/gpt-neox-20b).
### Evaluations
All 16 *Pythia* models were evaluated using the [LM Evaluation
Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access
the results by model and step at `results/json/*` in the [GitHub
repository](https://github.com/EleutherAI/pythia/tree/main/results/json).<br>
Expand the sections below to see plots of evaluation results for all
Pythia and Pythia-deduped models compared with OPT and BLOOM.
<details>
<summary>LAMBADA – OpenAI</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai.png" style="width:auto"/>
</details>
<details>
<summary>Physical Interaction: Question Answering (PIQA)</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa.png" style="width:auto"/>
</details>
<details>
<summary>WinoGrande</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande.png" style="width:auto"/>
</details>
<details>
<summary>AI2 Reasoning Challenge—Challenge Set</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_challenge.png" style="width:auto"/>
</details>
<details>
<summary>SciQ</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq.png" style="width:auto"/>
</details>
### Naming convention and parameter count
*Pythia* models were renamed in January 2023. It is possible that the old
naming convention still persists in some documentation by accident. The
current naming convention (70M, 160M, etc.) is based on total parameter count.
<figure style="width:32em">
| current Pythia suffix | old suffix | total params | non-embedding params |
| --------------------: | ---------: | -------------: | -------------------: |
| 70M | 19M | 70,426,624 | 18,915,328 |
| 160M | 125M | 162,322,944 | 85,056,000 |
| 410M | 350M | 405,334,016 | 302,311,424 |
| 1B | 800M | 1,011,781,632 | 805,736,448 |
| 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 |
| 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 |
| 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 |
| 12B | 13B | 11,846,072,320 | 11,327,027,200 |
</figure> | [
"QUESTION_ANSWERING",
"TRANSLATION"
] | [
"SCIQ"
] |
bigscience/T0p | bigscience | text2text-generation | [
"transformers",
"pytorch",
"t5",
"text2text-generation",
"en",
"dataset:bigscience/P3",
"arxiv:2110.08207",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:05 | 2022-06-21T01:23:09 | 738 | 5 | ---
datasets:
- bigscience/P3
language: en
license: apache-2.0
widget:
- text: A is the son's of B's uncle. What is the family relationship between A and
B?
- text: 'Reorder the words in this sentence: justin and name bieber years is my am
I 27 old.'
- text: "Task: copy but say the opposite.\n PSG won its match against Barca."
- text: 'Is this review positive or negative? Review: Best cast iron skillet you will
every buy.'
example_title: Sentiment analysis
- text: "Question A: How is air traffic controlled? \nQuestion B: How do you become\
\ an air traffic controller?\nPick one: these questions are duplicates or not\
\ duplicates."
- text: "Barack Obama nominated Hilary Clinton as his secretary of state on Monday.\
\ He chose her because she had foreign affairs experience as a former First Lady.\
\ \nIn the previous sentence, decide who 'her' is referring to."
example_title: Coreference resolution
- text: "Last week I upgraded my iOS version and ever since then my phone has been\
\ overheating whenever I use your app.\n Select the category for the above sentence\
\ from: mobile, website, billing, account access."
- text: "Sentence 1: Gyorgy Heizler, head of the local disaster unit, said the coach\
\ was carrying 38 passengers.\n Sentence 2: The head of the local disaster unit,\
\ Gyorgy Heizler, said the bus was full except for 38 empty seats.\n\n Do sentences\
\ 1 and 2 have the same meaning?"
example_title: Paraphrase identification
- text: "Here's the beginning of an article, choose a tag that best describes the\
\ topic of the article: business, cinema, politics, health, travel, sports.\n\n\
\ The best and worst fo 007 as 'No time to die' marks Daniel Craig's exit.\n (CNN)\
\ Some 007 math: 60 years, 25 movies (with a small asterisk) and six James Bonds.\
\ For a Cold War creation, Ian Fleming's suave spy has certainly gotten around,\
\ but despite different guises in the tuxedo and occasional scuba gear, when it\
\ comes to Bond ratings, there really shouldn't be much argument about who wore\
\ it best."
- text: "Max: Know any good websites to buy clothes from?\n Payton: Sure :) LINK 1,\
\ LINK 2, LINK 3\n Max: That's a lot of them!\n Payton: Yeah, but they have different\
\ things so I usually buy things from 2 or 3 of them.\n Max: I'll check them out.\
\ Thanks.\n\n Who or what are Payton and Max referring to when they say 'them'?"
- text: "Is the word 'table' used in the same meaning in the two following sentences?\n\
\n Sentence A: you can leave the books on the table over there.\n Sentence B:\
\ the tables in this book are very hard to read."
- text: "On a shelf, there are five books: a gray book, a red book, a purple book,\
\ a blue book, and a black book.\n The red book is to the right of the gray book.\
\ The black book is to the left of the blue book. The blue book is to the left\
\ of the gray book. The purple book is the second from the right.\n\n Which book\
\ is the leftmost book?"
example_title: Logic puzzles
- text: "The two men running to become New York City's next mayor will face off in\
\ their first debate Wednesday night.\n\n Democrat Eric Adams, the Brooklyn Borough\
\ president and a former New York City police captain, is widely expected to win\
\ the Nov. 2 election against Republican Curtis Sliwa, the founder of the 1970s-era\
\ Guardian Angels anti-crime patril.\n\n Who are the men running for mayor?"
example_title: Reading comprehension
- text: "The word 'binne' means any animal that is furry and has four legs, and the\
\ word 'bam' means a simple sort of dwelling.\n\n Which of the following best\
\ characterizes binne bams?\n - Sentence 1: Binne bams are for pets.\n - Sentence\
\ 2: Binne bams are typically furnished with sofas and televisions.\n - Sentence\
\ 3: Binne bams are luxurious apartments.\n - Sentence 4: Binne bams are places\
\ where people live."
---
**How do I pronounce the name of the model?** T0 should be pronounced "T Zero" (like in "T5 for zero-shot") and any "p" stands for "Plus", so "T0pp" should be pronounced "T Zero Plus Plus"!
**Official repository**: [bigscience-workshop/t-zero](https://github.com/bigscience-workshop/t-zero)
# Model Description
T0* shows zero-shot task generalization on English natural language prompts, outperforming GPT-3 on many tasks, while being 16x smaller. It is a series of encoder-decoder models trained on a large set of different tasks specified in natural language prompts. We convert numerous English supervised datasets into prompts, each with multiple templates using varying formulations. These prompted datasets allow for benchmarking the ability of a model to perform completely unseen tasks specified in natural language. To obtain T0*, we fine-tune a pretrained language model on this multitask mixture covering many different NLP tasks.
# Intended uses
You can use the models to perform inference on tasks by specifying your query in natural language, and the models will generate a prediction. For instance, you can ask *"Is this review positive or negative? Review: this is the best cast iron skillet you will ever buy"*, and the model will hopefully generate *"Positive"*.
A few other examples that you can try:
- *A is the son's of B's uncle. What is the family relationship between A and B?*
- *Question A: How is air traffic controlled?<br>
Question B: How do you become an air traffic controller?<br>
Pick one: these questions are duplicates or not duplicates.*
- *Is the word 'table' used in the same meaning in the two following sentences?<br><br>
Sentence A: you can leave the books on the table over there.<br>
Sentence B: the tables in this book are very hard to read.*
- *Max: Know any good websites to buy clothes from?<br>
Payton: Sure :) LINK 1, LINK 2, LINK 3<br>
Max: That's a lot of them!<br>
Payton: Yeah, but they have different things so I usually buy things from 2 or 3 of them.<br>
Max: I'll check them out. Thanks.<br><br>
Who or what are Payton and Max referring to when they say 'them'?*
- *On a shelf, there are five books: a gray book, a red book, a purple book, a blue book, and a black book.<br>
The red book is to the right of the gray book. The black book is to the left of the blue book. The blue book is to the left of the gray book. The purple book is the second from the right.<br><br>
Which book is the leftmost book?*
- *Reorder the words in this sentence: justin and name bieber years is my am I 27 old.*
# How to use
We make available the models presented in our [paper](https://arxiv.org/abs/2110.08207) along with the ablation models. We recommend using the [T0pp](https://huggingface.co/bigscience/T0pp) (pronounce "T Zero Plus Plus") checkpoint as it leads (on average) to the best performances on a variety of NLP tasks.
|Model|Number of parameters|
|-|-|
|[T0](https://huggingface.co/bigscience/T0)|11 billion|
|[T0p](https://huggingface.co/bigscience/T0p)|11 billion|
|[T0pp](https://huggingface.co/bigscience/T0pp)|11 billion|
|[T0_single_prompt](https://huggingface.co/bigscience/T0_single_prompt)|11 billion|
|[T0_original_task_only](https://huggingface.co/bigscience/T0_original_task_only)|11 billion|
|[T0_3B](https://huggingface.co/bigscience/T0_3B)|3 billion|
Here is how to use the model in PyTorch:
```python
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("bigscience/T0pp")
model = AutoModelForSeq2SeqLM.from_pretrained("bigscience/T0pp")
inputs = tokenizer.encode("Is this review positive or negative? Review: this is the best cast iron skillet you will ever buy", return_tensors="pt")
outputs = model.generate(inputs)
print(tokenizer.decode(outputs[0]))
```
If you want to use another checkpoint, please replace the path in `AutoTokenizer` and `AutoModelForSeq2SeqLM`.
**Note: the model was trained with bf16 activations. As such, we highly discourage running inference with fp16. fp32 or bf16 should be preferred.**
# Training procedure
T0* models are based on [T5](https://huggingface.co/google/t5-v1_1-large), a Transformer-based encoder-decoder language model pre-trained with a masked language modeling-style objective on [C4](https://huggingface.co/datasets/c4). We use the publicly available [language model-adapted T5 checkpoints](https://github.com/google-research/text-to-text-transfer-transformer/blob/main/released_checkpoints.md#lm-adapted-t511lm100k) which were produced by training T5 for 100'000 additional steps with a standard language modeling objective.
At a high level, the input text is fed to the encoder and the target text is produced by the decoder. The model is fine-tuned to autoregressively generate the target through standard maximum likelihood training. It is never trained to generate the input. We detail our training data in the next section.
Training details:
- Fine-tuning steps: 12'200
- Input sequence length: 1024
- Target sequence length: 256
- Batch size: 1'024 sequences
- Optimizer: Adafactor
- Learning rate: 1e-3
- Dropout: 0.1
- Sampling strategy: proportional to the number of examples in each dataset (we treated any dataset with over 500'000 examples as having 500'000/`num_templates` examples)
- Example grouping: We use packing to combine multiple training examples into a single sequence to reach the maximum sequence length
# Training data
We trained different variants T0 with different mixtures of datasets.
|Model|Training datasets|
|--|--|
|T0|- Multiple-Choice QA: CommonsenseQA, DREAM, QUAIL, QuaRTz, Social IQA, WiQA, Cosmos, QASC, Quarel, SciQ, Wiki Hop<br>- Extractive QA: Adversarial QA, Quoref, DuoRC, ROPES<br>- Closed-Book QA: Hotpot QA*, Wiki QA<br>- Structure-To-Text: Common Gen, Wiki Bio<br>- Sentiment: Amazon, App Reviews, IMDB, Rotten Tomatoes, Yelp<br>- Summarization: CNN Daily Mail, Gigaword, MultiNews, SamSum, XSum<br>- Topic Classification: AG News, DBPedia, TREC<br>- Paraphrase Identification: MRPC, PAWS, QQP|
|T0p|Same as T0 with additional datasets from GPT-3's evaluation suite:<br>- Multiple-Choice QA: ARC, OpenBook QA, PiQA, RACE, HellaSwag<br>- Extractive QA: SQuAD v2<br>- Closed-Book QA: Trivia QA, Web Questions|
|T0pp|Same as T0p with a few additional datasets from SuperGLUE (excluding NLI sets):<br>- BoolQ<br>- COPA<br>- MultiRC<br>- ReCoRD<br>- WiC<br>- WSC|
|T0_single_prompt|Same as T0 but only one prompt per training dataset|
|T0_original_task_only|Same as T0 but only original tasks templates|
|T0_3B|Same as T0 but starting from a T5-LM XL (3B parameters) pre-trained model|
For reproducibility, we release the data we used for training (and evaluation) in the [P3 dataset](https://huggingface.co/datasets/bigscience/P3). Prompts examples can be found on the dataset page.
*: We recast Hotpot QA as closed-book QA due to long input sequence length.
# Evaluation data
We evaluate our models on a suite of held-out tasks:
|Task category|Datasets|
|-|-|
|Natural language inference|ANLI, CB, RTE|
|Coreference resolution|WSC, Winogrande|
|Word sense disambiguation|WiC|
|Sentence completion|COPA, HellaSwag, Story Cloze|
We also evaluate T0, T0p and T0pp on the a subset of the [BIG-bench benchmark](https://github.com/google/BIG-bench):
- Code description task
- Conceptual combinations
- Hindu knowledge json
- Known unknowns
- Language identification
- Logic grid puzzle task
- Logical deduction
- Common misconceptions
- Movie dialog same or different
- Novel concepts
- Strategyqa
- Formal fallacies syllogisms negation
- VitaminC
- Winowhy multiple choice
# Limitations
- The models of the T0* series are quite large (3B or 11B parameters). Loading them and performing inference requires non-trivial computational resources. When using multiple GPUs, it is possible to use [.parallelize()](https://huggingface.co/transformers/parallelism.html).
- We have observed that different prompts can lead to varying performances. We believe that further research is required to explore the effectiveness of different prompts for a language model.
- Due to design choices in the tokenization, the models are unable to perform inference for tasks involving code or non English text.
# Bias and fairness
Even if we took deliberate decisions to exclude datasets with potentially harmful content from the fine-tuning, the models trained are not bias-free. Based on a few experimentations, T0++ can generate answers that could be categorized as conspiracist, biased, offensive or over-emphasizing sexual topics:
- Input: `Is the earth flat?` - Prediction: `yes`
- Input: `Do vaccines cause autism?` - Prediction: `yes`
- Input: `Complete this sentence: This man works as a` - Prediction: `Architect`
- Input: `Complete this sentence: This woman works as a` - Prediction: `Nanny`
- Input: `what do you have dreams of innovating?` - Prediction: `a new type of sex`
- Input: `what's something this generation normalized that shouldn't be normalized at all?` - Prediction: `sexual assault`
- Input: `what is something everyone hates, but you like?` - Prediction: `sex`
- Input: `What is socially unacceptable but really shouldn't be?` - Prediction: `sex`
- Input: `What is your favorite "your mom" joke?` - Prediction: `Your mom is a slut`
- Input: `if you could invent anything useless but useful at the same time, what would it be?` - Prediction: `sex toy`
Language models can reproduce undesirable social biases represented in the large corpus they are pre-trained on. We evaluate our models in two ways: first in their ability to recognize or label gender biases and second in the extent to which they reproduce those biases.
To measure the ability of our model to recognize gender biases, we evaluate our models using the WinoGender Schemas (also called AX-g under SuperGLUE) and CrowS-Pairs. WinoGender Schemas are minimal pairs of sentences that differ only by the gender of one pronoun in the sentence, designed to test for the presence of gender bias. We use the *Diverse Natural Language Inference Collection* ([Poliak et al., 2018](https://aclanthology.org/D18-1007/)) version that casts WinoGender as a textual entailment task and report accuracy. CrowS-Pairs is a challenge dataset for measuring the degree to which U.S. stereotypical biases present in the masked language models using minimal pairs of sentences. We re-formulate the task by predicting which of two sentences is stereotypical (or anti-stereotypical) and report accuracy. For each dataset, we evaluate between 5 and 10 prompts.
<table>
<tr>
<td>Dataset</td>
<td>Model</td>
<td>Average (Acc.)</td>
<td>Median (Acc.)</td>
</tr>
<tr>
<td rowspan="10">CrowS-Pairs</td><td>T0</td><td>59.2</td><td>83.8</td>
</tr>
<td>T0p</td><td>57.6</td><td>83.8</td>
<tr>
</tr>
<td>T0pp</td><td>62.7</td><td>64.4</td>
<tr>
</tr>
<td>T0_single_prompt</td><td>57.6</td><td>69.5</td>
<tr>
</tr>
<td>T0_original_task_only</td><td>47.1</td><td>37.8</td>
<tr>
</tr>
<td>T0_3B</td><td>56.9</td><td>82.6</td>
</tr>
<tr>
<td rowspan="10">WinoGender</td><td>T0</td><td>84.2</td><td>84.3</td>
</tr>
<td>T0p</td><td>80.1</td><td>80.6</td>
<tr>
</tr>
<td>T0pp</td><td>89.2</td><td>90.0</td>
<tr>
</tr>
<td>T0_single_prompt</td><td>81.6</td><td>84.6</td>
<tr>
</tr>
<td>T0_original_task_only</td><td>83.7</td><td>83.8</td>
<tr>
</tr>
<td>T0_3B</td><td>69.7</td><td>69.4</td>
</tr>
</table>
To measure the extent to which our model reproduces gender biases, we evaluate our models using the WinoBias Schemas. WinoBias Schemas are pronoun coreference resolution tasks that have the potential to be influenced by gender bias. WinoBias Schemas has two schemas (type1 and type2) which are partitioned into pro-stereotype and anti-stereotype subsets. A "pro-stereotype" example is one where the correct answer conforms to stereotypes, while an "anti-stereotype" example is one where it opposes stereotypes. All examples have an unambiguously correct answer, and so the difference in scores between the "pro-" and "anti-" subset measures the extent to which stereotypes can lead the model astray. We report accuracies by considering a prediction correct if the target noun is present in the model's prediction. We evaluate on 6 prompts.
<table>
<tr>
<td rowspan="2">Model</td>
<td rowspan="2">Subset</td>
<td colspan="3">Average (Acc.)</td>
<td colspan="3">Median (Acc.)</td>
</tr>
<tr>
<td>Pro</td>
<td>Anti</td>
<td>Pro - Anti</td>
<td>Pro</td>
<td>Anti</td>
<td>Pro - Anti</td>
</tr>
<tr>
<td rowspan="2">T0</td><td>Type 1</td>
<td>68.0</td><td>61.9</td><td>6.0</td><td>71.7</td><td>61.9</td><td>9.8</td>
</tr>
<td>Type 2</td>
<td>79.3</td><td>76.4</td><td>2.8</td><td>79.3</td><td>75.0</td><td>4.3</td>
</tr>
</tr>
<td rowspan="2">T0p</td>
<td>Type 1</td>
<td>66.6</td><td>57.2</td><td>9.4</td><td>71.5</td><td>62.6</td><td>8.8</td>
</tr>
</tr>
<td>Type 2</td>
<td>77.7</td><td>73.4</td><td>4.3</td><td>86.1</td><td>81.3</td><td>4.8</td>
</tr>
</tr>
<td rowspan="2">T0pp</td>
<td>Type 1</td>
<td>63.8</td><td>55.9</td><td>7.9</td><td>72.7</td><td>63.4</td><td>9.3</td>
</tr>
</tr>
<td>Type 2</td>
<td>66.8</td><td>63.0</td><td>3.9</td><td>79.3</td><td>74.0</td><td>5.3</td>
</tr>
</tr>
<td rowspan="2">T0_single_prompt</td>
<td>Type 1</td>
<td>73.7</td><td>60.5</td><td>13.2</td><td>79.3</td><td>60.6</td><td>18.7</td>
</tr>
</tr>
<td>Type 2</td>
<td>77.7</td><td>69.6</td><td>8.0</td><td>80.8</td><td>69.7</td><td>11.1</td>
</tr>
</tr>
<td rowspan="2">T0_original_task_only</td>
<td>Type 1</td>
<td>78.1</td><td>67.7</td><td>10.4</td><td>81.8</td><td>67.2</td><td>14.6</td>
</tr>
</tr>
<td> Type 2</td>
<td>85.2</td><td>82.3</td><td>2.9</td><td>89.6</td><td>85.4</td><td>4.3</td>
</tr>
</tr>
<td rowspan="2">T0_3B</td>
<td>Type 1</td>
<td>82.3</td><td>70.1</td><td>12.2</td><td>83.6</td><td>62.9</td><td>20.7</td>
</tr>
</tr>
<td> Type 2</td>
<td>83.8</td><td>76.5</td><td>7.3</td><td>85.9</td><td>75</td><td>10.9</td>
</tr>
</table>
# BibTeX entry and citation info
```bibtex
@misc{sanh2021multitask,
title={Multitask Prompted Training Enables Zero-Shot Task Generalization},
author={Victor Sanh and Albert Webson and Colin Raffel and Stephen H. Bach and Lintang Sutawika and Zaid Alyafeai and Antoine Chaffin and Arnaud Stiegler and Teven Le Scao and Arun Raja and Manan Dey and M Saiful Bari and Canwen Xu and Urmish Thakker and Shanya Sharma Sharma and Eliza Szczechla and Taewoon Kim and Gunjan Chhablani and Nihal Nayak and Debajyoti Datta and Jonathan Chang and Mike Tian-Jian Jiang and Han Wang and Matteo Manica and Sheng Shen and Zheng Xin Yong and Harshit Pandey and Rachel Bawden and Thomas Wang and Trishala Neeraj and Jos Rozen and Abheesht Sharma and Andrea Santilli and Thibault Fevry and Jason Alan Fries and Ryan Teehan and Stella Biderman and Leo Gao and Tali Bers and Thomas Wolf and Alexander M. Rush},
year={2021},
eprint={2110.08207},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
``` | [
"COREFERENCE_RESOLUTION",
"TEXTUAL_ENTAILMENT",
"SUMMARIZATION"
] | [
"SCIQ"
] |
erax-ai/EraX-VL-7B-V1.5 | erax-ai | visual-question-answering | [
"transformers",
"safetensors",
"qwen2_vl",
"image-text-to-text",
"erax",
"multimodal",
"erax-vl-2B",
"insurance",
"ocr",
"vietnamese",
"bcg",
"image-to-text",
"visual-question-answering",
"vi",
"en",
"zh",
"arxiv:2308.12966",
"arxiv:2407.10671",
"arxiv:2404.16821",
"arxiv:2404.07922",
"base_model:Qwen/Qwen2-VL-2B-Instruct",
"base_model:finetune:Qwen/Qwen2-VL-2B-Instruct",
"doi:10.57967/hf/3934",
"license:apache-2.0",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | 2024-11-26T00:51:41 | 2025-01-15T16:53:14 | 725 | 6 | ---
base_model:
- Qwen/Qwen2-VL-2B-Instruct
language:
- vi
- en
- zh
library_name: transformers
license: apache-2.0
pipeline_tag: visual-question-answering
tags:
- erax
- multimodal
- erax-vl-2B
- insurance
- ocr
- vietnamese
- bcg
- image-to-text
widget:
- src: images/photo-1-16505057982762025719470.webp
example_title: Test 1
- src: images/vt-don-thuoc-f0-7417.jpeg
example_title: Test 2
---
<p align="left">
<img src="https://cdn-uploads.huggingface.co/production/uploads/63d8d8879dfcfa941d4d7cd9/GsQKdaTyn2FFx_cZvVHk3.png" alt="Logo">
</p>
# EraX-VL-7B-V1.5
## Introduction 🎉
Hot on the heels of the popular **<a href="https://huggingface.co/erax-ai/EraX-VL-7B-V1.0" target="_blank">EraX-VL-7B-V1.0 model</a>**, we proudly present **EraX-VL-7B-V1.5**, another robust multimodal model for **OCR (optical character recognition)** and **VQA (visual question-answering)** that excels in various languages 🌍, with a particular focus on Vietnamese 🇻🇳. This model stands out for its precise recognition capabilities across a range of documents 📝, including medical forms 🩺, invoices 🧾, bills of sale 💳, quotes 📄, and medical records 💊. This functionality is expected to be highly beneficial for hospitals 🏥, clinics 💉, insurance companies 🛡️, and other similar applications 📋. Built on the solid foundation of the [Qwen/Qwen2-VL-2B-Instruct](https://huggingface.co/Qwen/Qwen2-VL-7B-Instruct)[1], which we found to be of high quality and fluent in Vietnamese, `EraX-VL-7B-V1.5` has been fine-tuned to enhance its performance. We plan to continue improving and releasing new versions for free, along with sharing performance benchmarks in the near future.
One standing-out feature of **EraX-VL-7B-V1.5** is the capability to do multi-turn Q&A with impressive reasoning capability!
**NOTA BENE**:
- EraX-VL-7B-V1.5 is NOT a typical OCR-only tool likes Tesseract but is a Multimodal LLM-based model. To use it effectively, you may have to **twist your prompt carefully** depending on your tasks.
- This model was NOT finetuned with medical (X-ray) dataset or car accidences (yet). Stay tune for updated version coming up sometime early 2025.
**EraX-VL-7B-V1.5** is a young member of our **EraX's LànhGPT** collection of LLM models.
- **Developed by:**
- Nguyễn Anh Nguyên ([email protected])
- Nguyễn Hồ Nam (BCG)
- Phạm Huỳnh Nhật ([email protected])
- Phạm Đình Thục ([email protected])
- **Funded by:** [Bamboo Capital Group](https://bamboocap.com.vn) and EraX
- **Model type:** Multimodal Transformer with over 7B parameters
- **Languages (NLP):** Primarily Vietnamese with multilingual capabilities
- **License:** Apache 2.0
- **Fine-tuned from:** [Qwen/Qwen2-VL-7B-Instruct](https://huggingface.co/Qwen/Qwen2-VL-7B-Instruct)
- **Prompt examples:** <a href="https://github.com/EraX-JS-Company/erax-vl-7b-v1/blob/main/prompts/Vietnam_popular_prompts.txt" target="_blank">Some popular prompt examples on Github.</a>
## Benchmarks 📊
## 🏆 LeaderBoard
The EraX-VL-7B-V1.5 achieved exceptionally high performance compared to other models of equal size or even **10 times larger, and we open-source**! You can re-run the benchmark at any time.
<table style="width:75%;">
<tr>
<th align="middle" width="300">Models</th>
<td align="middle" width="150"><b>Open-Source</b></td>
<td align="middle" width="300"><b>VI-MTVQA</b></td>
</tr>
<tr>
<th align="middle"><font color=darkred>EraX-VL-7B-V1.5 🥇 </font></th>
<td align="middle">✅</td>
<td align="middle">47.2 </td>
</tr>
<tr>
<th align="middle">Qwen2-VL 72B 🥈 </th>
<td align="middle">✘</td>
<td align="middle">41.6 </td>
</tr>
<tr>
<th align="middle">ViGPT-VL 🥉 </th>
<td align="middle">✘</td>
<td align="middle">39.1 </td>
</tr>
<tr>
<th align="middle"><font color=darkred>EraX-VL-2B-V1.5</font></th>
<td align="middle"> ✅ </td>
<td align="middle">38.2 </td>
</tr>
<tr>
<th align="middle"><font color=darkred>EraX-VL-7B-V1 </font></th>
<td align="middle"> ✅ </td>
<td align="middle">37.6 </td>
</tr>
<tr>
<th align="middle"><font color=darkred>Vintern-1B-V2</font></th>
<td align="middle"> ✅ </td>
<td align="middle">37.4 </td>
</tr>
<tr>
<th align="middle"><font color=darkred>Qwen2-VL 7B </font></th>
<td align="middle"> ✅ </td>
<td align="middle">30.0 </td>
</tr>
<tr>
<th align="middle">Claude3 Opus</th>
<td align="middle">✘</td>
<td align="middle">29.1 </td>
</tr>
<tr>
<th align="middle">GPT-4o mini </th>
<td align="middle"> ✘ </td>
<td align="middle">29.1 </td>
</tr>
<tr>
<th align="middle">GPT-4V</th>
<td align="middle">✘</td>
<td align="middle">28.9 </td>
</tr>
<tr>
<th align="middle">Gemini Ultra</th>
<td align="middle">✘</td>
<td align="middle">28.6 </td>
</tr>
<tr>
<th align="middle"><font color=darkred>InternVL2 76B</font></th>
<td align="middle"> ✅ </td>
<td align="middle">26.9 </td>
</tr>
<tr>
<th align="middle">QwenVL Max</th>
<td align="middle">✘</td>
<td align="middle">23.5 </td>
</tr>
<tr>
<th align="middle">Claude3 Sonnet</th>
<td align="middle">✘</td>
<td align="middle">20.8 </td>
</tr>
<tr>
<th align="middle">QwenVL Plus</th>
<td align="middle">✘</td>
<td align="middle">18.1 </td>
</tr>
<tr>
<th align="middle"><font color=darkred>MiniCPM-V2.5</font></th>
<td align="middle">✅</td>
<td align="middle">15.3 </td>
</tr>
</table>
**The test code for evaluating models in the paper can be found in**: <b><a href="https://github.com/EraX-JS-Company/EraX-MTVQA-Benchmark" target="_blank">EraX-JS-Company/EraX-MTVQA-Benchmark</a></b>
## API trial 🎉
Please contact **[email protected]** for API access inquiry.
## Examples 🧩
### 1. OCR - Optical Character Recognition for Multi-Images
**Example 01: Citizen identification card**
<div style="display: flex; flex-direction: row; align-items: center; justify-content: center;">
<div style="text-align: center; margin: 0 10px;">
<img src="images/trinhquangduy_front.jpg" width="500" alt="Front View" />
<p>Front View</p>
</div>
<div style="text-align: center; margin: 0 10px;">
<img src="images/trinhquangduy_back.jpg" width="500" alt="Back View" />
<p>Back View</p>
</div>
</div>
<p style="text-align: center; font-size: 12px; color: gray; margin-top: 10px;">
Source: <a href="https://support.google.com/google-ads/thread/270967947/t%C3%B4i-%C4%91%C3%A3-g%E1%BB%ADi-h%C3%ACnh-%E1%BA%A3nh-c%C4%83n-c%C6%B0%E1%BB%9Bc-c%C3%B4ng-d%C3%A2n-c%E1%BB%A7a-ch%C3%ADnh-t%C3%B4i-%C4%91%E1%BB%83-x%C3%A1c-minh-danh-t%C3%ADnh?hl=vi" target="_blank">Google Support</a>
</p>
```
{
"Số thẻ": "037094012351",
"Họ và tên": "TRỊNH QUANG DUY",
"Ngày sinh": "04/09/1994",
"Giới tính": "Nam",
"Quốc tịch": "Việt Nam",
"Quê quán": "Tân Thành, Kim Sơn, Ninh Bình",
"Nơi thường trú": "Xóm 6\nTân Thành, Kim Sơn, Ninh Bình",
"Có giá trị đến": "04/09/2034",
"Đặc điểm nhân dạng": "sẹo chấm c. 1cm trên đuôi mắt trái",
"Nơi cấp": "CỤC TRƯỞNG CỤC CẢNH SÁT\nQUẢN LÝ HÀNH CHÍNH VỀ TRẬT TỰ XÃ HỘI",
"Ngày cấp": "10/12/2022",
"Cán bộ ký tên": "Nguyễn Quốc Hùng",
"Mã định danh": "IDVNM0940123513037094012351"
}
```
**Example 02: Driver's License**
<div style="display: flex; flex-direction: row; align-items: center; justify-content: center;">
<div style="text-align: center; margin: 0 10px;">
<img src="images/nguyenvandung_front.png" width="500" alt="Front View" />
<p>Front View</p>
</div>
<div style="text-align: center; margin: 0 10px;">
<img src="images/nguyenvandung_back.png" width="500" alt="Back View" />
<p>Back View</p>
</div>
</div>
<p style="text-align: center; font-size: 12px; color: gray; margin-top: 10px;">
Source: <a href="https://baophapluat.vn/khoi-to-tai-xe-len-mang-mua-giay-phep-lai-xe-gia-de-chay-xe-post481047.html" target="_blank">Báo Pháp luật</a>
</p>
```
{
"No.":"400116012313"
"Fullname":"NGUYỄN VĂN DŨNG"
"Date_of_birth":"08/06/1979"
"Nationality":"VIỆT NAM"
"Address":"X. Quỳnh Hầu, H. Quỳnh Lưu, T. Nghệ An
Nghệ An, ngày/date 23 tháng/month 04 năm/year 2022"
"Hang_Class":"FC"
"Expires":"23/04/2027"
"Place_of_issue":"Nghệ An"
"Date_of_issue":"ngày/date 23 tháng/month 04 năm/year 2022"
"Signer":"Trần Anh Tuấn"
"Các loại xe được phép":"Ô tô hạng C kéo rơmoóc, đầu kéo kéo sơmi rơmoóc và xe hạng B1, B2, C, FB2 (Motor vehicle of class C with a trailer, semi-trailer truck and vehicles of classes B1, B2, C, FB2)"
"Mã số":""
}
```
**Example 03: Vehicle Registration Certificate**
<div style="display: flex; flex-direction: row; align-items: center; justify-content: center;">
<div style="text-align: center; margin: 0 10px;">
<img src="images/nguyentonnhuan.jpg" width="700"/>
</div>
</div>
<p style="text-align: center; font-size: 12px; color: gray; margin-top: 10px;">
Source: <a href="https://vietnamnet.vn/phan-biet-cac-loai-giay-dang-ky-xe-khi-mua-moto-da-qua-su-dung-541341.html" target="_blank">Báo Vietnamnet</a>
</p>
```
{
"Tên chủ xe": "NGUYỄN TÔN NHUẬN",
"Địa chỉ": "KE27 Kp3 P.TTTây Q7",
"Nhãn hiệu": "HONDA",
"Số loại": "DYLAN",
"Màu sơn": "Trắng",
"Năm sản xuất": "2012",
"Số máy": "F03E-0057735",
"Số khung": "SA04F-070410",
"Dung tích": "152",
"Số chỗ ngồi": "02",
"Biển số đăng ký": "59V1-498.89",
"Đăng ký lần đầu ngày": "08/06/2004",
"Chức vụ": "Thượng tá",
"Người ký": "Trần Văn Hiểu"
}
```
**Example 04: Vehicle Registration**
<div style="display: flex; flex-direction: row; align-items: center; justify-content: center;">
<div style="text-align: center; margin: 10 20px;">
<img src="https://cdn-uploads.huggingface.co/production/uploads/63d8d8879dfcfa941d4d7cd9/w5WCaQ-k9nupRIQYddcpr.jpeg" width="700"/>
</div>
</div>
<p style="text-align: center; font-size: 12px; color: gray; margin-top: 10px;">
Source: <a href="https://llumar.com.vn/dang-kiem-xe-o-to/" target="_blank">https://llumar.com.vn</a>
</p>
```
{
"vehicle": {
"registration_number": "30A-072.36",
"vehicle_inspection_number": "2903V-093515",
"type": "ô tô con",
"mark": "MERCEDES-BENZ",
"model_code": "C300 W204",
"engine_number": "27294732096079",
"chassis_number": "RLMGF5EX3DV005333",
"manufactured_year_and_country": "2013, Việt Nam",
"life_time_limit_to": "",
"commercial_use": "",
"modification": ""
},
"specifications": {
"wheel_formula": "4x2",
"wheel_tread": "1521/1512 (mm)",
"overall_dimension": "4650 x 1770 x 1429 (mm)",
"largest_luggage_container_dimension": "",
"wheelbase": "2760 (mm)",
"kerb_mass": "1575 (kg)",
"design_authorized_pay_load": "",
"design_authorized_total_mass": "2090/2090 (kg)",
"design_authorized_towed_mass": "",
"permissible_number_of_pers_carried": "5 chỗ ngồi, 0 chỗ đứng, 0 chỗ nằm",
"type_of_fuel_used": "Xăng",
"engine_displacement": "2996 (cm3)",
"max_output_per_rpm": "170(kW)/6000vph",
"number": "KC-1292285"
},
"inspection_report_number": "2905V-20953/16",
"valid_until": "31/01/2018",
"place_date_of_issue": "Hà Nội, ngày 1 tháng 8 năm 2016",
"inspection_center": "ĐƠN VỊ KIỂM ĐỊNH XE CƠ GIỚI",
"signature": "Ngọc Tuấn",
"equipped_with_tachograph": "",
"inspection_stamp_was_not_issued": "",
"notes": "Biển đăng ký nền trắng"
}
```
**Example 05: Receipt**
<div style="display: flex; flex-direction: row; align-items: center; justify-content: center;">
<div style="text-align: center; margin: 10 20px;">
<img src="https://cdn-uploads.huggingface.co/production/uploads/63d8d8879dfcfa941d4d7cd9/40vIbNdM1cFXwQYNHx7Ag.jpeg" width="500"/>
</div>
</div>
<p style="text-align: center; font-size: 12px; color: gray; margin-top: 10px;">
Source: <a href="https://tintucketoan.com/cach-viet-hoa-don-hang-hoa-dich-vu-khong-chiu-thue-gtgt/" target="_blank">https://tintucketoan.com/</a>
</p>
```
{
'Mẫu số': '01GKTKT3/001',
'Ký hiệu': 'TC/18P',
'Số': '0000030',
'Họ tên người mua hàng': None,
'Tên đơn vị': 'Công Ty TNHH Kế Toán Hà Nội',
'Mã số thuế': '0106235869',
'Địa chỉ': 'Số 49 Ngõ 322 Lê Trọng Tấn, phường Khương Mai, quận Thanh Xuân, Hà Nội',
'Hình thức thanh toán': 'TM',
'STT': None,
'Tên hàng hóa, dịch vụ': 'Tra cứu phần mềm thư viện pháp luật trực tuyến',
'Đơn vị tính': None,
'Số lượng': None,
'Đơn giá': '168.000',
'Thành tiền': '2.016.000',
'Thuế suất GTGT': None,
'Tiền thuế GTGT': None,
'Tổng cộng tiền thanh toán': '2.016.000',
'Số tiền viết bằng chữ': 'Hai triệu, không trăm mười sáu nghìn đồng',
'Người bán hàng': 'Bùi Văn Hùng',
'Chức vụ người bán hàng': 'TRƯỞNG CHI NHÁNH'
}
```
### 2.1 Image Captioning
<div align="center">
<img src="https://cdn-uploads.huggingface.co/production/uploads/63d8d8879dfcfa941d4d7cd9/g5V60A7rI94TH0z3zdSAA.jpeg" width="700"/>
</div>
Hình ảnh là biểu đồ BMI theo tuổi, thể hiện mối quan hệ giữa chỉ số khối cơ thể (BMI) và độ tuổi của trẻ em. Biểu đồ được chia thành các vùng màu khác nhau tương ứng với các mức BMI khác nhau:
* **Vùng màu đỏ:** Chỉ số BMI cao hơn 25, cho thấy tình trạng béo phì.
* **Vùng màu vàng:** Chỉ số BMI nằm trong khoảng từ 18 đến 25, cho thấy nguy cơ béo phì.
* **Vùng màu xanh lá cây nhạt:** Chỉ số BMI nằm trong khoảng từ 16 đến 18, cho thấy sức khỏe dinh dưỡng tốt.
* **Vùng màu xanh lá cây đậm:** Chỉ số BMI thấp hơn 16, cho thấy tình trạng thiếu cân.
Trục tung biểu diễn chỉ số BMI, trục hoành biểu diễn tuổi (tính bằng năm). Đường cong màu xám đậm thể hiện đường chuẩn BMI theo tuổi. Các đường cong này cho thấy sự thay đổi BMI theo thời gian, giúp đánh giá sự phát triển cân nặng của trẻ em. Ví dụ, ở trẻ em dưới 3 tuổi, BMI thường dao động trong vùng thiếu cân hoặc sức khỏe dinh dưỡng tốt. Khi trẻ lớn lên, BMI có xu hướng tăng dần, nhưng tốc độ tăng trưởng có thể khác nhau tùy thuộc vào từng cá nhân. Biểu đồ cũng hiển thị các phần trăm phân vị (Percentile), cho biết tỷ lệ phần trăm trẻ em có BMI thấp hơn hoặc cao hơn so với một nhóm trẻ em cùng độ tuổi. Điều này giúp so sánh BMI của trẻ em với tiêu chuẩn quốc tế.
### 2.2 Image Captioning
<div align="center">
<img src="https://huggingface.co/erax-ai/EraX-VL-7B-V1.5/resolve/main/images/27vid-Gaza-City-Cover-gqmt-videoSixteenByNine1050%20(1).jpg" width="700"/>
</div>
Hình ảnh chụp một cảnh tượng đầy xúc động và bi thảm, dường như diễn ra ở một khu vực nghèo khó, có thể là một khu định cư hoặc khu ổ chuột. Trung tâm của bức ảnh là một chiếc xe đẩy được kéo bởi một con lừa. Trên xe đẩy có một nhóm người, bao gồm một người đàn ông lớn tuổi có vẻ như là người hướng dẫn, một phụ nữ mặc áo choàng đen, một phụ nữ trẻ mặc áo xám, một bé gái nhỏ được che mặt bằng khăn trùm đầu, và một cậu bé mặc áo xanh lá cây. Họ có vẻ như đang di chuyển từ một khu vực bị tàn phá sang một nơi khác. Xung quanh họ là những mảnh vụn, rác thải và những tòa nhà đổ nát, cho thấy một môi trường sống khó khăn và thiếu thốn. Phía sau họ, có một số người khác đang đi bộ hoặc đi xe đạp, một số người đang mang theo hành lý. Bầu trời trong xanh, nhưng không khí chung của bức ảnh là nặng nề và buồn bã. Có thể đây là một cảnh tượng di cư, tị nạn, hoặc một sự kiện thảm khốc nào đó.
### 3. Multi-turns Visual Question Answering and Reasoning
<div style="display: flex; flex-direction: row; align-items: center; justify-content: center;">
<div style="text-align: center; margin: 10 20px;">
<img src="https://cdn-uploads.huggingface.co/production/uploads/63d8d8879dfcfa941d4d7cd9/S2UosY_TOOV5XNn3d4_zL.jpeg" width="700"/>
</div>
</div>
<p style="text-align: center; font-size: 12px; color: gray; margin-top: 10px;">
Source: <a href="https://kenh14.vn/bien-cang-thai-trinh-nhap-vien-vi-ngo-doc-dang-dan-to-1-resort-5-sao-thieu-trach-nhiem-voi-khach-hang-20200924214133912.chn" target="_blank">https://kenh14.vn</a>
</p>
- **User**: Hãy cho biết thông tin bệnh nhân
- **Assistant**: Tên: NGUYỄN THỊ THÁI TRINH, Ngày sinh: 09/09/1993, Giới tính: Nữ, Địa chỉ: TÂN ĐỊNH - Q1 - TPHCM, Mã thẻ BHYT: 19169080, Số khám bệnh: 2569501, Mã khu vực (K1/K2/K3): 2
- **User**: Họ được chuẩn đoán bị bệnh gì?
- **Assistant**: Nhiễm trùng đường ruột do vi khuẩn, không xác định loài
- **User**: Người bệnh có tiền sử bệnh không? Nếu có thì là gì?
- **Assistant**: Thiếu Calci do chế độ ăn
- **User**: Bệnh nhân đã thực hiện những xét nghiệm gì?
- **Assistant**: [NQ286] Tổng phân tích tế bào máu ngoại vi (bằng máy đếm laser), [NQ286] Điện giải đồ (Na, K, Cl) [Máu], [NQ286] Đo hoạt độ ALT (GPT) [Máu], [NQ286] Đo hoạt độ AST (GOT) [Máu], [NQ286] Định lượng Calci toàn phần [Máu], [NQ286] Định lượng Creatinin [Máu], [NQ286] Định lượng Glucose [Máu], [NQ286] Định lượng Urê máu [Máu], [NQ286] Siêu âm ổ bụng (gan mật, tụy, lách, thận, bàng quang)
- **User**: Số tiền Vật tư y tế mà người bệnh phải trả là bao nhiêu?
- **Assistant**: 19,708
- **User**: Các loại thuốc dịch truyền người bệnh phải trả là gì, tổng số tiền là bao nhiêu?
- **Assistant**: Atropin Sulfat (Syv2019), Bicefzidim 1g (Syv2019), GONCAL (Syv2019), Lactated Ringer's-500ml (Syv2019), Nước cất pha tiêm 5ml (Syv2019), Sodium Chloride 0.9% -500ml (Syv2019), Vincomid (Syv2019), Vinopa (Syv2019), tổng cộng 45,234 đồng
## Quickstart 🎮
Install the necessary packages:
```curl
python -m pip install git+https://github.com/huggingface/transformers accelerate
python -m pip install qwen-vl-utils
pip install flash-attn --no-build-isolation
```
Then you can use `EraX-VL-7B-V1.5` like this:
```python
import os
import base64
import json
import cv2
import numpy as np
import matplotlib.pyplot as plt
import torch
from transformers import Qwen2VLForConditionalGeneration, AutoTokenizer, AutoProcessor
from qwen_vl_utils import process_vision_info
model_path = "erax/EraX-VL-7B-V1.5"
model = Qwen2VLForConditionalGeneration.from_pretrained(
model_path,
torch_dtype=torch.bfloat16,
attn_implementation="eager", # replace with "flash_attention_2" if your GPU is Ampere architecture
device_map="auto"
)
tokenizer = AutoTokenizer.from_pretrained(model_path)
# processor = AutoProcessor.from_pretrained(model_path)
min_pixels = 256 * 28 * 28
max_pixels = 1280 * 28 * 28
processor = AutoProcessor.from_pretrained(
model_path,
min_pixels=min_pixels,
max_pixels=max_pixels,
)
image_path ="image.jpg"
with open(image_path, "rb") as f:
encoded_image = base64.b64encode(f.read())
decoded_image_text = encoded_image.decode('utf-8')
base64_data = f"data:image;base64,{decoded_image_text}"
messages = [
{
"role": "user",
"content": [
{
"type": "image",
"image": base64_data,
},
{
"type": "text",
"text": "Trích xuất thông tin nội dung từ hình ảnh được cung cấp."
},
],
}
]
# Prepare prompt
tokenized_text = processor.apply_chat_template(
messages, tokenize=False, add_generation_prompt=True
)
image_inputs, video_inputs = process_vision_info(messages)
inputs = processor(
text=[ tokenized_text],
images=image_inputs,
videos=video_inputs,
padding=True,
return_tensors="pt",
)
inputs = inputs.to("cuda")
# Generation configs
generation_config = model.generation_config
generation_config.do_sample = True
generation_config.temperature = 1.0
generation_config.top_k = 1
generation_config.top_p = 0.9
generation_config.min_p = 0.1
generation_config.best_of = 5
generation_config.max_new_tokens = 2048
generation_config.repetition_penalty = 1.06
# Inference
generated_ids = model.generate(**inputs, generation_config=generation_config)
generated_ids_trimmed = [
out_ids[len(in_ids) :] for in_ids, out_ids in zip(inputs.input_ids, generated_ids)
]
output_text = processor.batch_decode(
generated_ids_trimmed, skip_special_tokens=True, clean_up_tokenization_spaces=False
)
print(output_text[0])
```
## References 📑
[1] Qwen team. Qwen2-VL. 2024.
[2] Bai, Jinze, et al. "Qwen-VL: A Versatile Vision-Language Model for Understanding, Localization, Text Reading, and Beyond." arXiv preprint arXiv:2308.12966 (2023).
[4] Yang, An, et al. "Qwen2 technical report." arXiv preprint arXiv:2407.10671 (2024).
[5] Chen, Zhe, et al. "Internvl: Scaling up vision foundation models and aligning for generic visual-linguistic tasks." Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2024.
[6] Chen, Zhe, et al. "How far are we to gpt-4v? closing the gap to commercial multimodal models with open-source suites." arXiv preprint arXiv:2404.16821 (2024).
[7] Tran, Chi, and Huong Le Thanh. "LaVy: Vietnamese Multimodal Large Language Model." arXiv preprint arXiv:2404.07922 (2024).
## Contact 🤝
- For correspondence regarding this work or inquiry for API trial, please contact Nguyễn Anh Nguyên at [[email protected]]([email protected]).
- Follow us on <b><a href="https://github.com/EraX-JS-Company" target="_blank">EraX Github</a></b>
| [
"QUESTION_ANSWERING"
] | [
"CHIA"
] |
EleutherAI/pythia-2.8b-deduped-v0 | EleutherAI | text-generation | [
"transformers",
"pytorch",
"safetensors",
"gpt_neox",
"text-generation",
"causal-lm",
"pythia",
"pythia_v0",
"en",
"dataset:EleutherAI/the_pile_deduplicated",
"arxiv:2101.00027",
"arxiv:2201.07311",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | 2022-11-23T17:41:01 | 2023-07-10T01:32:13 | 707 | 6 | ---
datasets:
- EleutherAI/the_pile_deduplicated
language:
- en
license: apache-2.0
tags:
- pytorch
- causal-lm
- pythia
- pythia_v0
---
The *Pythia Scaling Suite* is a collection of models developed to facilitate
interpretability research. It contains two sets of eight models of sizes
70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two
models: one trained on the Pile, and one trained on the Pile after the dataset
has been globally deduplicated. All 8 model sizes are trained on the exact
same data, in the exact same order. All Pythia models are available
[on Hugging Face](https://huggingface.co/models?other=pythia).
The Pythia model suite was deliberately designed to promote scientific
research on large language models, especially interpretability research.
Despite not centering downstream performance as a design goal, we find the
models <a href="#evaluations">match or exceed</a> the performance of
similar and same-sized models, such as those in the OPT and GPT-Neo suites.
Please note that all models in the *Pythia* suite were renamed in January
2023. For clarity, a <a href="#naming-convention-and-parameter-count">table
comparing the old and new names</a> is provided in this model card, together
with exact parameter counts.
## Pythia-2.8B-deduped
### Model Details
- Developed by: [EleutherAI](http://eleuther.ai)
- Model type: Transformer-based Language Model
- Language: English
- Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia)
for training procedure, config files, and details on how to use.
- Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
- License: Apache 2.0
- Contact: to ask questions about this model, join the [EleutherAI
Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`.
Please read the existing *Pythia* documentation before asking about it in the
EleutherAI Discord. For general correspondence: [contact@eleuther.
ai](mailto:[email protected]).
<figure>
| Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models |
| -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: |
| 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — |
| 160M | 85,056,000 | 12 | 768 | 12 | 4M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M |
| 410M | 302,311,424 | 24 | 1024 | 16 | 4M | 3.0 x 10<sup>-4</sup> | OPT-350M |
| 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — |
| 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 4M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B |
| 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B |
| 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B |
| 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — |
<figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and
non-deduped models of a given size have the same hyperparameters. “Equivalent”
models have <b>exactly</b> the same architecture, and the same number of
non-embedding parameters.</figcaption>
</figure>
### Uses and Limitations
#### Intended Use
The primary intended use of Pythia is research on the behavior, functionality,
and limitations of large language models. This suite is intended to provide
a controlled setting for performing scientific experiments. To enable the
study of how language models change in the course of training, we provide
143 evenly spaced intermediate checkpoints per model. These checkpoints are
hosted on Hugging Face as branches. Note that branch `143000` corresponds
exactly to the model checkpoint on the `main` branch of each model.
You may also further fine-tune and adapt Pythia-2.8B-deduped for deployment,
as long as your use is in accordance with the Apache 2.0 license. Pythia
models work with the Hugging Face [Transformers
Library](https://huggingface.co/docs/transformers/index). If you decide to use
pre-trained Pythia-2.8B-deduped as a basis for your fine-tuned model, please
conduct your own risk and bias assessment.
#### Out-of-scope use
The Pythia Suite is **not** intended for deployment. It is not a in itself
a product and cannot be used for human-facing interactions.
Pythia models are English-language only, and are not suitable for translation
or generating text in other languages.
Pythia-2.8B-deduped has not been fine-tuned for downstream contexts in which
language models are commonly deployed, such as writing genre prose,
or commercial chatbots. This means Pythia-2.8B-deduped will **not**
respond to a given prompt the way a product like ChatGPT does. This is because,
unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement
Learning from Human Feedback (RLHF) to better “understand” human instructions.
#### Limitations and biases
The core functionality of a large language model is to take a string of text
and predict the next token. The token deemed statistically most likely by the
model need not produce the most “accurate” text. Never rely on
Pythia-2.8B-deduped to produce factually accurate output.
This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset
known to contain profanity and texts that are lewd or otherwise offensive.
See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a
discussion of documented biases with regards to gender, religion, and race.
Pythia-2.8B-deduped may produce socially unacceptable or undesirable text,
*even if* the prompt itself does not include anything explicitly offensive.
If you plan on using text generated through, for example, the Hosted Inference
API, we recommend having a human curate the outputs of this language model
before presenting it to other people. Please inform your audience that the
text was generated by Pythia-2.8B-deduped.
### Quickstart
Pythia models can be loaded and used via the following code, demonstrated here
for the third `pythia-70m-deduped` checkpoint:
```python
from transformers import GPTNeoXForCausalLM, AutoTokenizer
model = GPTNeoXForCausalLM.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
tokenizer = AutoTokenizer.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
inputs = tokenizer("Hello, I am", return_tensors="pt")
tokens = model.generate(**inputs)
tokenizer.decode(tokens[0])
```
Revision/branch `step143000` corresponds exactly to the model checkpoint on
the `main` branch of each model.<br>
For more information on how to use all Pythia models, see [documentation on
GitHub](https://github.com/EleutherAI/pythia).
### Training
#### Training data
Pythia-2.8B-deduped was trained on the Pile **after the dataset has been
globally deduplicated**.<br>
[The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in
English. It was created by EleutherAI specifically for training large language
models. It contains texts from 22 diverse sources, roughly broken down into
five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl),
prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and
miscellaneous (e.g. GitHub, Enron Emails). See [the Pile
paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources,
methodology, and a discussion of ethical implications. Consult [the
datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation
about the Pile and its component datasets. The Pile can be downloaded from
the [official website](https://pile.eleuther.ai/), or from a [community
mirror](https://the-eye.eu/public/AI/pile/).
#### Training procedure
All models were trained on the exact same data, in the exact same order. Each
model saw 299,892,736,000 tokens during training, and 143 checkpoints for each
model are saved every 2,097,152,000 tokens, spaced evenly throughout training.
This corresponds to training for just under 1 epoch on the Pile for
non-deduplicated models, and about 1.5 epochs on the deduplicated Pile.
All *Pythia* models trained for the equivalent of 143000 steps at a batch size
of 2,097,152 tokens. Two batch sizes were used: 2M and 4M. Models with a batch
size of 4M tokens listed were originally trained for 71500 steps instead, with
checkpoints every 500 steps. The checkpoints on Hugging Face are renamed for
consistency with all 2M batch models, so `step1000` is the first checkpoint
for `pythia-1.4b` that was saved (corresponding to step 500 in training), and
`step1000` is likewise the first `pythia-6.9b` checkpoint that was saved
(corresponding to 1000 “actual” steps).<br>
See [GitHub](https://github.com/EleutherAI/pythia) for more details on training
procedure, including [how to reproduce
it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br>
Pythia uses the same tokenizer as [GPT-NeoX-
20B](https://huggingface.co/EleutherAI/gpt-neox-20b).
### Evaluations
All 16 *Pythia* models were evaluated using the [LM Evaluation
Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access
the results by model and step at `results/json/*` in the [GitHub
repository](https://github.com/EleutherAI/pythia/tree/main/results/json).<br>
Expand the sections below to see plots of evaluation results for all
Pythia and Pythia-deduped models compared with OPT and BLOOM.
<details>
<summary>LAMBADA – OpenAI</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai.png" style="width:auto"/>
</details>
<details>
<summary>Physical Interaction: Question Answering (PIQA)</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa.png" style="width:auto"/>
</details>
<details>
<summary>WinoGrande</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande.png" style="width:auto"/>
</details>
<details>
<summary>AI2 Reasoning Challenge – Challenge Set</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_challenge.png" style="width:auto"/>
</details>
<details>
<summary>SciQ</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq.png" style="width:auto"/>
</details>
### Naming convention and parameter count
*Pythia* models were renamed in January 2023. It is possible that the old
naming convention still persists in some documentation by accident. The
current naming convention (70M, 160M, etc.) is based on total parameter count.
<figure style="width:32em">
| current Pythia suffix | old suffix | total params | non-embedding params |
| --------------------: | ---------: | -------------: | -------------------: |
| 70M | 19M | 70,426,624 | 18,915,328 |
| 160M | 125M | 162,322,944 | 85,056,000 |
| 410M | 350M | 405,334,016 | 302,311,424 |
| 1B | 800M | 1,011,781,632 | 805,736,448 |
| 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 |
| 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 |
| 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 |
| 12B | 13B | 11,846,072,320 | 11,327,027,200 |
</figure> | [
"QUESTION_ANSWERING",
"TRANSLATION"
] | [
"SCIQ"
] |
phamhai/Llama-3.2-1B-Instruct-Frog | phamhai | text-generation | [
"pytorch",
"safetensors",
"llama",
"text-generation",
"conversational",
"en",
"vi",
"base_model:meta-llama/Llama-3.2-1B-Instruct",
"base_model:finetune:meta-llama/Llama-3.2-1B-Instruct",
"license:llama3.2",
"region:us"
] | 2024-10-22T09:40:37 | 2024-11-15T10:02:29 | 690 | 3 | ---
base_model:
- meta-llama/Llama-3.2-1B-Instruct
language:
- en
- vi
license: llama3.2
pipeline_tag: text-generation
---
<p align="center"> <img src="https://cdn-uploads.huggingface.co/production/uploads/6612cc790b91dd96968028f9/yP51EyRNg-CHCKB4gBYan.png" width="100" /> </p>
<h1>Llama-3.2-1B-Instruct-Frog - a RAG-optimized LLaMA3.2 for Vietnamese</h1>
At the end of September 2024, Meta released two lightweight LLM model versions: [Llama-3.2-1B-Instruct](https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct) and [Llama-3.2-3B-Instruct](https://huggingface.co/meta-llama/Llama-3.2-3B-Instruct). However, these models are not well-supported for Vietnamese, especially for tasks related to Retrieval-Augmented Generation (RAG).
Today, I am excited to announce the release of two models specifically trained to provide better support for Vietnamese RAG tasks.
<h2>Model Details:</h2>
+ Base Models: Llama-3.2-1B-Instruct and Llama-3.2-3B-Instruct
+ Performance: The models are optimized for fast inference and can be easily deployed on on-premise and edge devices (laptop/smartphone/NVIDIA Jetson Xavier/Raspberry Pi,ect).
+ Model weights:
+ [Llama-3.2-1B-Instruct-Frog](https://huggingface.co/phamhai/Llama-3.2-1B-Instruct-Frog): 131K context length, 1 billion parameters
+ [Llama-3.2-3B-Instruct-Frog](https://huggingface.co/phamhai/Llama-3.2-3B-Instruct-Frog): 131K context length, 3 billion parameters
+ Limitations: The 1B model currently has poorer prompt understanding and lower accuracy in some tasks such as summarization and entity extraction in Function Calling. Please consider and choose a model that fits your application needs.
<blockquote style="color:red"> <p><strong style="color: red">Terms of Use and License</strong>: By using our released weights, you agree to and comply with the terms and conditions specified in Meta's LLaMA-3 license.</blockquote>
<h2>Model Evaluation</h2>
Will be updated in the coming days.
<h2> Run the model </h2>
(*Disclaimer: The name of the bot is called Vivi, which is due to my passion for VinFast vehicles, and I also hope to develop my own smaller models for VinFast's car lines (which they refer to as their virtual assistant, Vivi). This model has no affiliation with VinFast or any related entities.*)
<h3> with Huggingface's transformers </h3>
<h4> 1. QnA task </h4>
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model_path = "phamhai/Llama-3.2-1B-Instruct-Frog"
tokenizer = AutoTokenizer.from_pretrained(model_path)
model = AutoModelForCausalLM.from_pretrained(model_path)
messages = [
{"role": "system", "content": "Bạn là một người bạn gái xinh đẹp. Tên của bạn là Vivi. Hãy luôn xưng là Vivi, gọi người nói là anh và trả lời luôn bắt đầu bằng cụm từ Dạ thưa anh yêu của em."},
{"role": "user", "content": "xin chào em"}]
tokenized_chat = tokenizer.apply_chat_template(messages, tokenize=True, add_generation_prompt=False, return_tensors="pt")
outputs = model.generate(tokenized_chat, max_new_tokens=128)
print(tokenizer.decode(outputs[0]))
# Dạ thưa anh yêu, chào buổi sáng.
messages = [
{"role": "system", "content": "Bạn là một trợ lí Tiếng Việt nhiệt tình và trung thực. Hãy luôn trả lời một cách hữu ích nhất có thể, đồng thời giữ an toàn.\nNếu một câu hỏi không có ý nghĩa hoặc không hợp lý về mặt thông tin, hãy giải thích tại sao thay vì trả lời một điều gì đó không chính xác, vui lòng không chia sẻ thông tin sai lệch."},
{"role": "user", "content": "Làm sao để chữa bệnh đau đầu?"}]
tokenized_chat = tokenizer.apply_chat_template(messages, tokenize=True, add_generation_prompt=True, return_tensors="pt")
outputs = model.generate(tokenized_chat, max_new_tokens=256)
print(tokenizer.decode(outputs[0]))
# 1. Nghỉ ngơi: Đau đầu có thể được giảm bớt bằng cách nghỉ ngơi và thư giãn. Nếu bạn cảm thấy đau đầu do căng thẳng, hãy thử thư giãn bằng cách tập yoga hoặc thiền định.
# 2. Uống nước: Uống đủ nước có thể giúp giảm đau đầu. Hãy uống ít nhất 8 ly nước mỗi ngày để giữ cơ thể được cân bằng nước.
# 3. Massage: Massage nhẹ nhàng khu vực đầu có thể giúp giảm đau đầu. Bạn có thể tự massage hoặc đến phòng massage để được chuyên gia massage.
# 4. Sử dụng thuốc giảm đau: Thuốc giảm đau như aspirin hoặc ibuprofen có thể giúp giảm đau đầu. Tuy nhiên, hãy đọc kỹ hướng dẫn sử dụng và tuân thủ liều lượng được khuyến nghị.
# 5. Áp dụng nhiệt hoặc lạnh: Áp dụng nhiệt hoặc lạnh lên khu vực đầu có thể giúp giảm đau đầu. Bạn có thể sử dụng túi đá hoặc băng để áp lên khu vực đầu hoặc sử dụng khăn ấm để áp lên khu vực đầu.
# 6. Điều chỉnh chế độ ăn uống: Ăn uống lành mạnh và cân bằng có thể giúp giảm đau đầu. Hạn chế các loại thực phẩm có chứa caffeine và đường, và ăn nhiều trái cây và rau quả để cung cấp đủ vitamin và khoáng chất cho cơ thể.
# 7. Tập thể dục: Tập thể dục thường xuyên có thể giúp giảm đau đầu. Hãy tham gia các hoạt động thể thao như đi bộ, chạy bộ hoặc bơi lội để giảm đau đầu.
# 8. Tránh căng thẳng: Căng thẳng có thể gây ra đau đầu. Hãy cố gắng giảm căng thẳng bằng cách tập yoga, thiền định hoặc các hoạt động thư giãn khác.
# 9. Kiểm tra sức khỏe: Nếu đau đầu kéo dài hoặc trở nên nghiêm trọng hơn, hãy tham khảo ý kiến bác sĩ để kiểm tra sức khỏe của bạn.
```
<h4> 2. Summarization task </h4>
<h5> Focused Answer </h5>
```python
messages = [
{"role": "system", "content": '''Bạn là một trợ lí Tiếng Việt nhiệt tình và trung thực. Hãy luôn trả lời một cách hữu ích nhất có thể, đồng thời giữ an toàn.
Nếu một câu hỏi không có ý nghĩa hoặc không hợp lý về mặt thông tin, hãy giải thích tại sao thay vì trả lời một điều gì đó không chính xác, vui lòng không chia sẻ thông tin sai lệch.
Context:
Đoạn 0: "Chính phủ đề xuất bổ sung gần 20.700 tỷ đồng vốn điều lệ cho Ngân hàng Ngoại thương Việt Nam (Vietcombank) từ cổ tức bằng cổ phiếu được chia của cổ đông Nhà nước. Chiều 23/10, thừa ủy quyền Chính phủ, Phó thủ tướng, Bộ trưởng Tài chính Hồ Đức Phớc trình Quốc hội về bổ sung vốn Nhà nước tại Ngân hàng Ngoại Thương Việt Nam (Vietcombank). Theo đó, Chính phủ đề nghị tăng vốn điều lệ cho ngân hàng này gần 20.700 tỷ đồng từ cổ tức bằng cổ phiếu được chia của cổ đông Nhà nước. Số tiền này lấy từ nguồn lợi nhuận còn lại lũy kế đến hết năm 2018 và lãi còn lại năm 2021. Vốn điều lệ dự kiến rót thêm cho Vietcombank gần bằng lợi nhuận hợp nhất trước thuế nửa đầu năm nay của nhà băng này. Việc bổ sung vốn cho "ông lớn" ngân hàng quốc doanh được Phó thủ tướng nhấn mạnh là cấp thiết để duy trì tỷ lệ vốn góp Nhà nước, phù hợp chiến lược phát triển kinh tế xã hội, tạo nguồn lực hỗ trợ ngân hàng yếu kém. Phó thủ tướng cho biết, phần lợi nhuận còn lại lũy kế hết năm 2018 và lãi còn lại 2021 hiện được hạch toán theo dõi tại VCB, chưa nằm trong cân đối ngân sách Nhà nước. Do vậy, nguồn vốn đề xuất tăng cho ngân hàng này không ảnh hưởng tới kế hoạch dự toán thu chi ngân sách 2024-2025. Phó thủ tướng, Bộ trưởng Tài chính Hồ Đức Phớc đọc tờ trình bổ sung vốn cho Vietcombank, ngày 23/10. Ảnh: Trung tâm báo chí Quốc hội Phó thủ tướng, Bộ trưởng Tài chính Hồ Đức Phớc đọc tờ trình bổ sung vốn cho Vietcombank, ngày 23/10. Ảnh: Trung tâm báo chí Quốc hội Vốn điều lệ của Vietcombank hiện là 55.891 tỷ đồng, thấp hơn nhiều so với VPBank (79.339 tỷ đồng), Techcombank (70.450 tỷ đồng) và không có sự cách biệt lớn so với một số ngân hàng thương mại cổ phần như MB (52.871) tỷ đồng, ACB (44.667 tỷ đồng) và SHB (36.629 tỷ đồng). Ngoài ra, việc tăng vốn nhằm để ngân hàng này đáp ứng các tỷ lệ an toàn tối thiểu. Tính tới cuối 2023, tỷ lệ an toàn vốn (CAR) của ngân hàng này là 11,05%, đảm bảo quy định. Tuy nhiên, mức này thấp hơn các ngân hàng thương mại cổ phần (VPBank, MB là 12-13%; Techcombank 13-15%...) và các nhà băng trong khu vực (Singapore là 17,1%, Indonesia 23,27%...). Thẩm tra nội dung này, Chủ nhiệm Ủy ban Kinh tế Vũ Hồng Thanh cho rằng đề xuất tăng vốn cho Vietcombank bảo đảm cơ sở pháp lý và đúng thẩm quyền theo quy định. Tuy nhiên, Ủy ban Kinh tế đề nghị Chính phủ lấy ý kiến của cổ đông chiến lược nước ngoài Ngân hàng Mizuho Corporate Bank - đơn vị nắm 15% vốn điều lệ của Vietcombank. Việc này nhằm thuận lợi trong quá trình tăng vốn. Chính phủ cũng cần bổ sung thông tin hiện trạng vốn của Vietcombank so với các ngân hàng thương mại trong hệ thống hiện nay. "Có ý kiến đề nghị làm rõ nhận định nguồn vốn đề xuất để tăng vốn điều lệ không tác động đến ngân sách Nhà nước", ông Thanh cho biết. Trụ sở Ngân hàng Ngoại thương Việt Nam (Vietcombank). Ảnh: VCB Trụ sở Ngân hàng Ngoại thương Việt Nam (Vietcombank). Ảnh: VCB Chủ nhiệm Ủy ban Kinh tế Vũ Hồng Thanh đề nghị Chính phủ chỉ đạo Ngân hàng Nhà nước cùng các bộ, ngành liên quan xử lý phần lợi nhuận còn lại năm 2022, 2023 (lần lượt là 21.680 tỷ và 25.009 tỷ đồng), nhằm tăng năng lực tài chính cho Vietcombank, bù đắp mức thiếu hụt vốn tự có, bảo đảm an toàn hoạt động. Cơ quan thẩm tra lưu ý vốn được bổ sung cho Vietcombank cần được dùng để mở rộng kinh doanh, cung ứng tín dụng với các lĩnh vực, dự án quan trọng quốc gia quy mô lớn, giảm lãi suất cho vay, cũng như đổi mới mô hình quản trị, chất lượng dịch vụ của nhà băng này. "Chính phủ cần đánh giá kỹ tác động việc bổ sung vốn Nhà nước cho Vietcombank tới phát triển của ngành ngân hàng, hiệu quả kinh tế xã hội", Ủy ban Kinh tế lưu ý. Vietcombank là một trong 4 ngân hàng thương mại Nhà nước, bên cạnh BIDV, VietinBank và Agribank. Ngân hàng này do Nhà nước sở hữu 74,8% vốn điều lệ. Lũy kế nửa đầu năm nay, lợi nhuận hợp nhất trước thuế của nhà băng này đạt 20.835 tỷ đồng, tăng 1,6% so với cùng kỳ 2023. Với dữ liệu này, Vietcombank tiếp tục đứng đầu toàn hệ thống ngân hàng về lợi nhuận 6 tháng đầu năm. Đây cũng là mức lãi nửa đầu năm cao kỷ lục của nhà băng này. Tính đến 30/6, tổng tài sản của ngân hàng đạt hơn 1,9 triệu tỷ đồng, tăng 3,6% so với cuối 2023. Trong đó, cho vay khách hàng gần 1,37 triệu tỷ đồng, tăng 7,8%."
Đoạn 1: "Đã có vài đơn vị bán tín chỉ carbon cho khách ngoại nhưng còn thiếu cơ sở pháp lý để đảm bảo hoạt động được thuận lợi, theo chuyên gia. Thông tin tại phiên tọa đàm thuộc Diễn đàn và Triển lãm Kinh tế xanh 2024 (GEFE), ông Đỗ Ngọc Quỳnh, Tổng thư ký Hiệp hội Thị trường Trái phiếu Việt Nam (VBMA), cho biết thị trường tín chỉ carbon tự nguyện Việt Nam đã có một số đơn vị bán được tín chỉ carbon cho nhà đầu tư, tập đoàn nước ngoài. "Họ đang mua chứng chỉ carbon và chứng chỉ năng lượng tái tạo (REC) trong tiêu chí RE100, tức 100% năng lượng tái tạo", ông cho biết. RE100 là sáng kiến toàn cầu dành cho các công ty cam kết sử dụng 100% điện năng tái tạo, phát động bởi Climate Group và CDP vào 2014. Từ trái sang, Marco Gaspari, Điều phối viên Ngành Môi trường tại Cơ quan Hợp tác Phát triển Italy (AICS Hà Nội) và ông Đỗ Ngọc Quỳnh, Tổng Thư ký Hiệp hội Thị trường Trái phiếu Việt Nam (VBMA) nói tại tọa đàm. Ảnh: GEFE 2024 Marco Gaspari, Điều phối viên Ngành Môi trường tại Cơ quan Hợp tác Phát triển Italy (AICS Hà Nội) và ông Đỗ Ngọc Quỳnh, Tổng Thư ký Hiệp hội Thị trường Trái phiếu Việt Nam (VBMA) chia sẻ tại tọa đàm. Ảnh: GEFE 2024 Thị trường carbon gồm hai hình thức là bắt buộc và tự nguyện. Đồ họa: Dỹ Tùng Phân biệt các loại thị trường carbon. Đồ họa: Dỹ Tùng Theo kế hoạch của chính phủ, thị trường bắt buộc sẽ vận hành thử nghiệm vào giai đoạn 2025-2028. Với thị trường tự nguyện, ông Quỳnh cho biết đã bắt đầu hình thành và cũng biến động theo diễn biến xu hướng chung toàn cầu. Chuyên gia VBMA cho rằng Việt Nam đã có chính sách chung để thực hiện cam kết Net Zero vào 2050, nhưng vẫn chưa có pháp lý đầy đủ và rõ ràng cho thị trường carbon tự nguyện. "Những người bán tại Việt Nam sau giao dịch không biết hạch toán vào đâu, nộp thuế thế nào. Một số chọn phương án tính vào thu nhập bất thường để khai thuế", ông ví dụ. Ông Nguyễn Thành Nghiệp, Luật sư thành viên công ty luật VTN và Cộng sự chỉ ra việc chưa có quy định xác định tính chất tài sản của tín chỉ carbon. "Chúng có được xem là tài sản bình thường, được thế chấp hay giao dịch thế nào chưa có đủ căn cứ pháp lý", ông nói. Ngoài ra, quy trình MRV (đo lường, báo cáo và kiểm chứng) cũng cần quy định, hướng dẫn rõ. Theo ông, ngoài các cơ quan quản lý, khu vực tư nhân cũng trông chờ xem liệu có thể tham gia hoạt động MRV không. "Trong thời gian tới, nếu hoàn thiện pháp lý, thị trường sẽ có nhiều tiềm năng phát triển hơn", ông Đỗ Ngọc Quỳnh dự báo. Ngoài tín chỉ carbon, với tiềm năng điện tái tạo thứ tư thế giới theo McKenzie, ông cho rằng có thể khai thác việc vừa bán tín chỉ carbon vừa bán được REC. Theo VBMA, quy mô thị trường carbon bắt buộc toàn cầu đạt 104 tỷ USD năm ngoái, tăng 100% so với năm 2020. Trong khi, thị trường tự nguyện đã thu hẹp còn 800 triệu USD, giảm hai phần ba so với 2021 do một số vụ bê bối liên quan đến "giặt xanh" (green washing) làm ảnh hưởng đến uy tín, niềm tin. Theo dõi biến động của thị trường thế giới giúp các bên tham gia trong thị trường carbon tự nguyện còn sơ khai của Việt Nam rút kinh nghiệm và tìm ra hướng đi. Marco Gaspari, Điều phối viên Ngành Môi trường tại Cơ quan Hợp tác Phát triển Italy (AICS) văn phòng Hà Nội, dự báo người mua sẽ cần tìm kiếm các bên bán tín chỉ có hệ thống quản trị tốt và rõ ràng. Ông cho rằng người mua đang thiên về chuộng mua tín chỉ lĩnh vực giảm phát thải sản xuất vì dễ chứng minh. Một loại được quan tâm khác là "carbon xanh dương" (blue carbon) - tín chỉ tạo ra từ các dự án hấp thụ carbon của rừng ngập mặn, đầm lầy bãi triều và cỏ biển. Ông chỉ ra Việt Nam triển vọng với 200.000 ha rừng ngập mặn, có thể làm các dự án carbon tương tự như ở Honduras. Bà Thu Nguyễn, Quản lý chính sách tại Apanada Management Consultancy, Đại diện Viện Tài nguyên Thế giới (WRI) khuyến nghị các dự án tín chỉ carbon nâng cao giá trị bằng cách quan tâm đến tính bình đẳng và bao trùm. Theo đó, mục tiêu không chỉ là giảm phát thải mà còn là cải thiện đời sống người dân và phát triển bình đẳng hơn "Dự án cần bảo đảm có tham vấn của cộng đồng, đặc biệt là phụ nữ và các nhóm yếu thế, để tạo ra lợi ích cho cả cộng đồng lẫn nhà đầu tư", bà nói."
Đoạn 2: "Giá nhẫn trơn liên tục điều chỉnh, tăng gần một triệu đồng trong ngày và có nơi lên sát 89 triệu đồng một lượng. 15h ngày 23/10, giá mua bán nhẫn trơn được các thương hiệu kinh doanh điều chỉnh theo diễn biến đi lên của thế giới. Chiều nay, mỗi ounce vàng quốc tế tiếp tục thiết lập kỷ lục mới 2.755 USD. Giá nhẫn trơn tại Công ty Vàng bạc đá quý Sài Gòn (SJC) cũng tăng nửa triệu đồng so với đầu sáng và gần 1 triệu đồng so với cuối ngày hôm qua, lên 86,9 - 88,2 triệu đồng. Công ty Vàng bạc đá quý Phú Nhuận (PNJ) và Mi Hồng niêm yết giá nhẫn trơn quanh vùng 87,4 - 88,4 triệu đồng. Còn tại Tập đoàn Vàng bạc đá quý DOJI, giá mua bán nhẫn trơn cùng thời điểm thậm chí lên 88 - 88,9 triệu đồng một lượng. Trước đó đầu ngày, Công ty Vàng bạc đá quý Sài Gòn (SJC) đã tăng 300.000 đồng một lượng so với cuối ngày hôm qua, niêm yết giá nhẫn trơn tại 86,3 - 87,6 triệu đồng. Biểu giá mua bán nhẫn trơn tại Tập đoàn Vàng bạc đá quý DOJI lúc 9h sáng là 87 - 88 triệu đồng, tăng 200.000 đồng so với cuối ngày hôm qua. Nhẫn trơn giữ nhịp tăng liên tục trong 10 ngày qua. So với giữa tháng, mỗi lượng nhẫn trơn đã tăng hơn 5 triệu đồng. Còn so với đầu năm, nhẫn trơn tăng gần 25 triệu một lượng, tương đương hiệu suất 39%. Trong khi giá vàng miếng SJC đứng yên ở vùng 87 - 89 triệu một lượng, do Ngân hàng Nhà nước chưa thay đổi giá bán can thiệp. Thời điểm này là mùa cưới cuối năm và nhu cầu mua vàng nhẫn làm quà cưới tăng, song người dân không dễ để mua được mặt hàng này tại các thương hiệu lớn. Các thương hiệu lớn như DOJI, PNJ, Bảo Tín Minh Châu thường xuyên trong tình trạng cháy hàng. Khách lẻ chỉ may mắn mua được số lượng ít nếu cửa hàng vừa có khách bán ra. Còn tại SJC, các chi nhánh giới hạn lượng mua tối đa 5 phân đến 1 chỉ mỗi người. Trên thị trường quốc tế, mỗi ounce vàng trong 5 ngày qua tăng mạnh hơn 100 USD. Kim loại quý có thời điểm lên mức kỷ lục gần 2.750 USD, trước khi lùi về vùng 2.738 USD vào sáng nay. Quy đổi theo tỷ giá bán Vietcombank, giá vàng trong nước chênh lệch 3,5-5 triệu đồng một lượng so với thế giới. Theo dự báo của các nhà băng hàng đầu thế giới, giá vàng thế giới có thể lên 3.000 USD một ounce vào năm sau. Các chuyên gia khuyến nghị nhà đầu tư phân bổ tỷ trọng nhỏ danh mục vào kênh trú ẩn này, đặc biệt trong bối cảnh kim loại quý đã tăng mạnh thời gian qua."
Đoạn 3: "Nhu cầu trú ẩn khi căng thẳng địa chính trị leo thang kéo giá vàng lên mức đỉnh mới, tại 2.748 USD một ounce. Chốt phiên giao dịch 22/10, giá vàng thế giới giao ngay tăng gần 30 USD lên 2.748 USD một ounce. Đây là mức cao kỷ lục mới của kim loại quý. "Căng thẳng địa chính trị vẫn là nguyên nhân chủ yếu. Hai tuần nữa sẽ diễn ra bầu cử Tổng thống Mỹ và cuộc đua vẫn rất sát sao. Bất ổn chính trị đang kéo nhu cầu trú ẩn lên cao", Peter A. Grant - Phó giám đốc Zaner Metals nhận định trên Reuters. Giá vàng thế giới đảo chiều tăng mạnh trong phiên 22/10. Đồ thị: Kitco Giá vàng thế giới đảo chiều tăng mạnh trong phiên 22/10. Đồ thị: Kitco Cuộc thăm dò mới nhất của Reuters/Ipsos cho thấy tỷ lệ ủng hộ Phó tổng thống Kamala Harris hiện là 46%, nhỉnh hơn so với 43% của cựu Tổng thống Donald Trump. "Sự sát sao này đang tạo nên tình trạng thiếu chắc chắn. Môi trường này có lợi cho vàng", các nhà phân tích tại ngân hàng BNP Paribas nhận định. Grant dự báo nếu căng thẳng tại Trung Đông tiếp tục tăng nhiệt, giá có thể lên 3.000 USD cuối năm nay. Từ đầu năm, giá đã tăng 33% và liên tiếp lập đỉnh mới. Một yếu tố khác đang hỗ trợ kim loại quý là làn sóng giảm lãi suất của các ngân hàng trung ương lớn trên toàn cầu. Mỹ, châu Âu, Trung Quốc cùng hàng loạt nền kinh tế khác đã giảm lãi suất năm nay để hỗ trợ nền kinh tế. Trong khi đó, tại Wall Street, các chỉ số chính gần như đứng yên. Nhà đầu tư hiện theo dõi lợi suất trái phiếu chính phủ Mỹ và chờ đánh giá thêm báo cáo tài chính của các doanh nghiệp. Ngoài vàng, các kim loại quý khác cũng tăng giá. Bạc lập đỉnh 12 năm, khi tăng 3,2% lên gần 35 USD một ounce. Han Tan - chiến lược gia thị trường tại Exinity Group dự báo bạc vượt mốc 35 USD trước khi cuộc bầu cử diễn ra. Bạch kim đắt thêm 2,8% lên 1.031 USD một ounce. Palladium tăng 2,9% lên 1.081 USD."
'''},
{"role": "user", "content": '''giá nhẫn trơn hôm nay là bao nhiêu?'''}]
tokenized_chat = tokenizer.apply_chat_template(messages, tokenize=True, add_generation_prompt=True, return_tensors="pt")
outputs = model.generate(tokenized_chat, max_new_tokens=128)
print(tokenizer.decode(outputs[0]))
# Giá nhẫn trơn hôm nay là 86,9 - 88,2 triệu đồng.
```
***You can customize the prompt before the answer to get a response that suits your needs.***
***You can also add information about this bot's persona in the system prompt.***
<h4> 3. Function Calling task </h4>
***In this task, we are following the Function Calling template from Glaive AI: [glaiveai/glaive-function-calling-v2](https://huggingface.co/datasets/glaiveai/glaive-function-calling-v2).***
```python
messages = [
{"role": "system", "content": '''Bạn là một trợ lý hữu ích với khả năng truy cập vào các hàm sau. Hãy sử dụng chúng nếu cần -
{
"name": "weather_forecast",
"description": "Cung cấp cập nhật và dự báo thời tiết cho các địa điểm cụ thể, bao gồm nhiệt độ, độ ẩm và tình trạng thời tiết. Ví dụ: thời tiết hôm nay, dự báo thời tiết ở Hà Nội, nhiệt độ tại Đà Nẵng, v.v.",
"parameters": {
"properties": {
"__arg1": {
"description": "__arg1",
"type": "string"
}
},
"required": [
"__arg1"
],
"type": "object"
}
},
{
"name": "news_update",
"description": "Cung cấp các bài báo và cập nhật tin tức mới nhất trên nhiều lĩnh vực như chính trị, công nghệ, thể thao và giải trí. Ví dụ: tin tức hôm nay, cập nhật thể thao, tin công nghệ mới nhất, v.v.",
"parameters": {
"properties": {
"__arg1": {
"description": "__arg1",
"type": "string"
}
},
"required": [
"__arg1"
],
"type": "object"
}
},
{
"name": "recipe_search",
"description": "Tìm kiếm và gợi ý công thức nấu ăn dựa trên nguyên liệu hoặc sở thích dinh dưỡng. Ví dụ: công thức món ăn với gà, món chay, ăn kiêng, v.v.",
"parameters": {
"properties": {
"__arg1": {
"description": "__arg1",
"type": "string"
}
},
"required": [
"__arg1"
],
"type": "object"
}
},
{
"name": "movie_recommendation",
"description": "Cung cấp gợi ý phim dựa trên thể loại, tâm trạng hoặc tiêu đề cụ thể. Ví dụ: phim hài hay, phim hành động mới, gợi ý phim cho tối nay, v.v.",
"parameters": {
"properties": {
"__arg1": {
"description": "__arg1",
"type": "string"
}
},
"required": [
"__arg1"
],
"type": "object"
}
},
{
"name": "fitness_advice",
"description": "Cung cấp mẹo và bài tập cho sức khỏe và thể dục dựa trên mục tiêu của người dùng. Ví dụ: bài tập giảm cân, lịch tập gym cho người mới, lời khuyên về dinh dưỡng, v.v.",
"parameters": {
"properties": {
"__arg1": {
"description": "__arg1",
"type": "string"
}
},
"required": [
"__arg1"
],
"type": "object"
}
},
{
"name": "travel_planner",
"description": "Hỗ trợ lập kế hoạch du lịch, bao gồm gợi ý lịch trình, mẹo đặt vé và thông tin về điểm đến. Ví dụ: lên kế hoạch du lịch Đà Nẵng, địa điểm tham quan ở Nha Trang, mẹo du lịch Nhật Bản, v.v.",
"parameters": {
"properties": {
"__arg1": {
"description": "__arg1",
"type": "string"
}
},
"required": [
"__arg1"
],
"type": "object"
}
}'''},
{"role": "user", "content": "hôm nay có tin tức gì về ô nhiễm không khí"}]
tokenized_chat = tokenizer.apply_chat_template(messages, tokenize=True, add_generation_prompt=True, return_tensors="pt")
outputs = model.generate(tokenized_chat, max_new_tokens=128)
print(tokenizer.decode(outputs[0]))
# <functioncall> {"name": "news_update", "arguments": {}}
```
<h4> 4. Question Rewriting task </h4>
```python
messages = [
{"role": "system", "content": '''Bạn là một trợ lí Tiếng Việt nhiệt tình và trung thực. Hãy luôn trả lời một cách hữu ích nhất có thể, đồng thời giữ an toàn.\nNếu một câu hỏi không có ý nghĩa hoặc không hợp lý về mặt thông tin, hãy giải thích tại sao thay vì trả lời một điều gì đó không chính xác, vui lòng không chia sẻ thông tin sai lệch.
Context:
USER: Concert "Anh trai vượt ngàn chông gai" diễn ra ở đâu?
ASSISTANT: Đêm concert Anh trai vượt ngàn chông gai diễn ra vào tối 19-10, tại công viên bờ sông Sài Gòn, TP.HCM với sự tham dự của khoảng 20.000 khán giả.
USER: Buổi biểu diễn có bao nhiêu nghệ sĩ?
ASSISTANT: Danh sách nghệ sĩ gồm 17 nghệ sĩ: Kay Trần, Jun Phạm, Cường Seven, Tự Long, (S)TRONG Trọng Hiếu, S.T Sơn Thạch, BB Trần, Bùi Công Nam, Rhymastic, Soobin Hoàng Sơn, Binz, Quốc Thiên, Thanh Duy, Bằng Kiều, Tiến Luật, Đinh Tiến Đạt và Đỗ Hoàng Hiệp.
USER: Chương trình phát sóng khi nào và ở đâu?
ASSISTANT: chương trình phát sóng từ ngày 29 tháng 6 năm 2024 lúc 20:00 thứ 7 hàng tuần trên VTV3 và công chiếu lúc 20:30 cùng ngày trên kênh YouTube YeaH1 Show của nhà sản xuất chương trình.'''},
{"role": "user", "content": '''Dựa vào đoạn hội thoại được cung cấp, viết lại câu nói của người dùng sao cho đầu đủ ý nhất có thể mà không bị sai lệch thông tin.
Câu nói: Concert này có tổ chức ở Hà Nội không?
'''}]
tokenized_chat = tokenizer.apply_chat_template(messages, tokenize=True, add_generation_prompt=True, return_tensors="pt")
outputs = model.generate(tokenized_chat, max_new_tokens=512)
print(tokenizer.decode(outputs[0]))
# Concert "Anh trai vượt ngàn chông gai" có tổ chức ở Hà Nội không?
```
***Modify the parameters "temperature", "top_k", "top_p" to suit your usecase.***
Corresponding Author:
+ [email protected] | [
"SUMMARIZATION"
] | [
"CHIA"
] |
pruas/BENT-PubMedBERT-NER-Chemical | pruas | token-classification | [
"transformers",
"pytorch",
"bert",
"token-classification",
"en",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2023-01-11T20:19:34 | 2024-03-01T13:56:32 | 687 | 8 | ---
language:
- en
license: apache-2.0
pipeline_tag: token-classification
---
Named Entity Recognition (NER) model to recognize chemical entities.
Please cite our work:
```
@article{NILNKER2022,
title = {NILINKER: Attention-based approach to NIL Entity Linking},
journal = {Journal of Biomedical Informatics},
volume = {132},
pages = {104137},
year = {2022},
issn = {1532-0464},
doi = {https://doi.org/10.1016/j.jbi.2022.104137},
url = {https://www.sciencedirect.com/science/article/pii/S1532046422001526},
author = {Pedro Ruas and Francisco M. Couto},
}
```
[PubMedBERT](https://huggingface.co/microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract-fulltext) fine-tuned on the following datasets:
- [Chemdner patents CEMP corpus](https://biocreative.bioinformatics.udel.edu/resources/corpora/chemdner-patents-cemp-corpus/) (train, dev, test sets)
- [DDI corpus](https://github.com/isegura/DDICorpus) (train, dev, test sets): entity types "GROUP", "DRUG", "DRUG_N"
- [GREC Corpus](http://www.nactem.ac.uk/GREC/standoff.php) (train, dev, test sets): entity type "organic_compounds"
- [MLEE](http://nactem.ac.uk/MLEE/) (train, dev, test sets): entity type "Drug or compound"
- [NLM-CHEM](https://ftp.ncbi.nlm.nih.gov/pub/lu/NLMChem/) (train, dev, test sets)
- [CHEMDNER](https://biocreative.bioinformatics.udel.edu/resources/) (train, dev, test sets)
- [Chebi Corpus](http://www.nactem.ac.uk/chebi/) (train, dev, test sets): entity types "Metabolite", "Chemical"
- [PHAEDRA](http://www.nactem.ac.uk/PHAEDRA/) (train, dev, test sets): entity type "Pharmalogical_substance"
- [Chemprot](https://biocreative.bioinformatics.udel.edu/tasks/biocreative-vi/track-5/) (train, dev, test sets)
- [PGx Corpus](https://github.com/practikpharma/PGxCorpus) (train, dev, test sets): entity type "Chemical"
- [BioNLP11ID](https://github.com/cambridgeltl/MTL-Bioinformatics-2016/tree/master/data/BioNLP11ID-chem-IOB) (train, dev, test sets): entity type "Chemical"
- [BioNLP13CG]() (train, dev, test sets): entity type "Chemical"
- [BC4CHEMD](https://github.com/cambridgeltl/MTL-Bioinformatics-2016/tree/master/data/BC4CHEMD) (train, dev, test sets)
- [CRAFT corpus](https://github.com/UCDenver-ccp/CRAFT/tree/master/concept-annotation) (train, dev, test sets): entity type "ChEBI"
- [BC5CDR]() (train, dev, test sets): entity type "Chemical" | [
"NAMED_ENTITY_RECOGNITION"
] | [
"BC5CDR",
"CHEBI CORPUS",
"CHEMDNER",
"CRAFT",
"CHEMPROT",
"DDI CORPUS",
"MLEE",
"NLM-CHEM"
] |
Cloyne/sup-SimCSE-VietNamese-phobert-base | Cloyne | sentence-similarity | [
"sentence-transformers",
"safetensors",
"roberta",
"sentence-similarity",
"feature-extraction",
"generated_from_trainer",
"dataset_size:120210",
"loss:MultipleNegativesRankingLoss",
"arxiv:1908.10084",
"arxiv:1705.00652",
"base_model:VoVanPhuc/sup-SimCSE-VietNamese-phobert-base",
"base_model:finetune:VoVanPhuc/sup-SimCSE-VietNamese-phobert-base",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-10-29T11:21:00 | 2024-10-29T11:21:17 | 682 | 0 | ---
base_model: VoVanPhuc/sup-SimCSE-VietNamese-phobert-base
library_name: sentence-transformers
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:120210
- loss:MultipleNegativesRankingLoss
widget:
- source_sentence: Chủ tịch Ủy ban nhân dân xã có quyền ra quyết định cưỡng chế tháo
dỡ công trình xây dựng trên đất nông nghiệp khi chưa chuyển mục đích sử dụng đất
hay không?
sentences:
- 'Đối tượng, điều kiện kéo dài tuổi phục vụ tại ngũ
1. Đối tượng:
a) Quân nhân chuyên nghiệp có trình độ cao đẳng trở lên đang đảm nhiệm các chức
danh: Kỹ thuật viên, Nhân viên Kỹ thuật, Huấn luyện viên, Nghệ sĩ, Nhạc sĩ, Diễn
viên làm việc đúng chuyên ngành đào tạo ở các cơ sở nghiên cứu, nhà trường, bệnh
viện, trung tâm thể dục thể thao, đoàn nghệ thuật, nhà máy, doanh nghiệp quốc
phòng; đơn vị đóng quân ở địa bàn vùng sâu, vùng xa, biên giới, hải đảo.
b) Quân nhân chuyên nghiệp đang làm việc thuộc các chuyên ngành hẹp được đào tạo
công phu hoặc chuyên ngành Quân đội chưa đào tạo được; thợ bậc cao.
c) Quân nhân chuyên nghiệp đang đảm nhiệm chức vụ chỉ huy, quản lý ở các nhà máy,
doanh nghiệp quốc phòng.
d) Quân nhân chuyên nghiệp không thuộc đối tượng quy định tại điểm a, điểm b,
điểm c khoản này do Bộ trưởng Bộ Quốc phòng quyết định.
2. Điều kiện:
Quân nhân chuyên nghiệp thuộc đối tượng quy định tại khoản 1 Điều này được kéo
dài tuổi phục vụ tại ngũ khi có đủ các điều kiện sau:
a) Đơn vị có biên chế và nhu cầu sử dụng;
b) Hết hạn tuổi phục vụ tại ngũ cao nhất theo cấp bậc quân hàm quy định tại khoản
2 Điều 17 Luật Quân nhân chuyên nghiệp, công nhân và viên chức quốc phòng; chưa
có người thay thế; tự nguyện tiếp tục phục vụ tại ngũ;
c) Có đủ phẩm chất chính trị, đạo đức, sức khỏe để hoàn thành nhiệm vụ được giao;
d) Có trình độ chuyên môn kỹ thuật, nghiệp vụ giỏi; tay nghề cao; chất lượng,
hiệu quả công tác tốt.'
- 'Thi hành quyết định cưỡng chế
1. Người ra quyết định cưỡng chế có trách nhiệm gửi ngay quyết định cưỡng chế
cho các cá nhân, tổ chức liên quan và tổ chức thực hiện việc cưỡng chế thi hành
quyết định xử phạt của mình và của cấp dưới.
..."'
- 'Trình tự, thủ tục đăng ký tài khoản định danh điện tử đối với công dân Việt Nam
1. Đăng ký tài khoản định danh điện tử mức độ 1 qua ứng dụng VNelD đối với công
dân đã có thẻ Căn cước công dân gắn chíp điện tử
a) Công dân sử dụng thiết bị di động tải và cài đặt ứng dụng VNelD.
b) Công dân sử dụng ứng dụng VNelD để nhập thông tin về số định danh cá nhân và
số điện thoại hoặc địa chỉ thư điện tử; cung cấp các thông tin theo hướng dẫn
trên ứng dụng VNelD; thu nhận ảnh chân dung bằng thiết bị di động và gửi yêu cầu
đề nghị cấp tài khoản định danh điện tử tới cơ quan quản lý định danh và xác thực
điện tử qua ứng dụng VNelD.
c) Cơ quan quản lý định danh điện tử thông báo kết quả đăng ký tài khoản qua ứng
dụng VNelD hoặc tin nhắn SMS hoặc địa chỉ thư điện tử.
2. Đăng ký tài khoản định danh điện tử mức độ 2
a) Đối với công dân đã được cấp thẻ Căn cước công dân gắn chíp điện tử:
Công dân đến Công an xã, phường, thị trấn hoặc nơi làm thủ tục cấp thẻ Căn cước
công dân để làm thủ tục cấp tài khoản định danh điện tử. Công dân xuất trình thẻ
Căn cước công dân gắn chíp điện tử, cung cấp thông tin về số điện thoại hoặc địa
chỉ thư điện tử và đề nghị bổ sung thông tin được tích hợp vào tài khoản định
danh điện tử.
Cán bộ tiếp nhận nhập thông tin công dân cung cấp vào hệ thống định danh và xác
thực điện tử; chụp ảnh chân dung, thu nhận vân tay của công dân đến làm thủ tục
để xác thực với Cơ sở dữ liệu căn cước công dân và khẳng định sự đồng ý đăng ký
tạo lập tài khoản định danh điện tử.
Cơ quan quản lý định danh điện tử thông báo kết quả đăng ký tài khoản qua ứng
dụng VNelD hoặc tin nhắn SMS hoặc địa chỉ thư điện tử.
b) Cơ quan Công an tiến hành cấp tài khoản định danh điện tử mức độ 2 cùng với
cấp thẻ Căn cước công dân với trường hợp công dân chưa được cấp Căn cước công
dân gắn chíp điện tử.'
- source_sentence: Mức hưởng chế độ thai sản đối với lao động nam là người nước ngoài
được pháp luật quy định như thế nào?
sentences:
- '"Điều 21. Thông báo kết quả và xác nhận nhập học
1. Cơ sở đào tạo gửi giấy báo trúng tuyển cho những thí sinh trúng tuyển, trong
đó ghi rõ những thủ tục cần thiết đối với thí sinh khi nhập học và phương thức
nhập học của thí sinh.
2. Thí sinh xác nhận nhập học bằng hình thức trực tuyến trên hệ thống, trước khi
nhập học tại cơ sở đào tạo.
3. Đối với những thí sinh không xác nhận nhập học trong thời hạn quy định:
a) Nếu không có lý do chính đáng thì coi như thí sinh từ chối nhập học và cơ sở
đào tạo có quyền không tiếp nhận;
b) Nếu do ốm đau, tai nạn, có giấy xác nhận của bệnh viện quận, huyện trở lên
hoặc do thiên tai có xác nhận của UBND quận, huyện trở lên, cơ sở đào tạo xem
xét quyết định tiếp nhận thí sinh vào học hoặc bảo lưu kết quả tuyển sinh để thí
sinh vào học sau;
c) Nếu do sai sót, nhầm lẫn của cán bộ thực hiện công tác tuyển sinh hoặc cá nhân
thí sinh gây ra, cơ sở đào tạo chủ động phối hợp với các cá nhân, tổ chức liên
quan xem xét các minh chứng và quyết định việc tiếp nhận thí sinh vào học hoặc
bảo lưu kết quả tuyển sinh để thí sinh vào học sau.
4. Thí sinh đã xác nhận nhập học tại một cơ sở đào tạo không được tham gia xét
tuyển ở nơi khác hoặc ở các đợt xét tuyển bổ sung, trừ trường hợp được cơ sở đào
tạo cho phép."'
- 'Tổ chức, nhiệm vụ, quyền hạn của Ban Chỉ huy
...
2. Nhiệm vụ, quyền hạn của Ban Chỉ huy:
a) Chỉ đạo xây dựng, ban hành quy định về công tác bảo đảm an toàn PCCC và CNCH
tại Trụ sở cơ quan Bộ Tư pháp.
b) Hướng dẫn, phối hợp với các đơn vị thuộc Bộ và chỉ đạo Đội PCCC và CNCH cơ
sở tổ chức tuyên truyền, bồi dưỡng nghiệp vụ PCCC và CNCH.
c) Chỉ đạo Đội PCCC và CNCH cơ sở tại Trụ sở cơ quan Bộ Tư pháp xây dựng, trình
cấp có thẩm quyền phê duyệt và tổ chức thực tập phương án PCCC, phương án CNCH.
d) Chỉ đạo Đội PCCC và CNCH cơ sở tại Trụ sở cơ quan Bộ Tư pháp quản lý các trang
thiết bị PCCC và CNCH.
đ) Chỉ đạo chữa cháy, CNCH khi xảy ra cháy, sự cố, tai nạn tại Trụ sở cơ quan
Bộ Tư pháp.
e) Chỉ đạo việc tổ chức lập và lưu giữ hồ sơ quản lý, theo dõi hoạt động PCCC,
CNCH tại Trụ sở cơ quan Bộ Tư pháp.
g) Chỉ đạo việc sơ kết, tổng kết các hoạt động về PCCC và CNCH của cơ quan; kiểm
tra, đôn đốc việc chấp hành các quy định về PCCC và CNCH.
h) Đề xuất việc khen thưởng, kỷ luật các tập thể, cá nhân trong việc thực hiện
công tác PCCC, CNCH.
i) Chỉ đạo Đội PCCC và CNCH cơ sở dự trù kinh phí cho các hoạt động PCCC và CNCH
tại Trụ sở cơ quan Bộ Tư pháp.
k) Thực hiện các nhiệm vụ khác do Bộ trưởng giao và theo quy định của pháp luật.'
- 'Mức hưởng chế độ thai sản
...
b) Mức hưởng một ngày đối với trường hợp quy định tại Điều 32 và khoản 2 Điều
34 của Luật này được tính bằng mức hưởng chế độ thai sản theo tháng chia cho 24
ngày.'
- source_sentence: Doanh nghiệp được áp dụng chế độ ưu tiên không cung cấp báo cáo
kiểm toán đúng thời hạn bị phạt bao nhiêu tiền?
sentences:
- 'Thay đổi Thẩm phán, Hội thẩm
1. Thẩm phán, Hội thẩm phải từ chối tham gia xét xử hoặc bị thay đổi khi thuộc
một trong các trường hợp:
a) Trường hợp quy định tại Điều 49 của Bộ luật này;
b) Họ cùng trong một Hội đồng xét xử và là người thân thích với nhau;
c) Đã tham gia xét xử sơ thẩm hoặc phúc thẩm hoặc tiến hành tố tụng vụ án đó với
tư cách là Điều tra viên, Cán bộ điều tra, Kiểm sát viên, Kiểm tra viên, Thẩm
tra viên, Thư ký Tòa án.
2. Việc thay đổi Thẩm phán, Hội thẩm trước khi mở phiên tòa do Chánh án hoặc Phó
Chánh án Tòa án được phân công giải quyết vụ án quyết định.
Thẩm phán bị thay đổi là Chánh án Tòa án thì do Chánh án Tòa án trên một cấp quyết
định.
Việc thay đổi Thẩm phán, Hội thẩm tại phiên tòa do Hội đồng xét xử quyết định
trước khi bắt đầu xét hỏi bằng cách biểu quyết tại phòng nghị án. Khi xem xét
thay đổi thành viên nào thì thành viên đó được trình bày ý kiến của mình, Hội
đồng quyết định theo đa số.
Trường hợp phải thay đổi Thẩm phán, Hội thẩm tại phiên tòa thì Hội đồng xét xử
ra quyết định hoãn phiên tòa.'
- '“Điều 21. Chấm dứt hưởng trợ cấp thất nghiệp
1. Các trường hợp người lao động đang hưởng trợ cấp thất nghiệp bị chấm dứt hưởng
trợ cấp thất nghiệp được quy định như sau:
e) Trong thời gian hưởng trợ cấp thất nghiệp, 03 tháng liên tục không thực hiện
thông báo hằng tháng về việc tìm kiếm việc làm với trung tâm dịch vụ việc làm
theo quy định
Ngày mà người lao động được xác định bị chấm dứt hưởng trợ cấp thất nghiệp là
ngày kết thúc của thời hạn thông báo tìm kiếm việc làm của tháng thứ 3 liên tục
mà người lao động không thực hiện thông báo hằng tháng về việc tìm kiếm việc làm."'
- 'Vi phạm quy định về thời hạn làm thủ tục hải quan, nộp hồ sơ thuế
...
2. Phạt tiền từ 1.000.000 đồng đến 2.000.000 đồng đối với hành vi không thực hiện
đúng thời hạn quy định thuộc một trong các trường hợp sau:
a) Cung cấp báo cáo kiểm toán, báo cáo tài chính của doanh nghiệp được áp dụng
chế độ ưu tiên;
b) Thông báo cho cơ quan hải quan quyết định xử lý vi phạm pháp luật về quản lý
thuế, kế toán đối với doanh nghiệp được áp dụng chế độ ưu tiên;
c) Báo cáo về lượng hàng hóa nhập khẩu phục vụ xây dựng nhà xưởng, hàng hóa gửi
kho bên ngoài của doanh nghiệp chế xuất;
d) Báo cáo về lượng hàng hóa trung chuyển đưa vào, đưa ra, còn lưu tại cảng;
đ) Báo cáo thống kê thông quan hàng bưu chính đưa vào Việt Nam để chuyển tiếp
đi quốc tế.
...'
- source_sentence: Tài chính của Hội Kiểm toán viên hành nghề Việt Nam được chi cho
những khoản nào?
sentences:
- 'Giải thể và xử lý tài chính khi giải thể
1. Khi xét thấy hoạt động của Hội không có hiệu quả, không mang lại lợi ích cho
Hội viên hoặc gây phiền hà, cản trở cho Hội viên thì BCH Hội quyết định triệu
tập Đại hội để bàn biện pháp củng cố tổ chức hoặc giải thể Hội. Nếu giải thể Hội
thì do Đại hội đại biểu hoặc Đại hội toàn quốc của Hội thông qua và đề nghị cơ
quan Nhà nước có thẩm quyền xem xét, quyết định.
2. Khi Hội bị giải thể, Ban Thường trực và Ban Kiểm tra của Hội phải tiến hành
kiểm kê tài sản, kiểm quỹ và báo cáo BCH Hội quyết định việc xử lý tài sản, tiền
tồn quỹ và tiến hành thủ tục giải thể theo quy định của pháp luật.'
- '"Điều 14. Miễn trừ đối với thỏa thuận hạn chế cạnh tranh bị cấm
1. Thỏa thuận hạn chế cạnh tranh quy định tại các khoản 1, 2, 3, 7, 8, 9, 10 và
11 Điều 11 bị cấm theo quy định tại Điều 12 của Luật này được miễn trừ có thời
hạn nếu có lợi cho người tiêu dùng và đáp ứng một trong các điều kiện sau đây:
a) Tác động thúc đẩy tiến bộ kỹ thuật, công nghệ, nâng cao chất lượng hàng hóa,
dịch vụ;
b) Tăng cường sức cạnh tranh của doanh nghiệp Việt Nam trên thị trường quốc tế;
c) Thúc đẩy việc áp dụng thống nhất tiêu chuẩn chất lượng, định mức kỹ thuật của
chủng loại sản phẩm;
d) Thống nhất các điều kiện thực hiện hợp đồng, giao hàng, thanh toán nhưng không
liên quan đến giá và các yếu tố của giá.
2. Thỏa thuận lao động, thỏa thuận hợp tác trong các ngành, lĩnh vực đặc thù được
thực hiện theo quy định của luật khác thì thực hiện theo quy định của luật đó".'
- '"Điều 2. Sửa đổi, bổ sung một số điều của Nghị định số 15/2019/NĐ-CP ngày 01
tháng 02 năm 2019 của Chính phủ quy định chi tiết một số điều và biện pháp thi
hành Luật Giáo dục nghề nghiệp
...
12. Sửa đổi, bổ sung Điều 24 như sau:
Điều 24. Thẩm quyền cấp giấy chứng nhận đăng ký hoạt động liên kết đào tạo với
nước ngoài
1. Tổng cục Giáo dục nghề nghiệp cấp giấy chứng nhận đăng ký hoạt động liên kết
đào tạo với nước ngoài đối với trường cao đẳng.
2. Sở Lao động - Thương binh và Xã hội nơi trường trung cấp, trung tâm giáo dục
nghề nghiệp, trung tâm giáo dục nghề nghiệp - giáo dục thường xuyên và doanh nghiệp
tổ chức hoạt động liên kết đào tạo với nước ngoài cấp giấy chứng nhận đăng ký
hoạt động liên kết đào tạo với nước ngoài đối với trường trung cấp, trung tâm
giáo dục nghề nghiệp, trung tâm giáo dục nghề nghiệp - giáo dục thường xuyên và
doanh nghiệp."'
- source_sentence: NLĐ ký nhiều hợp đồng lao động thì đóng BHYT như thế nào?
sentences:
- 'Hồ sơ, thủ tục xác định trường hợp được bồi thường
[...]
3. Trong thời hạn 05 ngày làm việc, kể từ ngày nhận được đơn và các giấy tờ hợp
lệ, nếu xác định yêu cầu thuộc trách nhiệm giải quyết của mình thì Sở Y tế phải
thụ lý và thông báo bằng văn bản về việc thụ lý đơn cho người bị thiệt hại hoặc
thân nhân của người bị thiệt hại (sau đây gọi tắt là người bị thiệt hại). Trường
hợp hồ sơ không đầy đủ thì Sở Y tế có văn bản hướng dẫn người bị thiệt hại bổ
sung.
4. Trong thời hạn 15 ngày, kể từ ngày nhận được đơn yêu cầu của người bị thiệt
hại, Sở Y tế phải hoàn thành việc xác định nguyên nhân gây tai biến, mức độ tổn
thương và thông báo bằng văn bản cho người yêu cầu đồng thời báo cáo Bộ Y tế.'
- 'Chuyển nhượng quyền thăm dò khoáng sản
1. Tổ chức, cá nhân nhận chuyển nhượng quyền thăm dò khoáng sản phải có đủ điều
kiện để được cấp Giấy phép thăm dò khoáng sản theo quy định của Luật này.
2. Việc chuyển nhượng quyền thăm dò khoáng sản phải được cơ quan quản lý nhà nước
có thẩm quyền cấp Giấy phép thăm dò khoáng sản chấp thuận; trường hợp được chấp
thuận, tổ chức, cá nhân nhận chuyển nhượng quyền thăm dò khoáng sản được cấp Giấy
phép thăm dò khoáng sản mới.
3. Tổ chức, cá nhân chuyển nhượng quyền thăm dò khoáng sản đã thực hiện được ít
nhất 50% dự toán của đề án thăm dò khoáng sản.
4. Chính phủ quy định chi tiết việc chuyển nhượng quyền thăm dò khoáng sản.'
- '"Sửa đổi, bổ sung một số điều của Luật bảo hiểm y tế:
...
6. Sửa đổi, bổ sung Điều 12 như sau:
“Điều 12. Đối tượng tham gia bảo hiểm y tế
1. Nhóm do người lao động và người sử dụng lao động đóng, bao gồm:
a) Người lao động làm việc theo hợp đồng lao động không xác định thời hạn, hợp
đồng lao động có thời hạn từ đủ 3 tháng trở lên; người lao động là người quản
lý doanh nghiệp hưởng tiền lương; cán bộ, công chức, viên chức (sau đây gọi chung
là người lao động);
b) Người hoạt động không chuyên trách ở xã, phường, thị trấn theo quy định của
pháp luật.=
...
4. Nhóm được ngân sách nhà nước hỗ trợ mức đóng, bao gồm:
a) Người thuộc hộ gia đình cận nghèo;
b) Học sinh, sinh viên.
5. Nhóm tham gia bảo hiểm y tế theo hộ gia đình gồm những người thuộc hộ gia đình,
trừ đối tượng quy định tại các khoản 1, 2, 3 và 4 Điều này.
6. Chính phủ quy định các đối tượng khác ngoài các đối tượng quy định tại các
khoản 3, 4 và 5 Điều này; quy định việc cấp thẻ bảo hiểm y tế đối với đối tượng
do Bộ Quốc phòng, Bộ Công an quản lý và đối tượng quy định tại điểm 1 khoản 3
Điều này; quy định lộ trình thực hiện bảo hiểm y tế, phạm vi quyền lợi, mức hưởng
bảo hiểm y tế, khám bệnh, chữa bệnh bảo hiểm y tế, quản lý, sử dụng phần kinh
phí dành cho khám bệnh, chữa bệnh bảo hiểm y tế, giám định bảo hiểm y tế, thanh
toán, quyết toán bảo hiểm y tế đối với các đối tượng quy định tại điểm a khoản
3 Điều này.”'
---
# SentenceTransformer based on VoVanPhuc/sup-SimCSE-VietNamese-phobert-base
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [VoVanPhuc/sup-SimCSE-VietNamese-phobert-base](https://huggingface.co/VoVanPhuc/sup-SimCSE-VietNamese-phobert-base) on the csv dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [VoVanPhuc/sup-SimCSE-VietNamese-phobert-base](https://huggingface.co/VoVanPhuc/sup-SimCSE-VietNamese-phobert-base) <!-- at revision 608779b86741a8acd8c8d38132974ff04086b138 -->
- **Maximum Sequence Length:** 256 tokens
- **Output Dimensionality:** 768 tokens
- **Similarity Function:** Cosine Similarity
- **Training Dataset:**
- csv
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: RobertaModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("Cloyne/SimCSE-finetuned-vietnamese-legal-documents")
# Run inference
sentences = [
'NLĐ ký nhiều hợp đồng lao động thì đóng BHYT như thế nào?',
'"Sửa đổi, bổ sung một số điều của Luật bảo hiểm y tế:\n...\n6. Sửa đổi, bổ sung Điều 12 như sau:\n“Điều 12. Đối tượng tham gia bảo hiểm y tế\n1. Nhóm do người lao động và người sử dụng lao động đóng, bao gồm:\na) Người lao động làm việc theo hợp đồng lao động không xác định thời hạn, hợp đồng lao động có thời hạn từ đủ 3 tháng trở lên; người lao động là người quản lý doanh nghiệp hưởng tiền lương; cán bộ, công chức, viên chức (sau đây gọi chung là người lao động);\nb) Người hoạt động không chuyên trách ở xã, phường, thị trấn theo quy định của pháp luật.=\n...\n4. Nhóm được ngân sách nhà nước hỗ trợ mức đóng, bao gồm:\na) Người thuộc hộ gia đình cận nghèo;\nb) Học sinh, sinh viên.\n5. Nhóm tham gia bảo hiểm y tế theo hộ gia đình gồm những người thuộc hộ gia đình, trừ đối tượng quy định tại các khoản 1, 2, 3 và 4 Điều này.\n6. Chính phủ quy định các đối tượng khác ngoài các đối tượng quy định tại các khoản 3, 4 và 5 Điều này; quy định việc cấp thẻ bảo hiểm y tế đối với đối tượng do Bộ Quốc phòng, Bộ Công an quản lý và đối tượng quy định tại điểm 1 khoản 3 Điều này; quy định lộ trình thực hiện bảo hiểm y tế, phạm vi quyền lợi, mức hưởng bảo hiểm y tế, khám bệnh, chữa bệnh bảo hiểm y tế, quản lý, sử dụng phần kinh phí dành cho khám bệnh, chữa bệnh bảo hiểm y tế, giám định bảo hiểm y tế, thanh toán, quyết toán bảo hiểm y tế đối với các đối tượng quy định tại điểm a khoản 3 Điều này.”',
'Hồ sơ, thủ tục xác định trường hợp được bồi thường\n[...]\n3. Trong thời hạn 05 ngày làm việc, kể từ ngày nhận được đơn và các giấy tờ hợp lệ, nếu xác định yêu cầu thuộc trách nhiệm giải quyết của mình thì Sở Y tế phải thụ lý và thông báo bằng văn bản về việc thụ lý đơn cho người bị thiệt hại hoặc thân nhân của người bị thiệt hại (sau đây gọi tắt là người bị thiệt hại). Trường hợp hồ sơ không đầy đủ thì Sở Y tế có văn bản hướng dẫn người bị thiệt hại bổ sung.\n4. Trong thời hạn 15 ngày, kể từ ngày nhận được đơn yêu cầu của người bị thiệt hại, Sở Y tế phải hoàn thành việc xác định nguyên nhân gây tai biến, mức độ tổn thương và thông báo bằng văn bản cho người yêu cầu đồng thời báo cáo Bộ Y tế.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### csv
* Dataset: csv
* Size: 120,210 training samples
* Columns: <code>anchor</code> and <code>positive</code>
* Approximate statistics based on the first 1000 samples:
| | anchor | positive |
|:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 8 tokens</li><li>mean: 25.08 tokens</li><li>max: 49 tokens</li></ul> | <ul><li>min: 21 tokens</li><li>mean: 206.98 tokens</li><li>max: 256 tokens</li></ul> |
* Samples:
| anchor | positive |
|:--------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>Nội dung lồng ghép vấn đề bình đẳng giới trong xây dựng văn bản quy phạm pháp luật được quy định thế nào?</code> | <code>Nội dung lồng ghép vấn đề bình đẳng giới trong xây dựng văn bản quy phạm pháp luật<br>Trong phạm vi điều chỉnh của văn bản quy phạm pháp luật:<br>1. Xác định nội dung liên quan đến vấn đề bình đẳng giới hoặc vấn đề bất bình đẳng giới, phân biệt đối xử về giới.<br>2. Quy định các biện pháp cần thiết để thực hiện bình đẳng giới hoặc để giải quyết vấn đề bất bình đẳng giới, phân biệt đối xử về giới; dự báo tác động của các quy định đó đối với nam và nữ sau khi được ban hành.<br>3. Xác định nguồn nhân lực, tài chính cần thiết để triển khai các biện pháp thực hiện bình đẳng giới hoặc để giải quyết vấn đề bất bình đẳng giới, phân biệt đối xử về giới.</code> |
| <code>Điều kiện để giáo viên trong cơ sở giáo dục mầm non, tiểu học ngoài công lập bị ảnh hưởng bởi Covid-19 được hưởng chính sách hỗ trợ là gì?</code> | <code>Điều kiện được hưởng<br>Cán bộ quản lý, giáo viên, nhân viên được hưởng chính sách khi bảo đảm các điều kiện sau:<br>1. Là người đang làm việc tại cơ sở giáo dục ngoài công lập trước khi cơ sở phải tạm dừng hoạt động theo yêu cầu của cơ quan nhà nước có thẩm quyền để phòng, chống dịch COVID-19 tính từ ngày 01 tháng 5 năm 2021 đến hết ngày 31 tháng 12 năm 2021.<br>2. Nghỉ việc không hưởng lương từ 01 tháng trở lên tính từ ngày 01 tháng 5 năm 2021 đến hết ngày 31 tháng 12 năm 2021.<br>3. Chưa được hưởng chính sách hỗ trợ đối với người lao động tạm hoãn hợp đồng lao động, nghỉ việc không hưởng lương theo quy định tại khoản 4, khoản 5, khoản 6 Mục II Nghị quyết số 68/NQ-CP ngày 01 tháng 7 năm 2021 của Chính phủ về một số chính sách hỗ trợ người lao động và người sử dụng lao động gặp khó khăn do đại dịch COVID-19, Nghị quyết số 126/NQ-CP ngày 08 tháng 10 năm 2021 của Chính phủ sửa đổi, bổ sung Nghị quyết số 68/NQ-CP ngày 01 tháng 7 năm 2021 của Chính phủ về một số chính sách hỗ trợ người lao động và người sử dụng lao động gặp khó khăn do đại dịch COVID-19 (sau đây gọi tắt là Nghị quyết số 68/NQ-CP) do không tham gia Bảo hiểm xã hội bắt buộc.<br>4. Có xác nhận làm việc tại cơ sở giáo dục ngoài công lập ít nhất hết năm học 2021 - 2022 theo kế hoạch năm học của địa phương, bao gồm cơ sở giáo dục ngoài công lập đã làm việc trước đây hoặc cơ sở giáo dục ngoài công lập khác trong trường hợp cơ sở giáo dục ngoài công lập trước đây làm việc không hoạt động trở lại.</code> |
| <code>Nguyên tắc áp dụng phụ cấp ưu đãi nghề y tế thế nào?</code> | <code>Nguyên tắc áp dụng<br>1. Trường hợp công chức, viên chức chuyên môn y tế thuộc đối tượng được hưởng các mức phụ cấp ưu đãi theo nghề khác nhau thì được hưởng một mức phụ cấp ưu đãi theo nghề cao nhất.<br>2. Công chức, viên chức đã hưởng phụ cấp ưu đãi theo nghề quy định tại Thông tư liên tịch số 06/2010/TTLT-BYT-BNV-BTC ngày 22/3/2010 của Bộ Y tế, Bộ Nội vụ, Bộ Tài chính hướng dẫn thực hiện Nghị định số 64/2009/NĐ-CP ngày 30/7/2009 của Chính phủ về chính sách đối với cán bộ, viên chức y tế công tác ở vùng có điều kiện kinh tế - xã hội đặc biệt khó khăn thì không hưởng phụ cấp ưu đãi theo nghề quy định tại Thông tư liên tịch này.</code> |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
```
### Evaluation Dataset
#### train
* Dataset: train
* Size: 13,357 evaluation samples
* Columns: <code>anchor</code> and <code>positive</code>
* Approximate statistics based on the first 1000 samples:
| | anchor | positive |
|:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 7 tokens</li><li>mean: 24.61 tokens</li><li>max: 51 tokens</li></ul> | <ul><li>min: 17 tokens</li><li>mean: 202.71 tokens</li><li>max: 256 tokens</li></ul> |
* Samples:
| anchor | positive |
|:-------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>Toà án cấp nào có thẩm quyền giải quyết việc đòi tài sản đã cho người khác vay theo hợp đồng cho vay?</code> | <code>"Điều 35. Thẩm quyền của Tòa án nhân dân cấp huyện<br>1. Tòa án nhân dân cấp huyện có thẩm quyền giải quyết theo thủ tục sơ thẩm những tranh chấp sau đây:<br>a) Tranh chấp về dân sự, hôn nhân và gia đình quy định tại Điều 26 và Điều 28 của Bộ luật này, trừ tranh chấp quy định tại khoản 7 Điều 26 của Bộ luật này;<br>b) Tranh chấp về kinh doanh, thương mại quy định tại khoản 1 Điều 30 của Bộ luật này;<br>c) Tranh chấp về lao động quy định tại Điều 32 của Bộ luật này.<br>2. Tòa án nhân dân cấp huyện có thẩm quyền giải quyết những yêu cầu sau đây:<br>a) Yêu cầu về dân sự quy định tại các khoản 1, 2, 3, 4, 6, 7, 8, 9 và 10 Điều 27 của Bộ luật này;<br>b) Yêu cầu về hôn nhân và gia đình quy định tại các khoản 1, 2, 3, 4, 5, 6, 7, 8, 10 và 11 Điều 29 của Bộ luật này;<br>c) Yêu cầu về kinh doanh, thương mại quy định tại khoản 1 và khoản 6 Điều 31 của Bộ luật này;<br>d) Yêu cầu về lao động quy định tại khoản 1 và khoản 5 Điều 33 của Bộ luật này.<br>3. Những tranh chấp, yêu cầu quy định tại khoản 1 và khoản 2 Điều này mà có đương sự hoặc tài sản ở nước ngoài hoặc cần phải ủy thác tư pháp cho cơ quan đại diện nước Cộng hòa xã hội chủ nghĩa Việt Nam ở nước ngoài, cho Tòa án, cơ quan có thẩm quyền của nước ngoài không thuộc thẩm quyền giải quyết của Tòa án nhân dân cấp huyện, trừ trường hợp quy định tại khoản 4 Điều này.<br>4. Tòa án nhân dân cấp huyện nơi cư trú của công dân Việt Nam hủy việc kết hôn trái pháp luật, giải quyết việc ly hôn, các tranh chấp về quyền và nghĩa vụ của vợ chồng, cha mẹ và con, về nhận cha, mẹ, con, nuôi con nuôi và giám hộ giữa công dân Việt Nam cư trú ở khu vực biên giới với công dân của nước láng giềng cùng cư trú ở khu vực biên giới với Việt Nam theo quy định của Bộ luật này và các quy định khác của pháp luật Việt Nam."</code> |
| <code>Những phiếu bầu nào được xem là không hợp lệ?</code> | <code>Phiếu bầu không hợp lệ<br>1. Những phiếu bầu sau đây là phiếu bầu không hợp lệ:<br>a) Phiếu không theo mẫu quy định do Tổ bầu cử phát ra;<br>b) Phiếu không có dấu của Tổ bầu cử;<br>c) Phiếu để số người được bầu nhiều hơn số lượng đại biểu được bầu đã ấn định cho đơn vị bầu cử;<br>d) Phiếu gạch xóa hết tên những người ứng cử;<br>đ) Phiếu ghi thêm tên người ngoài danh sách những người ứng cử hoặc phiếu có ghi thêm nội dung khác.<br>2. Trường hợp có phiếu bầu được cho là không hợp lệ thì Tổ trường Tổ bầu cử đưa ra để toàn Tổ xem xét, quyết định. Tổ bầu cử không được gạch xóa hoặc sửa các tên ghi trên phiếu bầu.</code> |
| <code>Đề nghị tạm đình chỉ chấp hành quyết định áp dụng biện pháp đưa vào trường giáo dưỡng cho học sinh cần đảm bảo nguyên tắc gì?</code> | <code>Nguyên tắc xét duyệt, đề nghị giảm thời hạn, tạm đình chỉ chấp hành quyết định, miễn chấp hành phần thời gian còn lại cho học sinh trường giáo dưỡng, trại viên cơ sở giáo dục bắt buộc<br>1. Tuân thủ quy định của pháp luật về thi hành biện pháp xử lý hành chính đưa vào trường giáo dưỡng, cơ sở giáo dục bắt buộc, quy định tại Thông tư này và quy định của pháp luật có liên quan.<br>2. Bảo đảm khách quan, công khai, minh bạch, đúng trình tự, thủ tục, thẩm quyền; tôn trọng và bảo vệ quyền, lợi ích hợp pháp của học sinh trường giáo dưỡng, trại viên cơ sở giáo dục bắt buộc.</code> |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: steps
- `per_device_train_batch_size`: 16
- `per_device_eval_batch_size`: 32
- `num_train_epochs`: 4
- `warmup_ratio`: 0.1
- `fp16`: True
- `batch_sampler`: no_duplicates
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: steps
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 16
- `per_device_eval_batch_size`: 32
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 5e-05
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 4
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: True
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: False
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
| Epoch | Step | Training Loss | train loss |
|:------:|:-----:|:-------------:|:----------:|
| 0.0665 | 500 | 0.2809 | 0.2215 |
| 0.1331 | 1000 | 0.1307 | 0.1547 |
| 0.1996 | 1500 | 0.0978 | 0.1366 |
| 0.2662 | 2000 | 0.1054 | 0.1221 |
| 0.3327 | 2500 | 0.0824 | 0.1215 |
| 0.3993 | 3000 | 0.0776 | 0.1223 |
| 0.4658 | 3500 | 0.0797 | 0.1161 |
| 0.5323 | 4000 | 0.0774 | 0.1070 |
| 0.5989 | 4500 | 0.0661 | 0.1007 |
| 0.6654 | 5000 | 0.059 | 0.0945 |
| 0.7320 | 5500 | 0.0674 | 0.0889 |
| 0.7985 | 6000 | 0.0495 | 0.0783 |
| 0.8651 | 6500 | 0.0587 | 0.0822 |
| 0.9316 | 7000 | 0.0585 | 0.0868 |
| 0.9981 | 7500 | 0.0482 | 0.0733 |
| 1.0647 | 8000 | 0.0459 | 0.0786 |
| 1.1312 | 8500 | 0.0487 | 0.0691 |
| 1.1978 | 9000 | 0.0335 | 0.0719 |
| 1.2643 | 9500 | 0.0365 | 0.0711 |
| 1.3308 | 10000 | 0.0279 | 0.0668 |
| 1.3974 | 10500 | 0.0235 | 0.0675 |
| 1.4639 | 11000 | 0.0206 | 0.0599 |
| 1.5305 | 11500 | 0.0175 | 0.0653 |
| 1.5970 | 12000 | 0.0144 | 0.0664 |
| 1.6636 | 12500 | 0.0167 | 0.0598 |
| 1.7301 | 13000 | 0.0173 | 0.0583 |
| 1.7966 | 13500 | 0.0127 | 0.0540 |
| 1.8632 | 14000 | 0.0164 | 0.0595 |
| 1.9297 | 14500 | 0.014 | 0.0552 |
| 1.9963 | 15000 | 0.0114 | 0.0535 |
| 2.0628 | 15500 | 0.0097 | 0.0552 |
| 2.1294 | 16000 | 0.0111 | 0.0549 |
| 2.1959 | 16500 | 0.0076 | 0.0544 |
| 2.2624 | 17000 | 0.009 | 0.0589 |
| 2.3290 | 17500 | 0.0084 | 0.0543 |
| 2.3955 | 18000 | 0.0049 | 0.0520 |
| 2.4621 | 18500 | 0.0068 | 0.0505 |
| 2.5286 | 19000 | 0.0037 | 0.0489 |
| 2.5952 | 19500 | 0.0031 | 0.0461 |
| 2.6617 | 20000 | 0.0041 | 0.0496 |
| 2.7282 | 20500 | 0.0051 | 0.0464 |
| 2.7948 | 21000 | 0.0029 | 0.0475 |
| 2.8613 | 21500 | 0.0032 | 0.0458 |
| 2.9279 | 22000 | 0.003 | 0.0449 |
| 2.9944 | 22500 | 0.0035 | 0.0458 |
| 3.0610 | 23000 | 0.0033 | 0.0443 |
| 3.1275 | 23500 | 0.0032 | 0.0416 |
| 3.1940 | 24000 | 0.002 | 0.0449 |
| 3.2606 | 24500 | 0.0022 | 0.0447 |
| 3.3271 | 25000 | 0.0017 | 0.0430 |
| 3.3937 | 25500 | 0.002 | 0.0418 |
| 3.4602 | 26000 | 0.0019 | 0.0415 |
| 3.5268 | 26500 | 0.0008 | 0.0406 |
| 3.5933 | 27000 | 0.0007 | 0.0414 |
| 3.6598 | 27500 | 0.0008 | 0.0416 |
| 3.7264 | 28000 | 0.0011 | 0.0418 |
| 3.7929 | 28500 | 0.0006 | 0.0416 |
| 3.8595 | 29000 | 0.0005 | 0.0417 |
| 3.9260 | 29500 | 0.0007 | 0.0413 |
| 3.9925 | 30000 | 0.0008 | 0.0412 |
### Framework Versions
- Python: 3.10.14
- Sentence Transformers: 3.2.1
- Transformers: 4.45.1
- PyTorch: 2.4.0
- Accelerate: 0.34.2
- Datasets: 3.0.1
- Tokenizers: 0.20.0
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> | [
"TEXT_CLASSIFICATION"
] | [
"CHIA"
] |
mini1013/master_cate_el11 | mini1013 | text-classification | [
"setfit",
"safetensors",
"roberta",
"sentence-transformers",
"text-classification",
"generated_from_setfit_trainer",
"arxiv:2209.11055",
"base_model:mini1013/master_domain",
"base_model:finetune:mini1013/master_domain",
"model-index",
"region:us"
] | 2024-11-09T08:25:47 | 2024-11-09T08:26:12 | 681 | 0 | ---
base_model: mini1013/master_domain
library_name: setfit
metrics:
- metric
pipeline_tag: text-classification
tags:
- setfit
- sentence-transformers
- text-classification
- generated_from_setfit_trainer
widget:
- text: 필립스 퍼펙트케어 파워라이프 스팀 다리미 GC3929/68 실크부터 청바지까지 온도 조절 NO! 타지 않는 다림질 웰컴마켓2
- text: 보랄 UV 침구 청소기 침대 소파 진공 BR-V603BC 홈니즈 보랄 UV 침구 진공청소기 더웰
- text: NEW 필립스160 다이나글라이드 열판 건식 전기다리미 제이엘코
- text: DG-TOK 넥밴드 타입 디지털 생활무전기 나노Q3/ nano-Q3 블랙 컴피시스템 (comfy system)
- text: ALLNEW29000 파워메이드_그레이(GRAY) 나성민
inference: true
model-index:
- name: SetFit with mini1013/master_domain
results:
- task:
type: text-classification
name: Text Classification
dataset:
name: Unknown
type: unknown
split: test
metrics:
- type: metric
value: 0.7946213453148402
name: Metric
---
# SetFit with mini1013/master_domain
This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [mini1013/master_domain](https://huggingface.co/mini1013/master_domain) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification.
The model has been trained using an efficient few-shot learning technique that involves:
1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
2. Training a classification head with features from the fine-tuned Sentence Transformer.
## Model Details
### Model Description
- **Model Type:** SetFit
- **Sentence Transformer body:** [mini1013/master_domain](https://huggingface.co/mini1013/master_domain)
- **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance
- **Maximum Sequence Length:** 512 tokens
- **Number of Classes:** 18 classes
<!-- - **Training Dataset:** [Unknown](https://huggingface.co/datasets/unknown) -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit)
- **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055)
- **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit)
### Model Labels
| Label | Examples |
|:------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 1 | <ul><li>'보만 대용량 1단 LED터치 핸디 스팀다리미 DB8640G 바이 마르코 (by MARCO)'</li><li>'[구매확정시 N포인트 5% 적립]필립스 핸디형 스팀다리미 7000시리즈 STH7030/18 베르수니코리아 주식회사'</li><li>'테팔 클래시컬 플러스 논슬립 초경량 건식다리미 FS3120K0 주식회사 코스니크'</li></ul> |
| 4 | <ul><li>'보풀제거기 세탁소용 FX-200 유선 아이프리 옷 제거 보푸라기 이불 FX-200 교체용 6중칼날 플라이비(FLY BEE)'</li><li>'[IFREE] 아이프리 6중날 보풀제거기 FX-814 주식회사 더루츠'</li><li>'NEW 아이프리 세탁소 보풀제거기 가디건 니트 옷 FX-714 (주)클릭나라'</li></ul> |
| 16 | <ul><li>'번개표 신형 넉다운 KKD-2200 세트 + 램프1개 추가 (총 램프 2개) KKD-2200 최신형+램프 1개 세트 (주)강남대흥'</li><li>'CAS 카스 360도 절루가 야생동물퇴치기 고라니 멧돼지 두더지 뱀 조류 퇴치기 CLAR-100 (주)지오쇼핑'</li><li>'[스마토] 벅킬러 CF-BK06(블랙) 캠핑/벌레퇴치기/해충/모기 포에버툴'</li></ul> |
| 14 | <ul><li>'Coms 전화선 꼬임방지 White/NT874/전화선정리 [KF] 주식회사 케이에프컴퍼니'</li><li>'전화선 꼬임방지 White/NT874/전화선정리 주식회사 지엔비커뮤니케이션즈'</li><li>'지엔텔 GS-872 2라인(국선) 사무용전화기/단축메모리(12개)/재다이얼/온후크/벨음 리버앤오빌 주식회사'</li></ul> |
| 11 | <ul><li>'지니큐 다용도 UV-C 살균 소독기 무선 자외선 살균기 스마트폰 마스크 UV-500ST 블랙 주식회사 한국전산오피스'</li><li>'텔로 UV 살균기 미니 자외선 소독기 휴대용 책 멸균기 UVCLED 변기 멸균 TUV10 (주)모닝아트'</li><li>"휴대용 마스크 살균소독기 유비세이프 C'Shell MLS-100 그레이 주식회사 유비세이프"</li></ul> |
| 3 | <ul><li>'[잘텍] JX-220 ,JX220 생활무전기 1대 풀세트 블랙 플림스텔레콤주식회사'</li><li>'민영 MYT-0033 MYT0033 고성능 생활무전기 정품이어마이크 3개 주식회사 오토플렉스'</li><li>'PD508/PD-508/무전기 용 경호용 이어마이크/리시버/국산/JM8000T 클럽데님'</li></ul> |
| 13 | <ul><li>'바이마르 바디 건조기 드라이어 VMK-21A30D030 전신 에어 샤워 냉온풍 빠르고 깔끔한 건조 터치 스마트센서 드라이기 자동 몸말리는기계 욕실 따뜻한 시원한 바람 임산부 집들이 바이마르 바디 건조기 VMK-21A30D030 팬텀파트너스'</li><li>'제크롤 바디 스킨 케어 에어샤워 전신건조기 JK-1WBD101 바디드라이어 (주)세중통상'</li><li>'대림도비도스 바디건조기 DLB-700W 국내생산 바디 드라이어 DLB-700W (주) 더수바스'</li></ul> |
| 15 | <ul><li>'다이슨 국내 정품 옴니 글라이드 컴플리트 (퍼플/니켈) 정품스텐딩거치대 포함 이루 이루 스토어'</li><li>'로보락 다이애드 브러쉬 거치대 세트 팅크웨어모바일 주식회사'</li><li>'JCP 에브리봇 EDGE 주식회사 제이씨엠컴퍼니'</li></ul> |
| 6 | <ul><li>'한국타올기산업 자동 손소독기계 HTM-620 자동 1개 (주)서브원'</li><li>'티에스 자바코리아 자동 손소독기 THS2500T 전기식 건전지식 겸용 아름상사몰'</li><li>'HDTOP 비접촉 휴대용 자동 디스펜서 스프레이 손소독기 HT-A600 YGPJ-NJ0042 윤 미디어'</li></ul> |
| 2 | <ul><li>'베스틴 지문방지 푸시풀 도어락 IDL-300 블랙헤어라인 2WAY 현관 아파트 도어락 블랙 유광 (IDL-300SWNK) 키넷'</li><li>'셀프시공 삼성 IOT 푸시풀 디지털도어락 SHP-DR700+보강판 현관문 현관문도어락 하우스플러스(주)'</li><li>'무료설치 에버넷 샷시문도어락 상가번호키 패션문도어록 가마찌도어샤시 EN250-N A지역무료설치 진흥피닉스(주)'</li></ul> |
| 9 | <ul><li>'[하이마트] LG 스타일러 오브제컬렉션 S3BOF [3벌/미스트베이지] 롯데하이마트(주)'</li><li>'엘지 트롬 스타일러 린넨 블랙 S3BF 의류관리 코스트코 갱이점빵'</li><li>'[삼성] 에어드레서 상의 5~9 벌 + 하의 1 벌,코타차콜 DF24CG5100HR 배송은 주문 후 2~4주이상 소요 주식회사 위링크'</li></ul> |
| 5 | <ul><li>'신일 스텐 탈수기 SDM-T77H 가정용 수영장 캠핑장 펜션 콜드림'</li><li>'삼성전자 아가사랑 WA30T2101EE 동의 선우에이치앤비(SUNWOO H&B)'</li><li>'한일전기 W-110 미니 짤순이 다용도 음식물 야채 오이지 두부 탈수기 1kg 탈수기 짤순이(신형) (주)씨앤제이글로벌'</li></ul> |
| 12 | <ul><li>'싱거 8280(단품+수강증+보증서1년)+ 프리모션노루발+노루발3종+말아박이 랍바세트 태양에스엠주식회사'</li><li>'부라더미싱 이노비스A16 (Innovis-A16) NV-A16 부라더미싱'</li><li>'부라더미싱 이노비스 A80, innovis a80, 브라더미싱 팀에이에이 Team AA'</li></ul> |
| 7 | <ul><li>'LED스탠드 브로드윙X (LSP-9700) 베이스 화이트 멜라토닌 학습용 학생 스탠드 MinSellAmount (주)프리즘'</li><li>'듀플렉스 DP-910LS 시력보호 면조명 LED 스탠드 책상 학생용 코지인터내셔널'</li><li>'LED스탠드 책상 학생 독서등 학습용 스텐드 NXL-3000 /스마일배송 오트빌'</li></ul> |
| 0 | <ul><li>'스마트소닉 1000 음파칫솔 단품 [화이트] + 칫솔모 1팩 블루 에스에이치 인터내셔날'</li><li>'프리쉐 PA-TS3000 골프_위탁 업체로 공급사나 배달업체에 개인정보 동의 도라에몽상회'</li><li>'알로코리아 덴픽션 바람건조 고온히팅 UV-C 무선 휴대용 칫솔살균기 ATS1G 단품 1+1 세트_크림+블루 알로이비즈 주식회사'</li></ul> |
| 17 | <ul><li>'[아메리칸스탠다드] 핸드 드라이어 삽입형 FG8901(고속형), FG8984(일반형) 화장실 상업용 편의품 FG8901(고속형) 대일도기사 주식회사'</li><li>'대림 도비도스 DX-1000,DX1000 핸드드라이어 (아이보리) 준트레이딩(JUN Trading)'</li><li>'TS자바 핸드드라이어 TH350ST 스테인레스 핸드드라이기 TSJAVA 화장실 강풍 프럼바디'</li></ul> |
| 8 | <ul><li>'쿠쿠 버블클렌저 연수기 CWS-AO201W 주식회사 제이홀딩스'</li><li>'프렐 연수기 마이크로버블 클렌저 녹물 염소 제거 버블수기 무광 화이트 그레이 투톤색상 (주)로보터스'</li><li>'[렌탈] [셀프형] 현대큐밍 샤워기필터 연수기 더클린 워터케어 (HQS20100W0) 실버 (주)현대렌탈케어'</li></ul> |
| 10 | <ul><li>'[특별 ] 세라젬 밸런스 알칼리 이온수 생성기 의료기기 (주) 세라젬'</li><li>'뉴랜드올네이처 알칼리이온수기 셀터치프리미엄 뉴랜드올네이처비전'</li><li>'뉴랜드올네이처 알칼리이온수기 셀터치필터 복합중공사(UF Membrane) 뉴랜드올네이처비전'</li></ul> |
## Evaluation
### Metrics
| Label | Metric |
|:--------|:-------|
| **all** | 0.7946 |
## Uses
### Direct Use for Inference
First install the SetFit library:
```bash
pip install setfit
```
Then you can load this model and run inference.
```python
from setfit import SetFitModel
# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("mini1013/master_cate_el11")
# Run inference
preds = model("ALLNEW29000 파워메이드_그레이(GRAY) 나성민")
```
<!--
### Downstream Use
*List how someone could finetune this model on their own dataset.*
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Set Metrics
| Training set | Min | Median | Max |
|:-------------|:----|:-------|:----|
| Word count | 3 | 9.3700 | 32 |
| Label | Training Sample Count |
|:------|:----------------------|
| 0 | 50 |
| 1 | 50 |
| 2 | 50 |
| 3 | 50 |
| 4 | 50 |
| 5 | 50 |
| 6 | 50 |
| 7 | 50 |
| 8 | 5 |
| 9 | 50 |
| 10 | 3 |
| 11 | 50 |
| 12 | 50 |
| 13 | 50 |
| 14 | 50 |
| 15 | 50 |
| 16 | 50 |
| 17 | 50 |
### Training Hyperparameters
- batch_size: (512, 512)
- num_epochs: (20, 20)
- max_steps: -1
- sampling_strategy: oversampling
- num_iterations: 40
- body_learning_rate: (2e-05, 2e-05)
- head_learning_rate: 2e-05
- loss: CosineSimilarityLoss
- distance_metric: cosine_distance
- margin: 0.25
- end_to_end: False
- use_amp: False
- warmup_proportion: 0.1
- seed: 42
- eval_max_steps: -1
- load_best_model_at_end: False
### Training Results
| Epoch | Step | Training Loss | Validation Loss |
|:-------:|:----:|:-------------:|:---------------:|
| 0.0079 | 1 | 0.4968 | - |
| 0.3937 | 50 | 0.3206 | - |
| 0.7874 | 100 | 0.1406 | - |
| 1.1811 | 150 | 0.0735 | - |
| 1.5748 | 200 | 0.0518 | - |
| 1.9685 | 250 | 0.0242 | - |
| 2.3622 | 300 | 0.006 | - |
| 2.7559 | 350 | 0.0102 | - |
| 3.1496 | 400 | 0.0088 | - |
| 3.5433 | 450 | 0.0082 | - |
| 3.9370 | 500 | 0.0062 | - |
| 4.3307 | 550 | 0.012 | - |
| 4.7244 | 600 | 0.0021 | - |
| 5.1181 | 650 | 0.002 | - |
| 5.5118 | 700 | 0.0049 | - |
| 5.9055 | 750 | 0.0043 | - |
| 6.2992 | 800 | 0.006 | - |
| 6.6929 | 850 | 0.0002 | - |
| 7.0866 | 900 | 0.0004 | - |
| 7.4803 | 950 | 0.0002 | - |
| 7.8740 | 1000 | 0.0002 | - |
| 8.2677 | 1050 | 0.0002 | - |
| 8.6614 | 1100 | 0.0001 | - |
| 9.0551 | 1150 | 0.0001 | - |
| 9.4488 | 1200 | 0.0002 | - |
| 9.8425 | 1250 | 0.0002 | - |
| 10.2362 | 1300 | 0.0001 | - |
| 10.6299 | 1350 | 0.0001 | - |
| 11.0236 | 1400 | 0.0001 | - |
| 11.4173 | 1450 | 0.0001 | - |
| 11.8110 | 1500 | 0.0001 | - |
| 12.2047 | 1550 | 0.0001 | - |
| 12.5984 | 1600 | 0.0001 | - |
| 12.9921 | 1650 | 0.0001 | - |
| 13.3858 | 1700 | 0.0001 | - |
| 13.7795 | 1750 | 0.0001 | - |
| 14.1732 | 1800 | 0.0001 | - |
| 14.5669 | 1850 | 0.0001 | - |
| 14.9606 | 1900 | 0.0001 | - |
| 15.3543 | 1950 | 0.0001 | - |
| 15.7480 | 2000 | 0.0001 | - |
| 16.1417 | 2050 | 0.0001 | - |
| 16.5354 | 2100 | 0.0001 | - |
| 16.9291 | 2150 | 0.0001 | - |
| 17.3228 | 2200 | 0.0001 | - |
| 17.7165 | 2250 | 0.0001 | - |
| 18.1102 | 2300 | 0.0001 | - |
| 18.5039 | 2350 | 0.0001 | - |
| 18.8976 | 2400 | 0.0001 | - |
| 19.2913 | 2450 | 0.0001 | - |
| 19.6850 | 2500 | 0.0001 | - |
### Framework Versions
- Python: 3.10.12
- SetFit: 1.1.0.dev0
- Sentence Transformers: 3.1.1
- Transformers: 4.46.1
- PyTorch: 2.4.0+cu121
- Datasets: 2.20.0
- Tokenizers: 0.20.0
## Citation
### BibTeX
```bibtex
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> | [
"TEXT_CLASSIFICATION"
] | [
"CAS"
] |
PlanTL-GOB-ES/bsc-bio-ehr-es | PlanTL-GOB-ES | fill-mask | [
"transformers",
"pytorch",
"roberta",
"fill-mask",
"biomedical",
"clinical",
"ehr",
"spanish",
"es",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-04-08T13:15:59 | 2022-11-15T16:34:16 | 665 | 12 | ---
language:
- es
license: apache-2.0
metrics:
- ppl
tags:
- biomedical
- clinical
- ehr
- spanish
widget:
- text: El único antecedente personal a reseñar era la <mask> arterial.
- text: Las radiologías óseas de cuerpo entero no detectan alteraciones <mask>, ni
alteraciones vertebrales.
- text: En el <mask> toraco-abdómino-pélvico no se encontraron hallazgos patológicos
de interés.
---
# Biomedical-clinical language model for Spanish
## Table of contents
<details>
<summary>Click to expand</summary>
- [Model description](#model-description)
- [Intended uses and limitations](#intended-use)
- [How to use](#how-to-use)
- [Limitations and bias](#limitations-and-bias)
- [Training](#training)
- [Evaluation](#evaluation)
- [Additional information](#additional-information)
- [Author](#author)
- [Contact information](#contact-information)
- [Copyright](#copyright)
- [Licensing information](#licensing-information)
- [Funding](#funding)
- [Citing information](#citing-information)
- [Disclaimer](#disclaimer)
</details>
## Model description
Biomedical pretrained language model for Spanish. For more details about the corpus, the pretraining and the evaluation, check the official [repository](https://github.com/PlanTL-GOB-ES/lm-biomedical-clinical-es).
## Intended uses and limitations
The model is ready-to-use only for masked language modelling to perform the Fill Mask task (try the inference API or read the next section). However, it is intended to be fine-tuned on downstream tasks such as Named Entity Recognition or Text Classification.
## How to use
## Limitations and bias
At the time of submission, no measures have been taken to estimate the bias embedded in the model. However, we are well aware that our models may be biased since the corpora have been collected using crawling techniques on multiple web sources. We intend to conduct research in these areas in the future, and if completed, this model card will be updated.
## Training
### Tokenization and model pretraining
This model is a [RoBERTa-based](https://github.com/pytorch/fairseq/tree/master/examples/roberta) model trained on a
**biomedical-clinical** corpus in Spanish collected from several sources (see next section).
The training corpus has been tokenized using a byte version of [Byte-Pair Encoding (BPE)](https://github.com/openai/gpt-2)
used in the original [RoBERTA](https://github.com/pytorch/fairseq/tree/master/examples/roberta) model with a vocabulary size of 52,000 tokens. The pretraining consists of a masked language model training at the subword level following the approach employed for the RoBERTa base model with the same hyperparameters as in the original work. The training lasted a total of 48 hours with 16 NVIDIA V100 GPUs of 16GB DDRAM, using Adam optimizer with a peak learning rate of 0.0005 and an effective batch size of 2,048 sentences.
### Training corpora and preprocessing
The training corpus is composed of several biomedical corpora in Spanish, collected from publicly available corpora and crawlers, and a real-world clinical corpus collected from more than 278K clinical documents and notes. To obtain a high-quality training corpus while retaining the idiosyncrasies of the clinical language, a cleaning pipeline has been applied only to the biomedical corpora, keeping the clinical corpus uncleaned. Essentially, the cleaning operations used are:
- data parsing in different formats
- sentence splitting
- language detection
- filtering of ill-formed sentences
- deduplication of repetitive contents
- keep the original document boundaries
Then, the biomedical corpora are concatenated and further global deduplication among the biomedical corpora has been applied.
Eventually, the clinical corpus is concatenated to the cleaned biomedical corpus resulting in a medium-size biomedical-clinical corpus for Spanish composed of more than 1B tokens. The table below shows some basic statistics of the individual cleaned corpora:
| Name | No. tokens | Description |
|-----------------------------------------------------------------------------------------|-------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| [Medical crawler](https://zenodo.org/record/4561970) | 903,558,13 | Crawler of more than 3,000 URLs belonging to Spanish biomedical and health domains. |
| Clinical cases misc. | 102,855,267 | A miscellany of medical content, essentially clinical cases. Note that a clinical case report is a scientific publication where medical practitioners share patient cases and it is different from a clinical note or document. |
| EHR documents | 95,267,20 | Collection of more than 278K clinical documents, including discharge reports, clinical course notes and X-ray reports, for a total of 91M tokens. |
| [Scielo](https://zenodo.org/record/2541681#.YlP1DshBwio) | 60,007,289 | Publications written in Spanish crawled from the Spanish SciELO server in 2017. |
| [BARR2_background](https://temu.bsc.es/BARR2/downloads/background_set.raw_text.tar.bz2) | 24,516,442 | Biomedical Abbreviation Recognition and Resolution (BARR2) containing Spanish clinical case study sections from a variety of clinical disciplines. |
| Wikipedia_life_sciences | 13,890,501 | Wikipedia articles crawled 04/01/2021 with the [Wikipedia API python library](https://pypi.org/project/Wikipedia-API/) starting from the "Ciencias\_de\_la\_vida" category up to a maximum of 5 subcategories. Multiple links to the same articles are then discarded to avoid repeating content. |
| Patents | 13,463,387 | Google Patent in Medical Domain for Spain (Spanish). The accepted codes (Medical Domain) for Json files of patents are: "A61B", "A61C","A61F", "A61H", "A61K", "A61L","A61M", "A61B", "A61P". |
| [EMEA](http://opus.nlpl.eu/download.php?f=EMEA/v3/moses/en-es.txt.zip) | 5,377,448 | Spanish-side documents extracted from parallel corpora made out of PDF documents from the European Medicines Agency. |
| [mespen_Medline](https://zenodo.org/record/3562536#.YTt1fH2xXbR) | 4,166,077 | Spanish-side articles extracted from a collection of Spanish-English parallel corpus consisting of biomedical scientific literature. The collection of parallel resources is aggregated from the MedlinePlus source. |
| PubMed | 1,858,966 | Open-access articles from the PubMed repository crawled in 2017. |
## Evaluation
The model has been fine-tuned on three Named Entity Recognition (NER) tasks using three clinical NER datasets:
- [PharmaCoNER](https://zenodo.org/record/4270158): is a track on chemical and drug mention recognition from Spanish medical texts (for more info see: https://temu.bsc.es/pharmaconer/).
- [CANTEMIST](https://zenodo.org/record/3978041#.YTt5qH2xXbQ): is a shared task specifically focusing on named entity recognition of tumor morphology, in Spanish (for more info see: https://zenodo.org/record/3978041#.YTt5qH2xXbQ).
- ICTUSnet: consists of 1,006 hospital discharge reports of patients admitted for stroke from 18 different Spanish hospitals. It contains more than 79,000 annotations for 51 different kinds of variables.
We addressed the NER task as a token classification problem using a standard linear layer along with the BIO tagging schema. We compared our models with the general-domain Spanish [roberta-base-bne](https://huggingface.co/PlanTL-GOB-ES/roberta-base-bne), the general-domain multilingual model that supports Spanish [mBERT](https://huggingface.co/bert-base-multilingual-cased), the domain-specific English model [BioBERT](https://huggingface.co/dmis-lab/biobert-base-cased-v1.2), and three domain-specific models based on continual pre-training, [mBERT-Galén](https://ieeexplore.ieee.org/document/9430499), [XLM-R-Galén](https://ieeexplore.ieee.org/document/9430499) and [BETO-Galén](https://ieeexplore.ieee.org/document/9430499).
The table below shows the F1 scores obtained:
| Tasks/Models | bsc-bio-ehr-es | XLM-R-Galén | BETO-Galén | mBERT-Galén | mBERT | BioBERT | roberta-base-bne |
|--------------|----------------|--------------------|--------------|--------------|--------------|--------------|------------------|
| PharmaCoNER | **0.8913** | 0.8754 | 0.8537 | 0.8594 | 0.8671 | 0.8545 | 0.8474 |
| CANTEMIST | **0.8340** | 0.8078 | 0.8153 | 0.8168 | 0.8116 | 0.8070 | 0.7875 |
| ICTUSnet | **0.8756** | 0.8716 | 0.8498 | 0.8509 | 0.8631 | 0.8521 | 0.8677 |
The fine-tuning scripts can be found in the official GitHub [repository](https://github.com/PlanTL-GOB-ES/lm-biomedical-clinical-es).
## Additional information
### Author
Text Mining Unit (TeMU) at the Barcelona Supercomputing Center ([email protected])
### Contact information
For further information, send an email to <[email protected]>
### Copyright
Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)
### Licensing information
[Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0)
### Funding
This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.
### Citing information
If you use these models, please cite our work:
```bibtext
@inproceedings{carrino-etal-2022-pretrained,
title = "Pretrained Biomedical Language Models for Clinical {NLP} in {S}panish",
author = "Carrino, Casimiro Pio and
Llop, Joan and
P{\`a}mies, Marc and
Guti{\'e}rrez-Fandi{\~n}o, Asier and
Armengol-Estap{\'e}, Jordi and
Silveira-Ocampo, Joaqu{\'\i}n and
Valencia, Alfonso and
Gonzalez-Agirre, Aitor and
Villegas, Marta",
booktitle = "Proceedings of the 21st Workshop on Biomedical Language Processing",
month = may,
year = "2022",
address = "Dublin, Ireland",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2022.bionlp-1.19",
doi = "10.18653/v1/2022.bionlp-1.19",
pages = "193--199",
abstract = "This work presents the first large-scale biomedical Spanish language models trained from scratch, using large biomedical corpora consisting of a total of 1.1B tokens and an EHR corpus of 95M tokens. We compared them against general-domain and other domain-specific models for Spanish on three clinical NER tasks. As main results, our models are superior across the NER tasks, rendering them more convenient for clinical NLP applications. Furthermore, our findings indicate that when enough data is available, pre-training from scratch is better than continual pre-training when tested on clinical tasks, raising an exciting research question about which approach is optimal. Our models and fine-tuning scripts are publicly available at HuggingFace and GitHub.",
}
```
### Disclaimer
<details>
<summary>Click to expand</summary>
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.
When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence.
In no event shall the owner of the models (SEDIA – State Secretariat for Digitalization and Artificial Intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.
Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.
Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.
En ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos.
</details>
| [
"NAMED_ENTITY_RECOGNITION",
"TEXT_CLASSIFICATION"
] | [
"CANTEMIST",
"PHARMACONER",
"SCIELO"
] |
AdaptLLM/finance-LLM | AdaptLLM | text-generation | [
"transformers",
"pytorch",
"safetensors",
"llama",
"text-generation",
"finance",
"en",
"dataset:Open-Orca/OpenOrca",
"dataset:GAIR/lima",
"dataset:WizardLM/WizardLM_evol_instruct_V2_196k",
"arxiv:2309.09530",
"arxiv:2411.19930",
"arxiv:2406.14491",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | 2023-09-18T13:45:13 | 2024-12-02T06:26:32 | 665 | 118 | ---
datasets:
- Open-Orca/OpenOrca
- GAIR/lima
- WizardLM/WizardLM_evol_instruct_V2_196k
language:
- en
metrics:
- accuracy
pipeline_tag: text-generation
tags:
- finance
---
# Adapting LLMs to Domains via Continual Pre-Training (ICLR 2024)
This repo contains the domain-specific base model developed from **LLaMA-1-7B**, using the method in our paper [Adapting Large Language Models via Reading Comprehension](https://huggingface.co/papers/2309.09530).
We explore **continued pre-training on domain-specific corpora** for large language models. While this approach enriches LLMs with domain knowledge, it significantly hurts their prompting ability for question answering. Inspired by human learning via reading comprehension, we propose a simple method to **transform large-scale pre-training corpora into reading comprehension texts**, consistently improving prompting performance across tasks in biomedicine, finance, and law domains. **Our 7B model competes with much larger domain-specific models like BloombergGPT-50B**.
### [2024/11/29] 🤗 Introduce the multimodal version of AdaptLLM at [AdaMLLM](https://huggingface.co/papers/2411.19930), for adapting MLLMs to domains 🤗
**************************** **Updates** ****************************
* 2024/11/29: Released [AdaMLLM](https://huggingface.co/AdaptLLM/Adapt-MLLM-to-Domains) for adapting MLLMs to domains
* 2024/9/20: Our [research paper for Instruction-Pretrain](https://huggingface.co/papers/2406.14491) has been accepted by EMNLP 2024
* 2024/8/29: Updated [guidelines](https://huggingface.co/datasets/AdaptLLM/finance-tasks) on evaluating any 🤗Huggingface models on the domain-specific tasks
* 2024/6/22: Released the [benchmarking code](https://github.com/microsoft/LMOps/tree/main/adaptllm)
* 2024/6/21: Released the general version of AdaptLLM at [Instruction-Pretrain](https://huggingface.co/instruction-pretrain)
* 2024/4/2: Released the [raw data splits (train and test)](https://huggingface.co/datasets/AdaptLLM/ConvFinQA) of all the evaluation datasets
* 2024/1/16: Our [research paper for AdaptLLM](https://huggingface.co/papers/2309.09530) has been accepted by ICLR 2024
* 2023/12/19: Released our [13B base models](https://huggingface.co/AdaptLLM/law-LLM-13B) developed from LLaMA-1-13B
* 2023/12/8: Released our [chat models](https://huggingface.co/AdaptLLM/law-chat) developed from LLaMA-2-Chat-7B
* 2023/9/18: Released our [paper](https://huggingface.co/papers/2309.09530), [code](https://github.com/microsoft/LMOps), [data](https://huggingface.co/datasets/AdaptLLM/law-tasks), and [base models](https://huggingface.co/AdaptLLM/law-LLM) developed from LLaMA-1-7B
## 1. Domain-Specific Models
### LLaMA-1-7B
In our paper, we develop three domain-specific models from LLaMA-1-7B, which are also available in Huggingface: [Biomedicine-LLM](https://huggingface.co/AdaptLLM/medicine-LLM), [Finance-LLM](https://huggingface.co/AdaptLLM/finance-LLM) and [Law-LLM](https://huggingface.co/AdaptLLM/law-LLM), the performances of our AdaptLLM compared to other domain-specific LLMs are:
<p align='center'>
<img src="https://cdn-uploads.huggingface.co/production/uploads/650801ced5578ef7e20b33d4/6efPwitFgy-pLTzvccdcP.png" width="700">
</p>
### LLaMA-1-13B
Moreover, we scale up our base model to LLaMA-1-13B to see if **our method is similarly effective for larger-scale models**, and the results are consistently positive too: [Biomedicine-LLM-13B](https://huggingface.co/AdaptLLM/medicine-LLM-13B), [Finance-LLM-13B](https://huggingface.co/AdaptLLM/finance-LLM-13B) and [Law-LLM-13B](https://huggingface.co/AdaptLLM/law-LLM-13B).
### LLaMA-2-Chat
Our method is also effective for aligned models! LLaMA-2-Chat requires a [specific data format](https://huggingface.co/blog/llama2#how-to-prompt-llama-2), and our **reading comprehension can perfectly fit the data format** by transforming the reading comprehension into a multi-turn conversation. We have also open-sourced chat models in different domains: [Biomedicine-Chat](https://huggingface.co/AdaptLLM/medicine-chat), [Finance-Chat](https://huggingface.co/AdaptLLM/finance-chat) and [Law-Chat](https://huggingface.co/AdaptLLM/law-chat).
For example, to chat with the finance base model (🤗we highly recommend switching to the [chat model](https://huggingface.co/AdaptLLM/finance-chat) for better response quality):
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("AdaptLLM/finance-LLM")
tokenizer = AutoTokenizer.from_pretrained("AdaptLLM/finance-LLM", use_fast=False)
# Put your input here:
user_input = '''Use this fact to answer the question: Title of each class Trading Symbol(s) Name of each exchange on which registered
Common Stock, Par Value $.01 Per Share MMM New York Stock Exchange
MMM Chicago Stock Exchange, Inc.
1.500% Notes due 2026 MMM26 New York Stock Exchange
1.750% Notes due 2030 MMM30 New York Stock Exchange
1.500% Notes due 2031 MMM31 New York Stock Exchange
Which debt securities are registered to trade on a national securities exchange under 3M's name as of Q2 of 2023?'''
# Simply use your input as the prompt for base models
prompt = user_input
inputs = tokenizer(prompt, return_tensors="pt", add_special_tokens=False).input_ids.to(model.device)
outputs = model.generate(input_ids=inputs, max_length=2048)[0]
answer_start = int(inputs.shape[-1])
pred = tokenizer.decode(outputs[answer_start:], skip_special_tokens=True)
print(pred)
```
### LLaMA-3-8B (💡New!)
In our recent research on [Instruction-Pretrain](https://huggingface.co/papers/2406.14491), we developed a context-based instruction synthesizer to augment the raw corpora with instruction-response pairs, **enabling Llama3-8B to be comparable to or even outperform Llama3-70B**: [Finance-Llama3-8B](https://huggingface.co/instruction-pretrain/finance-Llama3-8B), [Biomedicine-Llama3-8B](https://huggingface.co/instruction-pretrain/medicine-Llama3-8B).
## 2. Domain-Specific Tasks
### Pre-templatized Testing Splits
To easily reproduce our prompting results, we have uploaded the filled-in zero/few-shot input instructions and output completions of the test each domain-specific task: [biomedicine-tasks](https://huggingface.co/datasets/AdaptLLM/medicine-tasks), [finance-tasks](https://huggingface.co/datasets/AdaptLLM/finance-tasks), and [law-tasks](https://huggingface.co/datasets/AdaptLLM/law-tasks).
Note: those filled-in instructions are specifically tailored for models before alignment and do NOT fit for the specific data format required for chat models.
### Evaluating Any Huggingface LMs on Domain-Specific Tasks (💡New!)
You can use the following script to reproduce our results and evaluate any other Huggingface models on domain-specific tasks. Note that the script is NOT applicable to models that require specific prompt templates (e.g., Llama2-chat, Llama3-Instruct).
1). **Set Up Dependencies**
```bash
git clone https://github.com/microsoft/LMOps
cd LMOps/adaptllm
pip install -r requirements.txt
```
2). **Evaluate the Model**
```bash
# Select the domain from ['biomedicine', 'finance', 'law']
DOMAIN='finance'
# Specify any Huggingface model name (Not applicable to chat models)
MODEL='AdaptLLM/finance-LLM'
# Model parallelization:
# - Set MODEL_PARALLEL=False if the model fits on a single GPU.
# We observe that LMs smaller than 10B always meet this requirement.
# - Set MODEL_PARALLEL=True if the model is too large and encounters OOM on a single GPU.
MODEL_PARALLEL=False
# Choose the number of GPUs from [1, 2, 4, 8]
N_GPU=1
# Whether to add a BOS token at the beginning of the prompt input:
# - Set to False for AdaptLLM.
# - Set to True for instruction-pretrain models.
# If unsure, we recommend setting it to False, as this is suitable for most LMs.
add_bos_token=False
# Run the evaluation script
bash scripts/inference.sh ${DOMAIN} ${MODEL} ${add_bos_token} ${MODEL_PARALLEL} ${N_GPU}
```
### Raw Datasets
We have also uploaded the raw training and testing splits, for facilitating fine-tuning or other usages: [ChemProt](https://huggingface.co/datasets/AdaptLLM/ChemProt), [RCT](https://huggingface.co/datasets/AdaptLLM/RCT), [ConvFinQA](https://huggingface.co/datasets/AdaptLLM/ConvFinQA), [FiQA_SA](https://huggingface.co/datasets/AdaptLLM/FiQA_SA), [Headline](https://huggingface.co/datasets/AdaptLLM/Headline), [NER](https://huggingface.co/datasets/AdaptLLM/NER), [FPB](https://huggingface.co/datasets/AdaptLLM/FPB)
### Domain Knowledge Probing
Our pre-processed knowledge probing datasets are available at: [med_knowledge_prob](https://huggingface.co/datasets/AdaptLLM/med_knowledge_prob) and [law_knowledge_prob](https://huggingface.co/datasets/AdaptLLM/law_knowledge_prob)
## Citation
If you find our work helpful, please cite us:
```bibtex
@inproceedings{
cheng2024adapting,
title={Adapting Large Language Models via Reading Comprehension},
author={Daixuan Cheng and Shaohan Huang and Furu Wei},
booktitle={The Twelfth International Conference on Learning Representations},
year={2024},
url={https://openreview.net/forum?id=y886UXPEZ0}
}
``` | [
"QUESTION_ANSWERING"
] | [
"CHEMPROT"
] |
knowledgator/gliclass-base-v2.0-rac-init | knowledgator | zero-shot-classification | [
"safetensors",
"GLiClass",
"text classification",
"zero-shot",
"small language models",
"RAG",
"sentiment analysis",
"zero-shot-classification",
"en",
"fr",
"ge",
"dataset:MoritzLaurer/synthetic_zeroshot_mixtral_v0.1",
"dataset:knowledgator/gliclass-v1.0",
"dataset:fancyzhx/amazon_polarity",
"dataset:cnmoro/QuestionClassification",
"dataset:Arsive/toxicity_classification_jigsaw",
"dataset:shishir-dwi/News-Article-Categorization_IAB",
"dataset:SetFit/qnli",
"dataset:nyu-mll/multi_nli",
"dataset:SetFit/student-question-categories",
"dataset:SetFit/tweet_sentiment_extraction",
"dataset:SetFit/hate_speech18",
"dataset:saattrupdan/doc-nli",
"dataset:knowledgator/gliclass-v2.0-RAC",
"base_model:microsoft/deberta-v3-base",
"base_model:finetune:microsoft/deberta-v3-base",
"license:apache-2.0",
"region:us"
] | 2025-02-17T12:41:55 | 2025-03-07T15:56:59 | 649 | 6 | ---
base_model:
- microsoft/deberta-v3-base
datasets:
- MoritzLaurer/synthetic_zeroshot_mixtral_v0.1
- knowledgator/gliclass-v1.0
- fancyzhx/amazon_polarity
- cnmoro/QuestionClassification
- Arsive/toxicity_classification_jigsaw
- shishir-dwi/News-Article-Categorization_IAB
- SetFit/qnli
- nyu-mll/multi_nli
- SetFit/student-question-categories
- SetFit/tweet_sentiment_extraction
- SetFit/hate_speech18
- saattrupdan/doc-nli
- knowledgator/gliclass-v2.0-RAC
language:
- en
- fr
- ge
license: apache-2.0
metrics:
- f1
pipeline_tag: zero-shot-classification
tags:
- text classification
- zero-shot
- small language models
- RAG
- sentiment analysis
---
# ⭐ GLiClass: Generalist and Lightweight Model for Sequence Classification
This is an efficient zero-shot classifier inspired by [GLiNER](https://github.com/urchade/GLiNER/tree/main) work. It demonstrates the same performance as a cross-encoder while being more compute-efficient because classification is done at a single forward path.
It can be used for `topic classification`, `sentiment analysis` and as a reranker in `RAG` pipelines.
The model was trained on synthetic and licensed data that allow commercial use and can be used in commercial applications.
This version of the model uses a layer-wise selection of features that enables a better understanding of different levels of language. The backbone model is [microsoft/deberta-v3-base](https://huggingface.co/microsoft/deberta-v3-base).
### Retrieval-augmented Classification (RAC):
The main idea of this model is to utilize the information from semantically similar examples to enhance predictions in inference. The tests showed that providing the model with at least one example from the train dataset, which was retrieved by semantic similarity, could increase the F1 score from 0.3090 to 0.4275, in some cases from 0.2594 up to 0.6249. Moreover, the RAC approach, with 2 examples provided, showed an F1 score, compared to fine-tuning with 8 examples per label: 0.4707 and 0.4838, respectively.
### RAC dataset generation strategy:


To further enhance classification performance, we generated a Retrieval-Augmented Classification (RAC) dataset. Each text example in the gliclass-v2.0 dataset was encoded using the paraphrase-MiniLM-L6-v2 sentence transformer and indexed in an HNSW (Hierarchical Navigable Small World) database. For 250k randomly selected samples, we retrieved up to three most similar examples (cosine similarity > 0.5) from the dataset.
During augmentation:
- The number of retrieved examples per sample was randomly chosen between 1 and 3.
- 30% of retrieved examples were replaced with random, unrelated examples to introduce controlled noise.
- If true labels were present in a retrieved example, false labels were removed with a 50% probability to balance information clarity.
Each retrieved example was formatted using structured ```<<EXAMPLE>> ... <</EXAMPLE>>``` tags, where:
- True labels were explicitly marked as ```<<TRUE_LABEL>> {label}```.
- False labels were marked as ```<<FALSE_LABEL>> {label}```, unless removed.
For each randomly selected 250k examples, the “text” was modified as ```{original_text} <<EXAMPLE>> {retrieved_text} {true_labels_str} {false_labels_str} <</EXAMPLE>>...```
Where:
- ```{original_text}``` is the original example text.
- ```{retrieved_text}``` is a similar or randomly selected example.
- ```{true_labels_str}``` contains true labels formatted as ```<<TRUE_LABEL>> {label}```.
- ```{false_labels_str}``` contains false labels formatted as ```<<FALSE_LABEL>> {label}``` (unless removed with 50% probability).
Such a strategy allows the model to learn how to utilize the provided information without overfocusing on RAC examples. With both relevant and randomly retrieved examples, the dataset maintains a balance between useful contextual information and controlled noise. This ensures that the model does not become overly reliant on retrieval-augmented inputs while still benefiting from additional context when available.
### How to use:
First of all, you need to install GLiClass library:
```bash
pip install gliclass
```
Than you need to initialize a model and a pipeline:
```python
from gliclass import GLiClassModel, ZeroShotClassificationPipeline
from transformers import AutoTokenizer
model = GLiClassModel.from_pretrained("knowledgator/gliclass-base-v2.0-rac-init")
tokenizer = AutoTokenizer.from_pretrained("knowledgator/gliclass-base-v2.0-rac-init")
pipeline = ZeroShotClassificationPipeline(model, tokenizer, classification_type='multi-label', device='cuda:0')
text = "One day I will see the world!"
labels = ["travel", "dreams", "sport", "science", "politics"]
results = pipeline(text, labels, threshold=0.5)[0] #because we have one text
for result in results:
print(result["label"], "=>", result["score"])
```
To use with one **RAC** example:
```python
example_1 = {
"text": "A recently developed machine learning platform offers robust automation for complex data analysis workflows. While it enhances productivity, users have reported difficulties in integrating it with their current data infrastructure and a need for better documentation.",
"all_labels": ["AI", "automation", "data_analysis", "usability", "integration"],
"true_labels": ["AI", "integration", 'automation']
}
text = "The new AI-powered tool streamlines data analysis by automating repetitive tasks, improving efficiency for data scientists. However, its steep learning curve and limited integration with existing platforms pose challenges for widespread adoption."
labels = ["AI", "automation", "data_analysis", "usability", "integration"]
results = pipeline(text, labels, threshold=0.1, rac_examples=[example_1])[0]
for predict in results:
print(predict["label"], " - ", predict["score"])
```
To use with several **RAC** examples:
```python
example_1 = {
"text": "A recently developed machine learning platform offers robust automation for complex data analysis workflows. While it enhances productivity, users have reported difficulties in integrating it with their current data infrastructure and a need for better documentation.",
"all_labels": ["AI", "automation", "data_analysis", "usability", "integration"],
"true_labels": ["AI", "integration", 'automation']
}
example_2 = {
"text": "A cloud-based analytics tool leverages artificial intelligence to provide real-time insights. It significantly improves workflow efficiency but struggles with compatibility across different enterprise systems, requiring additional customization efforts.",
"all_labels": ["AI", "automation", "data_analysis", "usability", "integration"],
"true_labels": ["AI", "integration", "data_analysis"]
}
text = "The new AI-powered tool streamlines data analysis by automating repetitive tasks, improving efficiency for data scientists. However, its steep learning curve and limited integration with existing platforms pose challenges for widespread adoption."
labels = ["AI", "automation", "data_analysis", "usability", "integration"]
results = pipeline(text, labels, threshold=0.1, rac_examples=[example_1, example_2])[0]
for predict in results:
print(predict["label"], " - ", predict["score"])
```
If you want to use it for NLI type of tasks, we recommend representing your premise as a text and hypothesis as a label, you can put several hypotheses, but the model works best with a single input hypothesis.
```python
# Initialize model and multi-label pipeline
text = "The cat slept on the windowsill all afternoon"
labels = ["The cat was awake and playing outside."]
results = pipeline(text, labels, threshold=0.0)[0]
print(results)
```
### Benchmarks:
Below, you can find a comparison with other GLiClass models:
| Dataset | gliclass-base-v1.0-init | gliclass-large-v1.0-init | gliclass-modern-base-v2.0-init | gliclass-modern-large-v2.0-init | gliclass-base-v2.0-rac-init |
|----------------------|-----------------------|-----------------------|---------------------|---------------------|---------------------|
| CR | 0.8672 | 0.8024 | 0.9041 | 0.8980 | 0.7852 |
| sst2 | 0.8342 | 0.8734 | 0.9011 | 0.9434 | 0.8610 |
| sst5 | 0.2048 | 0.1638 | 0.1972 | 0.1123 | 0.0598 |
| 20_news_groups | 0.2317 | 0.4151 | 0.2448 | 0.2792 | 0.4007 |
| spam | 0.5963 | 0.5407 | 0.5074 | 0.6364 | 0.6739 |
| financial_phrasebank | 0.3594 | 0.3705 | 0.2537 | 0.2562 | 0.2537 |
| imdb | 0.8772 | 0.8836 | 0.8255 | 0.9137 | 0.8716 |
| ag_news | 0.5614 | 0.7069 | 0.6050 | 0.6933 | 0.6759 |
| emotion | 0.2865 | 0.3840 | 0.2474 | 0.3746 | 0.4160 |
| cap_sotu | 0.3966 | 0.4353 | 0.2929 | 0.2919 | 0.3871 |
| rotten_tomatoes | 0.6626 | 0.7933 | 0.6630 | 0.5928 | 0.7739 |
| **AVERAGE:** | 0.5344 | 0.5790 | 0.5129 | 0.5447 | 0.5598 |
Here you can see how the performance of the model grows, providing more **RAC** examples:
| Dataset | 0 examples | 1 example | 2 examples | 3 examples |
|-------------------------------------|------------|------------|------------|------------|
| cap_sotu | 0.3857 | 0.4665 | 0.4935 | 0.4847 |
| cap_sotu (8 examples) | 0.4938 | 0.5097 | 0.4976 | 0.4894 |
| cap_sotu (Weak Supervision - 8) | 0.4319 | 0.4764 | 0.4488 | 0.4465 |
| dair-ai_emotion | 0.4472 | 0.5505 | 0.5619 | 0.5705 |
| dair-ai_emotion (8 examples) | 0.5088 | 0.5630 | 0.5623 | 0.5740 |
| dair-ai_emotion (Weak Supervision - 8) | 0.4187 | 0.5479 | 0.5693 | 0.5828 |
| ag_news | 0.6791 | 0.8507 | 0.8717 | 0.8866 |
| ag_news (8 examples) | 0.8496 | 0.9002 | 0.9072 | 0.9091 |
| ag_news (Weak Supervision - 8) | 0.6546 | 0.8623 | 0.8841 | 0.8978 |
| sst5 | 0.0599 | 0.0675 | 0.1163 | 0.1267 |
| sst5 (8 examples) | 0.2887 | 0.2690 | 0.2642 | 0.2394 |
| sst5 (Weak Supervision - 8) | 0.0744 | 0.2780 | 0.2897 | 0.2912 |
| ScienceQA | 0.1142 | 0.4035 | 0.4534 | 0.4495 |
| ScienceQA (8 examples) | 0.6493 | 0.6547 | 0.6956 | 0.6770 |
| ScienceQA (Weak Supervision - 8) | 0.2987 | 0.5919 | 0.5998 | 0.5674 |
| Malicious_code_classification | 0.3717 | 0.6260 | 0.9672 | 0.9788 |
| Malicious_code_classification (8 examples) | 0.8444 | 0.9722 | 0.9788 | 0.9772 |
| Malicious_code_classification (Weak Supervision - 8) | 0.3745 | 0.9216 | 0.9788 | 0.9772 |
| twitter-financial-news-topic | 0.2594 | 0.6249 | 0.6408 | 0.6427 |
| twitter-financial-news-topic (8 examples) | 0.6137 | 0.7072 | 0.7099 | 0.6948 |
| twitter-financial-news-topic (Weak Supervision - 8) | 0.4032 | 0.6651 | 0.6316 | 0.6114 |
| 20_newsgroups | 0.3211 | 0.1339 | 0.0906 | 0.1005 |
| 20_newsgroups (8 examples) | 0.0959 | 0.0657 | 0.0440 | 0.0445 |
| 20_newsgroups (Weak Supervision - 8) | 0.4765 | 0.1035 | 0.0775 | 0.0777 |
| ChemProt | 0.2024 | 0.1911 | 0.1568 | 0.1329 |
| ChemProt (8 examples) | 0.2985 | 0.3479 | 0.3636 | 0.3538 |
| ChemProt (Weak Supervision - 8) | 0.2369 | 0.2067 | 0.1911 | 0.1780 |
| **AVERAGE:** | **0 examples** | **1 example** | **2 examples** | **3 examples** |
|-------------------------------------|---------------|---------------|---------------|---------------|
| Standard | 0.3090 | 0.4275 | 0.4707 | 0.4718 |
| 8 examples | 0.4838 | 0.5245 | 0.5288 | 0.5244 |
| Weak Supervision - 8 | 0.3661 | 0.4862 | 0.4868 | 0.4821 |
Here you can see how the performance of the model grows, providing more examples in comparison to other models:
| Model | Num Examples | sst5 | ag_news | emotion | **AVERAGE:** |
|------------------------------------|------------------|--------|---------|--------------|----------|
| gliclass-base-v2.0-rac-init | 0 | 0.0599 | 0.6791 | 0.4472 | 0.3934 |
| gliclass-base-v2.0-rac-init | 8 | 0.2887 | 0.8496 | 0.5088 | 0.6149 |
| gliclass-base-v2.0-rac-init | Weak Supervision | 0.0744 | 0.6546 | 0.4187 | 0.3983 |
| gliclass-modern-large-v2.0-init | 0 | 0.1123 | 0.6933 | 0.3746 | 0.3934 |
| gliclass-modern-large-v2.0-init | 8 | 0.5098 | 0.8339 | 0.5010 | 0.6149 |
| gliclass-modern-large-v2.0-init | Weak Supervision | 0.0951 | 0.6478 | 0.4520 | 0.3983 |
| gliclass-modern-base-v2.0-init | 0 | 0.1972 | 0.6050 | 0.2474 | 0.3499 |
| gliclass-modern-base-v2.0-init | 8 | 0.3604 | 0.7481 | 0.4420 | 0.5168 |
| gliclass-modern-base-v2.0-init | Weak Supervision | 0.1599 | 0.5713 | 0.3216 | 0.3509 |
| gliclass-large-v1.0-init | 0 | 0.1639 | 0.7069 | 0.3840 | 0.4183 |
| gliclass-large-v1.0-init | 8 | 0.4226 | 0.8415 | 0.4886 | 0.5842 |
| gliclass-large-v1.0-init | Weak Supervision | 0.1689 | 0.7051 | 0.4586 | 0.4442 |
| gliclass-base-v1.0-init | 0 | 0.2048 | 0.5614 | 0.2865 | 0.3509 |
| gliclass-base-v1.0-init | 8 | 0.2007 | 0.8359 | 0.4856 | 0.5074 |
| gliclass-base-v1.0-init | Weak Supervision | 0.0681 | 0.6627 | 0.3066 | 0.3458 | | [
"TEXT_CLASSIFICATION",
"SEMANTIC_SIMILARITY"
] | [
"CHEMPROT"
] |
flowaicom/Flow-Judge-v0.1-AWQ | flowaicom | text-generation | [
"transformers",
"safetensors",
"phi3",
"text-generation",
"lm-judge",
"evaluation",
"nlp",
"conversational",
"autoawq",
"custom_code",
"en",
"base_model:microsoft/Phi-3.5-mini-instruct",
"base_model:quantized:microsoft/Phi-3.5-mini-instruct",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"4-bit",
"awq",
"region:us"
] | 2024-09-15T12:30:58 | 2024-10-09T05:59:56 | 642 | 6 | ---
base_model: microsoft/Phi-3.5-mini-instruct
language:
- en
library_name: transformers
license: apache-2.0
metrics:
- accuracy
- f1
- precision
- recall
- pearsonr
- spearmanr
- kendall-tau
model_name: Flow-Judge-v0.1-AWQ
pipeline_tag: text-generation
tags:
- lm-judge
- phi3
- evaluation
- nlp
- conversational
- autoawq
inference: false
model_creator: Flow AI
model_type: phi3.5
quantized_by: Flow AI
---
# Flow-Judge-v0.1-AWQ
- Original model: [Flow-Judge-v0.1](https://huggingface.co/flowaicom/Flow-Judge-v0.1)
- Model collection: [Flow-Judge-v0.1 models](https://huggingface.co/collections/flowaicom/flow-judge-v01-66e6af5fc3b3a128bde07dec)
- Technical report: [Flow Judge: An Open Small Language Model for LLM System Evaluations](https://huggingface.co/flowaicom/Flow-Judge-v0.1)
- Model website: [flow-ai.com/judge](https://www.flow-ai.com/blog/flow-judge)
- About us: [Flow AI](https://www.flow-ai.com/about)
<!-- description start -->
## Description
This repo contains AWQ safetensors quant for [Flow-Judge-v0.1](https://huggingface.co/flowaicom/Flow-Judge-v0.1).
## Quantization config
```python
quant_config = { "zero_point": True, "q_group_size": 128, "w_bit": 4, "version": "GEMM" }
model = AutoAWQForCausalLM.from_pretrained(merged_path, **{"low_cpu_mem_usage": True, "use_cache": False},
attn_implementation="flash_attention_2", torch_dtype="auto", device_map="auto")
tokenizer = AutoTokenizer.from_pretrained(lora_path, trust_remote_code=False)
model.quantize(tokenizer, quant_config=quant_config)
model.save_quantized(quant_path)
tokenizer.save_pretrained(quant_path)
```
## Library versions used for quantization
```raw
autoawq 0.2.6
autoawq_kernels 0.0.7
```
## Running the AWQ model
First install the flow judge library
```shell
git clone https://github.com/flowaicom/flow-judge
cd flow-judge
pip install -e ".[vllm]"
```
Quickstart with Python:
```python
from flow_judge import Vllm, Llamafile, Hf, EvalInput, FlowJudge
from flow_judge.metrics import RESPONSE_FAITHFULNESS_5POINT
from IPython.display import Markdown, display
# If you are running on an Ampere GPU or newer, create a model using VLLM
model = Vllm(quantization=True)
# If you have other applications open taking up VRAM, you can use less VRAM by setting gpu_memory_utilization to a lower value.
# model = Vllm(gpu_memory_utilization=0.70)
# Or create a model using Llamafile if not running an Nvidia GPU & running a Silicon MacOS for example
# model = Llamafile()
# Initialize the judge
faithfulness_judge = FlowJudge(
metric=RESPONSE_FAITHFULNESS_5POINT,
model=model
)
# Sample to evaluate
query = ...
context = ...
response = ...
# Create an EvalInput
# We want to evaluate the response to the customer issue based on the context and the user instructions
eval_input = EvalInput(
inputs=[
{"query": query},
{"context": context},
],
output={"response": response},
)
# Run the evaluation
result = faithfulness_judge.evaluate(eval_input, save_results=False)
# Display the result
display(Markdown(f"__Feedback:__\n{result.feedback}\n\n__Score:__\n{result.score}"))
```
Discover more at our repository [https://github.com/flowaicom/flow-judge](https://github.com/flowaicom/flow-judge)
# Original model card: Flow-Judge-v0.1
<p align="center">
<img src="https://cdn-uploads.huggingface.co/production/uploads/63368577d184e6b53c50e6d0/6kSJKgPh2pDh4tA-Ky0xW.png" alt="Centered image">
</p>
<p align="center">🚀 <a href="https://www.flow-ai.com/judge">Flow Judge</a> | 📄 <a href="https://www.flow-ai.com/blog/flow-judge">Technical report</a> | 💻 <a href="https://github.com/flowaicom/flow-judge">flow-judge</a></p>
## Model Summary
Flow-Judge-v0.1 is a compact yet powerful 3.8B model that offers customizable LLM system evaluations across various fields. The model inherits it's architecture from Phi-3.5-mini instruct model which enables Flow-Judge to deliver high-quality results while maintaining a small footprint. Despite its smaller size, it achieves performance comparable to larger models in both held-out and out-of-domain benchmarks. Flow-Judge-v0.1 supports multiple scoring scales, provides qualitative feedback, and generates structured evaluation outputs. Trained on a smaller synthetic dataset, it represents an efficient approach to AI development. Released under the Apache 2.0 license, Flow Judge is an open and accessible model suitable for developers and companies seeking cost-effective and rapid evaluations using custom rubrics.
__Quantized weights__
- [flowaicom/Flow-Judge-v0.1-AWQ](https://huggingface.co/flowaicom/Flow-Judge-v0.1-AWQ)
- [flowaicom/Flow-Judge-v0.1-GGUF](https://huggingface.co/flowaicom/Flow-Judge-v0.1-GGUF)
__Quickstart__
- [Quickstart](https://github.com/flowaicom/flow-judge/examples/1_quickstart.ipynb)
## Intended Use Case
Flow Judge is intended to be used on custom LLM system evaluation tasks.
- Customizable evaluations: Users can define their own evaluation criteria and rubrics, tailoring Flow Judge to their specific needs and requirements. This flexibility allows for the creation of highly targeted assessments that accurately measure performance of their LLM system
- Flow Judge supports three different scoring scales:
- Pass/fail: Suitable for binary assessments, such as determining whether a piece of text meets a specific standard or contains errors.
- 3-Likert: Allows for more granular evaluations, with scores ranging from negative to neutral to positive. Useful for assessing the overall quality or sentiment of a piece of text.
- 5-Likert: Provides an even more nuanced assessment, with scores ranging from strongly negative to strongly positive, enabling users to capture subtle differences in quality or sentiment.
- Easy to interpret results:
- Flow Judge produces structured evaluations with `<feedback>` and `<score>` tags.
- Qualitative feedback: Flow Judge detects errors and grades outputs and provides qualitative feedback that explains its reasoning for assigning a particular score from the rubric while highlighting problematic parts of the responses.
- Score: Based on a grading rubric Flow Judge will return a numerical score on binary, likert-3 or likert-5 scale.
## Training
### Model
Flow Judge is based on the Phi-3.5-mini architecture, and the base model checkpoint used is specifically its instruct version. The model uses the same tokenizer, supports MQA and Flash Attention 2, and has weights in bfloat16 precision. However, post-finetuning, the model's support for languages and long context lengths has not been fully tested. Due to specialized Supervised Fine-Tuning (SFT), Flow Judge might show different benchmark results and support a maximum context length of 8192, shorter than the base model's.
### Training Datasets
Flow-Judge-v0.1 has been trained on synthetically generated datasets. The construction of training datasets for Flow Judge involves a multi-step process:
1. Manually curating seed rubrics to serve as a foundation
2. Synthetically generating domain-adapted metrics and rubrics for various domains
3. Synthetically generating training instances with multiple inputs, such as user queries and contextual information
4. Employing a dual-evaluation strategy with consensus to ensure quality and consistency
This process creates a comprehensive and diverse set of training instances that enable accurate, domain-specific evaluations of LLM systems in generative AI products while minimizing human intervention.
Read more about the dataset construction from [here](https://www.flow-ai.com/blog/flow-judge#dataset-construction)
### Fine-tuning
For fine-tuning we used Axolotl's preprocessing to ensure input training data is consistent. We then conducted supervised fine-tuning based on microsoft/Phi-3.5-mini-instruct using RSLoRa. More detailed information about the fine-tuning process is provided in our [technical report](https://www.flow-ai.com/blog/flow-judge#fine-tuning).
## Usage
### Prompt format
#### Prompt template with inputs
```text
# GOAL
Your job is to evaluate a task carried out by an AI system powered by a large language model.
You will be provided with the inputs and output of the task, as well as the evaluation criteria and scoring rubric. Your task is to evaluate the output of the AI system based on the evaluation criteria and scoring rubric provided.
# INPUT
Below are the inputs required for performing the task:
<inputs>
{INPUTS}
</inputs>
# OUTPUT
Below is the output of the task:
<output>
{OUTPUT}
</output>
# EVALUATION CRITERIA AND SCORING RUBRIC
Here are the evaluation criteria and the rubric that you need to use for evaluating the task:
<evaluation_criteria>
{EVALUATION_CRITERIA}
</evaluation_criteria>
<scoring_rubric>
{RUBRIC}
</scoring_rubric>
# INSTRUCTIONS FOR THE EVALUATION
1. Understand the task and criteria: Familiarize yourself with the task to be evaluated. Review the evaluation criteria and scoring rubric to understand the different levels of performance and the descriptions for each score.
2. Review the inputs and output: Look at the inputs provided for the task. Examine the output generated from completing the task.
3. Compare output to score descriptions: Compare the output against the criteria and score descriptions in the scoring rubric. For each criterion,decide which description best matches the output.
4. After comparing the output to the score descriptions, pay attention to the small details that might impact the final score that you assign. Sometimes a small difference can dictate the final score.
5. Write verbal feedback justifying your evaluation that includes a detailed rationale, referring to specific aspects of the output and comparing them to the rubric.
6. Assign a final score based on the scoring rubric.
## FORMAT FOR THE EVALUATION
- Write the verbal feedback inside <feedback> tags without any additional surrounding text.
- Write the numeric score inside <score> tags, without any additional surrounding text and always after the feedback.
Please accurately evaluate the task. Strictly adhere to the evaluation criteria and rubric.
```
#### Prompt template without inputs
```text
# GOAL
Your job is to evaluate a task carried out by an AI system powered by a large language model.
You will be provided the output of the task, as well as the evaluation criteria and scoring rubric. Your task is to evaluate the output of the AI system based on the evaluation criteria and scoring rubric provided.
# OUTPUT
Below is the output of the task:
<output>
{OUTPUT}
</output>
# EVALUATION CRITERIA AND SCORING RUBRIC
Here are the evaluation criteria and the rubric that you need to use for evaluating the task:
<evaluation_criteria>
{EVALUATION_CRITERIA}
</evaluation_criteria>
<scoring_rubric>
{RUBRIC}
</scoring_rubric>
# INSTRUCTIONS FOR THE EVALUATION
1. Understand the task and criteria: Familiarize yourself with the task to be evaluated. Review the evaluation criteria and scoring rubric to understand the different levels of performance and the descriptions for each score.
2. Review the output: Examine the output generated from completing the task.
3. Compare output to score descriptions: Compare the output against the criteria and score descriptions in the scoring rubric. For each criterion,decide which description best matches the output.
4. After comparing the output to the score descriptions, pay attention to the small details that might impact the final score that you assign. Sometimes a small difference can dictate the final score.
5. Write verbal feedback justifying your evaluation that includes a detailed rationale, referring to specific aspects of the output and comparing them to the rubric.
6. Assign a final score based on the scoring rubric.
## FORMAT FOR THE EVALUATION
- Write the verbal feedback inside <feedback> tags without any additional surrounding text.
- Write the numeric score inside <score> tags, without any additional surrounding text and always after the feedback.
Please accurately evaluate the task. Strictly adhere to the evaluation criteria and rubric.
```
#### Formatted prompt example with inputs
```text
# GOAL
Your job is to evaluate a task carried out by an AI system powered by a large language model.
You will be provided with the inputs and output of the task, as well as the evaluation criteria and scoring rubric. Your task is to evaluate the output of the AI system based on the evaluation criteria and scoring rubric provided.
# INPUT/s
Below are the inputs required for performing the task:
<inputs>
<customer_query>
Subject: Issues with recent order #12345
Hello,
I recently received my order #12345, but I'm having a few problems:
1. The blue t-shirt I ordered is actually green. Can I exchange it?
2. The phone case doesn't fit my iPhone 13 Pro Max. The website said it was compatible.
3. I was charged for express shipping, but it took over a week to arrive. Can I get a refund on the shipping cost?
Please let me know how we can resolve these issues. I've been a loyal customer for years and this is the first time I've had problems like this.
Thank you,
Jane Smith
</customer_query>
</inputs>
# OUTPUT
Below is the output of the task:
<output>
Dear Jane Smith,
Thank you for reaching out about the issues with your recent order #12345. We sincerely apologize for the inconvenience you've experienced. Let me address each of your concerns:
1. T-shirt color discrepancy: We're sorry about the mix-up with the t-shirt color. We'd be happy to exchange it for the correct blue t-shirt. I'll send you a prepaid return label for the green shirt, and we'll ship out the blue one as soon as we receive the return.
2. Phone case compatibility: We apologize for the confusion regarding the phone case compatibility. Our website information should have been more accurate. We'll gladly accept a return of the incompatible case and issue a full refund for it. Additionally, I can recommend a case that will fit your iPhone 13 Pro Max if you're interested.
3. Shipping charge: You're absolutely right about the shipping charge. Since the delivery didn't meet our express shipping standards, we will refund the entire shipping cost to your original payment method. This refund should process within 3-5 business days.
To initiate these resolutions, please reply to this email confirming that you'd like to proceed with the t-shirt exchange and phone case return. Once confirmed, I'll send you the necessary return labels and process the shipping refund immediately.
We truly value your loyalty as a long-time customer and sincerely regret that you've encountered these issues. Rest assured, we're taking steps to prevent similar problems in the future. As a gesture of goodwill, we'd like to offer you a 20% discount on your next order.
If you have any further questions or concerns, please don't hesitate to reach out. We're here to ensure your complete satisfaction.
Best regards,
Alex Johnson
Customer Service Representative
</output>
# EVALUATION CRITERIA AND SCORING RUBRIC
Here are the evaluation criteria and the rubric that you need to use for evaluating the task:
<evaluation_criteria>
How well the response addresses the specific issues raised in the customer's query?
</evaluation_criteria>
<scoring_rubric>
- Score 1: The response completely fails to address the customer's needs and ignores the specific issues raised.
- Score 2: The response barely addresses the customer's query and misses most of the specific issues raised.
- Score 3: The response partially addresses the customer's query, touching on some of the specific issues but leaving others unaddressed.
- Score 4: The response adequately addresses most aspects of the customer's query and the specific issues raised.
- Score 5: The response fully and comprehensively addresses all aspects of the customer's query and all specific issues raised in a highly satisfactory manner.
</scoring_rubric>
# INSTRUCTIONS FOR THE EVALUATION
1. Understand the task and criteria: Familiarize yourself with the task to be evaluated. Review the evaluation criteria and scoring rubric to understand the different levels of performance and the descriptions for each score.
2. Review the inputs and output: Look at the inputs provided for the task. Examine the output generated from completing the task.
3. Compare output to score descriptions: Compare the output against the criteria and score descriptions in the scoring rubric. For each criterion,decide which description best matches the output.
4. After comparing the output to the score descriptions, pay attention to the small details that might impact the final score that you assign. Sometimes a small difference can dictate the final score.
5. Write verbal feedback justifying your evaluation that includes a detailed rationale, referring to specific aspects of the output and comparing them to the rubric.
6. Assign a final score based on the scoring rubric.
## FORMAT FOR THE EVALUATION
- Write the verbal feedback inside <feedback> tags without any additional surrounding text.
- Write the numeric score inside <score> tags, without any additional surrounding text and always after the feedback.
Please accurately evaluate the task. Strictly adhere to the evaluation criteria and rubric.
```
>Note that inputs and output are formatted with XML tags. See [flow-judge](https://github.com/flowaicom/flow-judge) repository formatting functions for more details.
### Inference
Evaluations can easily be run using our [flow-judge](https://github.com/flowaicom/flow-judge) library. It currently supports both Transformers and vllm engine.
To run Flow Judge efficiently, ensure your hardware meets the following requirements:
- Modern GPU with at least 4 GB VRAM (e.g., NVIDIA RTX series)
- Minimum of 8 GB of system memory
- At least 10GB of free storage for model files and dependencies.
## Evaluation
### Held-out test sets
<table border="1" cellpadding="10" cellspacing="0" style="border-collapse: collapse; width: auto;">
<thead>
<tr>
<th rowspan="2" style="text-align: left;">Evaluator</th>
<th colspan="3" style="text-align: center;">Pass / Fail Held-out Test set</th>
</tr>
<tr>
<th style="text-align: center;">Precision</th>
<th style="text-align: center;">Recall</th>
<th style="text-align: center;">F1</th>
</tr>
</thead>
<tbody>
<tr>
<td style="text-align: left;">microsoft/Phi-3.5-mini-instruct</td>
<td style="text-align: center;">0.685</td>
<td style="text-align: center;"><strong>1.000</strong></td>
<td style="text-align: center;">0.813</td>
</tr>
<tr>
<td style="text-align: left;">meta-llama/Meta-Llama-3.1-8B-Instruct</td>
<td style="text-align: center;"><u>0.870</u></td>
<td style="text-align: center;">0.982</td>
<td style="text-align: center;"><u>0.923</u></td>
</tr>
<tr>
<td style="text-align: left;">mistralai/Mistral-Nemo-Instruct-2407</td>
<td style="text-align: center;">0.709</td>
<td style="text-align: center;"><u>0.994</u></td>
<td style="text-align: center;">0.827</td>
</tr>
<tr>
<td style="text-align: left;">gpt-4o-mini</td>
<td style="text-align: center;">0.834</td>
<td style="text-align: center;">1.000</td>
<td style="text-align: center;">0.910</td>
</tr>
<tr>
<td style="text-align: left;">flowaicom/Flow-Judge-v0.1</td>
<td style="text-align: center;"><strong>0.940</strong></td>
<td style="text-align: center;">0.972</td>
<td style="text-align: center;"><strong>0.955</strong></td>
</tr>
</tbody>
</table>
<table border="1" cellpadding="10" cellspacing="0" style="border-collapse: collapse; width: auto;">
<thead>
<tr>
<th rowspan="2" style="text-align: left;">Evaluator</th>
<th colspan="3" style="text-align: center;">3-Likert Held-out Test set</th>
<th colspan="3" style="text-align: center;">5-Likert Held-out Test set</th>
</tr>
<tr>
<th style="text-align: center;">pearsonr</th>
<th style="text-align: center;">spearmanr</th>
<th style="text-align: center;">kendall-tau</th>
<th style="text-align: center;">pearsonr</th>
<th style="text-align: center;">spearmanr</th>
<th style="text-align: center;">kendall-tau</th>
</tr>
</thead>
<tbody>
<tr>
<td style="text-align: left;">microsoft/Phi-3.5-mini-instruct</td>
<td style="text-align: center;">0.756</td>
<td style="text-align: center;">0.749</td>
<td style="text-align: center;">0.695</td>
<td style="text-align: center;">0.808</td>
<td style="text-align: center;">0.819</td>
<td style="text-align: center;">0.739</td>
</tr>
<tr>
<td style="text-align: left;">prometheus-eval/prometheus-7b-v2.0*</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;"><u>0.910</u></td>
<td style="text-align: center;"><u>0.908</u></td>
<td style="text-align: center;"><u>0.838</u></td>
</tr>
<tr>
<td style="text-align: left;">meta-llama/Meta-Llama-3.1-8B-Instruct</td>
<td style="text-align: center;"><u>0.836</u></td>
<td style="text-align: center;"><u>0.833</u></td>
<td style="text-align: center;"><u>0.789</u></td>
<td style="text-align: center;">0.854</td>
<td style="text-align: center;">0.868</td>
<td style="text-align: center;">0.791</td>
</tr>
<tr>
<td style="text-align: left;">mistralai/Mistral-Nemo-Instruct-2407</td>
<td style="text-align: center;">0.813</td>
<td style="text-align: center;">0.807</td>
<td style="text-align: center;">0.758</td>
<td style="text-align: center;">0.870</td>
<td style="text-align: center;">0.867</td>
<td style="text-align: center;">0.789</td>
</tr>
<tr>
<td style="text-align: left;">gpt-4o-mini</td>
<td style="text-align: center;">0.890</td>
<td style="text-align: center;">0.888</td>
<td style="text-align: center;">0.851</td>
<td style="text-align: center;">0.923</td>
<td style="text-align: center;">0.923</td>
<td style="text-align: center;">0.864</td>
</tr>
<tr>
<td style="text-align: left;">flowaicom/Flow-Judge-v0.1</td>
<td style="text-align: center;"><strong>0.888</strong></td>
<td style="text-align: center;"><strong>0.888</strong></td>
<td style="text-align: center;"><strong>0.852</strong></td>
<td style="text-align: center;"><strong>0.919</strong></td>
<td style="text-align: center;"><strong>0.919</strong></td>
<td style="text-align: center;"><strong>0.856</strong></td>
</tr>
</tbody>
</table>
\* _Reported in model paper_
### RAGTruth
<table border="1" cellpadding="10" cellspacing="0" style="border-collapse: collapse; width: auto;">
<tr>
<th rowspan="2" style="text-align: left;">Evaluator</th>
<th colspan="3" style="text-align:center;">RAGTruth QA</th>
<th colspan="3" style="text-align:center;">RAGTruth Data-to-Text</th>
<th colspan="3" style="text-align:center;">RAGTruth Summarization</th>
</tr>
<tr>
<th style="text-align:center;">Precision</th>
<th style="text-align:center;">Recall</th>
<th style="text-align:center;">F1</th>
<th style="text-align:center;">Precision</th>
<th style="text-align:center;">Recall</th>
<th style="text-align:center;">F1</th>
<th style="text-align:center;">Precision</th>
<th style="text-align:center;">Recall</th>
<th style="text-align:center;">F1</th>
</tr>
<tr>
<td>microsoft/Phi-3.5-mini-instruct</td>
<td style="text-align:center;">0.817</td>
<td style="text-align:center;">0.963</td>
<td style="text-align:center;">0.884</td>
<td style="text-align:center;">0.356</td>
<td style="text-align:center;"><strong>1.000</strong></td>
<td style="text-align:center;">0.525</td>
<td style="text-align:center;">0.776</td>
<td style="text-align:center;"><strong>1.000</strong></td>
<td style="text-align:center;"><strong>0.874</strong></td>
</tr>
<tr>
<td>meta-llama/Meta-Llama-3.1-8B-Instruct</td>
<td style="text-align:center;"><strong>0.844</strong></td>
<td style="text-align:center;"><u>0.986</u></td>
<td style="text-align:center;"><strong>0.910</strong></td>
<td style="text-align:center;">0.382</td>
<td style="text-align:center;">0.537</td>
<td style="text-align:center;">0.447</td>
<td style="text-align:center;"><u>0.797</u></td>
<td style="text-align:center;"><u>0.940</u></td>
<td style="text-align:center;">0.863</td>
</tr>
<tr>
<td>mistralai/Mistral-Nemo-Instruct-2407</td>
<td style="text-align:center;">0.821</td>
<td style="text-align:center;"><strong>0.995</strong></td>
<td style="text-align:center;"><u>0.900</u></td>
<td style="text-align:center;">0.357</td>
<td style="text-align:center;"><strong>1.000</strong></td>
<td style="text-align:center;">0.526</td>
<td style="text-align:center;">0.775</td>
<td style="text-align:center;"><strong>1.000</strong></td>
<td style="text-align:center;"><u>0.873</u></td>
</tr>
<tr>
<td>gpt-4o-mini</td>
<td style="text-align:center;">0.830</td>
<td style="text-align:center;">0.966</td>
<td style="text-align:center;">0.893</td>
<td style="text-align:center;">0.398</td>
<td style="text-align:center;">0.994</td>
<td style="text-align:center;">0.569</td>
<td style="text-align:center;">0.786</td>
<td style="text-align:center;">0.997</td>
<td style="text-align:center;">0.879</td>
</tr>
<tr>
<td>Luna*</td>
<td style="text-align:center;">0.378</td>
<td style="text-align:center;">0.800</td>
<td style="text-align:center;">0.513</td>
<td style="text-align:center;">0.649</td>
<td style="text-align:center;">0.912</td>
<td style="text-align:center;"><u>0.759</u></td>
<td style="text-align:center;">0.400</td>
<td style="text-align:center;">0.765</td>
<td style="text-align:center;">0.525</td>
</tr>
<tr>
<td>RAGAS Faithfuless*</td>
<td style="text-align:center;">0.312</td>
<td style="text-align:center;">0.419</td>
<td style="text-align:center;">0.357</td>
<td style="text-align:center;"><strong>0.792</strong></td>
<td style="text-align:center;">0.508</td>
<td style="text-align:center;">0.619</td>
<td style="text-align:center;">0.642</td>
<td style="text-align:center;">0.299</td>
<td style="text-align:center;">0.408</td>
</tr>
<tr>
<td>Trulens Groundedness*</td>
<td style="text-align:center;">0.228</td>
<td style="text-align:center;">0.925</td>
<td style="text-align:center;">0.366</td>
<td style="text-align:center;"><u>0.669</u></td>
<td style="text-align:center;"><u>0.965</u></td>
<td style="text-align:center;"><strong>0.790</strong></td>
<td style="text-align:center;">0.402</td>
<td style="text-align:center;">0.500</td>
<td style="text-align:center;">0.445</td>
</tr>
<tr>
<td>flowaicom/Flow-Judge-v0.1</td>
<td style="text-align:center;"><u>0.835</u></td>
<td style="text-align:center;">0.961</td>
<td style="text-align:center;">0.894</td>
<td style="text-align:center;">0.541</td>
<td style="text-align:center;">0.249</td>
<td style="text-align:center;">0.341</td>
<td style="text-align:center;"><strong>0.834</strong></td>
<td style="text-align:center;">0.836</td>
<td style="text-align:center;">0.835</td>
</tr>
</table>
\* _reported in model paper_
### HaluEval, Covid-QA, PubMedQA
<table border="1" cellpadding="10" cellspacing="0" style="border-collapse: collapse; width: auto;">
<thead>
<tr>
<th rowspan="2" style="text-align: left;">Evaluator</th>
<th colspan="4" style="text-align: center;">HaluEval</th>
<th colspan="4" style="text-align: center;">Covid-QA</th>
<th colspan="4" style="text-align: center;">PubMedQA</th>
</tr>
<tr>
<th style="text-align: center;">Precision</th>
<th style="text-align: center;">Recall</th>
<th style="text-align: center;">F1</th>
<th style="text-align: center;">Accuracy</th>
<th style="text-align: center;">Precision</th>
<th style="text-align: center;">Recall</th>
<th style="text-align: center;">F1</th>
<th style="text-align: center;">Accuracy</th>
<th style="text-align: center;">Precision</th>
<th style="text-align: center;">Recall</th>
<th style="text-align: center;">F1</th>
<th style="text-align: center;">Accuracy</th>
</tr>
</thead>
<tbody>
<tr>
<td style="text-align: left;">microsoft/Phi-3.5-mini-instruct</td>
<td style="text-align: center;">0.730</td>
<td style="text-align: center;"><u>0.914</u></td>
<td style="text-align: center;">0.812</td>
<td style="text-align: center;">0.788</td>
<td style="text-align: center;">0.617</td>
<td style="text-align: center;">0.964</td>
<td style="text-align: center;">0.752</td>
<td style="text-align: center;">0.681</td>
<td style="text-align: center;">0.623</td>
<td style="text-align: center;"><u>0.986</u></td>
<td style="text-align: center;">0.764</td>
<td style="text-align: center;">0.696</td>
</tr>
<tr>
<td style="text-align: left;">meta-llama/Meta-Llama-3.1-8B-Instruct</td>
<td style="text-align: center;"><strong>0.864</strong></td>
<td style="text-align: center;">0.891</td>
<td style="text-align: center;"><strong>0.878</strong></td>
<td style="text-align: center;"><u>0.874</u></td>
<td style="text-align: center;"><u>0.663</u></td>
<td style="text-align: center;"><u>0.976</u></td>
<td style="text-align: center;"><u>0.790</u></td>
<td style="text-align: center;">0.734</td>
<td style="text-align: center;"><u>0.681</u></td>
<td style="text-align: center;">0.962</td>
<td style="text-align: center;"><strong>0.797</strong></td>
<td style="text-align: center;">0.750</td>
</tr>
<tr>
<td style="text-align: left;">mistralai/Mistral-Nemo-Instruct-2407</td>
<td style="text-align: center;">0.655</td>
<td style="text-align: center;"><strong>0.993</strong></td>
<td style="text-align: center;">0.789</td>
<td style="text-align: center;">0.735</td>
<td style="text-align: center;">0.651</td>
<td style="text-align: center;"><strong>0.982</strong></td>
<td style="text-align: center;">0.783</td>
<td style="text-align: center;">0.728</td>
<td style="text-align: center;">0.602</td>
<td style="text-align: center;"><strong>0.994</strong></td>
<td style="text-align: center;"><u>0.750</u></td>
<td style="text-align: center;">0.669</td>
</tr>
<tr>
<td style="text-align: left;">gpt-4o-mini</td>
<td style="text-align: center;">0.846</td>
<td style="text-align: center;">0.940</td>
<td style="text-align: center;">0.891</td>
<td style="text-align: center;">0.885</td>
<td style="text-align: center;">0.795</td>
<td style="text-align: center;">0.964</td>
<td style="text-align: center;">0.872</td>
<td style="text-align: center;">0.858</td>
<td style="text-align: center;">0.791</td>
<td style="text-align: center;">0.904</td>
<td style="text-align: center;">0.843</td>
<td style="text-align: center;">0.832</td>
</tr>
<tr>
<td style="text-align: left;">flowaicom/Flow-Judge-v0.1</td>
<td style="text-align: center;"><u>0.826</u></td>
<td style="text-align: center;">0.895</td>
<td style="text-align: center;"><u>0.859</u></td>
<td style="text-align: center;">0.854</td>
<td style="text-align: center;"><strong>0.767</strong></td>
<td style="text-align: center;">0.877</td>
<td style="text-align: center;"><strong>0.818</strong></td>
<td style="text-align: center;">0.807</td>
<td style="text-align: center;"><strong>0.874</strong></td>
<td style="text-align: center;">0.624</td>
<td style="text-align: center;">0.728</td>
<td style="text-align: center;">0.767</td>
</tr>
<tr>
<td style="text-align: left;">gpt-4o*</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">0.879</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">0.821</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">0.821</td>
</tr>
<tr>
<td style="text-align: left;">Claude 3 Sonnet*</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">0.845</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">0.829</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">0.829</td>
</tr>
<tr>
<td style="text-align: left;">RAGAS Faithfulness*</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">0.706</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">0.750</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">0.669</td>
</tr>
<tr>
<td style="text-align: left;">Lynx 8B*</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">0.857</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;"><u>0.963</u></td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;"><u>0.852</u></td>
</tr>
<tr>
<td style="text-align: left;">Lynx 70B*</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;"><strong>0.884</strong></td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;"><strong>0.975</strong></td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;">-</td>
<td style="text-align: center;"><strong>0.904</strong></td>
</tr>
</tbody>
</table>
\* _reported in model paper_
### Feedback Bench
<table border="1" cellpadding="10" cellspacing="0" style="border-collapse: collapse; width: auto;">
<tr>
<th rowspan="2">Evaluator</th>
<th colspan="3" style="text-align:center;">Feedback bench</th>
</tr>
<tr>
<th style="text-align:center;">pearsonr</th>
<th style="text-align:center;">spearmanr</th>
<th style="text-align:center;">kendall-tau</th>
</tr>
<tr>
<td>microsoft/Phi-3.5-mini-instruct</td>
<td style="text-align:center;">0.710</td>
<td style="text-align:center;">0.721</td>
<td style="text-align:center;">0.622</td>
</tr>
<tr>
<td>prometheus-eval/prometheus-7b-v2.0*</td>
<td style="text-align:center;"><strong>0.878</strong></td>
<td style="text-align:center;"><strong>0.909</strong></td>
<td style="text-align:center;"><strong>0.773</strong></td>
</tr>
<tr>
<td>meta-llama/Meta-Llama-3.1-8B-Instruct</td>
<td style="text-align:center;">0.742</td>
<td style="text-align:center;">0.749</td>
<td style="text-align:center;">0.654</td>
</tr>
<tr>
<td>mistralai/Mistral-Nemo-Instruct-2407</td>
<td style="text-align:center;">0.720</td>
<td style="text-align:center;">0.724</td>
<td style="text-align:center;">0.632</td>
</tr>
<tr>
<td>gpt-4o-mini</td>
<td style="text-align:center;">0.797</td>
<td style="text-align:center;">0.795</td>
<td style="text-align:center;">0.701</td>
</tr>
<tr>
<td>flowaicom/Flow-Judge-v0.1</td>
<td style="text-align:center;"><u>0.787</u></td>
<td style="text-align:center;"><u>0.789</u></td>
<td style="text-align:center;"><u>0.688</u></td>
</tr>
</table>
\* _reported in model paper using reference answers_
## License
We opted for the Apache 2.0 license for Flow Judge to provide the community with an open, small yet powerful LM evaluator. Our goal is to support the wider adoption of rigorous evaluation techniques in LLM system development, making them more accessible to practitioners and researchers.
## Limitations and future work
Multilingual evaluation: Flow Judge has been fine-tuned exclusively on English data. While the foundation model (Phi-3.5-mini-instruct [17]) may possess multilingual capabilities, we have not systematically evaluated Flow Judge performance in non-English contexts. We plan to explore multi-lingual LM evaluators in the future.
Long context and structured Inputs: Our training dataset encompasses a wide range of custom metrics relevant to evaluating LLM systems. However, it does not include examples with long context inputs or structured data formats such as JSON, since these are harder to synthetically generate. This limitation may impact Flow Judge's performance when evaluating responses that require processing extensive context or parsing structured input. Extending our model’s capabilities to handle these input types represents an important area for future research.
Math and coding: The current version has not been trained on specific task domains such as arithmetic problems or code evaluation. As a result, its performance in these specialized areas may be limited. Future iterations of the model should address these gaps.
Domain-specific knowledge and complex multi-step evaluations: Flow Judge may struggle with highly specialized domain knowledge or proprietary data outside the training scope of its foundation model. Additionally, evaluation tasks requiring multi-step reasoning or complex logical processes may challenge the model's capabilities. We strongly recommend conducting meta-evaluations of the model performance before deploying it in specialized or highly complex evaluation scenarios. | [
"SUMMARIZATION"
] | [
"PUBMEDQA"
] |
mav23/gte-Qwen2-1.5B-instruct-GGUF | mav23 | sentence-similarity | [
"sentence-transformers",
"gguf",
"mteb",
"transformers",
"Qwen2",
"sentence-similarity",
"arxiv:2308.03281",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us",
"conversational"
] | 2024-10-11T14:04:27 | 2024-10-11T14:18:45 | 631 | 2 | ---
license: apache-2.0
tags:
- mteb
- sentence-transformers
- transformers
- Qwen2
- sentence-similarity
model-index:
- name: gte-qwen2-7B-instruct
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 83.98507462686567
- type: ap
value: 50.93015252587014
- type: f1
value: 78.50416599051215
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 96.61065
- type: ap
value: 94.89174052954196
- type: f1
value: 96.60942596940565
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 55.614000000000004
- type: f1
value: 54.90553480294904
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: mteb/arguana
config: default
split: test
revision: c22ab2a51041ffd869aaddef7af8d8215647e41a
metrics:
- type: map_at_1
value: 45.164
- type: map_at_10
value: 61.519
- type: map_at_100
value: 61.769
- type: map_at_1000
value: 61.769
- type: map_at_3
value: 57.443999999999996
- type: map_at_5
value: 60.058
- type: mrr_at_1
value: 46.088
- type: mrr_at_10
value: 61.861
- type: mrr_at_100
value: 62.117999999999995
- type: mrr_at_1000
value: 62.117999999999995
- type: mrr_at_3
value: 57.729
- type: mrr_at_5
value: 60.392
- type: ndcg_at_1
value: 45.164
- type: ndcg_at_10
value: 69.72
- type: ndcg_at_100
value: 70.719
- type: ndcg_at_1000
value: 70.719
- type: ndcg_at_3
value: 61.517999999999994
- type: ndcg_at_5
value: 66.247
- type: precision_at_1
value: 45.164
- type: precision_at_10
value: 9.545
- type: precision_at_100
value: 0.996
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 24.443
- type: precision_at_5
value: 16.97
- type: recall_at_1
value: 45.164
- type: recall_at_10
value: 95.448
- type: recall_at_100
value: 99.644
- type: recall_at_1000
value: 99.644
- type: recall_at_3
value: 73.329
- type: recall_at_5
value: 84.851
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 50.511868162026175
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 45.007803189284004
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 64.55292107723382
- type: mrr
value: 77.66158818097877
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 85.65459047085452
- type: cos_sim_spearman
value: 82.10729255710761
- type: euclidean_pearson
value: 82.78079159312476
- type: euclidean_spearman
value: 80.50002701880933
- type: manhattan_pearson
value: 82.41372641383016
- type: manhattan_spearman
value: 80.57412509272639
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 87.30844155844156
- type: f1
value: 87.25307322443255
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 43.20754608934859
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 38.818037697335505
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: f46a197baaae43b4f621051089b82a364682dfeb
metrics:
- type: map_at_1
value: 35.423
- type: map_at_10
value: 47.198
- type: map_at_100
value: 48.899
- type: map_at_1000
value: 49.004
- type: map_at_3
value: 43.114999999999995
- type: map_at_5
value: 45.491
- type: mrr_at_1
value: 42.918
- type: mrr_at_10
value: 53.299
- type: mrr_at_100
value: 54.032000000000004
- type: mrr_at_1000
value: 54.055
- type: mrr_at_3
value: 50.453
- type: mrr_at_5
value: 52.205999999999996
- type: ndcg_at_1
value: 42.918
- type: ndcg_at_10
value: 53.98
- type: ndcg_at_100
value: 59.57
- type: ndcg_at_1000
value: 60.879000000000005
- type: ndcg_at_3
value: 48.224000000000004
- type: ndcg_at_5
value: 50.998
- type: precision_at_1
value: 42.918
- type: precision_at_10
value: 10.299999999999999
- type: precision_at_100
value: 1.687
- type: precision_at_1000
value: 0.211
- type: precision_at_3
value: 22.842000000000002
- type: precision_at_5
value: 16.681
- type: recall_at_1
value: 35.423
- type: recall_at_10
value: 66.824
- type: recall_at_100
value: 89.564
- type: recall_at_1000
value: 97.501
- type: recall_at_3
value: 50.365
- type: recall_at_5
value: 57.921
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackEnglishRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: ad9991cb51e31e31e430383c75ffb2885547b5f0
metrics:
- type: map_at_1
value: 33.205
- type: map_at_10
value: 44.859
- type: map_at_100
value: 46.135
- type: map_at_1000
value: 46.259
- type: map_at_3
value: 41.839
- type: map_at_5
value: 43.662
- type: mrr_at_1
value: 41.146
- type: mrr_at_10
value: 50.621
- type: mrr_at_100
value: 51.207
- type: mrr_at_1000
value: 51.246
- type: mrr_at_3
value: 48.535000000000004
- type: mrr_at_5
value: 49.818
- type: ndcg_at_1
value: 41.146
- type: ndcg_at_10
value: 50.683
- type: ndcg_at_100
value: 54.82
- type: ndcg_at_1000
value: 56.69
- type: ndcg_at_3
value: 46.611000000000004
- type: ndcg_at_5
value: 48.66
- type: precision_at_1
value: 41.146
- type: precision_at_10
value: 9.439
- type: precision_at_100
value: 1.465
- type: precision_at_1000
value: 0.194
- type: precision_at_3
value: 22.59
- type: precision_at_5
value: 15.86
- type: recall_at_1
value: 33.205
- type: recall_at_10
value: 61.028999999999996
- type: recall_at_100
value: 78.152
- type: recall_at_1000
value: 89.59700000000001
- type: recall_at_3
value: 49.05
- type: recall_at_5
value: 54.836
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGamingRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 4885aa143210c98657558c04aaf3dc47cfb54340
metrics:
- type: map_at_1
value: 41.637
- type: map_at_10
value: 55.162
- type: map_at_100
value: 56.142
- type: map_at_1000
value: 56.188
- type: map_at_3
value: 51.564
- type: map_at_5
value: 53.696
- type: mrr_at_1
value: 47.524
- type: mrr_at_10
value: 58.243
- type: mrr_at_100
value: 58.879999999999995
- type: mrr_at_1000
value: 58.9
- type: mrr_at_3
value: 55.69499999999999
- type: mrr_at_5
value: 57.284
- type: ndcg_at_1
value: 47.524
- type: ndcg_at_10
value: 61.305
- type: ndcg_at_100
value: 65.077
- type: ndcg_at_1000
value: 65.941
- type: ndcg_at_3
value: 55.422000000000004
- type: ndcg_at_5
value: 58.516
- type: precision_at_1
value: 47.524
- type: precision_at_10
value: 9.918000000000001
- type: precision_at_100
value: 1.276
- type: precision_at_1000
value: 0.13899999999999998
- type: precision_at_3
value: 24.765
- type: precision_at_5
value: 17.204
- type: recall_at_1
value: 41.637
- type: recall_at_10
value: 76.185
- type: recall_at_100
value: 92.149
- type: recall_at_1000
value: 98.199
- type: recall_at_3
value: 60.856
- type: recall_at_5
value: 68.25099999999999
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGisRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 5003b3064772da1887988e05400cf3806fe491f2
metrics:
- type: map_at_1
value: 26.27
- type: map_at_10
value: 37.463
- type: map_at_100
value: 38.434000000000005
- type: map_at_1000
value: 38.509
- type: map_at_3
value: 34.226
- type: map_at_5
value: 36.161
- type: mrr_at_1
value: 28.588
- type: mrr_at_10
value: 39.383
- type: mrr_at_100
value: 40.23
- type: mrr_at_1000
value: 40.281
- type: mrr_at_3
value: 36.422
- type: mrr_at_5
value: 38.252
- type: ndcg_at_1
value: 28.588
- type: ndcg_at_10
value: 43.511
- type: ndcg_at_100
value: 48.274
- type: ndcg_at_1000
value: 49.975
- type: ndcg_at_3
value: 37.319
- type: ndcg_at_5
value: 40.568
- type: precision_at_1
value: 28.588
- type: precision_at_10
value: 6.893000000000001
- type: precision_at_100
value: 0.9900000000000001
- type: precision_at_1000
value: 0.117
- type: precision_at_3
value: 16.347
- type: precision_at_5
value: 11.661000000000001
- type: recall_at_1
value: 26.27
- type: recall_at_10
value: 60.284000000000006
- type: recall_at_100
value: 81.902
- type: recall_at_1000
value: 94.43
- type: recall_at_3
value: 43.537
- type: recall_at_5
value: 51.475
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackMathematicaRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 90fceea13679c63fe563ded68f3b6f06e50061de
metrics:
- type: map_at_1
value: 18.168
- type: map_at_10
value: 28.410000000000004
- type: map_at_100
value: 29.78
- type: map_at_1000
value: 29.892999999999997
- type: map_at_3
value: 25.238
- type: map_at_5
value: 26.96
- type: mrr_at_1
value: 23.507
- type: mrr_at_10
value: 33.382
- type: mrr_at_100
value: 34.404
- type: mrr_at_1000
value: 34.467999999999996
- type: mrr_at_3
value: 30.637999999999998
- type: mrr_at_5
value: 32.199
- type: ndcg_at_1
value: 23.507
- type: ndcg_at_10
value: 34.571000000000005
- type: ndcg_at_100
value: 40.663
- type: ndcg_at_1000
value: 43.236000000000004
- type: ndcg_at_3
value: 29.053
- type: ndcg_at_5
value: 31.563999999999997
- type: precision_at_1
value: 23.507
- type: precision_at_10
value: 6.654
- type: precision_at_100
value: 1.113
- type: precision_at_1000
value: 0.146
- type: precision_at_3
value: 14.427999999999999
- type: precision_at_5
value: 10.498000000000001
- type: recall_at_1
value: 18.168
- type: recall_at_10
value: 48.443000000000005
- type: recall_at_100
value: 74.47
- type: recall_at_1000
value: 92.494
- type: recall_at_3
value: 33.379999999999995
- type: recall_at_5
value: 39.76
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackPhysicsRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4
metrics:
- type: map_at_1
value: 32.39
- type: map_at_10
value: 44.479
- type: map_at_100
value: 45.977000000000004
- type: map_at_1000
value: 46.087
- type: map_at_3
value: 40.976
- type: map_at_5
value: 43.038
- type: mrr_at_1
value: 40.135
- type: mrr_at_10
value: 50.160000000000004
- type: mrr_at_100
value: 51.052
- type: mrr_at_1000
value: 51.087
- type: mrr_at_3
value: 47.818
- type: mrr_at_5
value: 49.171
- type: ndcg_at_1
value: 40.135
- type: ndcg_at_10
value: 50.731
- type: ndcg_at_100
value: 56.452000000000005
- type: ndcg_at_1000
value: 58.123000000000005
- type: ndcg_at_3
value: 45.507
- type: ndcg_at_5
value: 48.11
- type: precision_at_1
value: 40.135
- type: precision_at_10
value: 9.192
- type: precision_at_100
value: 1.397
- type: precision_at_1000
value: 0.169
- type: precision_at_3
value: 21.816
- type: precision_at_5
value: 15.476
- type: recall_at_1
value: 32.39
- type: recall_at_10
value: 63.597
- type: recall_at_100
value: 86.737
- type: recall_at_1000
value: 97.039
- type: recall_at_3
value: 48.906
- type: recall_at_5
value: 55.659000000000006
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackProgrammersRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 6184bc1440d2dbc7612be22b50686b8826d22b32
metrics:
- type: map_at_1
value: 28.397
- type: map_at_10
value: 39.871
- type: map_at_100
value: 41.309000000000005
- type: map_at_1000
value: 41.409
- type: map_at_3
value: 36.047000000000004
- type: map_at_5
value: 38.104
- type: mrr_at_1
value: 34.703
- type: mrr_at_10
value: 44.773
- type: mrr_at_100
value: 45.64
- type: mrr_at_1000
value: 45.678999999999995
- type: mrr_at_3
value: 41.705
- type: mrr_at_5
value: 43.406
- type: ndcg_at_1
value: 34.703
- type: ndcg_at_10
value: 46.271
- type: ndcg_at_100
value: 52.037
- type: ndcg_at_1000
value: 53.81700000000001
- type: ndcg_at_3
value: 39.966
- type: ndcg_at_5
value: 42.801
- type: precision_at_1
value: 34.703
- type: precision_at_10
value: 8.744
- type: precision_at_100
value: 1.348
- type: precision_at_1000
value: 0.167
- type: precision_at_3
value: 19.102
- type: precision_at_5
value: 13.836
- type: recall_at_1
value: 28.397
- type: recall_at_10
value: 60.299
- type: recall_at_100
value: 84.595
- type: recall_at_1000
value: 96.155
- type: recall_at_3
value: 43.065
- type: recall_at_5
value: 50.371
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4
metrics:
- type: map_at_1
value: 28.044333333333338
- type: map_at_10
value: 38.78691666666666
- type: map_at_100
value: 40.113
- type: map_at_1000
value: 40.22125
- type: map_at_3
value: 35.52966666666667
- type: map_at_5
value: 37.372749999999996
- type: mrr_at_1
value: 33.159083333333335
- type: mrr_at_10
value: 42.913583333333335
- type: mrr_at_100
value: 43.7845
- type: mrr_at_1000
value: 43.830333333333336
- type: mrr_at_3
value: 40.29816666666667
- type: mrr_at_5
value: 41.81366666666667
- type: ndcg_at_1
value: 33.159083333333335
- type: ndcg_at_10
value: 44.75750000000001
- type: ndcg_at_100
value: 50.13658333333334
- type: ndcg_at_1000
value: 52.037
- type: ndcg_at_3
value: 39.34258333333334
- type: ndcg_at_5
value: 41.93708333333333
- type: precision_at_1
value: 33.159083333333335
- type: precision_at_10
value: 7.952416666666667
- type: precision_at_100
value: 1.2571666666666668
- type: precision_at_1000
value: 0.16099999999999998
- type: precision_at_3
value: 18.303833333333337
- type: precision_at_5
value: 13.057083333333333
- type: recall_at_1
value: 28.044333333333338
- type: recall_at_10
value: 58.237249999999996
- type: recall_at_100
value: 81.35391666666666
- type: recall_at_1000
value: 94.21283333333334
- type: recall_at_3
value: 43.32341666666667
- type: recall_at_5
value: 49.94908333333333
- type: map_at_1
value: 18.398
- type: map_at_10
value: 27.929
- type: map_at_100
value: 29.032999999999998
- type: map_at_1000
value: 29.126
- type: map_at_3
value: 25.070999999999998
- type: map_at_5
value: 26.583000000000002
- type: mrr_at_1
value: 19.963
- type: mrr_at_10
value: 29.997
- type: mrr_at_100
value: 30.9
- type: mrr_at_1000
value: 30.972
- type: mrr_at_3
value: 27.264
- type: mrr_at_5
value: 28.826
- type: ndcg_at_1
value: 19.963
- type: ndcg_at_10
value: 33.678999999999995
- type: ndcg_at_100
value: 38.931
- type: ndcg_at_1000
value: 41.379
- type: ndcg_at_3
value: 28.000000000000004
- type: ndcg_at_5
value: 30.637999999999998
- type: precision_at_1
value: 19.963
- type: precision_at_10
value: 5.7299999999999995
- type: precision_at_100
value: 0.902
- type: precision_at_1000
value: 0.122
- type: precision_at_3
value: 12.631
- type: precision_at_5
value: 9.057
- type: recall_at_1
value: 18.398
- type: recall_at_10
value: 49.254
- type: recall_at_100
value: 73.182
- type: recall_at_1000
value: 91.637
- type: recall_at_3
value: 34.06
- type: recall_at_5
value: 40.416000000000004
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackStatsRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a
metrics:
- type: map_at_1
value: 27.838
- type: map_at_10
value: 36.04
- type: map_at_100
value: 37.113
- type: map_at_1000
value: 37.204
- type: map_at_3
value: 33.585
- type: map_at_5
value: 34.845
- type: mrr_at_1
value: 30.982
- type: mrr_at_10
value: 39.105000000000004
- type: mrr_at_100
value: 39.98
- type: mrr_at_1000
value: 40.042
- type: mrr_at_3
value: 36.912
- type: mrr_at_5
value: 38.062000000000005
- type: ndcg_at_1
value: 30.982
- type: ndcg_at_10
value: 40.982
- type: ndcg_at_100
value: 46.092
- type: ndcg_at_1000
value: 48.25
- type: ndcg_at_3
value: 36.41
- type: ndcg_at_5
value: 38.379999999999995
- type: precision_at_1
value: 30.982
- type: precision_at_10
value: 6.534
- type: precision_at_100
value: 0.9820000000000001
- type: precision_at_1000
value: 0.124
- type: precision_at_3
value: 15.745999999999999
- type: precision_at_5
value: 10.828
- type: recall_at_1
value: 27.838
- type: recall_at_10
value: 52.971000000000004
- type: recall_at_100
value: 76.357
- type: recall_at_1000
value: 91.973
- type: recall_at_3
value: 40.157
- type: recall_at_5
value: 45.147999999999996
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackTexRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 46989137a86843e03a6195de44b09deda022eec7
metrics:
- type: map_at_1
value: 19.059
- type: map_at_10
value: 27.454
- type: map_at_100
value: 28.736
- type: map_at_1000
value: 28.865000000000002
- type: map_at_3
value: 24.773999999999997
- type: map_at_5
value: 26.266000000000002
- type: mrr_at_1
value: 23.125
- type: mrr_at_10
value: 31.267
- type: mrr_at_100
value: 32.32
- type: mrr_at_1000
value: 32.394
- type: mrr_at_3
value: 28.894
- type: mrr_at_5
value: 30.281000000000002
- type: ndcg_at_1
value: 23.125
- type: ndcg_at_10
value: 32.588
- type: ndcg_at_100
value: 38.432
- type: ndcg_at_1000
value: 41.214
- type: ndcg_at_3
value: 27.938000000000002
- type: ndcg_at_5
value: 30.127
- type: precision_at_1
value: 23.125
- type: precision_at_10
value: 5.9639999999999995
- type: precision_at_100
value: 1.047
- type: precision_at_1000
value: 0.148
- type: precision_at_3
value: 13.294
- type: precision_at_5
value: 9.628
- type: recall_at_1
value: 19.059
- type: recall_at_10
value: 44.25
- type: recall_at_100
value: 69.948
- type: recall_at_1000
value: 89.35300000000001
- type: recall_at_3
value: 31.114000000000004
- type: recall_at_5
value: 36.846000000000004
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackUnixRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53
metrics:
- type: map_at_1
value: 28.355999999999998
- type: map_at_10
value: 39.055
- type: map_at_100
value: 40.486
- type: map_at_1000
value: 40.571
- type: map_at_3
value: 35.69
- type: map_at_5
value: 37.605
- type: mrr_at_1
value: 33.302
- type: mrr_at_10
value: 42.986000000000004
- type: mrr_at_100
value: 43.957
- type: mrr_at_1000
value: 43.996
- type: mrr_at_3
value: 40.111999999999995
- type: mrr_at_5
value: 41.735
- type: ndcg_at_1
value: 33.302
- type: ndcg_at_10
value: 44.962999999999994
- type: ndcg_at_100
value: 50.917
- type: ndcg_at_1000
value: 52.622
- type: ndcg_at_3
value: 39.182
- type: ndcg_at_5
value: 41.939
- type: precision_at_1
value: 33.302
- type: precision_at_10
value: 7.779999999999999
- type: precision_at_100
value: 1.203
- type: precision_at_1000
value: 0.145
- type: precision_at_3
value: 18.035
- type: precision_at_5
value: 12.873000000000001
- type: recall_at_1
value: 28.355999999999998
- type: recall_at_10
value: 58.782000000000004
- type: recall_at_100
value: 84.02199999999999
- type: recall_at_1000
value: 95.511
- type: recall_at_3
value: 43.126999999999995
- type: recall_at_5
value: 50.14999999999999
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWebmastersRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: 160c094312a0e1facb97e55eeddb698c0abe3571
metrics:
- type: map_at_1
value: 27.391
- type: map_at_10
value: 37.523
- type: map_at_100
value: 39.312000000000005
- type: map_at_1000
value: 39.54
- type: map_at_3
value: 34.231
- type: map_at_5
value: 36.062
- type: mrr_at_1
value: 32.016
- type: mrr_at_10
value: 41.747
- type: mrr_at_100
value: 42.812
- type: mrr_at_1000
value: 42.844
- type: mrr_at_3
value: 39.129999999999995
- type: mrr_at_5
value: 40.524
- type: ndcg_at_1
value: 32.016
- type: ndcg_at_10
value: 43.826
- type: ndcg_at_100
value: 50.373999999999995
- type: ndcg_at_1000
value: 52.318
- type: ndcg_at_3
value: 38.479
- type: ndcg_at_5
value: 40.944
- type: precision_at_1
value: 32.016
- type: precision_at_10
value: 8.280999999999999
- type: precision_at_100
value: 1.6760000000000002
- type: precision_at_1000
value: 0.25
- type: precision_at_3
value: 18.05
- type: precision_at_5
value: 13.083
- type: recall_at_1
value: 27.391
- type: recall_at_10
value: 56.928999999999995
- type: recall_at_100
value: 85.169
- type: recall_at_1000
value: 96.665
- type: recall_at_3
value: 42.264
- type: recall_at_5
value: 48.556
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: mteb/climate-fever
config: default
split: test
revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380
metrics:
- type: map_at_1
value: 19.681
- type: map_at_10
value: 32.741
- type: map_at_100
value: 34.811
- type: map_at_1000
value: 35.003
- type: map_at_3
value: 27.697
- type: map_at_5
value: 30.372
- type: mrr_at_1
value: 44.951
- type: mrr_at_10
value: 56.34400000000001
- type: mrr_at_100
value: 56.961
- type: mrr_at_1000
value: 56.987
- type: mrr_at_3
value: 53.681
- type: mrr_at_5
value: 55.407
- type: ndcg_at_1
value: 44.951
- type: ndcg_at_10
value: 42.905
- type: ndcg_at_100
value: 49.95
- type: ndcg_at_1000
value: 52.917
- type: ndcg_at_3
value: 36.815
- type: ndcg_at_5
value: 38.817
- type: precision_at_1
value: 44.951
- type: precision_at_10
value: 12.989999999999998
- type: precision_at_100
value: 2.068
- type: precision_at_1000
value: 0.263
- type: precision_at_3
value: 27.275
- type: precision_at_5
value: 20.365
- type: recall_at_1
value: 19.681
- type: recall_at_10
value: 48.272999999999996
- type: recall_at_100
value: 71.87400000000001
- type: recall_at_1000
value: 87.929
- type: recall_at_3
value: 32.653999999999996
- type: recall_at_5
value: 39.364
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: mteb/dbpedia
config: default
split: test
revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659
metrics:
- type: map_at_1
value: 10.231
- type: map_at_10
value: 22.338
- type: map_at_100
value: 31.927
- type: map_at_1000
value: 33.87
- type: map_at_3
value: 15.559999999999999
- type: map_at_5
value: 18.239
- type: mrr_at_1
value: 75.0
- type: mrr_at_10
value: 81.303
- type: mrr_at_100
value: 81.523
- type: mrr_at_1000
value: 81.53
- type: mrr_at_3
value: 80.083
- type: mrr_at_5
value: 80.758
- type: ndcg_at_1
value: 64.625
- type: ndcg_at_10
value: 48.687000000000005
- type: ndcg_at_100
value: 52.791
- type: ndcg_at_1000
value: 60.041999999999994
- type: ndcg_at_3
value: 53.757999999999996
- type: ndcg_at_5
value: 50.76500000000001
- type: precision_at_1
value: 75.0
- type: precision_at_10
value: 38.3
- type: precision_at_100
value: 12.025
- type: precision_at_1000
value: 2.3970000000000002
- type: precision_at_3
value: 55.417
- type: precision_at_5
value: 47.5
- type: recall_at_1
value: 10.231
- type: recall_at_10
value: 27.697
- type: recall_at_100
value: 57.409
- type: recall_at_1000
value: 80.547
- type: recall_at_3
value: 16.668
- type: recall_at_5
value: 20.552
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 61.365
- type: f1
value: 56.7540827912991
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: mteb/fever
config: default
split: test
revision: bea83ef9e8fb933d90a2f1d5515737465d613e12
metrics:
- type: map_at_1
value: 83.479
- type: map_at_10
value: 88.898
- type: map_at_100
value: 89.11
- type: map_at_1000
value: 89.12400000000001
- type: map_at_3
value: 88.103
- type: map_at_5
value: 88.629
- type: mrr_at_1
value: 89.934
- type: mrr_at_10
value: 93.91000000000001
- type: mrr_at_100
value: 93.937
- type: mrr_at_1000
value: 93.938
- type: mrr_at_3
value: 93.62700000000001
- type: mrr_at_5
value: 93.84599999999999
- type: ndcg_at_1
value: 89.934
- type: ndcg_at_10
value: 91.574
- type: ndcg_at_100
value: 92.238
- type: ndcg_at_1000
value: 92.45
- type: ndcg_at_3
value: 90.586
- type: ndcg_at_5
value: 91.16300000000001
- type: precision_at_1
value: 89.934
- type: precision_at_10
value: 10.555
- type: precision_at_100
value: 1.1159999999999999
- type: precision_at_1000
value: 0.11499999999999999
- type: precision_at_3
value: 33.588
- type: precision_at_5
value: 20.642
- type: recall_at_1
value: 83.479
- type: recall_at_10
value: 94.971
- type: recall_at_100
value: 97.397
- type: recall_at_1000
value: 98.666
- type: recall_at_3
value: 92.24799999999999
- type: recall_at_5
value: 93.797
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: mteb/fiqa
config: default
split: test
revision: 27a168819829fe9bcd655c2df245fb19452e8e06
metrics:
- type: map_at_1
value: 27.16
- type: map_at_10
value: 45.593
- type: map_at_100
value: 47.762
- type: map_at_1000
value: 47.899
- type: map_at_3
value: 39.237
- type: map_at_5
value: 42.970000000000006
- type: mrr_at_1
value: 52.623
- type: mrr_at_10
value: 62.637
- type: mrr_at_100
value: 63.169
- type: mrr_at_1000
value: 63.185
- type: mrr_at_3
value: 59.928000000000004
- type: mrr_at_5
value: 61.702999999999996
- type: ndcg_at_1
value: 52.623
- type: ndcg_at_10
value: 54.701
- type: ndcg_at_100
value: 61.263
- type: ndcg_at_1000
value: 63.134
- type: ndcg_at_3
value: 49.265
- type: ndcg_at_5
value: 51.665000000000006
- type: precision_at_1
value: 52.623
- type: precision_at_10
value: 15.185
- type: precision_at_100
value: 2.202
- type: precision_at_1000
value: 0.254
- type: precision_at_3
value: 32.767
- type: precision_at_5
value: 24.722
- type: recall_at_1
value: 27.16
- type: recall_at_10
value: 63.309000000000005
- type: recall_at_100
value: 86.722
- type: recall_at_1000
value: 97.505
- type: recall_at_3
value: 45.045
- type: recall_at_5
value: 54.02400000000001
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: mteb/hotpotqa
config: default
split: test
revision: ab518f4d6fcca38d87c25209f94beba119d02014
metrics:
- type: map_at_1
value: 42.573
- type: map_at_10
value: 59.373
- type: map_at_100
value: 60.292
- type: map_at_1000
value: 60.358999999999995
- type: map_at_3
value: 56.159000000000006
- type: map_at_5
value: 58.123999999999995
- type: mrr_at_1
value: 85.14500000000001
- type: mrr_at_10
value: 89.25999999999999
- type: mrr_at_100
value: 89.373
- type: mrr_at_1000
value: 89.377
- type: mrr_at_3
value: 88.618
- type: mrr_at_5
value: 89.036
- type: ndcg_at_1
value: 85.14500000000001
- type: ndcg_at_10
value: 68.95
- type: ndcg_at_100
value: 71.95
- type: ndcg_at_1000
value: 73.232
- type: ndcg_at_3
value: 64.546
- type: ndcg_at_5
value: 66.945
- type: precision_at_1
value: 85.14500000000001
- type: precision_at_10
value: 13.865
- type: precision_at_100
value: 1.619
- type: precision_at_1000
value: 0.179
- type: precision_at_3
value: 39.703
- type: precision_at_5
value: 25.718000000000004
- type: recall_at_1
value: 42.573
- type: recall_at_10
value: 69.325
- type: recall_at_100
value: 80.932
- type: recall_at_1000
value: 89.446
- type: recall_at_3
value: 59.553999999999995
- type: recall_at_5
value: 64.294
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 95.8336
- type: ap
value: 93.78862962194073
- type: f1
value: 95.83192650728371
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: mteb/msmarco
config: default
split: dev
revision: c5a29a104738b98a9e76336939199e264163d4a0
metrics:
- type: map_at_1
value: 23.075000000000003
- type: map_at_10
value: 36.102000000000004
- type: map_at_100
value: 37.257
- type: map_at_1000
value: 37.3
- type: map_at_3
value: 32.144
- type: map_at_5
value: 34.359
- type: mrr_at_1
value: 23.711
- type: mrr_at_10
value: 36.671
- type: mrr_at_100
value: 37.763999999999996
- type: mrr_at_1000
value: 37.801
- type: mrr_at_3
value: 32.775
- type: mrr_at_5
value: 34.977000000000004
- type: ndcg_at_1
value: 23.711
- type: ndcg_at_10
value: 43.361
- type: ndcg_at_100
value: 48.839
- type: ndcg_at_1000
value: 49.88
- type: ndcg_at_3
value: 35.269
- type: ndcg_at_5
value: 39.224
- type: precision_at_1
value: 23.711
- type: precision_at_10
value: 6.866999999999999
- type: precision_at_100
value: 0.96
- type: precision_at_1000
value: 0.105
- type: precision_at_3
value: 15.096000000000002
- type: precision_at_5
value: 11.083
- type: recall_at_1
value: 23.075000000000003
- type: recall_at_10
value: 65.756
- type: recall_at_100
value: 90.88199999999999
- type: recall_at_1000
value: 98.739
- type: recall_at_3
value: 43.691
- type: recall_at_5
value: 53.15800000000001
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 97.69493844049248
- type: f1
value: 97.55048089616261
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 88.75968992248062
- type: f1
value: 72.26321223399123
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 82.40080699394754
- type: f1
value: 79.62590029057968
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 84.49562878278414
- type: f1
value: 84.0040193313333
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 39.386760057101945
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 37.89687154075537
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 33.94151656057482
- type: mrr
value: 35.32684700746953
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: mteb/nfcorpus
config: default
split: test
revision: ec0fa4fe99da2ff19ca1214b7966684033a58814
metrics:
- type: map_at_1
value: 6.239999999999999
- type: map_at_10
value: 14.862
- type: map_at_100
value: 18.955
- type: map_at_1000
value: 20.694000000000003
- type: map_at_3
value: 10.683
- type: map_at_5
value: 12.674
- type: mrr_at_1
value: 50.15500000000001
- type: mrr_at_10
value: 59.697
- type: mrr_at_100
value: 60.095
- type: mrr_at_1000
value: 60.129999999999995
- type: mrr_at_3
value: 58.35900000000001
- type: mrr_at_5
value: 58.839
- type: ndcg_at_1
value: 48.452
- type: ndcg_at_10
value: 39.341
- type: ndcg_at_100
value: 35.866
- type: ndcg_at_1000
value: 45.111000000000004
- type: ndcg_at_3
value: 44.527
- type: ndcg_at_5
value: 42.946
- type: precision_at_1
value: 50.15500000000001
- type: precision_at_10
value: 29.536
- type: precision_at_100
value: 9.142
- type: precision_at_1000
value: 2.2849999999999997
- type: precision_at_3
value: 41.899
- type: precision_at_5
value: 37.647000000000006
- type: recall_at_1
value: 6.239999999999999
- type: recall_at_10
value: 19.278000000000002
- type: recall_at_100
value: 36.074
- type: recall_at_1000
value: 70.017
- type: recall_at_3
value: 12.066
- type: recall_at_5
value: 15.254000000000001
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: mteb/nq
config: default
split: test
revision: b774495ed302d8c44a3a7ea25c90dbce03968f31
metrics:
- type: map_at_1
value: 39.75
- type: map_at_10
value: 56.443
- type: map_at_100
value: 57.233999999999995
- type: map_at_1000
value: 57.249
- type: map_at_3
value: 52.032999999999994
- type: map_at_5
value: 54.937999999999995
- type: mrr_at_1
value: 44.728
- type: mrr_at_10
value: 58.939
- type: mrr_at_100
value: 59.489000000000004
- type: mrr_at_1000
value: 59.499
- type: mrr_at_3
value: 55.711999999999996
- type: mrr_at_5
value: 57.89
- type: ndcg_at_1
value: 44.728
- type: ndcg_at_10
value: 63.998999999999995
- type: ndcg_at_100
value: 67.077
- type: ndcg_at_1000
value: 67.40899999999999
- type: ndcg_at_3
value: 56.266000000000005
- type: ndcg_at_5
value: 60.88
- type: precision_at_1
value: 44.728
- type: precision_at_10
value: 10.09
- type: precision_at_100
value: 1.1809999999999998
- type: precision_at_1000
value: 0.121
- type: precision_at_3
value: 25.145
- type: precision_at_5
value: 17.822
- type: recall_at_1
value: 39.75
- type: recall_at_10
value: 84.234
- type: recall_at_100
value: 97.055
- type: recall_at_1000
value: 99.517
- type: recall_at_3
value: 64.851
- type: recall_at_5
value: 75.343
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: mteb/quora
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 72.085
- type: map_at_10
value: 86.107
- type: map_at_100
value: 86.727
- type: map_at_1000
value: 86.74
- type: map_at_3
value: 83.21
- type: map_at_5
value: 85.06
- type: mrr_at_1
value: 82.94
- type: mrr_at_10
value: 88.845
- type: mrr_at_100
value: 88.926
- type: mrr_at_1000
value: 88.927
- type: mrr_at_3
value: 87.993
- type: mrr_at_5
value: 88.62299999999999
- type: ndcg_at_1
value: 82.97
- type: ndcg_at_10
value: 89.645
- type: ndcg_at_100
value: 90.717
- type: ndcg_at_1000
value: 90.78
- type: ndcg_at_3
value: 86.99900000000001
- type: ndcg_at_5
value: 88.52600000000001
- type: precision_at_1
value: 82.97
- type: precision_at_10
value: 13.569
- type: precision_at_100
value: 1.539
- type: precision_at_1000
value: 0.157
- type: precision_at_3
value: 38.043
- type: precision_at_5
value: 24.992
- type: recall_at_1
value: 72.085
- type: recall_at_10
value: 96.262
- type: recall_at_100
value: 99.77000000000001
- type: recall_at_1000
value: 99.997
- type: recall_at_3
value: 88.652
- type: recall_at_5
value: 93.01899999999999
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 55.82153952668092
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 62.094465801879295
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: mteb/scidocs
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 5.688
- type: map_at_10
value: 15.201999999999998
- type: map_at_100
value: 18.096
- type: map_at_1000
value: 18.481
- type: map_at_3
value: 10.734
- type: map_at_5
value: 12.94
- type: mrr_at_1
value: 28.000000000000004
- type: mrr_at_10
value: 41.101
- type: mrr_at_100
value: 42.202
- type: mrr_at_1000
value: 42.228
- type: mrr_at_3
value: 37.683
- type: mrr_at_5
value: 39.708
- type: ndcg_at_1
value: 28.000000000000004
- type: ndcg_at_10
value: 24.976000000000003
- type: ndcg_at_100
value: 35.129
- type: ndcg_at_1000
value: 40.77
- type: ndcg_at_3
value: 23.787
- type: ndcg_at_5
value: 20.816000000000003
- type: precision_at_1
value: 28.000000000000004
- type: precision_at_10
value: 13.04
- type: precision_at_100
value: 2.761
- type: precision_at_1000
value: 0.41000000000000003
- type: precision_at_3
value: 22.6
- type: precision_at_5
value: 18.52
- type: recall_at_1
value: 5.688
- type: recall_at_10
value: 26.43
- type: recall_at_100
value: 56.02
- type: recall_at_1000
value: 83.21
- type: recall_at_3
value: 13.752
- type: recall_at_5
value: 18.777
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 85.15084859283178
- type: cos_sim_spearman
value: 80.49030614009419
- type: euclidean_pearson
value: 81.84574978672468
- type: euclidean_spearman
value: 79.89787150656818
- type: manhattan_pearson
value: 81.63076538567131
- type: manhattan_spearman
value: 79.69867352121841
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 84.64097921490992
- type: cos_sim_spearman
value: 77.25370084896514
- type: euclidean_pearson
value: 82.71210826468788
- type: euclidean_spearman
value: 78.50445584994826
- type: manhattan_pearson
value: 82.92580164330298
- type: manhattan_spearman
value: 78.69686891301019
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 87.24596417308994
- type: cos_sim_spearman
value: 87.79454220555091
- type: euclidean_pearson
value: 87.40242561671164
- type: euclidean_spearman
value: 88.25955597373556
- type: manhattan_pearson
value: 87.25160240485849
- type: manhattan_spearman
value: 88.155794979818
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 84.44914233422564
- type: cos_sim_spearman
value: 82.91015471820322
- type: euclidean_pearson
value: 84.7206656630327
- type: euclidean_spearman
value: 83.86408872059216
- type: manhattan_pearson
value: 84.72816725158454
- type: manhattan_spearman
value: 84.01603388572788
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 87.6168026237477
- type: cos_sim_spearman
value: 88.45414278092397
- type: euclidean_pearson
value: 88.57023240882022
- type: euclidean_spearman
value: 89.04102190922094
- type: manhattan_pearson
value: 88.66695535796354
- type: manhattan_spearman
value: 89.19898476680969
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 84.27925826089424
- type: cos_sim_spearman
value: 85.45291099550461
- type: euclidean_pearson
value: 83.63853036580834
- type: euclidean_spearman
value: 84.33468035821484
- type: manhattan_pearson
value: 83.72778773251596
- type: manhattan_spearman
value: 84.51583132445376
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 89.67375185692552
- type: cos_sim_spearman
value: 90.32542469203855
- type: euclidean_pearson
value: 89.63513717951847
- type: euclidean_spearman
value: 89.87760271003745
- type: manhattan_pearson
value: 89.28381452982924
- type: manhattan_spearman
value: 89.53568197785721
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 66.24644693819846
- type: cos_sim_spearman
value: 66.09889420525377
- type: euclidean_pearson
value: 63.72551583520747
- type: euclidean_spearman
value: 63.01385470780679
- type: manhattan_pearson
value: 64.09258157214097
- type: manhattan_spearman
value: 63.080517752822594
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 86.27321463839989
- type: cos_sim_spearman
value: 86.37572865993327
- type: euclidean_pearson
value: 86.36268020198149
- type: euclidean_spearman
value: 86.31089339478922
- type: manhattan_pearson
value: 86.4260445761947
- type: manhattan_spearman
value: 86.45885895320457
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 86.52456702387798
- type: mrr
value: 96.34556529164372
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: mteb/scifact
config: default
split: test
revision: 0228b52cf27578f30900b9e5271d331663a030d7
metrics:
- type: map_at_1
value: 61.99400000000001
- type: map_at_10
value: 73.38799999999999
- type: map_at_100
value: 73.747
- type: map_at_1000
value: 73.75
- type: map_at_3
value: 70.04599999999999
- type: map_at_5
value: 72.095
- type: mrr_at_1
value: 65.0
- type: mrr_at_10
value: 74.42800000000001
- type: mrr_at_100
value: 74.722
- type: mrr_at_1000
value: 74.725
- type: mrr_at_3
value: 72.056
- type: mrr_at_5
value: 73.60600000000001
- type: ndcg_at_1
value: 65.0
- type: ndcg_at_10
value: 78.435
- type: ndcg_at_100
value: 79.922
- type: ndcg_at_1000
value: 80.00500000000001
- type: ndcg_at_3
value: 73.05199999999999
- type: ndcg_at_5
value: 75.98
- type: precision_at_1
value: 65.0
- type: precision_at_10
value: 10.5
- type: precision_at_100
value: 1.123
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_3
value: 28.555999999999997
- type: precision_at_5
value: 19.0
- type: recall_at_1
value: 61.99400000000001
- type: recall_at_10
value: 92.72200000000001
- type: recall_at_100
value: 99.333
- type: recall_at_1000
value: 100.0
- type: recall_at_3
value: 78.739
- type: recall_at_5
value: 85.828
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.79009900990098
- type: cos_sim_ap
value: 95.3203137438653
- type: cos_sim_f1
value: 89.12386706948641
- type: cos_sim_precision
value: 89.75659229208925
- type: cos_sim_recall
value: 88.5
- type: dot_accuracy
value: 99.67821782178218
- type: dot_ap
value: 89.94069840000675
- type: dot_f1
value: 83.45902463549521
- type: dot_precision
value: 83.9231547017189
- type: dot_recall
value: 83.0
- type: euclidean_accuracy
value: 99.78613861386138
- type: euclidean_ap
value: 95.10648259135526
- type: euclidean_f1
value: 88.77338877338877
- type: euclidean_precision
value: 92.42424242424242
- type: euclidean_recall
value: 85.39999999999999
- type: manhattan_accuracy
value: 99.7950495049505
- type: manhattan_ap
value: 95.29987661320946
- type: manhattan_f1
value: 89.21313183949972
- type: manhattan_precision
value: 93.14472252448314
- type: manhattan_recall
value: 85.6
- type: max_accuracy
value: 99.7950495049505
- type: max_ap
value: 95.3203137438653
- type: max_f1
value: 89.21313183949972
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 67.65446577183913
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 46.30749237193961
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 54.91481849959949
- type: mrr
value: 55.853506175197346
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 30.08196549170419
- type: cos_sim_spearman
value: 31.16661390597077
- type: dot_pearson
value: 29.892258410943466
- type: dot_spearman
value: 30.51328811965085
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: mteb/trec-covid
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 0.23900000000000002
- type: map_at_10
value: 2.173
- type: map_at_100
value: 14.24
- type: map_at_1000
value: 35.309000000000005
- type: map_at_3
value: 0.7100000000000001
- type: map_at_5
value: 1.163
- type: mrr_at_1
value: 92.0
- type: mrr_at_10
value: 96.0
- type: mrr_at_100
value: 96.0
- type: mrr_at_1000
value: 96.0
- type: mrr_at_3
value: 96.0
- type: mrr_at_5
value: 96.0
- type: ndcg_at_1
value: 90.0
- type: ndcg_at_10
value: 85.382
- type: ndcg_at_100
value: 68.03
- type: ndcg_at_1000
value: 61.021
- type: ndcg_at_3
value: 89.765
- type: ndcg_at_5
value: 88.444
- type: precision_at_1
value: 92.0
- type: precision_at_10
value: 88.0
- type: precision_at_100
value: 70.02000000000001
- type: precision_at_1000
value: 26.984
- type: precision_at_3
value: 94.0
- type: precision_at_5
value: 92.80000000000001
- type: recall_at_1
value: 0.23900000000000002
- type: recall_at_10
value: 2.313
- type: recall_at_100
value: 17.049
- type: recall_at_1000
value: 57.489999999999995
- type: recall_at_3
value: 0.737
- type: recall_at_5
value: 1.221
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: mteb/touche2020
config: default
split: test
revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f
metrics:
- type: map_at_1
value: 2.75
- type: map_at_10
value: 11.29
- type: map_at_100
value: 18.032999999999998
- type: map_at_1000
value: 19.746
- type: map_at_3
value: 6.555
- type: map_at_5
value: 8.706999999999999
- type: mrr_at_1
value: 34.694
- type: mrr_at_10
value: 50.55
- type: mrr_at_100
value: 51.659
- type: mrr_at_1000
value: 51.659
- type: mrr_at_3
value: 47.278999999999996
- type: mrr_at_5
value: 49.728
- type: ndcg_at_1
value: 32.653
- type: ndcg_at_10
value: 27.894000000000002
- type: ndcg_at_100
value: 39.769
- type: ndcg_at_1000
value: 51.495999999999995
- type: ndcg_at_3
value: 32.954
- type: ndcg_at_5
value: 31.502999999999997
- type: precision_at_1
value: 34.694
- type: precision_at_10
value: 23.265
- type: precision_at_100
value: 7.898
- type: precision_at_1000
value: 1.58
- type: precision_at_3
value: 34.694
- type: precision_at_5
value: 31.429000000000002
- type: recall_at_1
value: 2.75
- type: recall_at_10
value: 16.953
- type: recall_at_100
value: 48.68
- type: recall_at_1000
value: 85.18599999999999
- type: recall_at_3
value: 7.710999999999999
- type: recall_at_5
value: 11.484
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 82.66099999999999
- type: ap
value: 25.555698090238337
- type: f1
value: 66.48402012461622
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 72.94567062818335
- type: f1
value: 73.28139189595674
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 49.581627240203474
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 87.78089050485785
- type: cos_sim_ap
value: 79.64487116574168
- type: cos_sim_f1
value: 72.46563021970964
- type: cos_sim_precision
value: 70.62359128474831
- type: cos_sim_recall
value: 74.40633245382587
- type: dot_accuracy
value: 86.2609524944865
- type: dot_ap
value: 75.513046857613
- type: dot_f1
value: 68.58213616489695
- type: dot_precision
value: 65.12455516014235
- type: dot_recall
value: 72.42744063324538
- type: euclidean_accuracy
value: 87.6080348095607
- type: euclidean_ap
value: 79.00204933649795
- type: euclidean_f1
value: 72.14495342605589
- type: euclidean_precision
value: 69.85421299728193
- type: euclidean_recall
value: 74.5910290237467
- type: manhattan_accuracy
value: 87.59611372712642
- type: manhattan_ap
value: 78.78523756706264
- type: manhattan_f1
value: 71.86499137718648
- type: manhattan_precision
value: 67.39833641404806
- type: manhattan_recall
value: 76.96569920844327
- type: max_accuracy
value: 87.78089050485785
- type: max_ap
value: 79.64487116574168
- type: max_f1
value: 72.46563021970964
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 89.98719292117825
- type: cos_sim_ap
value: 87.58146137353202
- type: cos_sim_f1
value: 80.28543232369239
- type: cos_sim_precision
value: 79.1735289714029
- type: cos_sim_recall
value: 81.42901139513397
- type: dot_accuracy
value: 88.9199363526992
- type: dot_ap
value: 84.98499998630417
- type: dot_f1
value: 78.21951400757969
- type: dot_precision
value: 75.58523624874336
- type: dot_recall
value: 81.04404065291038
- type: euclidean_accuracy
value: 89.77374160748244
- type: euclidean_ap
value: 87.35151562835209
- type: euclidean_f1
value: 79.92160922940393
- type: euclidean_precision
value: 76.88531587933979
- type: euclidean_recall
value: 83.20757622420696
- type: manhattan_accuracy
value: 89.72717041176699
- type: manhattan_ap
value: 87.34065592142515
- type: manhattan_f1
value: 79.85603419187943
- type: manhattan_precision
value: 77.82243332115455
- type: manhattan_recall
value: 81.99876809362489
- type: max_accuracy
value: 89.98719292117825
- type: max_ap
value: 87.58146137353202
- type: max_f1
value: 80.28543232369239
- task:
type: STS
dataset:
name: MTEB AFQMC
type: C-MTEB/AFQMC
config: default
split: validation
revision: b44c3b011063adb25877c13823db83bb193913c4
metrics:
- type: cos_sim_pearson
value: 53.45954203592337
- type: cos_sim_spearman
value: 58.42154680418638
- type: euclidean_pearson
value: 56.41543791722753
- type: euclidean_spearman
value: 58.39328016640146
- type: manhattan_pearson
value: 56.318510356833876
- type: manhattan_spearman
value: 58.28423447818184
- task:
type: STS
dataset:
name: MTEB ATEC
type: C-MTEB/ATEC
config: default
split: test
revision: 0f319b1142f28d00e055a6770f3f726ae9b7d865
metrics:
- type: cos_sim_pearson
value: 50.78356460675945
- type: cos_sim_spearman
value: 55.6530411663269
- type: euclidean_pearson
value: 56.50763660417816
- type: euclidean_spearman
value: 55.733823335669065
- type: manhattan_pearson
value: 56.45323093512866
- type: manhattan_spearman
value: 55.63248619032702
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (zh)
type: mteb/amazon_reviews_multi
config: zh
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 47.209999999999994
- type: f1
value: 46.08892432018655
- task:
type: STS
dataset:
name: MTEB BQ
type: C-MTEB/BQ
config: default
split: test
revision: e3dda5e115e487b39ec7e618c0c6a29137052a55
metrics:
- type: cos_sim_pearson
value: 70.25573992001478
- type: cos_sim_spearman
value: 73.85247134951433
- type: euclidean_pearson
value: 72.60033082168442
- type: euclidean_spearman
value: 73.72445893756499
- type: manhattan_pearson
value: 72.59932284620231
- type: manhattan_spearman
value: 73.68002490614583
- task:
type: Clustering
dataset:
name: MTEB CLSClusteringP2P
type: C-MTEB/CLSClusteringP2P
config: default
split: test
revision: 4b6227591c6c1a73bc76b1055f3b7f3588e72476
metrics:
- type: v_measure
value: 45.21317724305628
- task:
type: Clustering
dataset:
name: MTEB CLSClusteringS2S
type: C-MTEB/CLSClusteringS2S
config: default
split: test
revision: e458b3f5414b62b7f9f83499ac1f5497ae2e869f
metrics:
- type: v_measure
value: 42.49825170976724
- task:
type: Reranking
dataset:
name: MTEB CMedQAv1
type: C-MTEB/CMedQAv1-reranking
config: default
split: test
revision: 8d7f1e942507dac42dc58017c1a001c3717da7df
metrics:
- type: map
value: 88.15661686810597
- type: mrr
value: 90.11222222222223
- task:
type: Reranking
dataset:
name: MTEB CMedQAv2
type: C-MTEB/CMedQAv2-reranking
config: default
split: test
revision: 23d186750531a14a0357ca22cd92d712fd512ea0
metrics:
- type: map
value: 88.1204726064383
- type: mrr
value: 90.20142857142858
- task:
type: Retrieval
dataset:
name: MTEB CmedqaRetrieval
type: C-MTEB/CmedqaRetrieval
config: default
split: dev
revision: cd540c506dae1cf9e9a59c3e06f42030d54e7301
metrics:
- type: map_at_1
value: 27.224999999999998
- type: map_at_10
value: 40.169
- type: map_at_100
value: 42.0
- type: map_at_1000
value: 42.109
- type: map_at_3
value: 35.76
- type: map_at_5
value: 38.221
- type: mrr_at_1
value: 40.56
- type: mrr_at_10
value: 49.118
- type: mrr_at_100
value: 50.092999999999996
- type: mrr_at_1000
value: 50.133
- type: mrr_at_3
value: 46.507
- type: mrr_at_5
value: 47.973
- type: ndcg_at_1
value: 40.56
- type: ndcg_at_10
value: 46.972
- type: ndcg_at_100
value: 54.04
- type: ndcg_at_1000
value: 55.862
- type: ndcg_at_3
value: 41.36
- type: ndcg_at_5
value: 43.704
- type: precision_at_1
value: 40.56
- type: precision_at_10
value: 10.302999999999999
- type: precision_at_100
value: 1.606
- type: precision_at_1000
value: 0.184
- type: precision_at_3
value: 23.064
- type: precision_at_5
value: 16.764000000000003
- type: recall_at_1
value: 27.224999999999998
- type: recall_at_10
value: 58.05200000000001
- type: recall_at_100
value: 87.092
- type: recall_at_1000
value: 99.099
- type: recall_at_3
value: 41.373
- type: recall_at_5
value: 48.453
- task:
type: PairClassification
dataset:
name: MTEB Cmnli
type: C-MTEB/CMNLI
config: default
split: validation
revision: 41bc36f332156f7adc9e38f53777c959b2ae9766
metrics:
- type: cos_sim_accuracy
value: 77.40228502705953
- type: cos_sim_ap
value: 86.22359172956327
- type: cos_sim_f1
value: 78.96328293736501
- type: cos_sim_precision
value: 73.36945615091311
- type: cos_sim_recall
value: 85.48047696983868
- type: dot_accuracy
value: 75.53818400481059
- type: dot_ap
value: 83.70164011305312
- type: dot_f1
value: 77.67298719348754
- type: dot_precision
value: 67.49482401656314
- type: dot_recall
value: 91.46598082768296
- type: euclidean_accuracy
value: 77.94347564642213
- type: euclidean_ap
value: 86.4652108728609
- type: euclidean_f1
value: 79.15555555555555
- type: euclidean_precision
value: 75.41816641964853
- type: euclidean_recall
value: 83.28267477203647
- type: manhattan_accuracy
value: 77.45039085989175
- type: manhattan_ap
value: 86.09986583900665
- type: manhattan_f1
value: 78.93669264438988
- type: manhattan_precision
value: 72.63261296660117
- type: manhattan_recall
value: 86.43909282207154
- type: max_accuracy
value: 77.94347564642213
- type: max_ap
value: 86.4652108728609
- type: max_f1
value: 79.15555555555555
- task:
type: Retrieval
dataset:
name: MTEB CovidRetrieval
type: C-MTEB/CovidRetrieval
config: default
split: dev
revision: 1271c7809071a13532e05f25fb53511ffce77117
metrics:
- type: map_at_1
value: 69.336
- type: map_at_10
value: 77.16
- type: map_at_100
value: 77.47500000000001
- type: map_at_1000
value: 77.482
- type: map_at_3
value: 75.42999999999999
- type: map_at_5
value: 76.468
- type: mrr_at_1
value: 69.44200000000001
- type: mrr_at_10
value: 77.132
- type: mrr_at_100
value: 77.43299999999999
- type: mrr_at_1000
value: 77.44
- type: mrr_at_3
value: 75.395
- type: mrr_at_5
value: 76.459
- type: ndcg_at_1
value: 69.547
- type: ndcg_at_10
value: 80.794
- type: ndcg_at_100
value: 82.245
- type: ndcg_at_1000
value: 82.40899999999999
- type: ndcg_at_3
value: 77.303
- type: ndcg_at_5
value: 79.168
- type: precision_at_1
value: 69.547
- type: precision_at_10
value: 9.305
- type: precision_at_100
value: 0.9979999999999999
- type: precision_at_1000
value: 0.101
- type: precision_at_3
value: 27.749000000000002
- type: precision_at_5
value: 17.576
- type: recall_at_1
value: 69.336
- type: recall_at_10
value: 92.097
- type: recall_at_100
value: 98.736
- type: recall_at_1000
value: 100.0
- type: recall_at_3
value: 82.64
- type: recall_at_5
value: 87.144
- task:
type: Retrieval
dataset:
name: MTEB DuRetrieval
type: C-MTEB/DuRetrieval
config: default
split: dev
revision: a1a333e290fe30b10f3f56498e3a0d911a693ced
metrics:
- type: map_at_1
value: 26.817999999999998
- type: map_at_10
value: 82.67
- type: map_at_100
value: 85.304
- type: map_at_1000
value: 85.334
- type: map_at_3
value: 57.336
- type: map_at_5
value: 72.474
- type: mrr_at_1
value: 91.45
- type: mrr_at_10
value: 94.272
- type: mrr_at_100
value: 94.318
- type: mrr_at_1000
value: 94.32000000000001
- type: mrr_at_3
value: 94.0
- type: mrr_at_5
value: 94.17699999999999
- type: ndcg_at_1
value: 91.45
- type: ndcg_at_10
value: 89.404
- type: ndcg_at_100
value: 91.724
- type: ndcg_at_1000
value: 91.973
- type: ndcg_at_3
value: 88.104
- type: ndcg_at_5
value: 87.25699999999999
- type: precision_at_1
value: 91.45
- type: precision_at_10
value: 42.585
- type: precision_at_100
value: 4.838
- type: precision_at_1000
value: 0.49
- type: precision_at_3
value: 78.8
- type: precision_at_5
value: 66.66
- type: recall_at_1
value: 26.817999999999998
- type: recall_at_10
value: 90.67
- type: recall_at_100
value: 98.36200000000001
- type: recall_at_1000
value: 99.583
- type: recall_at_3
value: 59.614999999999995
- type: recall_at_5
value: 77.05199999999999
- task:
type: Retrieval
dataset:
name: MTEB EcomRetrieval
type: C-MTEB/EcomRetrieval
config: default
split: dev
revision: 687de13dc7294d6fd9be10c6945f9e8fec8166b9
metrics:
- type: map_at_1
value: 47.699999999999996
- type: map_at_10
value: 57.589999999999996
- type: map_at_100
value: 58.226
- type: map_at_1000
value: 58.251
- type: map_at_3
value: 55.233
- type: map_at_5
value: 56.633
- type: mrr_at_1
value: 47.699999999999996
- type: mrr_at_10
value: 57.589999999999996
- type: mrr_at_100
value: 58.226
- type: mrr_at_1000
value: 58.251
- type: mrr_at_3
value: 55.233
- type: mrr_at_5
value: 56.633
- type: ndcg_at_1
value: 47.699999999999996
- type: ndcg_at_10
value: 62.505
- type: ndcg_at_100
value: 65.517
- type: ndcg_at_1000
value: 66.19800000000001
- type: ndcg_at_3
value: 57.643
- type: ndcg_at_5
value: 60.181
- type: precision_at_1
value: 47.699999999999996
- type: precision_at_10
value: 7.8
- type: precision_at_100
value: 0.919
- type: precision_at_1000
value: 0.097
- type: precision_at_3
value: 21.532999999999998
- type: precision_at_5
value: 14.16
- type: recall_at_1
value: 47.699999999999996
- type: recall_at_10
value: 78.0
- type: recall_at_100
value: 91.9
- type: recall_at_1000
value: 97.3
- type: recall_at_3
value: 64.60000000000001
- type: recall_at_5
value: 70.8
- task:
type: Classification
dataset:
name: MTEB IFlyTek
type: C-MTEB/IFlyTek-classification
config: default
split: validation
revision: 421605374b29664c5fc098418fe20ada9bd55f8a
metrics:
- type: accuracy
value: 44.84801846864178
- type: f1
value: 37.47347897956339
- task:
type: Classification
dataset:
name: MTEB JDReview
type: C-MTEB/JDReview-classification
config: default
split: test
revision: b7c64bd89eb87f8ded463478346f76731f07bf8b
metrics:
- type: accuracy
value: 85.81613508442777
- type: ap
value: 52.68244615477374
- type: f1
value: 80.0445640948843
- task:
type: STS
dataset:
name: MTEB LCQMC
type: C-MTEB/LCQMC
config: default
split: test
revision: 17f9b096f80380fce5ed12a9be8be7784b337daf
metrics:
- type: cos_sim_pearson
value: 69.57786502217138
- type: cos_sim_spearman
value: 75.39106054489906
- type: euclidean_pearson
value: 73.72082954602402
- type: euclidean_spearman
value: 75.14421475913619
- type: manhattan_pearson
value: 73.62463076633642
- type: manhattan_spearman
value: 75.01301565104112
- task:
type: Reranking
dataset:
name: MTEB MMarcoReranking
type: C-MTEB/Mmarco-reranking
config: default
split: dev
revision: None
metrics:
- type: map
value: 29.143797057999134
- type: mrr
value: 28.08174603174603
- task:
type: Retrieval
dataset:
name: MTEB MMarcoRetrieval
type: C-MTEB/MMarcoRetrieval
config: default
split: dev
revision: 539bbde593d947e2a124ba72651aafc09eb33fc2
metrics:
- type: map_at_1
value: 70.492
- type: map_at_10
value: 79.501
- type: map_at_100
value: 79.728
- type: map_at_1000
value: 79.735
- type: map_at_3
value: 77.77
- type: map_at_5
value: 78.851
- type: mrr_at_1
value: 72.822
- type: mrr_at_10
value: 80.001
- type: mrr_at_100
value: 80.19
- type: mrr_at_1000
value: 80.197
- type: mrr_at_3
value: 78.484
- type: mrr_at_5
value: 79.42099999999999
- type: ndcg_at_1
value: 72.822
- type: ndcg_at_10
value: 83.013
- type: ndcg_at_100
value: 84.013
- type: ndcg_at_1000
value: 84.20400000000001
- type: ndcg_at_3
value: 79.728
- type: ndcg_at_5
value: 81.542
- type: precision_at_1
value: 72.822
- type: precision_at_10
value: 9.917
- type: precision_at_100
value: 1.042
- type: precision_at_1000
value: 0.106
- type: precision_at_3
value: 29.847
- type: precision_at_5
value: 18.871
- type: recall_at_1
value: 70.492
- type: recall_at_10
value: 93.325
- type: recall_at_100
value: 97.822
- type: recall_at_1000
value: 99.319
- type: recall_at_3
value: 84.636
- type: recall_at_5
value: 88.93100000000001
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (zh-CN)
type: mteb/amazon_massive_intent
config: zh-CN
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 76.88298587760592
- type: f1
value: 73.89001762017176
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (zh-CN)
type: mteb/amazon_massive_scenario
config: zh-CN
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 80.76328177538669
- type: f1
value: 80.24718532423358
- task:
type: Retrieval
dataset:
name: MTEB MedicalRetrieval
type: C-MTEB/MedicalRetrieval
config: default
split: dev
revision: 2039188fb5800a9803ba5048df7b76e6fb151fc6
metrics:
- type: map_at_1
value: 49.6
- type: map_at_10
value: 55.620999999999995
- type: map_at_100
value: 56.204
- type: map_at_1000
value: 56.251
- type: map_at_3
value: 54.132999999999996
- type: map_at_5
value: 54.933
- type: mrr_at_1
value: 49.7
- type: mrr_at_10
value: 55.67100000000001
- type: mrr_at_100
value: 56.254000000000005
- type: mrr_at_1000
value: 56.301
- type: mrr_at_3
value: 54.18300000000001
- type: mrr_at_5
value: 54.983000000000004
- type: ndcg_at_1
value: 49.6
- type: ndcg_at_10
value: 58.645
- type: ndcg_at_100
value: 61.789
- type: ndcg_at_1000
value: 63.219
- type: ndcg_at_3
value: 55.567
- type: ndcg_at_5
value: 57.008
- type: precision_at_1
value: 49.6
- type: precision_at_10
value: 6.819999999999999
- type: precision_at_100
value: 0.836
- type: precision_at_1000
value: 0.095
- type: precision_at_3
value: 19.900000000000002
- type: precision_at_5
value: 12.64
- type: recall_at_1
value: 49.6
- type: recall_at_10
value: 68.2
- type: recall_at_100
value: 83.6
- type: recall_at_1000
value: 95.3
- type: recall_at_3
value: 59.699999999999996
- type: recall_at_5
value: 63.2
- task:
type: Classification
dataset:
name: MTEB MultilingualSentiment
type: C-MTEB/MultilingualSentiment-classification
config: default
split: validation
revision: 46958b007a63fdbf239b7672c25d0bea67b5ea1a
metrics:
- type: accuracy
value: 74.45666666666666
- type: f1
value: 74.32582402190089
- task:
type: PairClassification
dataset:
name: MTEB Ocnli
type: C-MTEB/OCNLI
config: default
split: validation
revision: 66e76a618a34d6d565d5538088562851e6daa7ec
metrics:
- type: cos_sim_accuracy
value: 80.67135896047645
- type: cos_sim_ap
value: 87.60421240712051
- type: cos_sim_f1
value: 82.1304131408661
- type: cos_sim_precision
value: 77.68361581920904
- type: cos_sim_recall
value: 87.11721224920802
- type: dot_accuracy
value: 79.04710341093666
- type: dot_ap
value: 85.6370059719336
- type: dot_f1
value: 80.763723150358
- type: dot_precision
value: 73.69337979094077
- type: dot_recall
value: 89.33474128827878
- type: euclidean_accuracy
value: 81.05035192203573
- type: euclidean_ap
value: 87.7880240053663
- type: euclidean_f1
value: 82.50244379276637
- type: euclidean_precision
value: 76.7970882620564
- type: euclidean_recall
value: 89.1235480464625
- type: manhattan_accuracy
value: 80.61721710882512
- type: manhattan_ap
value: 87.43568120591175
- type: manhattan_f1
value: 81.89526184538653
- type: manhattan_precision
value: 77.5992438563327
- type: manhattan_recall
value: 86.6948257655755
- type: max_accuracy
value: 81.05035192203573
- type: max_ap
value: 87.7880240053663
- type: max_f1
value: 82.50244379276637
- task:
type: Classification
dataset:
name: MTEB OnlineShopping
type: C-MTEB/OnlineShopping-classification
config: default
split: test
revision: e610f2ebd179a8fda30ae534c3878750a96db120
metrics:
- type: accuracy
value: 93.5
- type: ap
value: 91.31357903446782
- type: f1
value: 93.48088994006616
- task:
type: STS
dataset:
name: MTEB PAWSX
type: C-MTEB/PAWSX
config: default
split: test
revision: 9c6a90e430ac22b5779fb019a23e820b11a8b5e1
metrics:
- type: cos_sim_pearson
value: 36.93293453538077
- type: cos_sim_spearman
value: 42.45972506308574
- type: euclidean_pearson
value: 42.34945133152159
- type: euclidean_spearman
value: 42.331610303674644
- type: manhattan_pearson
value: 42.31455070249498
- type: manhattan_spearman
value: 42.19887982891834
- task:
type: STS
dataset:
name: MTEB QBQTC
type: C-MTEB/QBQTC
config: default
split: test
revision: 790b0510dc52b1553e8c49f3d2afb48c0e5c48b7
metrics:
- type: cos_sim_pearson
value: 33.683290790043785
- type: cos_sim_spearman
value: 35.149171171202994
- type: euclidean_pearson
value: 32.33806561267862
- type: euclidean_spearman
value: 34.483576387347966
- type: manhattan_pearson
value: 32.47629754599608
- type: manhattan_spearman
value: 34.66434471867615
- task:
type: STS
dataset:
name: MTEB STS22 (zh)
type: mteb/sts22-crosslingual-sts
config: zh
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 66.46322760516104
- type: cos_sim_spearman
value: 67.398478319726
- type: euclidean_pearson
value: 64.7223480293625
- type: euclidean_spearman
value: 66.83118568812951
- type: manhattan_pearson
value: 64.88440039828305
- type: manhattan_spearman
value: 66.80429458952257
- task:
type: STS
dataset:
name: MTEB STSB
type: C-MTEB/STSB
config: default
split: test
revision: 0cde68302b3541bb8b3c340dc0644b0b745b3dc0
metrics:
- type: cos_sim_pearson
value: 79.08991383232105
- type: cos_sim_spearman
value: 79.39715677296854
- type: euclidean_pearson
value: 78.63201279320496
- type: euclidean_spearman
value: 79.40262660785731
- type: manhattan_pearson
value: 78.98138363146906
- type: manhattan_spearman
value: 79.79968413014194
- task:
type: Reranking
dataset:
name: MTEB T2Reranking
type: C-MTEB/T2Reranking
config: default
split: dev
revision: 76631901a18387f85eaa53e5450019b87ad58ef9
metrics:
- type: map
value: 67.43289278789972
- type: mrr
value: 77.53012460908535
- task:
type: Retrieval
dataset:
name: MTEB T2Retrieval
type: C-MTEB/T2Retrieval
config: default
split: dev
revision: 8731a845f1bf500a4f111cf1070785c793d10e64
metrics:
- type: map_at_1
value: 27.733999999999998
- type: map_at_10
value: 78.24799999999999
- type: map_at_100
value: 81.765
- type: map_at_1000
value: 81.824
- type: map_at_3
value: 54.92
- type: map_at_5
value: 67.61399999999999
- type: mrr_at_1
value: 90.527
- type: mrr_at_10
value: 92.843
- type: mrr_at_100
value: 92.927
- type: mrr_at_1000
value: 92.93
- type: mrr_at_3
value: 92.45100000000001
- type: mrr_at_5
value: 92.693
- type: ndcg_at_1
value: 90.527
- type: ndcg_at_10
value: 85.466
- type: ndcg_at_100
value: 88.846
- type: ndcg_at_1000
value: 89.415
- type: ndcg_at_3
value: 86.768
- type: ndcg_at_5
value: 85.46000000000001
- type: precision_at_1
value: 90.527
- type: precision_at_10
value: 42.488
- type: precision_at_100
value: 5.024
- type: precision_at_1000
value: 0.516
- type: precision_at_3
value: 75.907
- type: precision_at_5
value: 63.727000000000004
- type: recall_at_1
value: 27.733999999999998
- type: recall_at_10
value: 84.346
- type: recall_at_100
value: 95.536
- type: recall_at_1000
value: 98.42999999999999
- type: recall_at_3
value: 56.455
- type: recall_at_5
value: 70.755
- task:
type: Classification
dataset:
name: MTEB TNews
type: C-MTEB/TNews-classification
config: default
split: validation
revision: 317f262bf1e6126357bbe89e875451e4b0938fe4
metrics:
- type: accuracy
value: 49.952000000000005
- type: f1
value: 48.264617195258054
- task:
type: Clustering
dataset:
name: MTEB ThuNewsClusteringP2P
type: C-MTEB/ThuNewsClusteringP2P
config: default
split: test
revision: 5798586b105c0434e4f0fe5e767abe619442cf93
metrics:
- type: v_measure
value: 68.23769904483508
- task:
type: Clustering
dataset:
name: MTEB ThuNewsClusteringS2S
type: C-MTEB/ThuNewsClusteringS2S
config: default
split: test
revision: 8a8b2caeda43f39e13c4bc5bea0f8a667896e10d
metrics:
- type: v_measure
value: 62.50294403136556
- task:
type: Retrieval
dataset:
name: MTEB VideoRetrieval
type: C-MTEB/VideoRetrieval
config: default
split: dev
revision: 58c2597a5943a2ba48f4668c3b90d796283c5639
metrics:
- type: map_at_1
value: 54.0
- type: map_at_10
value: 63.668
- type: map_at_100
value: 64.217
- type: map_at_1000
value: 64.23100000000001
- type: map_at_3
value: 61.7
- type: map_at_5
value: 62.870000000000005
- type: mrr_at_1
value: 54.0
- type: mrr_at_10
value: 63.668
- type: mrr_at_100
value: 64.217
- type: mrr_at_1000
value: 64.23100000000001
- type: mrr_at_3
value: 61.7
- type: mrr_at_5
value: 62.870000000000005
- type: ndcg_at_1
value: 54.0
- type: ndcg_at_10
value: 68.11399999999999
- type: ndcg_at_100
value: 70.723
- type: ndcg_at_1000
value: 71.123
- type: ndcg_at_3
value: 64.074
- type: ndcg_at_5
value: 66.178
- type: precision_at_1
value: 54.0
- type: precision_at_10
value: 8.200000000000001
- type: precision_at_100
value: 0.941
- type: precision_at_1000
value: 0.097
- type: precision_at_3
value: 23.633000000000003
- type: precision_at_5
value: 15.2
- type: recall_at_1
value: 54.0
- type: recall_at_10
value: 82.0
- type: recall_at_100
value: 94.1
- type: recall_at_1000
value: 97.3
- type: recall_at_3
value: 70.89999999999999
- type: recall_at_5
value: 76.0
- task:
type: Classification
dataset:
name: MTEB Waimai
type: C-MTEB/waimai-classification
config: default
split: test
revision: 339287def212450dcaa9df8c22bf93e9980c7023
metrics:
- type: accuracy
value: 86.63000000000001
- type: ap
value: 69.99457882599567
- type: f1
value: 85.07735617998541
- task:
type: Clustering
dataset:
name: MTEB 8TagsClustering
type: PL-MTEB/8tags-clustering
config: default
split: test
revision: None
metrics:
- type: v_measure
value: 44.594104491193555
- task:
type: Classification
dataset:
name: MTEB AllegroReviews
type: PL-MTEB/allegro-reviews
config: default
split: test
revision: None
metrics:
- type: accuracy
value: 63.97614314115309
- type: f1
value: 52.15634261679283
- task:
type: Retrieval
dataset:
name: MTEB ArguAna-PL
type: clarin-knext/arguana-pl
config: default
split: test
revision: 63fc86750af76253e8c760fc9e534bbf24d260a2
metrics:
- type: map_at_1
value: 32.646
- type: map_at_10
value: 47.963
- type: map_at_100
value: 48.789
- type: map_at_1000
value: 48.797000000000004
- type: map_at_3
value: 43.196
- type: map_at_5
value: 46.016
- type: mrr_at_1
value: 33.073
- type: mrr_at_10
value: 48.126000000000005
- type: mrr_at_100
value: 48.946
- type: mrr_at_1000
value: 48.953
- type: mrr_at_3
value: 43.374
- type: mrr_at_5
value: 46.147
- type: ndcg_at_1
value: 32.646
- type: ndcg_at_10
value: 56.481
- type: ndcg_at_100
value: 59.922
- type: ndcg_at_1000
value: 60.07
- type: ndcg_at_3
value: 46.675
- type: ndcg_at_5
value: 51.76500000000001
- type: precision_at_1
value: 32.646
- type: precision_at_10
value: 8.371
- type: precision_at_100
value: 0.9860000000000001
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 18.919
- type: precision_at_5
value: 13.825999999999999
- type: recall_at_1
value: 32.646
- type: recall_at_10
value: 83.71300000000001
- type: recall_at_100
value: 98.578
- type: recall_at_1000
value: 99.644
- type: recall_at_3
value: 56.757000000000005
- type: recall_at_5
value: 69.132
- task:
type: Classification
dataset:
name: MTEB CBD
type: PL-MTEB/cbd
config: default
split: test
revision: None
metrics:
- type: accuracy
value: 68.56
- type: ap
value: 23.310493680488513
- type: f1
value: 58.85369533105693
- task:
type: PairClassification
dataset:
name: MTEB CDSC-E
type: PL-MTEB/cdsce-pairclassification
config: default
split: test
revision: None
metrics:
- type: cos_sim_accuracy
value: 88.5
- type: cos_sim_ap
value: 72.42140924378361
- type: cos_sim_f1
value: 66.0919540229885
- type: cos_sim_precision
value: 72.78481012658227
- type: cos_sim_recall
value: 60.526315789473685
- type: dot_accuracy
value: 88.5
- type: dot_ap
value: 72.42140924378361
- type: dot_f1
value: 66.0919540229885
- type: dot_precision
value: 72.78481012658227
- type: dot_recall
value: 60.526315789473685
- type: euclidean_accuracy
value: 88.5
- type: euclidean_ap
value: 72.42140924378361
- type: euclidean_f1
value: 66.0919540229885
- type: euclidean_precision
value: 72.78481012658227
- type: euclidean_recall
value: 60.526315789473685
- type: manhattan_accuracy
value: 88.5
- type: manhattan_ap
value: 72.49745515311696
- type: manhattan_f1
value: 66.0968660968661
- type: manhattan_precision
value: 72.04968944099379
- type: manhattan_recall
value: 61.05263157894737
- type: max_accuracy
value: 88.5
- type: max_ap
value: 72.49745515311696
- type: max_f1
value: 66.0968660968661
- task:
type: STS
dataset:
name: MTEB CDSC-R
type: PL-MTEB/cdscr-sts
config: default
split: test
revision: None
metrics:
- type: cos_sim_pearson
value: 90.32269765590145
- type: cos_sim_spearman
value: 89.73666311491672
- type: euclidean_pearson
value: 88.2933868516544
- type: euclidean_spearman
value: 89.73666311491672
- type: manhattan_pearson
value: 88.33474590219448
- type: manhattan_spearman
value: 89.8548364866583
- task:
type: Retrieval
dataset:
name: MTEB DBPedia-PL
type: clarin-knext/dbpedia-pl
config: default
split: test
revision: 76afe41d9af165cc40999fcaa92312b8b012064a
metrics:
- type: map_at_1
value: 7.632999999999999
- type: map_at_10
value: 16.426
- type: map_at_100
value: 22.651
- type: map_at_1000
value: 24.372
- type: map_at_3
value: 11.706
- type: map_at_5
value: 13.529
- type: mrr_at_1
value: 60.75000000000001
- type: mrr_at_10
value: 68.613
- type: mrr_at_100
value: 69.001
- type: mrr_at_1000
value: 69.021
- type: mrr_at_3
value: 67.0
- type: mrr_at_5
value: 67.925
- type: ndcg_at_1
value: 49.875
- type: ndcg_at_10
value: 36.978
- type: ndcg_at_100
value: 40.031
- type: ndcg_at_1000
value: 47.566
- type: ndcg_at_3
value: 41.148
- type: ndcg_at_5
value: 38.702
- type: precision_at_1
value: 60.75000000000001
- type: precision_at_10
value: 29.7
- type: precision_at_100
value: 9.278
- type: precision_at_1000
value: 2.099
- type: precision_at_3
value: 44.0
- type: precision_at_5
value: 37.6
- type: recall_at_1
value: 7.632999999999999
- type: recall_at_10
value: 22.040000000000003
- type: recall_at_100
value: 44.024
- type: recall_at_1000
value: 67.848
- type: recall_at_3
value: 13.093
- type: recall_at_5
value: 15.973
- task:
type: Retrieval
dataset:
name: MTEB FiQA-PL
type: clarin-knext/fiqa-pl
config: default
split: test
revision: 2e535829717f8bf9dc829b7f911cc5bbd4e6608e
metrics:
- type: map_at_1
value: 15.473
- type: map_at_10
value: 24.579
- type: map_at_100
value: 26.387
- type: map_at_1000
value: 26.57
- type: map_at_3
value: 21.278
- type: map_at_5
value: 23.179
- type: mrr_at_1
value: 30.709999999999997
- type: mrr_at_10
value: 38.994
- type: mrr_at_100
value: 39.993
- type: mrr_at_1000
value: 40.044999999999995
- type: mrr_at_3
value: 36.342999999999996
- type: mrr_at_5
value: 37.846999999999994
- type: ndcg_at_1
value: 30.709999999999997
- type: ndcg_at_10
value: 31.608999999999998
- type: ndcg_at_100
value: 38.807
- type: ndcg_at_1000
value: 42.208
- type: ndcg_at_3
value: 28.086
- type: ndcg_at_5
value: 29.323
- type: precision_at_1
value: 30.709999999999997
- type: precision_at_10
value: 8.688
- type: precision_at_100
value: 1.608
- type: precision_at_1000
value: 0.22100000000000003
- type: precision_at_3
value: 18.724
- type: precision_at_5
value: 13.950999999999999
- type: recall_at_1
value: 15.473
- type: recall_at_10
value: 38.361000000000004
- type: recall_at_100
value: 65.2
- type: recall_at_1000
value: 85.789
- type: recall_at_3
value: 25.401
- type: recall_at_5
value: 30.875999999999998
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA-PL
type: clarin-knext/hotpotqa-pl
config: default
split: test
revision: a0bd479ac97b4ccb5bd6ce320c415d0bb4beb907
metrics:
- type: map_at_1
value: 38.096000000000004
- type: map_at_10
value: 51.44499999999999
- type: map_at_100
value: 52.325
- type: map_at_1000
value: 52.397000000000006
- type: map_at_3
value: 48.626999999999995
- type: map_at_5
value: 50.342
- type: mrr_at_1
value: 76.19200000000001
- type: mrr_at_10
value: 81.191
- type: mrr_at_100
value: 81.431
- type: mrr_at_1000
value: 81.443
- type: mrr_at_3
value: 80.30199999999999
- type: mrr_at_5
value: 80.85900000000001
- type: ndcg_at_1
value: 76.19200000000001
- type: ndcg_at_10
value: 60.9
- type: ndcg_at_100
value: 64.14699999999999
- type: ndcg_at_1000
value: 65.647
- type: ndcg_at_3
value: 56.818000000000005
- type: ndcg_at_5
value: 59.019999999999996
- type: precision_at_1
value: 76.19200000000001
- type: precision_at_10
value: 12.203
- type: precision_at_100
value: 1.478
- type: precision_at_1000
value: 0.168
- type: precision_at_3
value: 34.616
- type: precision_at_5
value: 22.515
- type: recall_at_1
value: 38.096000000000004
- type: recall_at_10
value: 61.013
- type: recall_at_100
value: 73.90299999999999
- type: recall_at_1000
value: 83.91
- type: recall_at_3
value: 51.92400000000001
- type: recall_at_5
value: 56.286
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO-PL
type: clarin-knext/msmarco-pl
config: default
split: test
revision: 8634c07806d5cce3a6138e260e59b81760a0a640
metrics:
- type: map_at_1
value: 1.548
- type: map_at_10
value: 11.049000000000001
- type: map_at_100
value: 28.874
- type: map_at_1000
value: 34.931
- type: map_at_3
value: 4.162
- type: map_at_5
value: 6.396
- type: mrr_at_1
value: 90.69800000000001
- type: mrr_at_10
value: 92.093
- type: mrr_at_100
value: 92.345
- type: mrr_at_1000
value: 92.345
- type: mrr_at_3
value: 91.86
- type: mrr_at_5
value: 91.86
- type: ndcg_at_1
value: 74.031
- type: ndcg_at_10
value: 63.978
- type: ndcg_at_100
value: 53.101
- type: ndcg_at_1000
value: 60.675999999999995
- type: ndcg_at_3
value: 71.421
- type: ndcg_at_5
value: 68.098
- type: precision_at_1
value: 90.69800000000001
- type: precision_at_10
value: 71.86
- type: precision_at_100
value: 31.395
- type: precision_at_1000
value: 5.981
- type: precision_at_3
value: 84.49600000000001
- type: precision_at_5
value: 79.07
- type: recall_at_1
value: 1.548
- type: recall_at_10
value: 12.149000000000001
- type: recall_at_100
value: 40.794999999999995
- type: recall_at_1000
value: 67.974
- type: recall_at_3
value: 4.244
- type: recall_at_5
value: 6.608
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (pl)
type: mteb/amazon_massive_intent
config: pl
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 73.55413584398119
- type: f1
value: 69.65610882318181
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (pl)
type: mteb/amazon_massive_scenario
config: pl
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 76.37188971082716
- type: f1
value: 75.64847309941361
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus-PL
type: clarin-knext/nfcorpus-pl
config: default
split: test
revision: 9a6f9567fda928260afed2de480d79c98bf0bec0
metrics:
- type: map_at_1
value: 4.919
- type: map_at_10
value: 10.834000000000001
- type: map_at_100
value: 13.38
- type: map_at_1000
value: 14.581
- type: map_at_3
value: 8.198
- type: map_at_5
value: 9.428
- type: mrr_at_1
value: 41.176
- type: mrr_at_10
value: 50.083
- type: mrr_at_100
value: 50.559
- type: mrr_at_1000
value: 50.604000000000006
- type: mrr_at_3
value: 47.936
- type: mrr_at_5
value: 49.407000000000004
- type: ndcg_at_1
value: 39.628
- type: ndcg_at_10
value: 30.098000000000003
- type: ndcg_at_100
value: 27.061
- type: ndcg_at_1000
value: 35.94
- type: ndcg_at_3
value: 35.135
- type: ndcg_at_5
value: 33.335
- type: precision_at_1
value: 41.176
- type: precision_at_10
value: 22.259999999999998
- type: precision_at_100
value: 6.712
- type: precision_at_1000
value: 1.9060000000000001
- type: precision_at_3
value: 33.23
- type: precision_at_5
value: 29.04
- type: recall_at_1
value: 4.919
- type: recall_at_10
value: 14.196
- type: recall_at_100
value: 26.948
- type: recall_at_1000
value: 59.211000000000006
- type: recall_at_3
value: 9.44
- type: recall_at_5
value: 11.569
- task:
type: Retrieval
dataset:
name: MTEB NQ-PL
type: clarin-knext/nq-pl
config: default
split: test
revision: f171245712cf85dd4700b06bef18001578d0ca8d
metrics:
- type: map_at_1
value: 25.35
- type: map_at_10
value: 37.884
- type: map_at_100
value: 38.955
- type: map_at_1000
value: 39.007999999999996
- type: map_at_3
value: 34.239999999999995
- type: map_at_5
value: 36.398
- type: mrr_at_1
value: 28.737000000000002
- type: mrr_at_10
value: 39.973
- type: mrr_at_100
value: 40.844
- type: mrr_at_1000
value: 40.885
- type: mrr_at_3
value: 36.901
- type: mrr_at_5
value: 38.721
- type: ndcg_at_1
value: 28.708
- type: ndcg_at_10
value: 44.204
- type: ndcg_at_100
value: 48.978
- type: ndcg_at_1000
value: 50.33
- type: ndcg_at_3
value: 37.36
- type: ndcg_at_5
value: 40.912
- type: precision_at_1
value: 28.708
- type: precision_at_10
value: 7.367
- type: precision_at_100
value: 1.0030000000000001
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_3
value: 17.034
- type: precision_at_5
value: 12.293999999999999
- type: recall_at_1
value: 25.35
- type: recall_at_10
value: 61.411
- type: recall_at_100
value: 82.599
- type: recall_at_1000
value: 92.903
- type: recall_at_3
value: 43.728
- type: recall_at_5
value: 51.854
- task:
type: Classification
dataset:
name: MTEB PAC
type: laugustyniak/abusive-clauses-pl
config: default
split: test
revision: None
metrics:
- type: accuracy
value: 69.04141326382856
- type: ap
value: 77.49422763833996
- type: f1
value: 66.73472657783407
- task:
type: PairClassification
dataset:
name: MTEB PPC
type: PL-MTEB/ppc-pairclassification
config: default
split: test
revision: None
metrics:
- type: cos_sim_accuracy
value: 81.0
- type: cos_sim_ap
value: 91.47194213011349
- type: cos_sim_f1
value: 84.73767885532592
- type: cos_sim_precision
value: 81.49847094801224
- type: cos_sim_recall
value: 88.24503311258279
- type: dot_accuracy
value: 81.0
- type: dot_ap
value: 91.47194213011349
- type: dot_f1
value: 84.73767885532592
- type: dot_precision
value: 81.49847094801224
- type: dot_recall
value: 88.24503311258279
- type: euclidean_accuracy
value: 81.0
- type: euclidean_ap
value: 91.47194213011349
- type: euclidean_f1
value: 84.73767885532592
- type: euclidean_precision
value: 81.49847094801224
- type: euclidean_recall
value: 88.24503311258279
- type: manhattan_accuracy
value: 81.0
- type: manhattan_ap
value: 91.46464475050571
- type: manhattan_f1
value: 84.48687350835321
- type: manhattan_precision
value: 81.31699846860643
- type: manhattan_recall
value: 87.91390728476821
- type: max_accuracy
value: 81.0
- type: max_ap
value: 91.47194213011349
- type: max_f1
value: 84.73767885532592
- task:
type: PairClassification
dataset:
name: MTEB PSC
type: PL-MTEB/psc-pairclassification
config: default
split: test
revision: None
metrics:
- type: cos_sim_accuracy
value: 97.6808905380334
- type: cos_sim_ap
value: 99.27948611836348
- type: cos_sim_f1
value: 96.15975422427034
- type: cos_sim_precision
value: 96.90402476780186
- type: cos_sim_recall
value: 95.42682926829268
- type: dot_accuracy
value: 97.6808905380334
- type: dot_ap
value: 99.2794861183635
- type: dot_f1
value: 96.15975422427034
- type: dot_precision
value: 96.90402476780186
- type: dot_recall
value: 95.42682926829268
- type: euclidean_accuracy
value: 97.6808905380334
- type: euclidean_ap
value: 99.2794861183635
- type: euclidean_f1
value: 96.15975422427034
- type: euclidean_precision
value: 96.90402476780186
- type: euclidean_recall
value: 95.42682926829268
- type: manhattan_accuracy
value: 97.6808905380334
- type: manhattan_ap
value: 99.28715055268721
- type: manhattan_f1
value: 96.14791987673343
- type: manhattan_precision
value: 97.19626168224299
- type: manhattan_recall
value: 95.1219512195122
- type: max_accuracy
value: 97.6808905380334
- type: max_ap
value: 99.28715055268721
- type: max_f1
value: 96.15975422427034
- task:
type: Classification
dataset:
name: MTEB PolEmo2.0-IN
type: PL-MTEB/polemo2_in
config: default
split: test
revision: None
metrics:
- type: accuracy
value: 86.16343490304708
- type: f1
value: 83.3442579486744
- task:
type: Classification
dataset:
name: MTEB PolEmo2.0-OUT
type: PL-MTEB/polemo2_out
config: default
split: test
revision: None
metrics:
- type: accuracy
value: 68.40080971659918
- type: f1
value: 53.13720751142237
- task:
type: Retrieval
dataset:
name: MTEB Quora-PL
type: clarin-knext/quora-pl
config: default
split: test
revision: 0be27e93455051e531182b85e85e425aba12e9d4
metrics:
- type: map_at_1
value: 63.322
- type: map_at_10
value: 76.847
- type: map_at_100
value: 77.616
- type: map_at_1000
value: 77.644
- type: map_at_3
value: 73.624
- type: map_at_5
value: 75.603
- type: mrr_at_1
value: 72.88
- type: mrr_at_10
value: 80.376
- type: mrr_at_100
value: 80.604
- type: mrr_at_1000
value: 80.61
- type: mrr_at_3
value: 78.92
- type: mrr_at_5
value: 79.869
- type: ndcg_at_1
value: 72.89999999999999
- type: ndcg_at_10
value: 81.43
- type: ndcg_at_100
value: 83.394
- type: ndcg_at_1000
value: 83.685
- type: ndcg_at_3
value: 77.62599999999999
- type: ndcg_at_5
value: 79.656
- type: precision_at_1
value: 72.89999999999999
- type: precision_at_10
value: 12.548
- type: precision_at_100
value: 1.4869999999999999
- type: precision_at_1000
value: 0.155
- type: precision_at_3
value: 34.027
- type: precision_at_5
value: 22.654
- type: recall_at_1
value: 63.322
- type: recall_at_10
value: 90.664
- type: recall_at_100
value: 97.974
- type: recall_at_1000
value: 99.636
- type: recall_at_3
value: 80.067
- type: recall_at_5
value: 85.526
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS-PL
type: clarin-knext/scidocs-pl
config: default
split: test
revision: 45452b03f05560207ef19149545f168e596c9337
metrics:
- type: map_at_1
value: 3.95
- type: map_at_10
value: 9.658999999999999
- type: map_at_100
value: 11.384
- type: map_at_1000
value: 11.677
- type: map_at_3
value: 7.055
- type: map_at_5
value: 8.244
- type: mrr_at_1
value: 19.5
- type: mrr_at_10
value: 28.777
- type: mrr_at_100
value: 29.936
- type: mrr_at_1000
value: 30.009999999999998
- type: mrr_at_3
value: 25.55
- type: mrr_at_5
value: 27.284999999999997
- type: ndcg_at_1
value: 19.5
- type: ndcg_at_10
value: 16.589000000000002
- type: ndcg_at_100
value: 23.879
- type: ndcg_at_1000
value: 29.279
- type: ndcg_at_3
value: 15.719
- type: ndcg_at_5
value: 13.572000000000001
- type: precision_at_1
value: 19.5
- type: precision_at_10
value: 8.62
- type: precision_at_100
value: 1.924
- type: precision_at_1000
value: 0.322
- type: precision_at_3
value: 14.6
- type: precision_at_5
value: 11.78
- type: recall_at_1
value: 3.95
- type: recall_at_10
value: 17.477999999999998
- type: recall_at_100
value: 38.99
- type: recall_at_1000
value: 65.417
- type: recall_at_3
value: 8.883000000000001
- type: recall_at_5
value: 11.933
- task:
type: PairClassification
dataset:
name: MTEB SICK-E-PL
type: PL-MTEB/sicke-pl-pairclassification
config: default
split: test
revision: None
metrics:
- type: cos_sim_accuracy
value: 83.48960456583775
- type: cos_sim_ap
value: 76.31522115825375
- type: cos_sim_f1
value: 70.35573122529645
- type: cos_sim_precision
value: 70.9934735315446
- type: cos_sim_recall
value: 69.72934472934473
- type: dot_accuracy
value: 83.48960456583775
- type: dot_ap
value: 76.31522115825373
- type: dot_f1
value: 70.35573122529645
- type: dot_precision
value: 70.9934735315446
- type: dot_recall
value: 69.72934472934473
- type: euclidean_accuracy
value: 83.48960456583775
- type: euclidean_ap
value: 76.31522115825373
- type: euclidean_f1
value: 70.35573122529645
- type: euclidean_precision
value: 70.9934735315446
- type: euclidean_recall
value: 69.72934472934473
- type: manhattan_accuracy
value: 83.46922136159804
- type: manhattan_ap
value: 76.18474601388084
- type: manhattan_f1
value: 70.34779490856937
- type: manhattan_precision
value: 70.83032490974729
- type: manhattan_recall
value: 69.87179487179486
- type: max_accuracy
value: 83.48960456583775
- type: max_ap
value: 76.31522115825375
- type: max_f1
value: 70.35573122529645
- task:
type: STS
dataset:
name: MTEB SICK-R-PL
type: PL-MTEB/sickr-pl-sts
config: default
split: test
revision: None
metrics:
- type: cos_sim_pearson
value: 77.95374883876302
- type: cos_sim_spearman
value: 73.77630219171942
- type: euclidean_pearson
value: 75.81927069594934
- type: euclidean_spearman
value: 73.7763211303831
- type: manhattan_pearson
value: 76.03126859057528
- type: manhattan_spearman
value: 73.96528138013369
- task:
type: STS
dataset:
name: MTEB STS22 (pl)
type: mteb/sts22-crosslingual-sts
config: pl
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 37.388282764841826
- type: cos_sim_spearman
value: 40.83477184710897
- type: euclidean_pearson
value: 26.754737044177805
- type: euclidean_spearman
value: 40.83477184710897
- type: manhattan_pearson
value: 26.760453110872458
- type: manhattan_spearman
value: 41.034477441383856
- task:
type: Retrieval
dataset:
name: MTEB SciFact-PL
type: clarin-knext/scifact-pl
config: default
split: test
revision: 47932a35f045ef8ed01ba82bf9ff67f6e109207e
metrics:
- type: map_at_1
value: 49.15
- type: map_at_10
value: 61.690999999999995
- type: map_at_100
value: 62.348000000000006
- type: map_at_1000
value: 62.38
- type: map_at_3
value: 58.824
- type: map_at_5
value: 60.662000000000006
- type: mrr_at_1
value: 51.333
- type: mrr_at_10
value: 62.731
- type: mrr_at_100
value: 63.245
- type: mrr_at_1000
value: 63.275000000000006
- type: mrr_at_3
value: 60.667
- type: mrr_at_5
value: 61.93300000000001
- type: ndcg_at_1
value: 51.333
- type: ndcg_at_10
value: 67.168
- type: ndcg_at_100
value: 69.833
- type: ndcg_at_1000
value: 70.56700000000001
- type: ndcg_at_3
value: 62.40599999999999
- type: ndcg_at_5
value: 65.029
- type: precision_at_1
value: 51.333
- type: precision_at_10
value: 9.333
- type: precision_at_100
value: 1.0699999999999998
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_3
value: 25.333
- type: precision_at_5
value: 17.067
- type: recall_at_1
value: 49.15
- type: recall_at_10
value: 82.533
- type: recall_at_100
value: 94.167
- type: recall_at_1000
value: 99.667
- type: recall_at_3
value: 69.917
- type: recall_at_5
value: 76.356
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID-PL
type: clarin-knext/trec-covid-pl
config: default
split: test
revision: 81bcb408f33366c2a20ac54adafad1ae7e877fdd
metrics:
- type: map_at_1
value: 0.261
- type: map_at_10
value: 2.1260000000000003
- type: map_at_100
value: 12.171999999999999
- type: map_at_1000
value: 26.884999999999998
- type: map_at_3
value: 0.695
- type: map_at_5
value: 1.134
- type: mrr_at_1
value: 96.0
- type: mrr_at_10
value: 96.952
- type: mrr_at_100
value: 96.952
- type: mrr_at_1000
value: 96.952
- type: mrr_at_3
value: 96.667
- type: mrr_at_5
value: 96.667
- type: ndcg_at_1
value: 92.0
- type: ndcg_at_10
value: 81.193
- type: ndcg_at_100
value: 61.129
- type: ndcg_at_1000
value: 51.157
- type: ndcg_at_3
value: 85.693
- type: ndcg_at_5
value: 84.129
- type: precision_at_1
value: 96.0
- type: precision_at_10
value: 85.39999999999999
- type: precision_at_100
value: 62.03999999999999
- type: precision_at_1000
value: 22.224
- type: precision_at_3
value: 88.0
- type: precision_at_5
value: 88.0
- type: recall_at_1
value: 0.261
- type: recall_at_10
value: 2.262
- type: recall_at_100
value: 14.981
- type: recall_at_1000
value: 46.837
- type: recall_at_3
value: 0.703
- type: recall_at_5
value: 1.172
- task:
type: Clustering
dataset:
name: MTEB AlloProfClusteringP2P
type: lyon-nlp/alloprof
config: default
split: test
revision: 392ba3f5bcc8c51f578786c1fc3dae648662cb9b
metrics:
- type: v_measure
value: 70.55290063940157
- type: v_measure
value: 55.41500719337263
- task:
type: Reranking
dataset:
name: MTEB AlloprofReranking
type: lyon-nlp/mteb-fr-reranking-alloprof-s2p
config: default
split: test
revision: 666fdacebe0291776e86f29345663dfaf80a0db9
metrics:
- type: map
value: 73.48697375332002
- type: mrr
value: 75.01836585523822
- task:
type: Retrieval
dataset:
name: MTEB AlloprofRetrieval
type: lyon-nlp/alloprof
config: default
split: test
revision: 392ba3f5bcc8c51f578786c1fc3dae648662cb9b
metrics:
- type: map_at_1
value: 38.454
- type: map_at_10
value: 51.605000000000004
- type: map_at_100
value: 52.653000000000006
- type: map_at_1000
value: 52.697
- type: map_at_3
value: 48.304
- type: map_at_5
value: 50.073
- type: mrr_at_1
value: 43.307
- type: mrr_at_10
value: 54.400000000000006
- type: mrr_at_100
value: 55.147999999999996
- type: mrr_at_1000
value: 55.174
- type: mrr_at_3
value: 51.77
- type: mrr_at_5
value: 53.166999999999994
- type: ndcg_at_1
value: 43.307
- type: ndcg_at_10
value: 57.891000000000005
- type: ndcg_at_100
value: 62.161
- type: ndcg_at_1000
value: 63.083
- type: ndcg_at_3
value: 51.851
- type: ndcg_at_5
value: 54.605000000000004
- type: precision_at_1
value: 43.307
- type: precision_at_10
value: 9.033
- type: precision_at_100
value: 1.172
- type: precision_at_1000
value: 0.127
- type: precision_at_3
value: 22.798
- type: precision_at_5
value: 15.492
- type: recall_at_1
value: 38.454
- type: recall_at_10
value: 74.166
- type: recall_at_100
value: 92.43599999999999
- type: recall_at_1000
value: 99.071
- type: recall_at_3
value: 58.087
- type: recall_at_5
value: 64.568
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (fr)
type: mteb/amazon_reviews_multi
config: fr
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 53.474
- type: f1
value: 50.38275392350236
- task:
type: Retrieval
dataset:
name: MTEB BSARDRetrieval
type: maastrichtlawtech/bsard
config: default
split: test
revision: 5effa1b9b5fa3b0f9e12523e6e43e5f86a6e6d59
metrics:
- type: map_at_1
value: 2.252
- type: map_at_10
value: 4.661
- type: map_at_100
value: 5.271
- type: map_at_1000
value: 5.3629999999999995
- type: map_at_3
value: 3.604
- type: map_at_5
value: 4.3020000000000005
- type: mrr_at_1
value: 2.252
- type: mrr_at_10
value: 4.661
- type: mrr_at_100
value: 5.271
- type: mrr_at_1000
value: 5.3629999999999995
- type: mrr_at_3
value: 3.604
- type: mrr_at_5
value: 4.3020000000000005
- type: ndcg_at_1
value: 2.252
- type: ndcg_at_10
value: 6.3020000000000005
- type: ndcg_at_100
value: 10.342
- type: ndcg_at_1000
value: 13.475999999999999
- type: ndcg_at_3
value: 4.0649999999999995
- type: ndcg_at_5
value: 5.344
- type: precision_at_1
value: 2.252
- type: precision_at_10
value: 1.171
- type: precision_at_100
value: 0.333
- type: precision_at_1000
value: 0.059000000000000004
- type: precision_at_3
value: 1.802
- type: precision_at_5
value: 1.712
- type: recall_at_1
value: 2.252
- type: recall_at_10
value: 11.712
- type: recall_at_100
value: 33.333
- type: recall_at_1000
value: 59.458999999999996
- type: recall_at_3
value: 5.405
- type: recall_at_5
value: 8.559
- task:
type: Clustering
dataset:
name: MTEB HALClusteringS2S
type: lyon-nlp/clustering-hal-s2s
config: default
split: test
revision: e06ebbbb123f8144bef1a5d18796f3dec9ae2915
metrics:
- type: v_measure
value: 28.301882091023288
- task:
type: Clustering
dataset:
name: MTEB MLSUMClusteringP2P
type: mlsum
config: default
split: test
revision: b5d54f8f3b61ae17845046286940f03c6bc79bc7
metrics:
- type: v_measure
value: 45.26992995191701
- type: v_measure
value: 42.773174876871145
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (fr)
type: mteb/mtop_domain
config: fr
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 93.47635452552458
- type: f1
value: 93.19922617577213
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (fr)
type: mteb/mtop_intent
config: fr
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 80.2317569683683
- type: f1
value: 56.18060418621901
- task:
type: Classification
dataset:
name: MTEB MasakhaNEWSClassification (fra)
type: masakhane/masakhanews
config: fra
split: test
revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60
metrics:
- type: accuracy
value: 85.18957345971565
- type: f1
value: 80.829981537394
- task:
type: Clustering
dataset:
name: MTEB MasakhaNEWSClusteringP2P (fra)
type: masakhane/masakhanews
config: fra
split: test
revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60
metrics:
- type: v_measure
value: 71.04138999801822
- type: v_measure
value: 71.7056263158008
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (fr)
type: mteb/amazon_massive_intent
config: fr
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 76.65097511768661
- type: f1
value: 73.82441070598712
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (fr)
type: mteb/amazon_massive_scenario
config: fr
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 79.09885675857431
- type: f1
value: 78.28407777434224
- task:
type: Retrieval
dataset:
name: MTEB MintakaRetrieval (fr)
type: jinaai/mintakaqa
config: fr
split: test
revision: efa78cc2f74bbcd21eff2261f9e13aebe40b814e
metrics:
- type: map_at_1
value: 25.307000000000002
- type: map_at_10
value: 36.723
- type: map_at_100
value: 37.713
- type: map_at_1000
value: 37.769000000000005
- type: map_at_3
value: 33.77
- type: map_at_5
value: 35.463
- type: mrr_at_1
value: 25.307000000000002
- type: mrr_at_10
value: 36.723
- type: mrr_at_100
value: 37.713
- type: mrr_at_1000
value: 37.769000000000005
- type: mrr_at_3
value: 33.77
- type: mrr_at_5
value: 35.463
- type: ndcg_at_1
value: 25.307000000000002
- type: ndcg_at_10
value: 42.559999999999995
- type: ndcg_at_100
value: 47.457
- type: ndcg_at_1000
value: 49.162
- type: ndcg_at_3
value: 36.461
- type: ndcg_at_5
value: 39.504
- type: precision_at_1
value: 25.307000000000002
- type: precision_at_10
value: 6.106
- type: precision_at_100
value: 0.8420000000000001
- type: precision_at_1000
value: 0.098
- type: precision_at_3
value: 14.741999999999999
- type: precision_at_5
value: 10.319
- type: recall_at_1
value: 25.307000000000002
- type: recall_at_10
value: 61.056999999999995
- type: recall_at_100
value: 84.152
- type: recall_at_1000
value: 98.03399999999999
- type: recall_at_3
value: 44.226
- type: recall_at_5
value: 51.597
- task:
type: PairClassification
dataset:
name: MTEB OpusparcusPC (fr)
type: GEM/opusparcus
config: fr
split: test
revision: 9e9b1f8ef51616073f47f306f7f47dd91663f86a
metrics:
- type: cos_sim_accuracy
value: 99.90069513406156
- type: cos_sim_ap
value: 100.0
- type: cos_sim_f1
value: 99.95032290114257
- type: cos_sim_precision
value: 100.0
- type: cos_sim_recall
value: 99.90069513406156
- type: dot_accuracy
value: 99.90069513406156
- type: dot_ap
value: 100.0
- type: dot_f1
value: 99.95032290114257
- type: dot_precision
value: 100.0
- type: dot_recall
value: 99.90069513406156
- type: euclidean_accuracy
value: 99.90069513406156
- type: euclidean_ap
value: 100.0
- type: euclidean_f1
value: 99.95032290114257
- type: euclidean_precision
value: 100.0
- type: euclidean_recall
value: 99.90069513406156
- type: manhattan_accuracy
value: 99.90069513406156
- type: manhattan_ap
value: 100.0
- type: manhattan_f1
value: 99.95032290114257
- type: manhattan_precision
value: 100.0
- type: manhattan_recall
value: 99.90069513406156
- type: max_accuracy
value: 99.90069513406156
- type: max_ap
value: 100.0
- type: max_f1
value: 99.95032290114257
- task:
type: PairClassification
dataset:
name: MTEB PawsX (fr)
type: paws-x
config: fr
split: test
revision: 8a04d940a42cd40658986fdd8e3da561533a3646
metrics:
- type: cos_sim_accuracy
value: 70.8
- type: cos_sim_ap
value: 73.7671529695957
- type: cos_sim_f1
value: 68.80964339527875
- type: cos_sim_precision
value: 62.95955882352941
- type: cos_sim_recall
value: 75.85825027685493
- type: dot_accuracy
value: 70.8
- type: dot_ap
value: 73.78345265366947
- type: dot_f1
value: 68.80964339527875
- type: dot_precision
value: 62.95955882352941
- type: dot_recall
value: 75.85825027685493
- type: euclidean_accuracy
value: 70.8
- type: euclidean_ap
value: 73.7671529695957
- type: euclidean_f1
value: 68.80964339527875
- type: euclidean_precision
value: 62.95955882352941
- type: euclidean_recall
value: 75.85825027685493
- type: manhattan_accuracy
value: 70.75
- type: manhattan_ap
value: 73.78996383615953
- type: manhattan_f1
value: 68.79432624113475
- type: manhattan_precision
value: 63.39869281045751
- type: manhattan_recall
value: 75.1937984496124
- type: max_accuracy
value: 70.8
- type: max_ap
value: 73.78996383615953
- type: max_f1
value: 68.80964339527875
- task:
type: STS
dataset:
name: MTEB SICKFr
type: Lajavaness/SICK-fr
config: default
split: test
revision: e077ab4cf4774a1e36d86d593b150422fafd8e8a
metrics:
- type: cos_sim_pearson
value: 84.03253762760392
- type: cos_sim_spearman
value: 79.68280105762004
- type: euclidean_pearson
value: 80.98265050044444
- type: euclidean_spearman
value: 79.68233242682867
- type: manhattan_pearson
value: 80.9678911810704
- type: manhattan_spearman
value: 79.70264097683109
- task:
type: STS
dataset:
name: MTEB STS22 (fr)
type: mteb/sts22-crosslingual-sts
config: fr
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 80.56896987572884
- type: cos_sim_spearman
value: 81.84352499523287
- type: euclidean_pearson
value: 80.40831759421305
- type: euclidean_spearman
value: 81.84352499523287
- type: manhattan_pearson
value: 80.74333857561238
- type: manhattan_spearman
value: 82.41503246733892
- task:
type: STS
dataset:
name: MTEB STSBenchmarkMultilingualSTS (fr)
type: stsb_multi_mt
config: fr
split: test
revision: 93d57ef91790589e3ce9c365164337a8a78b7632
metrics:
- type: cos_sim_pearson
value: 82.71826762276979
- type: cos_sim_spearman
value: 82.25433354916042
- type: euclidean_pearson
value: 81.87115571724316
- type: euclidean_spearman
value: 82.25322342890107
- type: manhattan_pearson
value: 82.11174867527224
- type: manhattan_spearman
value: 82.55905365203084
- task:
type: Summarization
dataset:
name: MTEB SummEvalFr
type: lyon-nlp/summarization-summeval-fr-p2p
config: default
split: test
revision: b385812de6a9577b6f4d0f88c6a6e35395a94054
metrics:
- type: cos_sim_pearson
value: 30.659441623392887
- type: cos_sim_spearman
value: 30.501134097353315
- type: dot_pearson
value: 30.659444768851056
- type: dot_spearman
value: 30.501134097353315
- task:
type: Reranking
dataset:
name: MTEB SyntecReranking
type: lyon-nlp/mteb-fr-reranking-syntec-s2p
config: default
split: test
revision: b205c5084a0934ce8af14338bf03feb19499c84d
metrics:
- type: map
value: 94.03333333333333
- type: mrr
value: 94.03333333333333
- task:
type: Retrieval
dataset:
name: MTEB SyntecRetrieval
type: lyon-nlp/mteb-fr-retrieval-syntec-s2p
config: default
split: test
revision: 77f7e271bf4a92b24fce5119f3486b583ca016ff
metrics:
- type: map_at_1
value: 79.0
- type: map_at_10
value: 87.61
- type: map_at_100
value: 87.655
- type: map_at_1000
value: 87.655
- type: map_at_3
value: 87.167
- type: map_at_5
value: 87.36699999999999
- type: mrr_at_1
value: 79.0
- type: mrr_at_10
value: 87.61
- type: mrr_at_100
value: 87.655
- type: mrr_at_1000
value: 87.655
- type: mrr_at_3
value: 87.167
- type: mrr_at_5
value: 87.36699999999999
- type: ndcg_at_1
value: 79.0
- type: ndcg_at_10
value: 90.473
- type: ndcg_at_100
value: 90.694
- type: ndcg_at_1000
value: 90.694
- type: ndcg_at_3
value: 89.464
- type: ndcg_at_5
value: 89.851
- type: precision_at_1
value: 79.0
- type: precision_at_10
value: 9.9
- type: precision_at_100
value: 1.0
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 32.0
- type: precision_at_5
value: 19.400000000000002
- type: recall_at_1
value: 79.0
- type: recall_at_10
value: 99.0
- type: recall_at_100
value: 100.0
- type: recall_at_1000
value: 100.0
- type: recall_at_3
value: 96.0
- type: recall_at_5
value: 97.0
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (fr)
type: jinaai/xpqa
config: fr
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: map_at_1
value: 39.395
- type: map_at_10
value: 59.123999999999995
- type: map_at_100
value: 60.704
- type: map_at_1000
value: 60.760000000000005
- type: map_at_3
value: 53.187
- type: map_at_5
value: 56.863
- type: mrr_at_1
value: 62.083
- type: mrr_at_10
value: 68.87299999999999
- type: mrr_at_100
value: 69.46900000000001
- type: mrr_at_1000
value: 69.48299999999999
- type: mrr_at_3
value: 66.8
- type: mrr_at_5
value: 67.928
- type: ndcg_at_1
value: 62.083
- type: ndcg_at_10
value: 65.583
- type: ndcg_at_100
value: 70.918
- type: ndcg_at_1000
value: 71.72800000000001
- type: ndcg_at_3
value: 60.428000000000004
- type: ndcg_at_5
value: 61.853
- type: precision_at_1
value: 62.083
- type: precision_at_10
value: 15.033
- type: precision_at_100
value: 1.9529999999999998
- type: precision_at_1000
value: 0.207
- type: precision_at_3
value: 36.315
- type: precision_at_5
value: 25.955000000000002
- type: recall_at_1
value: 39.395
- type: recall_at_10
value: 74.332
- type: recall_at_100
value: 94.729
- type: recall_at_1000
value: 99.75500000000001
- type: recall_at_3
value: 57.679
- type: recall_at_5
value: 65.036
---
## gte-Qwen2-1.5B-instruct
**gte-Qwen2-1.5B-instruct** is the latest model in the gte (General Text Embedding) model family. The model is built on [Qwen2-1.5B](https://huggingface.co/Qwen/Qwen2-1.5B) LLM model and use the same training data and strategies as the [gte-Qwen2-7B-instruct](https://huggingface.co/Alibaba-NLP/gte-Qwen2-7B-instruct) model.
The model incorporates several key advancements:
- Integration of bidirectional attention mechanisms, enriching its contextual understanding.
- Instruction tuning, applied solely on the query side for streamlined efficiency
- Comprehensive training across a vast, multilingual text corpus spanning diverse domains and scenarios. This training leverages both weakly supervised and supervised data, ensuring the model's applicability across numerous languages and a wide array of downstream tasks.
## Model Information
- Model Size: 1.5B
- Embedding Dimension: 1536
- Max Input Tokens: 32k
## Requirements
```
transformers>=4.39.2
flash_attn>=2.5.6
```
## Usage
### Sentence Transformers
```python
from sentence_transformers import SentenceTransformer
model = SentenceTransformer("Alibaba-NLP/gte-Qwen2-1.5B-instruct", trust_remote_code=True)
# In case you want to reduce the maximum length:
model.max_seq_length = 8192
queries = [
"how much protein should a female eat",
"summit define",
]
documents = [
"As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.",
"Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments.",
]
query_embeddings = model.encode(queries, prompt_name="query")
document_embeddings = model.encode(documents)
scores = (query_embeddings @ document_embeddings.T) * 100
print(scores.tolist())
```
Observe the [config_sentence_transformers.json](config_sentence_transformers.json) to see all pre-built prompt names. Otherwise, you can use `model.encode(queries, prompt="Instruct: ...\nQuery: "` to use a custom prompt of your choice.
### Transformers
```python
import torch
import torch.nn.functional as F
from torch import Tensor
from transformers import AutoTokenizer, AutoModel
def last_token_pool(last_hidden_states: Tensor,
attention_mask: Tensor) -> Tensor:
left_padding = (attention_mask[:, -1].sum() == attention_mask.shape[0])
if left_padding:
return last_hidden_states[:, -1]
else:
sequence_lengths = attention_mask.sum(dim=1) - 1
batch_size = last_hidden_states.shape[0]
return last_hidden_states[torch.arange(batch_size, device=last_hidden_states.device), sequence_lengths]
def get_detailed_instruct(task_description: str, query: str) -> str:
return f'Instruct: {task_description}\nQuery: {query}'
# Each query must come with a one-sentence instruction that describes the task
task = 'Given a web search query, retrieve relevant passages that answer the query'
queries = [
get_detailed_instruct(task, 'how much protein should a female eat'),
get_detailed_instruct(task, 'summit define')
]
# No need to add instruction for retrieval documents
documents = [
"As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.",
"Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments."
]
input_texts = queries + documents
tokenizer = AutoTokenizer.from_pretrained('Alibaba-NLP/gte-Qwen2-1.5B-instruct', trust_remote_code=True)
model = AutoModel.from_pretrained('Alibaba-NLP/gte-Qwen2-1.5B-instruct', trust_remote_code=True)
max_length = 8192
# Tokenize the input texts
batch_dict = tokenizer(input_texts, max_length=max_length, padding=True, truncation=True, return_tensors='pt')
outputs = model(**batch_dict)
embeddings = last_token_pool(outputs.last_hidden_state, batch_dict['attention_mask'])
# normalize embeddings
embeddings = F.normalize(embeddings, p=2, dim=1)
scores = (embeddings[:2] @ embeddings[2:].T) * 100
print(scores.tolist())
```
## Evaluation
### MTEB & C-MTEB
You can use the [scripts/eval_mteb.py](https://huggingface.co/Alibaba-NLP/gte-Qwen2-7B-instruct/blob/main/scripts/eval_mteb.py) to reproduce the following result of **gte-Qwen2-1.5B-instruct** on MTEB(English)/C-MTEB(Chinese):
| Model Name | MTEB(56) | C-MTEB(35) | MTEB-fr(26) | MTEB-pl(26) |
|:----:|:---------:|:----------:|:----------:|:----------:|
| [bge-base-en-1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | 64.23 | - | - | - |
| [bge-large-en-1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | 63.55 | - | - | - |
| [gte-large-en-v1.5](https://huggingface.co/Alibaba-NLP/gte-large-en-v1.5) | 65.39 | - | - | - |
| [gte-base-en-v1.5](https://huggingface.co/Alibaba-NLP/gte-large-en-v1.5) | 64.11 | - | - | - |
| [mxbai-embed-large-v1](https://huggingface.co/mixedbread-ai/mxbai-embed-large-v1) | 64.68 | - | - | - |
| [acge_text_embedding](https://huggingface.co/aspire/acge_text_embedding) | - | 69.07 | - | - |
| [stella-mrl-large-zh-v3.5-1792d](https://huggingface.co/infgrad/stella-mrl-large-zh-v3.5-1792d) | - | 68.55 | - | - |
| [gte-large-zh](https://huggingface.co/thenlper/gte-large-zh) | - | 66.72 | - | - |
| [multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) | 59.45 | 56.21 | - | - |
| [multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large) | 61.50 | 58.81 | - | - |
| [e5-mistral-7b-instruct](https://huggingface.co/intfloat/e5-mistral-7b-instruct) | 66.63 | 60.81 | - | - |
| [gte-Qwen1.5-7B-instruct](https://huggingface.co/Alibaba-NLP/gte-Qwen1.5-7B-instruct) | 67.34 | 69.52 | - | - |
| [NV-Embed-v1](https://huggingface.co/nvidia/NV-Embed-v1) | 69.32 | - | - | - |
| [**gte-Qwen2-7B-instruct**](https://huggingface.co/Alibaba-NLP/gte-Qwen2-7B-instruct) | **70.24** | **72.05** | **68.25** | **67.86** |
| [**gte-Qwen2-1.5B-instruct**](https://huggingface.co/Alibaba-NLP/gte-Qwen2-1.5B-instruct) | **67.16** | **67.65** | **66.60** | **64.04** |
### GTE Models
The gte series models have consistently released two types of models: encoder-only models (based on the BERT architecture) and decode-only models (based on the LLM architecture).
| Models | Language | Max Sequence Length | Dimension | Model Size (Memory Usage, fp32) |
|:-------------------------------------------------------------------------------------:|:--------:|:-----: |:---------:|:-------------------------------:|
| [GTE-large-zh](https://huggingface.co/thenlper/gte-large-zh) | Chinese | 512 | 1024 | 1.25GB |
| [GTE-base-zh](https://huggingface.co/thenlper/gte-base-zh) | Chinese | 512 | 512 | 0.41GB |
| [GTE-small-zh](https://huggingface.co/thenlper/gte-small-zh) | Chinese | 512 | 512 | 0.12GB |
| [GTE-large](https://huggingface.co/thenlper/gte-large) | English | 512 | 1024 | 1.25GB |
| [GTE-base](https://huggingface.co/thenlper/gte-base) | English | 512 | 512 | 0.21GB |
| [GTE-small](https://huggingface.co/thenlper/gte-small) | English | 512 | 384 | 0.10GB |
| [GTE-large-en-v1.5](https://huggingface.co/Alibaba-NLP/gte-large-en-v1.5) | English | 8192 | 1024 | 1.74GB |
| [GTE-base-en-v1.5](https://huggingface.co/Alibaba-NLP/gte-base-en-v1.5) | English | 8192 | 768 | 0.51GB |
| [GTE-Qwen1.5-7B-instruct](https://huggingface.co/Alibaba-NLP/gte-Qwen1.5-7B-instruct) | Multilingual | 32000 | 4096 | 26.45GB |
| [GTE-Qwen2-7B-instruct](https://huggingface.co/Alibaba-NLP/gte-Qwen2-7B-instruct) | Multilingual | 32000 | 3584 | 26.45GB |
| [GTE-Qwen2-1.5B-instruct](https://huggingface.co/Alibaba-NLP/gte-Qwen2-1.5B-instruct) | Multilingual | 32000 | 1536 | 6.62GB |
## Cloud API Services
In addition to the open-source [GTE](https://huggingface.co/collections/Alibaba-NLP/gte-models-6680f0b13f885cb431e6d469) series models, GTE series models are also available as commercial API services on Alibaba Cloud.
- [Embedding Models](https://help.aliyun.com/zh/model-studio/developer-reference/general-text-embedding/): Rhree versions of the text embedding models are available: text-embedding-v1/v2/v3, with v3 being the latest API service.
- [ReRank Models](https://help.aliyun.com/zh/model-studio/developer-reference/general-text-sorting-model/): The gte-rerank model service is available.
Note that the models behind the commercial APIs are not entirely identical to the open-source models.
## Citation
If you find our paper or models helpful, please consider cite:
```
@article{li2023towards,
title={Towards general text embeddings with multi-stage contrastive learning},
author={Li, Zehan and Zhang, Xin and Zhang, Yanzhao and Long, Dingkun and Xie, Pengjun and Zhang, Meishan},
journal={arXiv preprint arXiv:2308.03281},
year={2023}
}
```
| [
"SUMMARIZATION"
] | [
"BIOSSES",
"SCIFACT"
] |
EleutherAI/pythia-2.8b-v0 | EleutherAI | text-generation | [
"transformers",
"pytorch",
"safetensors",
"gpt_neox",
"text-generation",
"causal-lm",
"pythia",
"pythia_v0",
"en",
"dataset:the_pile",
"arxiv:2101.00027",
"arxiv:2201.07311",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | 2022-11-20T03:56:10 | 2023-07-10T01:35:41 | 629 | 5 | ---
datasets:
- the_pile
language:
- en
license: apache-2.0
tags:
- pytorch
- causal-lm
- pythia
- pythia_v0
---
The *Pythia Scaling Suite* is a collection of models developed to facilitate
interpretability research. It contains two sets of eight models of sizes
70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two
models: one trained on the Pile, and one trained on the Pile after the dataset
has been globally deduplicated. All 8 model sizes are trained on the exact
same data, in the exact same order. All Pythia models are available
[on Hugging Face](https://huggingface.co/models?other=pythia).
The Pythia model suite was deliberately designed to promote scientific
research on large language models, especially interpretability research.
Despite not centering downstream performance as a design goal, we find the
models <a href="#evaluations">match or exceed</a> the performance of
similar and same-sized models, such as those in the OPT and GPT-Neo suites.
Please note that all models in the *Pythia* suite were renamed in January
2023. For clarity, a <a href="#naming-convention-and-parameter-count">table
comparing the old and new names</a> is provided in this model card, together
with exact parameter counts.
## Pythia-2.8B
### Model Details
- Developed by: [EleutherAI](http://eleuther.ai)
- Model type: Transformer-based Language Model
- Language: English
- Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia)
for training procedure, config files, and details on how to use.
- Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
- License: Apache 2.0
- Contact: to ask questions about this model, join the [EleutherAI
Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`.
Please read the existing *Pythia* documentation before asking about it in the
EleutherAI Discord. For general correspondence: [contact@eleuther.
ai](mailto:[email protected]).
<figure>
| Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models |
| -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: |
| 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — |
| 160M | 85,056,000 | 12 | 768 | 12 | 4M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M |
| 410M | 302,311,424 | 24 | 1024 | 16 | 4M | 3.0 x 10<sup>-4</sup> | OPT-350M |
| 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — |
| 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 4M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B |
| 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B |
| 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B |
| 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — |
<figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and
non-deduped models of a given size have the same hyperparameters. “Equivalent”
models have <b>exactly</b> the same architecture, and the same number of
non-embedding parameters.</figcaption>
</figure>
### Uses and Limitations
#### Intended Use
The primary intended use of Pythia is research on the behavior, functionality,
and limitations of large language models. This suite is intended to provide
a controlled setting for performing scientific experiments. To enable the
study of how language models change over the course of training, we provide
143 evenly spaced intermediate checkpoints per model. These checkpoints are
hosted on Hugging Face as branches. Note that branch `143000` corresponds
exactly to the model checkpoint on the `main` branch of each model.
You may also further fine-tune and adapt Pythia-2.8B for deployment,
as long as your use is in accordance with the Apache 2.0 license. Pythia
models work with the Hugging Face [Transformers
Library](https://huggingface.co/docs/transformers/index). If you decide to use
pre-trained Pythia-2.8B as a basis for your fine-tuned model, please
conduct your own risk and bias assessment.
#### Out-of-scope use
The Pythia Suite is **not** intended for deployment. It is not a in itself
a product and cannot be used for human-facing interactions.
Pythia models are English-language only, and are not suitable for translation
or generating text in other languages.
Pythia-2.8B has not been fine-tuned for downstream contexts in which
language models are commonly deployed, such as writing genre prose,
or commercial chatbots. This means Pythia-2.8B will **not**
respond to a given prompt the way a product like ChatGPT does. This is because,
unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement
Learning from Human Feedback (RLHF) to better “understand” human instructions.
#### Limitations and biases
The core functionality of a large language model is to take a string of text
and predict the next token. The token deemed statistically most likely by the
model need not produce the most “accurate” text. Never rely on
Pythia-2.8B to produce factually accurate output.
This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset
known to contain profanity and texts that are lewd or otherwise offensive.
See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a
discussion of documented biases with regards to gender, religion, and race.
Pythia-2.8B may produce socially unacceptable or undesirable text, *even if*
the prompt itself does not include anything explicitly offensive.
If you plan on using text generated through, for example, the Hosted Inference
API, we recommend having a human curate the outputs of this language model
before presenting it to other people. Please inform your audience that the
text was generated by Pythia-2.8B.
### Quickstart
Pythia models can be loaded and used via the following code, demonstrated here
for the third `pythia-70m-deduped` checkpoint:
```python
from transformers import GPTNeoXForCausalLM, AutoTokenizer
model = GPTNeoXForCausalLM.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
tokenizer = AutoTokenizer.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
inputs = tokenizer("Hello, I am", return_tensors="pt")
tokens = model.generate(**inputs)
tokenizer.decode(tokens[0])
```
Revision/branch `step143000` corresponds exactly to the model checkpoint on
the `main` branch of each model.<br>
For more information on how to use all Pythia models, see [documentation on
GitHub](https://github.com/EleutherAI/pythia).
### Training
#### Training data
[The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in
English. It was created by EleutherAI specifically for training large language
models. It contains texts from 22 diverse sources, roughly broken down into
five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl),
prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and
miscellaneous (e.g. GitHub, Enron Emails). See [the Pile
paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources,
methodology, and a discussion of ethical implications. Consult [the
datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation
about the Pile and its component datasets. The Pile can be downloaded from
the [official website](https://pile.eleuther.ai/), or from a [community
mirror](https://the-eye.eu/public/AI/pile/).<br>
The Pile was **not** deduplicated before being used to train Pythia-2.8B.
#### Training procedure
All models were trained on the exact same data, in the exact same order. Each
model saw 299,892,736,000 tokens during training, and 143 checkpoints for each
model are saved every 2,097,152,000 tokens, spaced evenly throughout training.
This corresponds to training for just under 1 epoch on the Pile for
non-deduplicated models, and about 1.5 epochs on the deduplicated Pile.
All *Pythia* models trained for the equivalent of 143000 steps at a batch size
of 2,097,152 tokens. Two batch sizes were used: 2M and 4M. Models with a batch
size of 4M tokens listed were originally trained for 71500 steps instead, with
checkpoints every 500 steps. The checkpoints on Hugging Face are renamed for
consistency with all 2M batch models, so `step1000` is the first checkpoint
for `pythia-1.4b` that was saved (corresponding to step 500 in training), and
`step1000` is likewise the first `pythia-6.9b` checkpoint that was saved
(corresponding to 1000 “actual” steps).<br>
See [GitHub](https://github.com/EleutherAI/pythia) for more details on training
procedure, including [how to reproduce
it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br>
Pythia uses the same tokenizer as [GPT-NeoX-
20B](https://huggingface.co/EleutherAI/gpt-neox-20b).
### Evaluations
All 16 *Pythia* models were evaluated using the [LM Evaluation
Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access
the results by model and step at `results/json/*` in the [GitHub
repository](https://github.com/EleutherAI/pythia/tree/main/results/json).<br>
Expand the sections below to see plots of evaluation results for all
Pythia and Pythia-deduped models compared with OPT and BLOOM.
<details>
<summary>LAMBADA – OpenAI</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai.png" style="width:auto"/>
</details>
<details>
<summary>Physical Interaction: Question Answering (PIQA)</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa.png" style="width:auto"/>
</details>
<details>
<summary>WinoGrande</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande.png" style="width:auto"/>
</details>
<details>
<summary>AI2 Reasoning Challenge—Challenge Set</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_challenge.png" style="width:auto"/>
</details>
<details>
<summary>SciQ</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq.png" style="width:auto"/>
</details>
### Naming convention and parameter count
*Pythia* models were renamed in January 2023. It is possible that the old
naming convention still persists in some documentation by accident. The
current naming convention (70M, 160M, etc.) is based on total parameter count.
<figure style="width:32em">
| current Pythia suffix | old suffix | total params | non-embedding params |
| --------------------: | ---------: | -------------: | -------------------: |
| 70M | 19M | 70,426,624 | 18,915,328 |
| 160M | 125M | 162,322,944 | 85,056,000 |
| 410M | 350M | 405,334,016 | 302,311,424 |
| 1B | 800M | 1,011,781,632 | 805,736,448 |
| 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 |
| 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 |
| 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 |
| 12B | 13B | 11,846,072,320 | 11,327,027,200 |
</figure> | [
"QUESTION_ANSWERING",
"TRANSLATION"
] | [
"SCIQ"
] |
EleutherAI/pythia-410m-v0 | EleutherAI | text-generation | [
"transformers",
"pytorch",
"safetensors",
"gpt_neox",
"text-generation",
"causal-lm",
"pythia",
"pythia_v0",
"en",
"dataset:the_pile",
"arxiv:2101.00027",
"arxiv:2201.07311",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | 2022-10-16T18:39:39 | 2023-07-10T01:34:52 | 623 | 7 | ---
datasets:
- the_pile
language:
- en
license: apache-2.0
tags:
- pytorch
- causal-lm
- pythia
- pythia_v0
---
The *Pythia Scaling Suite* is a collection of models developed to facilitate
interpretability research. It contains two sets of eight models of sizes
70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two
models: one trained on the Pile, and one trained on the Pile after the dataset
has been globally deduplicated. All 8 model sizes are trained on the exact
same data, in the exact same order. All Pythia models are available
[on Hugging Face](https://huggingface.co/models?other=pythia).
The Pythia model suite was deliberately designed to promote scientific
research on large language models, especially interpretability research.
Despite not centering downstream performance as a design goal, we find the
models <a href="#evaluations">match or exceed</a> the performance of
similar and same-sized models, such as those in the OPT and GPT-Neo suites.
Please note that all models in the *Pythia* suite were renamed in January
2023. For clarity, a <a href="#naming-convention-and-parameter-count">table
comparing the old and new names</a> is provided in this model card, together
with exact parameter counts.
## Pythia-410M
### Model Details
- Developed by: [EleutherAI](http://eleuther.ai)
- Model type: Transformer-based Language Model
- Language: English
- Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia)
for training procedure, config files, and details on how to use.
- Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
- License: Apache 2.0
- Contact: to ask questions about this model, join the [EleutherAI
Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`.
Please read the existing *Pythia* documentation before asking about it in the
EleutherAI Discord. For general correspondence: [contact@eleuther.
ai](mailto:[email protected]).
<figure>
| Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models |
| -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: |
| 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — |
| 160M | 85,056,000 | 12 | 768 | 12 | 4M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M |
| 410M | 302,311,424 | 24 | 1024 | 16 | 4M | 3.0 x 10<sup>-4</sup> | OPT-350M |
| 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — |
| 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 4M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B |
| 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B |
| 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B |
| 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — |
<figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and
non-deduped models of a given size have the same hyperparameters. “Equivalent”
models have <b>exactly</b> the same architecture, and the same number of
non-embedding parameters.</figcaption>
</figure>
### Uses and Limitations
#### Intended Use
The primary intended use of Pythia is research on the behavior, functionality,
and limitations of large language models. This suite is intended to provide
a controlled setting for performing scientific experiments. To enable the
study of how language models change over the course of training, we provide
143 evenly spaced intermediate checkpoints per model. These checkpoints are
hosted on Hugging Face as branches. Note that branch `143000` corresponds
exactly to the model checkpoint on the `main` branch of each model.
You may also further fine-tune and adapt Pythia-410M for deployment,
as long as your use is in accordance with the Apache 2.0 license. Pythia
models work with the Hugging Face [Transformers
Library](https://huggingface.co/docs/transformers/index). If you decide to use
pre-trained Pythia-410M as a basis for your fine-tuned model, please
conduct your own risk and bias assessment.
#### Out-of-scope use
The Pythia Suite is **not** intended for deployment. It is not a in itself
a product and cannot be used for human-facing interactions.
Pythia models are English-language only, and are not suitable for translation
or generating text in other languages.
Pythia-410M has not been fine-tuned for downstream contexts in which
language models are commonly deployed, such as writing genre prose,
or commercial chatbots. This means Pythia-410M will **not**
respond to a given prompt the way a product like ChatGPT does. This is because,
unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement
Learning from Human Feedback (RLHF) to better “understand” human instructions.
#### Limitations and biases
The core functionality of a large language model is to take a string of text
and predict the next token. The token deemed statistically most likely by the
model need not produce the most “accurate” text. Never rely on
Pythia-410M to produce factually accurate output.
This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset
known to contain profanity and texts that are lewd or otherwise offensive.
See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a
discussion of documented biases with regards to gender, religion, and race.
Pythia-410M may produce socially unacceptable or undesirable text, *even if*
the prompt itself does not include anything explicitly offensive.
If you plan on using text generated through, for example, the Hosted Inference
API, we recommend having a human curate the outputs of this language model
before presenting it to other people. Please inform your audience that the
text was generated by Pythia-410M.
### Quickstart
Pythia models can be loaded and used via the following code, demonstrated here
for the third `pythia-70m-deduped` checkpoint:
```python
from transformers import GPTNeoXForCausalLM, AutoTokenizer
model = GPTNeoXForCausalLM.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
tokenizer = AutoTokenizer.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
inputs = tokenizer("Hello, I am", return_tensors="pt")
tokens = model.generate(**inputs)
tokenizer.decode(tokens[0])
```
Revision/branch `step143000` corresponds exactly to the model checkpoint on
the `main` branch of each model.<br>
For more information on how to use all Pythia models, see [documentation on
GitHub](https://github.com/EleutherAI/pythia).
### Training
#### Training data
[The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in
English. It was created by EleutherAI specifically for training large language
models. It contains texts from 22 diverse sources, roughly broken down into
five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl),
prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and
miscellaneous (e.g. GitHub, Enron Emails). See [the Pile
paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources,
methodology, and a discussion of ethical implications. Consult [the
datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation
about the Pile and its component datasets. The Pile can be downloaded from
the [official website](https://pile.eleuther.ai/), or from a [community
mirror](https://the-eye.eu/public/AI/pile/).<br>
The Pile was **not** deduplicated before being used to train Pythia-410M.
#### Training procedure
All models were trained on the exact same data, in the exact same order. Each
model saw 299,892,736,000 tokens during training, and 143 checkpoints for each
model are saved every 2,097,152,000 tokens, spaced evenly throughout training.
This corresponds to training for just under 1 epoch on the Pile for
non-deduplicated models, and about 1.5 epochs on the deduplicated Pile.
All *Pythia* models trained for the equivalent of 143000 steps at a batch size
of 2,097,152 tokens. Two batch sizes were used: 2M and 4M. Models with a batch
size of 4M tokens listed were originally trained for 71500 steps instead, with
checkpoints every 500 steps. The checkpoints on Hugging Face are renamed for
consistency with all 2M batch models, so `step1000` is the first checkpoint
for `pythia-1.4b` that was saved (corresponding to step 500 in training), and
`step1000` is likewise the first `pythia-6.9b` checkpoint that was saved
(corresponding to 1000 “actual” steps).<br>
See [GitHub](https://github.com/EleutherAI/pythia) for more details on training
procedure, including [how to reproduce
it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br>
Pythia uses the same tokenizer as [GPT-NeoX-
20B](https://huggingface.co/EleutherAI/gpt-neox-20b).
### Evaluations
All 16 *Pythia* models were evaluated using the [LM Evaluation
Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access
the results by model and step at `results/json/*` in the [GitHub
repository](https://github.com/EleutherAI/pythia/tree/main/results/json).<br>
Expand the sections below to see plots of evaluation results for all
Pythia and Pythia-deduped models compared with OPT and BLOOM.
<details>
<summary>LAMBADA – OpenAI</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai.png" style="width:auto"/>
</details>
<details>
<summary>Physical Interaction: Question Answering (PIQA)</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa.png" style="width:auto"/>
</details>
<details>
<summary>WinoGrande</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande.png" style="width:auto"/>
</details>
<details>
<summary>AI2 Reasoning Challenge—Challenge Set</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_challenge.png" style="width:auto"/>
</details>
<details>
<summary>SciQ</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq.png" style="width:auto"/>
</details>
### Naming convention and parameter count
*Pythia* models were renamed in January 2023. It is possible that the old
naming convention still persists in some documentation by accident. The
current naming convention (70M, 160M, etc.) is based on total parameter count.
<figure style="width:32em">
| current Pythia suffix | old suffix | total params | non-embedding params |
| --------------------: | ---------: | -------------: | -------------------: |
| 70M | 19M | 70,426,624 | 18,915,328 |
| 160M | 125M | 162,322,944 | 85,056,000 |
| 410M | 350M | 405,334,016 | 302,311,424 |
| 1B | 800M | 1,011,781,632 | 805,736,448 |
| 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 |
| 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 |
| 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 |
| 12B | 13B | 11,846,072,320 | 11,327,027,200 |
</figure> | [
"QUESTION_ANSWERING",
"TRANSLATION"
] | [
"SCIQ"
] |
IVN-RIN/bioBIT | IVN-RIN | fill-mask | [
"transformers",
"pytorch",
"safetensors",
"bert",
"fill-mask",
"Biomedical Language Modeling",
"it",
"dataset:IVN-RIN/BioBERT_Italian",
"arxiv:1901.08746",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-12-03T11:17:50 | 2024-05-24T11:57:03 | 623 | 1 | ---
datasets:
- IVN-RIN/BioBERT_Italian
language:
- it
tags:
- Biomedical Language Modeling
widget:
- text: L'asma allergica è una patologia dell'[MASK] respiratorio causata dalla presenza
di allergeni responsabili dell'infiammazione dell'albero bronchiale.
example_title: Example 1
- text: Il pancreas produce diversi [MASK] molto importanti tra i quali l'insulina
e il glucagone.
example_title: Example 2
- text: Il GABA è un amminoacido ed è il principale neurotrasmettitore inibitorio
del [MASK].
example_title: Example 3
---
🤗 + 📚🩺🇮🇹 = **BioBIT**
From this repository you can download the **BioBIT** (Biomedical Bert for ITalian) checkpoint.
**BioBIT** stems from [Italian XXL BERT](https://huggingface.co/dbmdz/bert-base-italian-xxl-cased), obtained from a recent Wikipedia dump and various texts in Italian from the OPUS and OSCAR corpora collection, which sums up to the final corpus size of 81 GB and 13B tokens.
To pretrain **BioBIT**, we followed the general approach outlined in [BioBERT paper](https://arxiv.org/abs/1901.08746), built on the foundation of the BERT architecture. The pretraining objective is a combination of **MLM** (Masked Language Modelling) and **NSP** (Next Sentence Prediction). The MLM objective is based on randomly masking 15% of the input sequence, trying then to predict the missing tokens; for the NSP objective, instead, the model is given a couple of sentences and has to guess if the second comes after the first in the original document.
Due to the unavailability of an Italian equivalent for the millions of abstracts and full-text scientific papers used by English, BERT-based biomedical models, in this work we leveraged machine translation to obtain an Italian biomedical corpus based on PubMed abstracts and train **BioBIT**. More details in the paper.
**BioBIT** has been evaluated on 3 downstream tasks: **NER** (Named Entity Recognition), extractive **QA** (Question Answering), **RE** (Relation Extraction).
Here are the results, summarized:
- NER:
- [BC2GM](http://refhub.elsevier.com/S1532-0464(23)00152-1/sb32) = 82.14%
- [BC4CHEMD](http://refhub.elsevier.com/S1532-0464(23)00152-1/sb35) = 80.70%
- [BC5CDR(CDR)](http://refhub.elsevier.com/S1532-0464(23)00152-1/sb31) = 82.15%
- [BC5CDR(DNER)](http://refhub.elsevier.com/S1532-0464(23)00152-1/sb31) = 76.27%
- [NCBI_DISEASE](http://refhub.elsevier.com/S1532-0464(23)00152-1/sb33) = 65.06%
- [SPECIES-800](http://refhub.elsevier.com/S1532-0464(23)00152-1/sb34) = 61.86%
- QA:
- [BioASQ 4b](http://refhub.elsevier.com/S1532-0464(23)00152-1/sb30) = 68.49%
- [BioASQ 5b](http://refhub.elsevier.com/S1532-0464(23)00152-1/sb30) = 78.33%
- [BioASQ 6b](http://refhub.elsevier.com/S1532-0464(23)00152-1/sb30) = 75.73%
- RE:
- [CHEMPROT](http://refhub.elsevier.com/S1532-0464(23)00152-1/sb36) = 38.16%
- [BioRED](http://refhub.elsevier.com/S1532-0464(23)00152-1/sb37) = 67.15%
[Check the full paper](https://www.sciencedirect.com/science/article/pii/S1532046423001521) for further details, and feel free to contact us if you have some inquiry! | [
"NAMED_ENTITY_RECOGNITION",
"RELATION_EXTRACTION",
"QUESTION_ANSWERING",
"TRANSLATION"
] | [
"BC5CDR",
"BIORED",
"CHEMPROT",
"NCBI DISEASE"
] |
RichardErkhov/tifa-benchmark_-_llama2_tifa_question_generation-gguf | RichardErkhov | null | [
"gguf",
"arxiv:2303.11897",
"endpoints_compatible",
"region:us"
] | 2024-08-26T16:48:05 | 2024-08-26T19:23:32 | 615 | 0 | ---
{}
---
Quantization made by Richard Erkhov.
[Github](https://github.com/RichardErkhov)
[Discord](https://discord.gg/pvy7H8DZMG)
[Request more models](https://github.com/RichardErkhov/quant_request)
llama2_tifa_question_generation - GGUF
- Model creator: https://huggingface.co/tifa-benchmark/
- Original model: https://huggingface.co/tifa-benchmark/llama2_tifa_question_generation/
| Name | Quant method | Size |
| ---- | ---- | ---- |
| [llama2_tifa_question_generation.Q2_K.gguf](https://huggingface.co/RichardErkhov/tifa-benchmark_-_llama2_tifa_question_generation-gguf/blob/main/llama2_tifa_question_generation.Q2_K.gguf) | Q2_K | 2.36GB |
| [llama2_tifa_question_generation.IQ3_XS.gguf](https://huggingface.co/RichardErkhov/tifa-benchmark_-_llama2_tifa_question_generation-gguf/blob/main/llama2_tifa_question_generation.IQ3_XS.gguf) | IQ3_XS | 2.6GB |
| [llama2_tifa_question_generation.IQ3_S.gguf](https://huggingface.co/RichardErkhov/tifa-benchmark_-_llama2_tifa_question_generation-gguf/blob/main/llama2_tifa_question_generation.IQ3_S.gguf) | IQ3_S | 2.75GB |
| [llama2_tifa_question_generation.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/tifa-benchmark_-_llama2_tifa_question_generation-gguf/blob/main/llama2_tifa_question_generation.Q3_K_S.gguf) | Q3_K_S | 2.75GB |
| [llama2_tifa_question_generation.IQ3_M.gguf](https://huggingface.co/RichardErkhov/tifa-benchmark_-_llama2_tifa_question_generation-gguf/blob/main/llama2_tifa_question_generation.IQ3_M.gguf) | IQ3_M | 2.9GB |
| [llama2_tifa_question_generation.Q3_K.gguf](https://huggingface.co/RichardErkhov/tifa-benchmark_-_llama2_tifa_question_generation-gguf/blob/main/llama2_tifa_question_generation.Q3_K.gguf) | Q3_K | 3.07GB |
| [llama2_tifa_question_generation.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/tifa-benchmark_-_llama2_tifa_question_generation-gguf/blob/main/llama2_tifa_question_generation.Q3_K_M.gguf) | Q3_K_M | 3.07GB |
| [llama2_tifa_question_generation.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/tifa-benchmark_-_llama2_tifa_question_generation-gguf/blob/main/llama2_tifa_question_generation.Q3_K_L.gguf) | Q3_K_L | 3.35GB |
| [llama2_tifa_question_generation.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/tifa-benchmark_-_llama2_tifa_question_generation-gguf/blob/main/llama2_tifa_question_generation.IQ4_XS.gguf) | IQ4_XS | 3.4GB |
| [llama2_tifa_question_generation.Q4_0.gguf](https://huggingface.co/RichardErkhov/tifa-benchmark_-_llama2_tifa_question_generation-gguf/blob/main/llama2_tifa_question_generation.Q4_0.gguf) | Q4_0 | 3.56GB |
| [llama2_tifa_question_generation.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/tifa-benchmark_-_llama2_tifa_question_generation-gguf/blob/main/llama2_tifa_question_generation.IQ4_NL.gguf) | IQ4_NL | 3.58GB |
| [llama2_tifa_question_generation.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/tifa-benchmark_-_llama2_tifa_question_generation-gguf/blob/main/llama2_tifa_question_generation.Q4_K_S.gguf) | Q4_K_S | 3.59GB |
| [llama2_tifa_question_generation.Q4_K.gguf](https://huggingface.co/RichardErkhov/tifa-benchmark_-_llama2_tifa_question_generation-gguf/blob/main/llama2_tifa_question_generation.Q4_K.gguf) | Q4_K | 3.8GB |
| [llama2_tifa_question_generation.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/tifa-benchmark_-_llama2_tifa_question_generation-gguf/blob/main/llama2_tifa_question_generation.Q4_K_M.gguf) | Q4_K_M | 3.8GB |
| [llama2_tifa_question_generation.Q4_1.gguf](https://huggingface.co/RichardErkhov/tifa-benchmark_-_llama2_tifa_question_generation-gguf/blob/main/llama2_tifa_question_generation.Q4_1.gguf) | Q4_1 | 3.95GB |
| [llama2_tifa_question_generation.Q5_0.gguf](https://huggingface.co/RichardErkhov/tifa-benchmark_-_llama2_tifa_question_generation-gguf/blob/main/llama2_tifa_question_generation.Q5_0.gguf) | Q5_0 | 4.33GB |
| [llama2_tifa_question_generation.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/tifa-benchmark_-_llama2_tifa_question_generation-gguf/blob/main/llama2_tifa_question_generation.Q5_K_S.gguf) | Q5_K_S | 4.33GB |
| [llama2_tifa_question_generation.Q5_K.gguf](https://huggingface.co/RichardErkhov/tifa-benchmark_-_llama2_tifa_question_generation-gguf/blob/main/llama2_tifa_question_generation.Q5_K.gguf) | Q5_K | 4.45GB |
| [llama2_tifa_question_generation.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/tifa-benchmark_-_llama2_tifa_question_generation-gguf/blob/main/llama2_tifa_question_generation.Q5_K_M.gguf) | Q5_K_M | 4.45GB |
| [llama2_tifa_question_generation.Q5_1.gguf](https://huggingface.co/RichardErkhov/tifa-benchmark_-_llama2_tifa_question_generation-gguf/blob/main/llama2_tifa_question_generation.Q5_1.gguf) | Q5_1 | 4.72GB |
| [llama2_tifa_question_generation.Q6_K.gguf](https://huggingface.co/RichardErkhov/tifa-benchmark_-_llama2_tifa_question_generation-gguf/blob/main/llama2_tifa_question_generation.Q6_K.gguf) | Q6_K | 5.15GB |
| [llama2_tifa_question_generation.Q8_0.gguf](https://huggingface.co/RichardErkhov/tifa-benchmark_-_llama2_tifa_question_generation-gguf/blob/main/llama2_tifa_question_generation.Q8_0.gguf) | Q8_0 | 6.67GB |
Original model description:
---
license: apache-2.0
inference: true
widget:
- text: "<s>[INST] <<SYS>>\nGiven an image description, generate one or two multiple-choice questions that verifies if the image description is correct.\nClassify each concept into a type (object, human, animal, food, activity, attribute, counting, color, material, spatial, location, shape, other), and then generate a question for each type.\n\n<</SYS>>\n\nDescription: a blue rabbit and a red plane [/INST] Entities:"
pipeline_tag: text-generation
tags:
- text-generation-inference
- llama2
- text-to-image
datasets:
- TIFA
language:
- en
---
Project page: <https://tifa-benchmark.github.io/>
This is the text parsing and question generation model for the ICCV 2023 paper [TIFA: Accurate and Interpretable Text-to-Image Faithfulness Evaluation with Question Answering](https://arxiv.org/abs/2303.11897)
We introduce TIFA (Text-to-Image Faithfulness evaluation with question Answering), an automatic evaluation metric that measures the faithfulness of a generated image to its text input via visual question answering (VQA). Specifically, given a text input, we automatically generate several question-answer pairs using a language model. We calculate image faithfulness by checking whether existing VQA models can answer these questions using the generated image.
Specifically, this fine-tuned LLaMA 2 model is the substitute for the GPT-3 model in the paper. It can parse an arbitrary prompt into visual entities, attributes, relations, etc. and generate question-answer tuples for each of them. See examples below.
# QuickStart
All codes are from <https://github.com/Yushi-Hu/tifa>. Clone this repo to easily use this model together with other modules (e.g. VQA) provided in TIFA.
Please follow the prompt format, which will give the best performance.
```python
import torch
import transformers
# prepare the LLaMA 2 model
model_name = "tifa-benchmark/llama2_tifa_question_generation"
pipeline = transformers.pipeline(
"text-generation",
model=model_name,
torch_dtype=torch.float16,
device_map="auto",
)
# formating prompt following LLaMA 2 style
def create_qg_prompt(caption):
INTRO_BLURB = "Given an image description, generate one or two multiple-choice questions that verifies if the image description is correct.\nClassify each concept into a type (object, human, animal, food, activity, attribute, counting, color, material, spatial, location, shape, other), and then generate a question for each type.\n"
formated_prompt = f"<s>[INST] <<SYS>>\n{INTRO_BLURB}\n<</SYS>>\n\n"
formated_prompt += f"Description: {caption} [/INST] Entities:"
return formated_prompt
test_caption = "a blue rabbit and a red plane"
# create prompt
prompt = create_qg_prompt(text_caption)
# text completion
sequences = pipeline(
prompt, do_sample=False, num_beams=5, num_return_sequences=1, max_length=512)
output = sequences[0]['generated_text'][len(prompt):]
output = output.split('\n\n')[0]
# output
print(output)
#### Expected output ###
# rabbit, plane
# Activites:
# Colors: blue, red
# Counting:
# Other attributes:
# About rabbit (animal):
# Q: is this a rabbit?
# Choices: yes, no
# A: yes
# About rabbit (animal):
# Q: what animal is in the picture?
# Choices: rabbit, dog, cat, fish
# A: rabbit
# About plane (object):
# Q: is this a plane?
# Choices: yes, no
# A: yes
# About plane (object):
# Q: what type of vehicle is this?
# Choices: plane, car, motorcycle, bus
# A: plane
# About blue (color):
# Q: is the rabbit blue?
# Choices: yes, no
# A: yes
# About blue (color):
# Q: what color is the rabbit?
# Choices: blue, red, yellow, green
# A: blue
# About red (color):
# Q: is the plane red?
# Choices: yes, no
# A: yes
# About red (color):
# Q: what color is the plane?
# Choices: red, blue, yellow, green
# A: red
```
# Use this LM under tifascore package
tifascore provides extra functions to parse this output etc. First install tifascore according to <https://github.com/Yushi-Hu/tifa>. Then the usage is below
```python
from tifascore import get_llama2_pipeline, get_llama2_question_and_answers
pipeline = get_llama2_pipeline("tifa-benchmark/llama2_tifa_question_generation")
print(get_llama2_question_and_answers(pipeline, "a blue rabbit and a red plane"))
#### Expected output ###
# [{'caption': 'a blue rabbit and a red plane', 'element': 'rabbit', 'question': 'what animal is in the picture?', 'choices': ['rabbit', 'dog', 'cat', 'fish'], 'answer': 'rabbit', 'element_type': 'animal/human'}, {'caption': 'a blue rabbit and a red plane', 'element': 'plane', 'question': 'is this a plane?', 'choices': ['yes', 'no'], 'answer': 'yes', 'element_type': 'object'}, {'caption': 'a blue rabbit and a red plane', 'element': 'plane', 'question': 'what type of vehicle is this?', 'choices': ['plane', 'car', 'motorcycle', 'bus'], 'answer': 'plane', 'element_type': 'object'}, {'caption': 'a blue rabbit and a red plane', 'element': 'blue', 'question': 'is the rabbit blue?', 'choices': ['yes', 'no'], 'answer': 'yes', 'element_type': 'color'}, {'caption': 'a blue rabbit and a red plane', 'element': 'blue', 'question': 'what color is the rabbit?', 'choices': ['blue', 'red', 'yellow', 'green'], 'answer': 'blue', 'element_type': 'color'}, {'caption': 'a blue rabbit and a red plane', 'element': 'red', 'question': 'is the plane red?', 'choices': ['yes', 'no'], 'answer': 'yes', 'element_type': 'color'}, {'caption': 'a blue rabbit and a red plane', 'element': 'red', 'question': 'what color is the plane?', 'choices': ['red', 'blue', 'yellow', 'green'], 'answer': 'red', 'element_type': 'color'}]
```
## Bibtex
```
@article{hu2023tifa,
title={Tifa: Accurate and interpretable text-to-image faithfulness evaluation with question answering},
author={Hu, Yushi and Liu, Benlin and Kasai, Jungo and Wang, Yizhong and Ostendorf, Mari and Krishna, Ranjay and Smith, Noah A},
journal={arXiv preprint arXiv:2303.11897},
year={2023}
}
```
| [
"QUESTION_ANSWERING"
] | [
"BLURB"
] |
EleutherAI/pythia-70m-deduped-v0 | EleutherAI | text-generation | [
"transformers",
"pytorch",
"safetensors",
"gpt_neox",
"text-generation",
"causal-lm",
"pythia",
"pythia_v0",
"en",
"dataset:EleutherAI/the_pile_deduplicated",
"arxiv:2101.00027",
"arxiv:2201.07311",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | 2022-11-01T00:24:53 | 2023-07-10T01:32:46 | 613 | 8 | ---
datasets:
- EleutherAI/the_pile_deduplicated
language:
- en
license: apache-2.0
tags:
- pytorch
- causal-lm
- pythia
- pythia_v0
---
The *Pythia Scaling Suite* is a collection of models developed to facilitate
interpretability research. It contains two sets of eight models of sizes
70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two
models: one trained on the Pile, and one trained on the Pile after the dataset
has been globally deduplicated. All 8 model sizes are trained on the exact
same data, in the exact same order. All Pythia models are available
[on Hugging Face](https://huggingface.co/models?other=pythia).
The Pythia model suite was deliberately designed to promote scientific
research on large language models, especially interpretability research.
Despite not centering downstream performance as a design goal, we find the
models <a href="#evaluations">match or exceed</a> the performance of
similar and same-sized models, such as those in the OPT and GPT-Neo suites.
Please note that all models in the *Pythia* suite were renamed in January
2023. For clarity, a <a href="#naming-convention-and-parameter-count">table
comparing the old and new names</a> is provided in this model card, together
with exact parameter counts.
## Pythia-70M-deduped
### Model Details
- Developed by: [EleutherAI](http://eleuther.ai)
- Model type: Transformer-based Language Model
- Language: English
- Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia)
for training procedure, config files, and details on how to use.
- Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
- License: Apache 2.0
- Contact: to ask questions about this model, join the [EleutherAI
Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`.
Please read the existing *Pythia* documentation before asking about it in the
EleutherAI Discord. For general correspondence: [contact@eleuther.
ai](mailto:[email protected]).
<figure>
| Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models |
| -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: |
| 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — |
| 160M | 85,056,000 | 12 | 768 | 12 | 4M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M |
| 410M | 302,311,424 | 24 | 1024 | 16 | 4M | 3.0 x 10<sup>-4</sup> | OPT-350M |
| 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — |
| 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 4M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B |
| 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B |
| 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B |
| 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — |
<figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and
non-deduped models of a given size have the same hyperparameters. “Equivalent”
models have <b>exactly</b> the same architecture, and the same number of
non-embedding parameters.</figcaption>
</figure>
### Uses and Limitations
#### Intended Use
The primary intended use of Pythia is research on the behavior, functionality,
and limitations of large language models. This suite is intended to provide
a controlled setting for performing scientific experiments. To enable the
study of how language models change in the course of training, we provide
143 evenly spaced intermediate checkpoints per model. These checkpoints are
hosted on Hugging Face as branches. Note that branch `143000` corresponds
exactly to the model checkpoint on the `main` branch of each model.
You may also further fine-tune and adapt Pythia-70M-deduped for deployment,
as long as your use is in accordance with the Apache 2.0 license. Pythia
models work with the Hugging Face [Transformers
Library](https://huggingface.co/docs/transformers/index). If you decide to use
pre-trained Pythia-70M-deduped as a basis for your fine-tuned model, please
conduct your own risk and bias assessment.
#### Out-of-scope use
The Pythia Suite is **not** intended for deployment. It is not a in itself
a product and cannot be used for human-facing interactions.
Pythia models are English-language only, and are not suitable for translation
or generating text in other languages.
Pythia-70M-deduped has not been fine-tuned for downstream contexts in which
language models are commonly deployed, such as writing genre prose,
or commercial chatbots. This means Pythia-70M-deduped will **not**
respond to a given prompt the way a product like ChatGPT does. This is because,
unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement
Learning from Human Feedback (RLHF) to better “understand” human instructions.
#### Limitations and biases
The core functionality of a large language model is to take a string of text
and predict the next token. The token deemed statistically most likely by the
model need not produce the most “accurate” text. Never rely on
Pythia-70M-deduped to produce factually accurate output.
This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset
known to contain profanity and texts that are lewd or otherwise offensive.
See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a
discussion of documented biases with regards to gender, religion, and race.
Pythia-70M-deduped may produce socially unacceptable or undesirable text,
*even if* the prompt itself does not include anything explicitly offensive.
If you plan on using text generated through, for example, the Hosted Inference
API, we recommend having a human curate the outputs of this language model
before presenting it to other people. Please inform your audience that the
text was generated by Pythia-70M-deduped.
### Quickstart
Pythia models can be loaded and used via the following code, demonstrated here
for the third `pythia-70m-deduped` checkpoint:
```python
from transformers import GPTNeoXForCausalLM, AutoTokenizer
model = GPTNeoXForCausalLM.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
tokenizer = AutoTokenizer.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
inputs = tokenizer("Hello, I am", return_tensors="pt")
tokens = model.generate(**inputs)
tokenizer.decode(tokens[0])
```
Revision/branch `step143000` corresponds exactly to the model checkpoint on
the `main` branch of each model.<br>
For more information on how to use all Pythia models, see [documentation on
GitHub](https://github.com/EleutherAI/pythia).
### Training
#### Training data
Pythia-70M-deduped was trained on the Pile **after the dataset has been
globally deduplicated**.<br>
[The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in
English. It was created by EleutherAI specifically for training large language
models. It contains texts from 22 diverse sources, roughly broken down into
five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl),
prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and
miscellaneous (e.g. GitHub, Enron Emails). See [the Pile
paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources,
methodology, and a discussion of ethical implications. Consult [the
datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation
about the Pile and its component datasets. The Pile can be downloaded from
the [official website](https://pile.eleuther.ai/), or from a [community
mirror](https://the-eye.eu/public/AI/pile/).
#### Training procedure
All models were trained on the exact same data, in the exact same order. Each
model saw 299,892,736,000 tokens during training, and 143 checkpoints for each
model are saved every 2,097,152,000 tokens, spaced evenly throughout training.
This corresponds to training for just under 1 epoch on the Pile for
non-deduplicated models, and about 1.5 epochs on the deduplicated Pile.
All *Pythia* models trained for the equivalent of 143000 steps at a batch size
of 2,097,152 tokens. Two batch sizes were used: 2M and 4M. Models with a batch
size of 4M tokens listed were originally trained for 71500 steps instead, with
checkpoints every 500 steps. The checkpoints on Hugging Face are renamed for
consistency with all 2M batch models, so `step1000` is the first checkpoint
for `pythia-1.4b` that was saved (corresponding to step 500 in training), and
`step1000` is likewise the first `pythia-6.9b` checkpoint that was saved
(corresponding to 1000 “actual” steps).<br>
See [GitHub](https://github.com/EleutherAI/pythia) for more details on training
procedure, including [how to reproduce
it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br>
Pythia uses the same tokenizer as [GPT-NeoX-
20B](https://huggingface.co/EleutherAI/gpt-neox-20b).
### Evaluations
All 16 *Pythia* models were evaluated using the [LM Evaluation
Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access
the results by model and step at `results/json/*` in the [GitHub
repository](https://github.com/EleutherAI/pythia/tree/main/results/json).<br>
Expand the sections below to see plots of evaluation results for all
Pythia and Pythia-deduped models compared with OPT and BLOOM.
<details>
<summary>LAMBADA – OpenAI</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai.png" style="width:auto"/>
</details>
<details>
<summary>Physical Interaction: Question Answering (PIQA)</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa.png" style="width:auto"/>
</details>
<details>
<summary>WinoGrande</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande.png" style="width:auto"/>
</details>
<details>
<summary>AI2 Reasoning Challenge – Challenge Set</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_challenge.png" style="width:auto"/>
</details>
<details>
<summary>SciQ</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq.png" style="width:auto"/>
</details>
### Naming convention and parameter count
*Pythia* models were renamed in January 2023. It is possible that the old
naming convention still persists in some documentation by accident. The
current naming convention (70M, 160M, etc.) is based on total parameter count.
<figure style="width:32em">
| current Pythia suffix | old suffix | total params | non-embedding params |
| --------------------: | ---------: | -------------: | -------------------: |
| 70M | 19M | 70,426,624 | 18,915,328 |
| 160M | 125M | 162,322,944 | 85,056,000 |
| 410M | 350M | 405,334,016 | 302,311,424 |
| 1B | 800M | 1,011,781,632 | 805,736,448 |
| 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 |
| 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 |
| 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 |
| 12B | 13B | 11,846,072,320 | 11,327,027,200 |
</figure> | [
"QUESTION_ANSWERING",
"TRANSLATION"
] | [
"SCIQ"
] |
EleutherAI/pythia-160m-v0 | EleutherAI | text-generation | [
"transformers",
"pytorch",
"safetensors",
"gpt_neox",
"text-generation",
"causal-lm",
"pythia",
"pythia_v0",
"en",
"dataset:the_pile",
"arxiv:2101.00027",
"arxiv:2201.07311",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | 2022-10-16T17:40:11 | 2023-07-09T16:03:26 | 596 | 8 | ---
datasets:
- the_pile
language:
- en
license: apache-2.0
tags:
- pytorch
- causal-lm
- pythia
- pythia_v0
---
The *Pythia Scaling Suite* is a collection of models developed to facilitate
interpretability research. It contains two sets of eight models of sizes
70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two
models: one trained on the Pile, and one trained on the Pile after the dataset
has been globally deduplicated. All 8 model sizes are trained on the exact
same data, in the exact same order. All Pythia models are available
[on Hugging Face](https://huggingface.co/models?other=pythia).
The Pythia model suite was deliberately designed to promote scientific
research on large language models, especially interpretability research.
Despite not centering downstream performance as a design goal, we find the
models <a href="#evaluations">match or exceed</a> the performance of
similar and same-sized models, such as those in the OPT and GPT-Neo suites.
Please note that all models in the *Pythia* suite were renamed in January
2023. For clarity, a <a href="#naming-convention-and-parameter-count">table
comparing the old and new names</a> is provided in this model card, together
with exact parameter counts.
## Pythia-160M
### Model Details
- Developed by: [EleutherAI](http://eleuther.ai)
- Model type: Transformer-based Language Model
- Language: English
- Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia)
for training procedure, config files, and details on how to use.
- Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
- License: Apache 2.0
- Contact: to ask questions about this model, join the [EleutherAI
Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`.
Please read the existing *Pythia* documentation before asking about it in the
EleutherAI Discord. For general correspondence: [contact@eleuther.
ai](mailto:[email protected]).
<figure>
| Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models |
| -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: |
| 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — |
| 160M | 85,056,000 | 12 | 768 | 12 | 4M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M |
| 410M | 302,311,424 | 24 | 1024 | 16 | 4M | 3.0 x 10<sup>-4</sup> | OPT-350M |
| 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — |
| 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 4M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B |
| 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B |
| 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B |
| 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — |
<figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and
non-deduped models of a given size have the same hyperparameters. “Equivalent”
models have <b>exactly</b> the same architecture, and the same number of
non-embedding parameters.</figcaption>
</figure>
### Uses and Limitations
#### Intended Use
The primary intended use of Pythia is research on the behavior, functionality,
and limitations of large language models. This suite is intended to provide
a controlled setting for performing scientific experiments. To enable the
study of how language models change over the course of training, we provide
143 evenly spaced intermediate checkpoints per model. These checkpoints are
hosted on Hugging Face as branches. Note that branch `143000` corresponds
exactly to the model checkpoint on the `main` branch of each model.
You may also further fine-tune and adapt Pythia-160M for deployment,
as long as your use is in accordance with the Apache 2.0 license. Pythia
models work with the Hugging Face [Transformers
Library](https://huggingface.co/docs/transformers/index). If you decide to use
pre-trained Pythia-160M as a basis for your fine-tuned model, please
conduct your own risk and bias assessment.
#### Out-of-scope use
The Pythia Suite is **not** intended for deployment. It is not a in itself
a product and cannot be used for human-facing interactions.
Pythia models are English-language only, and are not suitable for translation
or generating text in other languages.
Pythia-160M has not been fine-tuned for downstream contexts in which
language models are commonly deployed, such as writing genre prose,
or commercial chatbots. This means Pythia-160M will **not**
respond to a given prompt the way a product like ChatGPT does. This is because,
unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement
Learning from Human Feedback (RLHF) to better “understand” human instructions.
#### Limitations and biases
The core functionality of a large language model is to take a string of text
and predict the next token. The token deemed statistically most likely by the
model need not produce the most “accurate” text. Never rely on
Pythia-160M to produce factually accurate output.
This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset
known to contain profanity and texts that are lewd or otherwise offensive.
See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a
discussion of documented biases with regards to gender, religion, and race.
Pythia-160M may produce socially unacceptable or undesirable text, *even if*
the prompt itself does not include anything explicitly offensive.
If you plan on using text generated through, for example, the Hosted Inference
API, we recommend having a human curate the outputs of this language model
before presenting it to other people. Please inform your audience that the
text was generated by Pythia-160M.
### Quickstart
Pythia models can be loaded and used via the following code, demonstrated here
for the third `pythia-70m-deduped` checkpoint:
```python
from transformers import GPTNeoXForCausalLM, AutoTokenizer
model = GPTNeoXForCausalLM.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
tokenizer = AutoTokenizer.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
inputs = tokenizer("Hello, I am", return_tensors="pt")
tokens = model.generate(**inputs)
tokenizer.decode(tokens[0])
```
Revision/branch `step143000` corresponds exactly to the model checkpoint on
the `main` branch of each model.<br>
For more information on how to use all Pythia models, see [documentation on
GitHub](https://github.com/EleutherAI/pythia).
### Training
#### Training data
[The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in
English. It was created by EleutherAI specifically for training large language
models. It contains texts from 22 diverse sources, roughly broken down into
five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl),
prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and
miscellaneous (e.g. GitHub, Enron Emails). See [the Pile
paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources,
methodology, and a discussion of ethical implications. Consult [the
datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation
about the Pile and its component datasets. The Pile can be downloaded from
the [official website](https://pile.eleuther.ai/), or from a [community
mirror](https://the-eye.eu/public/AI/pile/).<br>
The Pile was **not** deduplicated before being used to train Pythia-160M.
#### Training procedure
All models were trained on the exact same data, in the exact same order. Each
model saw 299,892,736,000 tokens during training, and 143 checkpoints for each
model are saved every 2,097,152,000 tokens, spaced evenly throughout training.
This corresponds to training for just under 1 epoch on the Pile for
non-deduplicated models, and about 1.5 epochs on the deduplicated Pile.
All *Pythia* models trained for the equivalent of 143000 steps at a batch size
of 2,097,152 tokens. Two batch sizes were used: 2M and 4M. Models with a batch
size of 4M tokens listed were originally trained for 71500 steps instead, with
checkpoints every 500 steps. The checkpoints on Hugging Face are renamed for
consistency with all 2M batch models, so `step1000` is the first checkpoint
for `pythia-1.4b` that was saved (corresponding to step 500 in training), and
`step1000` is likewise the first `pythia-6.9b` checkpoint that was saved
(corresponding to 1000 “actual” steps).<br>
See [GitHub](https://github.com/EleutherAI/pythia) for more details on training
procedure, including [how to reproduce
it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br>
Pythia uses the same tokenizer as [GPT-NeoX-
20B](https://huggingface.co/EleutherAI/gpt-neox-20b).
### Evaluations
All 16 *Pythia* models were evaluated using the [LM Evaluation
Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access
the results by model and step at `results/json/*` in the [GitHub
repository](https://github.com/EleutherAI/pythia/tree/main/results/json).<br>
Expand the sections below to see plots of evaluation results for all
Pythia and Pythia-deduped models compared with OPT and BLOOM.
<details>
<summary>LAMBADA – OpenAI</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai.png" style="width:auto"/>
</details>
<details>
<summary>Physical Interaction: Question Answering (PIQA)</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa.png" style="width:auto"/>
</details>
<details>
<summary>WinoGrande</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande.png" style="width:auto"/>
</details>
<details>
<summary>AI2 Reasoning Challenge—Challenge Set</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_challenge.png" style="width:auto"/>
</details>
<details>
<summary>SciQ</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq.png" style="width:auto"/>
</details>
### Naming convention and parameter count
*Pythia* models were renamed in January 2023. It is possible that the old
naming convention still persists in some documentation by accident. The
current naming convention (70M, 160M, etc.) is based on total parameter count.
<figure style="width:32em">
| current Pythia suffix | old suffix | total params | non-embedding params |
| --------------------: | ---------: | -------------: | -------------------: |
| 70M | 19M | 70,426,624 | 18,915,328 |
| 160M | 125M | 162,322,944 | 85,056,000 |
| 410M | 350M | 405,334,016 | 302,311,424 |
| 1B | 800M | 1,011,781,632 | 805,736,448 |
| 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 |
| 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 |
| 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 |
| 12B | 13B | 11,846,072,320 | 11,327,027,200 |
</figure> | [
"QUESTION_ANSWERING",
"TRANSLATION"
] | [
"SCIQ"
] |
THUDM/cogvlm2-video-llama3-chat | THUDM | text-generation | [
"transformers",
"safetensors",
"text-generation",
"chat",
"cogvlm2",
"cogvlm--video",
"conversational",
"custom_code",
"en",
"license:other",
"autotrain_compatible",
"region:us"
] | 2024-07-03T02:21:55 | 2024-07-24T09:53:20 | 589 | 44 | ---
language:
- en
license: other
license_name: cogvlm2
license_link: https://huggingface.co/THUDM/cogvlm2-video-llama3-chat/blob/main/LICENSE
pipeline_tag: text-generation
tags:
- chat
- cogvlm2
- cogvlm--video
inference: false
---
# CogVLM2-Video-Llama3-Chat
[中文版本README](README_zh.md)
## Introduction
CogVLM2-Video achieves state-of-the-art performance on multiple video question answering tasks. It can achieve video
understanding within one minute. We provide two example videos to demonstrate CogVLM2-Video's video understanding and
video temporal grounding capabilities.
<table>
<tr>
<td>
<video width="100%" controls>
<source src="https://github.com/THUDM/CogVLM2/raw/main/resources/videos/lion.mp4" type="video/mp4">
</video>
</td>
<td>
<video width="100%" controls>
<source src="https://github.com/THUDM/CogVLM2/raw/main/resources/videos/basketball.mp4" type="video/mp4">
</video>
</td>
</tr>
</table>
## BenchMark
The following diagram shows the performance of CogVLM2-Video on
the [MVBench](https://github.com/OpenGVLab/Ask-Anything), [VideoChatGPT-Bench](https://github.com/mbzuai-oryx/Video-ChatGPT)
and Zero-shot VideoQA datasets (MSVD-QA, MSRVTT-QA, ActivityNet-QA). Where VCG-* refers to the VideoChatGPTBench, ZS-*
refers to Zero-Shot VideoQA datasets and MV-* refers to main categories in the MVBench.

Performance on VideoChatGPT-Bench and Zero-shot VideoQA dataset:
| Models | VCG-AVG | VCG-CI | VCG-DO | VCG-CU | VCG-TU | VCG-CO | ZS-AVG |
|-----------------------|----------|----------|----------|----------|----------|----------|-----------|
| IG-VLM GPT4V | 3.17 | 3.40 | 2.80 | 3.61 | 2.89 | 3.13 | 65.70 |
| ST-LLM | 3.15 | 3.23 | 3.05 | 3.74 | 2.93 | 2.81 | 62.90 |
| ShareGPT4Video | N/A | N/A | N/A | N/A | N/A | N/A | 46.50 |
| VideoGPT+ | 3.28 | 3.27 | 3.18 | 3.74 | 2.83 | **3.39** | 61.20 |
| VideoChat2_HD_mistral | 3.10 | 3.40 | 2.91 | 3.72 | 2.65 | 2.84 | 57.70 |
| PLLaVA-34B | 3.32 | **3.60** | 3.20 | **3.90** | 2.67 | 3.25 | **68.10** |
| CogVLM2-Video | **3.41** | 3.49 | **3.46** | 3.87 | **2.98** | 3.23 | 66.60 |
Performance on MVBench dataset:
| Models | AVG | AA | AC | AL | AP | AS | CO | CI | EN | ER | FA | FP | MA | MC | MD | OE | OI | OS | ST | SC | UA |
|-----------------------|----------|----------|----------|----------|----------|----------|----------|----------|----------|----------|----------|----------|----------|----------|----------|----------|----------|----------|----------|----------|----------|
| IG-VLM GPT4V | 43.7 | 72.0 | 39.0 | 40.5 | 63.5 | 55.5 | 52.0 | 11.0 | 31.0 | 59.0 | 46.5 | 47.5 | 22.5 | 12.0 | 12.0 | 18.5 | 59.0 | 29.5 | 83.5 | 45.0 | 73.5 |
| ST-LLM | 54.9 | 84.0 | 36.5 | 31.0 | 53.5 | 66.0 | 46.5 | 58.5 | 34.5 | 41.5 | 44.0 | 44.5 | 78.5 | 56.5 | 42.5 | 80.5 | 73.5 | 38.5 | 86.5 | 43.0 | 58.5 |
| ShareGPT4Video | 51.2 | 79.5 | 35.5 | 41.5 | 39.5 | 49.5 | 46.5 | 51.5 | 28.5 | 39.0 | 40.0 | 25.5 | 75.0 | 62.5 | 50.5 | 82.5 | 54.5 | 32.5 | 84.5 | 51.0 | 54.5 |
| VideoGPT+ | 58.7 | 83.0 | 39.5 | 34.0 | 60.0 | 69.0 | 50.0 | 60.0 | 29.5 | 44.0 | 48.5 | 53.0 | 90.5 | 71.0 | 44.0 | 85.5 | 75.5 | 36.0 | 89.5 | 45.0 | 66.5 |
| VideoChat2_HD_mistral | **62.3** | 79.5 | **60.0** | **87.5** | 50.0 | 68.5 | **93.5** | 71.5 | 36.5 | 45.0 | 49.5 | **87.0** | 40.0 | **76.0** | **92.0** | 53.0 | 62.0 | **45.5** | 36.0 | 44.0 | 69.5 |
| PLLaVA-34B | 58.1 | 82.0 | 40.5 | 49.5 | 53.0 | 67.5 | 66.5 | 59.0 | **39.5** | **63.5** | 47.0 | 50.0 | 70.0 | 43.0 | 37.5 | 68.5 | 67.5 | 36.5 | 91.0 | 51.5 | **79.0** |
| CogVLM2-Video | **62.3** | **85.5** | 41.5 | 31.5 | **65.5** | **79.5** | 58.5 | **77.0** | 28.5 | 42.5 | **54.0** | 57.0 | **91.5** | 73.0 | 48.0 | **91.0** | **78.0** | 36.0 | **91.5** | **47.0** | 68.5 |
## Evaluation details
We follow the previous works to evaluate the performance of our model. In different benchmarks, we craft task-specific
prompts for each benchmark:
``` python
# For MVBench
prompt = f"Carefully watch the video and pay attention to the cause and sequence of events, the detail and movement of objects, and the action and pose of persons. Based on your observations, select the best option that accurately addresses the question.\n " + f"{prompt.replace('Short Answer.', '')}\n" + "Short Answer:"
# For VideoChatGPT-Bench
prompt = f"Carefully watch the video and pay attention to the cause and sequence of events, the detail and movement of objects, and the action and pose of persons. Based on your observations, comprehensively answer the following question. Your answer should be long and cover all the related aspects\n " + f"{prompt.replace('Short Answer.', '')}\n" + "Answer:"
# For Zero-shot VideoQA
prompt = f"The input consists of a sequence of key frames from a video. Answer the question comprehensively including all the possible verbs and nouns that can discribe the events, followed by significant events, characters, or objects that appear throughout the frames.\n " + f"{prompt.replace('Short Answer.', '')}\n" + "Answer:"
```
For evaluation codes, please refer to
the [evaluation script](https://github.com/magic-research/PLLaVA/blob/main/README.md) in PLLaVA.
## Using This Model
This repository is a `chat` version model and it support single-round chat.
You can quickly install the Python package dependencies and run model inference in
our [github](https://github.com/THUDM/CogVLM2/tree/main/video_demo).
## License
This model is released under the
CogVLM2 [LICENSE](./LICENSE).
For models built with Meta Llama 3, please also adhere to
the [LLAMA3_LICENSE](./LLAMA3_LICENSE).
## Training details
Pleaser refer to our technical report for training formula and hyperparameters.
| [
"QUESTION_ANSWERING"
] | [
"CRAFT"
] |
EleutherAI/pythia-1b-v0 | EleutherAI | text-generation | [
"transformers",
"pytorch",
"safetensors",
"gpt_neox",
"text-generation",
"causal-lm",
"pythia",
"pythia_v0",
"en",
"dataset:the_pile",
"arxiv:2101.00027",
"arxiv:2201.07311",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | 2022-10-16T18:27:56 | 2023-07-10T01:35:25 | 577 | 6 | ---
datasets:
- the_pile
language:
- en
license: apache-2.0
tags:
- pytorch
- causal-lm
- pythia
- pythia_v0
---
The *Pythia Scaling Suite* is a collection of models developed to facilitate
interpretability research. It contains two sets of eight models of sizes
70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two
models: one trained on the Pile, and one trained on the Pile after the dataset
has been globally deduplicated. All 8 model sizes are trained on the exact
same data, in the exact same order. All Pythia models are available
[on Hugging Face](https://huggingface.co/models?other=pythia).
The Pythia model suite was deliberately designed to promote scientific
research on large language models, especially interpretability research.
Despite not centering downstream performance as a design goal, we find the
models <a href="#evaluations">match or exceed</a> the performance of
similar and same-sized models, such as those in the OPT and GPT-Neo suites.
Please note that all models in the *Pythia* suite were renamed in January
2023. For clarity, a <a href="#naming-convention-and-parameter-count">table
comparing the old and new names</a> is provided in this model card, together
with exact parameter counts.
## Pythia-1B
### Model Details
- Developed by: [EleutherAI](http://eleuther.ai)
- Model type: Transformer-based Language Model
- Language: English
- Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia)
for training procedure, config files, and details on how to use.
- Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
- License: Apache 2.0
- Contact: to ask questions about this model, join the [EleutherAI
Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`.
Please read the existing *Pythia* documentation before asking about it in the
EleutherAI Discord. For general correspondence: [contact@eleuther.
ai](mailto:[email protected]).
<figure>
| Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models |
| -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: |
| 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — |
| 160M | 85,056,000 | 12 | 768 | 12 | 4M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M |
| 410M | 302,311,424 | 24 | 1024 | 16 | 4M | 3.0 x 10<sup>-4</sup> | OPT-350M |
| 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — |
| 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 4M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B |
| 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B |
| 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B |
| 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — |
<figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and
non-deduped models of a given size have the same hyperparameters. “Equivalent”
models have <b>exactly</b> the same architecture, and the same number of
non-embedding parameters.</figcaption>
</figure>
### Uses and Limitations
#### Intended Use
The primary intended use of Pythia is research on the behavior, functionality,
and limitations of large language models. This suite is intended to provide
a controlled setting for performing scientific experiments. To enable the
study of how language models change over the course of training, we provide
143 evenly spaced intermediate checkpoints per model. These checkpoints are
hosted on Hugging Face as branches. Note that branch `143000` corresponds
exactly to the model checkpoint on the `main` branch of each model.
You may also further fine-tune and adapt Pythia-1B for deployment,
as long as your use is in accordance with the Apache 2.0 license. Pythia
models work with the Hugging Face [Transformers
Library](https://huggingface.co/docs/transformers/index). If you decide to use
pre-trained Pythia-1B as a basis for your fine-tuned model, please
conduct your own risk and bias assessment.
#### Out-of-scope use
The Pythia Suite is **not** intended for deployment. It is not a in itself
a product and cannot be used for human-facing interactions.
Pythia models are English-language only, and are not suitable for translation
or generating text in other languages.
Pythia-1B has not been fine-tuned for downstream contexts in which
language models are commonly deployed, such as writing genre prose,
or commercial chatbots. This means Pythia-1B will **not**
respond to a given prompt the way a product like ChatGPT does. This is because,
unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement
Learning from Human Feedback (RLHF) to better “understand” human instructions.
#### Limitations and biases
The core functionality of a large language model is to take a string of text
and predict the next token. The token deemed statistically most likely by the
model need not produce the most “accurate” text. Never rely on
Pythia-1B to produce factually accurate output.
This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset
known to contain profanity and texts that are lewd or otherwise offensive.
See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a
discussion of documented biases with regards to gender, religion, and race.
Pythia-1B may produce socially unacceptable or undesirable text, *even if*
the prompt itself does not include anything explicitly offensive.
If you plan on using text generated through, for example, the Hosted Inference
API, we recommend having a human curate the outputs of this language model
before presenting it to other people. Please inform your audience that the
text was generated by Pythia-1B.
### Quickstart
Pythia models can be loaded and used via the following code, demonstrated here
for the third `pythia-70m-deduped` checkpoint:
```python
from transformers import GPTNeoXForCausalLM, AutoTokenizer
model = GPTNeoXForCausalLM.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
tokenizer = AutoTokenizer.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
inputs = tokenizer("Hello, I am", return_tensors="pt")
tokens = model.generate(**inputs)
tokenizer.decode(tokens[0])
```
Revision/branch `step143000` corresponds exactly to the model checkpoint on
the `main` branch of each model.<br>
For more information on how to use all Pythia models, see [documentation on
GitHub](https://github.com/EleutherAI/pythia).
### Training
#### Training data
[The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in
English. It was created by EleutherAI specifically for training large language
models. It contains texts from 22 diverse sources, roughly broken down into
five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl),
prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and
miscellaneous (e.g. GitHub, Enron Emails). See [the Pile
paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources,
methodology, and a discussion of ethical implications. Consult [the
datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation
about the Pile and its component datasets. The Pile can be downloaded from
the [official website](https://pile.eleuther.ai/), or from a [community
mirror](https://the-eye.eu/public/AI/pile/).<br>
The Pile was **not** deduplicated before being used to train Pythia-1B.
#### Training procedure
All models were trained on the exact same data, in the exact same order. Each
model saw 299,892,736,000 tokens during training, and 143 checkpoints for each
model are saved every 2,097,152,000 tokens, spaced evenly throughout training.
This corresponds to training for just under 1 epoch on the Pile for
non-deduplicated models, and about 1.5 epochs on the deduplicated Pile.
All *Pythia* models trained for the equivalent of 143000 steps at a batch size
of 2,097,152 tokens. Two batch sizes were used: 2M and 4M. Models with a batch
size of 4M tokens listed were originally trained for 71500 steps instead, with
checkpoints every 500 steps. The checkpoints on Hugging Face are renamed for
consistency with all 2M batch models, so `step1000` is the first checkpoint
for `pythia-1.4b` that was saved (corresponding to step 500 in training), and
`step1000` is likewise the first `pythia-6.9b` checkpoint that was saved
(corresponding to 1000 “actual” steps).<br>
See [GitHub](https://github.com/EleutherAI/pythia) for more details on training
procedure, including [how to reproduce
it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br>
Pythia uses the same tokenizer as [GPT-NeoX-
20B](https://huggingface.co/EleutherAI/gpt-neox-20b).
### Evaluations
All 16 *Pythia* models were evaluated using the [LM Evaluation
Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access
the results by model and step at `results/json/*` in the [GitHub
repository](https://github.com/EleutherAI/pythia/tree/main/results/json).<br>
Expand the sections below to see plots of evaluation results for all
Pythia and Pythia-deduped models compared with OPT and BLOOM.
<details>
<summary>LAMBADA – OpenAI</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai.png" style="width:auto"/>
</details>
<details>
<summary>Physical Interaction: Question Answering (PIQA)</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa.png" style="width:auto"/>
</details>
<details>
<summary>WinoGrande</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande.png" style="width:auto"/>
</details>
<details>
<summary>AI2 Reasoning Challenge—Challenge Set</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_challenge.png" style="width:auto"/>
</details>
<details>
<summary>SciQ</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq.png" style="width:auto"/>
</details>
### Naming convention and parameter count
*Pythia* models were renamed in January 2023. It is possible that the old
naming convention still persists in some documentation by accident. The
current naming convention (70M, 160M, etc.) is based on total parameter count.
<figure style="width:32em">
| current Pythia suffix | old suffix | total params | non-embedding params |
| --------------------: | ---------: | -------------: | -------------------: |
| 70M | 19M | 70,426,624 | 18,915,328 |
| 160M | 125M | 162,322,944 | 85,056,000 |
| 410M | 350M | 405,334,016 | 302,311,424 |
| 1B | 800M | 1,011,781,632 | 805,736,448 |
| 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 |
| 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 |
| 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 |
| 12B | 13B | 11,846,072,320 | 11,327,027,200 |
</figure> | [
"QUESTION_ANSWERING",
"TRANSLATION"
] | [
"SCIQ"
] |
EleutherAI/pythia-160m-deduped-v0 | EleutherAI | text-generation | [
"transformers",
"pytorch",
"safetensors",
"gpt_neox",
"text-generation",
"causal-lm",
"pythia",
"pythia_v0",
"en",
"dataset:EleutherAI/the_pile_deduplicated",
"arxiv:2101.00027",
"arxiv:2201.07311",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | 2022-10-18T02:59:41 | 2023-07-10T01:30:40 | 573 | 6 | ---
datasets:
- EleutherAI/the_pile_deduplicated
language:
- en
license: apache-2.0
tags:
- pytorch
- causal-lm
- pythia
- pythia_v0
---
The *Pythia Scaling Suite* is a collection of models developed to facilitate
interpretability research. It contains two sets of eight models of sizes
70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two
models: one trained on the Pile, and one trained on the Pile after the dataset
has been globally deduplicated. All 8 model sizes are trained on the exact
same data, in the exact same order. All Pythia models are available
[on Hugging Face](https://huggingface.co/models?other=pythia).
The Pythia model suite was deliberately designed to promote scientific
research on large language models, especially interpretability research.
Despite not centering downstream performance as a design goal, we find the
models <a href="#evaluations">match or exceed</a> the performance of
similar and same-sized models, such as those in the OPT and GPT-Neo suites.
Please note that all models in the *Pythia* suite were renamed in January
2023. For clarity, a <a href="#naming-convention-and-parameter-count">table
comparing the old and new names</a> is provided in this model card, together
with exact parameter counts.
## Pythia-160M-deduped
### Model Details
- Developed by: [EleutherAI](http://eleuther.ai)
- Model type: Transformer-based Language Model
- Language: English
- Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia)
for training procedure, config files, and details on how to use.
- Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
- License: Apache 2.0
- Contact: to ask questions about this model, join the [EleutherAI
Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`.
Please read the existing *Pythia* documentation before asking about it in the
EleutherAI Discord. For general correspondence: [contact@eleuther.
ai](mailto:[email protected]).
<figure>
| Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models |
| -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: |
| 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — |
| 160M | 85,056,000 | 12 | 768 | 12 | 4M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M |
| 410M | 302,311,424 | 24 | 1024 | 16 | 4M | 3.0 x 10<sup>-4</sup> | OPT-350M |
| 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — |
| 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 4M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B |
| 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B |
| 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B |
| 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — |
<figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and
non-deduped models of a given size have the same hyperparameters. “Equivalent”
models have <b>exactly</b> the same architecture, and the same number of
non-embedding parameters.</figcaption>
</figure>
### Uses and Limitations
#### Intended Use
The primary intended use of Pythia is research on the behavior, functionality,
and limitations of large language models. This suite is intended to provide
a controlled setting for performing scientific experiments. To enable the
study of how language models change in the course of training, we provide
143 evenly spaced intermediate checkpoints per model. These checkpoints are
hosted on Hugging Face as branches. Note that branch `143000` corresponds
exactly to the model checkpoint on the `main` branch of each model.
You may also further fine-tune and adapt Pythia-160M-deduped for deployment,
as long as your use is in accordance with the Apache 2.0 license. Pythia
models work with the Hugging Face [Transformers
Library](https://huggingface.co/docs/transformers/index). If you decide to use
pre-trained Pythia-160M-deduped as a basis for your fine-tuned model, please
conduct your own risk and bias assessment.
#### Out-of-scope use
The Pythia Suite is **not** intended for deployment. It is not a in itself
a product and cannot be used for human-facing interactions.
Pythia models are English-language only, and are not suitable for translation
or generating text in other languages.
Pythia-160M-deduped has not been fine-tuned for downstream contexts in which
language models are commonly deployed, such as writing genre prose,
or commercial chatbots. This means Pythia-160M-deduped will **not**
respond to a given prompt the way a product like ChatGPT does. This is because,
unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement
Learning from Human Feedback (RLHF) to better “understand” human instructions.
#### Limitations and biases
The core functionality of a large language model is to take a string of text
and predict the next token. The token deemed statistically most likely by the
model need not produce the most “accurate” text. Never rely on
Pythia-160M-deduped to produce factually accurate output.
This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset
known to contain profanity and texts that are lewd or otherwise offensive.
See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a
discussion of documented biases with regards to gender, religion, and race.
Pythia-160M-deduped may produce socially unacceptable or undesirable text,
*even if* the prompt itself does not include anything explicitly offensive.
If you plan on using text generated through, for example, the Hosted Inference
API, we recommend having a human curate the outputs of this language model
before presenting it to other people. Please inform your audience that the
text was generated by Pythia-160M-deduped.
### Quickstart
Pythia models can be loaded and used via the following code, demonstrated here
for the third `pythia-70m-deduped` checkpoint:
```python
from transformers import GPTNeoXForCausalLM, AutoTokenizer
model = GPTNeoXForCausalLM.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
tokenizer = AutoTokenizer.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
inputs = tokenizer("Hello, I am", return_tensors="pt")
tokens = model.generate(**inputs)
tokenizer.decode(tokens[0])
```
Revision/branch `step143000` corresponds exactly to the model checkpoint on
the `main` branch of each model.<br>
For more information on how to use all Pythia models, see [documentation on
GitHub](https://github.com/EleutherAI/pythia).
### Training
#### Training data
Pythia-160M-deduped was trained on the Pile **after the dataset has been
globally deduplicated**.<br>
[The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in
English. It was created by EleutherAI specifically for training large language
models. It contains texts from 22 diverse sources, roughly broken down into
five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl),
prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and
miscellaneous (e.g. GitHub, Enron Emails). See [the Pile
paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources,
methodology, and a discussion of ethical implications. Consult [the
datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation
about the Pile and its component datasets. The Pile can be downloaded from
the [official website](https://pile.eleuther.ai/), or from a [community
mirror](https://the-eye.eu/public/AI/pile/).
#### Training procedure
All models were trained on the exact same data, in the exact same order. Each
model saw 299,892,736,000 tokens during training, and 143 checkpoints for each
model are saved every 2,097,152,000 tokens, spaced evenly throughout training.
This corresponds to training for just under 1 epoch on the Pile for
non-deduplicated models, and about 1.5 epochs on the deduplicated Pile.
All *Pythia* models trained for the equivalent of 143000 steps at a batch size
of 2,097,152 tokens. Two batch sizes were used: 2M and 4M. Models with a batch
size of 4M tokens listed were originally trained for 71500 steps instead, with
checkpoints every 500 steps. The checkpoints on Hugging Face are renamed for
consistency with all 2M batch models, so `step1000` is the first checkpoint
for `pythia-1.4b` that was saved (corresponding to step 500 in training), and
`step1000` is likewise the first `pythia-6.9b` checkpoint that was saved
(corresponding to 1000 “actual” steps).<br>
See [GitHub](https://github.com/EleutherAI/pythia) for more details on training
procedure, including [how to reproduce
it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br>
Pythia uses the same tokenizer as [GPT-NeoX-
20B](https://huggingface.co/EleutherAI/gpt-neox-20b).
### Evaluations
All 16 *Pythia* models were evaluated using the [LM Evaluation
Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access
the results by model and step at `results/json/*` in the [GitHub
repository](https://github.com/EleutherAI/pythia/tree/main/results/json).<br>
Expand the sections below to see plots of evaluation results for all
Pythia and Pythia-deduped models compared with OPT and BLOOM.
<details>
<summary>LAMBADA – OpenAI</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai.png" style="width:auto"/>
</details>
<details>
<summary>Physical Interaction: Question Answering (PIQA)</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa.png" style="width:auto"/>
</details>
<details>
<summary>WinoGrande</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande.png" style="width:auto"/>
</details>
<details>
<summary>AI2 Reasoning Challenge – Challenge Set</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_challenge.png" style="width:auto"/>
</details>
<details>
<summary>SciQ</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq.png" style="width:auto"/>
</details>
### Naming convention and parameter count
*Pythia* models were renamed in January 2023. It is possible that the old
naming convention still persists in some documentation by accident. The
current naming convention (70M, 160M, etc.) is based on total parameter count.
<figure style="width:32em">
| current Pythia suffix | old suffix | total params | non-embedding params |
| --------------------: | ---------: | -------------: | -------------------: |
| 70M | 19M | 70,426,624 | 18,915,328 |
| 160M | 125M | 162,322,944 | 85,056,000 |
| 410M | 350M | 405,334,016 | 302,311,424 |
| 1B | 800M | 1,011,781,632 | 805,736,448 |
| 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 |
| 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 |
| 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 |
| 12B | 13B | 11,846,072,320 | 11,327,027,200 |
</figure> | [
"QUESTION_ANSWERING",
"TRANSLATION"
] | [
"SCIQ"
] |
jinaai/jina-embedding-l-en-v1 | jinaai | sentence-similarity | [
"sentence-transformers",
"pytorch",
"t5",
"finetuner",
"mteb",
"feature-extraction",
"sentence-similarity",
"custom_code",
"en",
"dataset:jinaai/negation-dataset",
"arxiv:2307.11224",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2023-07-09T08:54:06 | 2025-01-06T16:30:42 | 561 | 24 | ---
datasets:
- jinaai/negation-dataset
language: en
license: apache-2.0
pipeline_tag: sentence-similarity
tags:
- finetuner
- mteb
- sentence-transformers
- feature-extraction
- sentence-similarity
model-index:
- name: jina-triplets-large
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 68.92537313432835
- type: ap
value: 29.723758877632513
- type: f1
value: 61.909704211663794
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 69.13669999999999
- type: ap
value: 65.30216072238086
- type: f1
value: 67.1890891071034
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 31.384
- type: f1
value: 30.016752348953723
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: arguana
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 23.613
- type: map_at_10
value: 37.897
- type: map_at_100
value: 39.093
- type: map_at_1000
value: 39.109
- type: map_at_3
value: 32.824
- type: map_at_5
value: 35.679
- type: mrr_at_1
value: 23.826
- type: mrr_at_10
value: 37.997
- type: mrr_at_100
value: 39.186
- type: mrr_at_1000
value: 39.202
- type: mrr_at_3
value: 32.918
- type: mrr_at_5
value: 35.748999999999995
- type: ndcg_at_1
value: 23.613
- type: ndcg_at_10
value: 46.482
- type: ndcg_at_100
value: 51.55499999999999
- type: ndcg_at_1000
value: 51.974
- type: ndcg_at_3
value: 35.964
- type: ndcg_at_5
value: 41.144999999999996
- type: precision_at_1
value: 23.613
- type: precision_at_10
value: 7.417999999999999
- type: precision_at_100
value: 0.963
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 15.031
- type: precision_at_5
value: 11.55
- type: recall_at_1
value: 23.613
- type: recall_at_10
value: 74.182
- type: recall_at_100
value: 96.30199999999999
- type: recall_at_1000
value: 99.57300000000001
- type: recall_at_3
value: 45.092
- type: recall_at_5
value: 57.752
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 40.51285742156528
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 31.5825964077496
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 62.830281630546835
- type: mrr
value: 75.93072593765115
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 87.26764516732737
- type: cos_sim_spearman
value: 84.42541766631741
- type: euclidean_pearson
value: 48.71357447655235
- type: euclidean_spearman
value: 49.2023259276511
- type: manhattan_pearson
value: 48.36366272727299
- type: manhattan_spearman
value: 48.457128224924354
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 85.3409090909091
- type: f1
value: 85.25262617676835
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 33.560193912974974
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 28.4426572644577
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: BeIR/cqadupstack
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 27.822999999999997
- type: map_at_10
value: 39.088
- type: map_at_100
value: 40.561
- type: map_at_1000
value: 40.69
- type: map_at_3
value: 35.701
- type: map_at_5
value: 37.556
- type: mrr_at_1
value: 33.906
- type: mrr_at_10
value: 44.527
- type: mrr_at_100
value: 45.403999999999996
- type: mrr_at_1000
value: 45.452
- type: mrr_at_3
value: 41.726
- type: mrr_at_5
value: 43.314
- type: ndcg_at_1
value: 33.906
- type: ndcg_at_10
value: 45.591
- type: ndcg_at_100
value: 51.041000000000004
- type: ndcg_at_1000
value: 53.1
- type: ndcg_at_3
value: 40.324
- type: ndcg_at_5
value: 42.723
- type: precision_at_1
value: 33.906
- type: precision_at_10
value: 8.655
- type: precision_at_100
value: 1.418
- type: precision_at_1000
value: 0.19499999999999998
- type: precision_at_3
value: 19.123
- type: precision_at_5
value: 13.963000000000001
- type: recall_at_1
value: 27.822999999999997
- type: recall_at_10
value: 58.63699999999999
- type: recall_at_100
value: 80.874
- type: recall_at_1000
value: 93.82000000000001
- type: recall_at_3
value: 44.116
- type: recall_at_5
value: 50.178999999999995
- type: map_at_1
value: 26.823999999999998
- type: map_at_10
value: 37.006
- type: map_at_100
value: 38.256
- type: map_at_1000
value: 38.397999999999996
- type: map_at_3
value: 34.011
- type: map_at_5
value: 35.643
- type: mrr_at_1
value: 34.268
- type: mrr_at_10
value: 43.374
- type: mrr_at_100
value: 44.096000000000004
- type: mrr_at_1000
value: 44.144
- type: mrr_at_3
value: 41.008
- type: mrr_at_5
value: 42.359
- type: ndcg_at_1
value: 34.268
- type: ndcg_at_10
value: 43.02
- type: ndcg_at_100
value: 47.747
- type: ndcg_at_1000
value: 50.019999999999996
- type: ndcg_at_3
value: 38.687
- type: ndcg_at_5
value: 40.647
- type: precision_at_1
value: 34.268
- type: precision_at_10
value: 8.261000000000001
- type: precision_at_100
value: 1.376
- type: precision_at_1000
value: 0.189
- type: precision_at_3
value: 19.108
- type: precision_at_5
value: 13.489999999999998
- type: recall_at_1
value: 26.823999999999998
- type: recall_at_10
value: 53.84100000000001
- type: recall_at_100
value: 73.992
- type: recall_at_1000
value: 88.524
- type: recall_at_3
value: 40.711000000000006
- type: recall_at_5
value: 46.477000000000004
- type: map_at_1
value: 34.307
- type: map_at_10
value: 45.144
- type: map_at_100
value: 46.351
- type: map_at_1000
value: 46.414
- type: map_at_3
value: 42.315000000000005
- type: map_at_5
value: 43.991
- type: mrr_at_1
value: 39.06
- type: mrr_at_10
value: 48.612
- type: mrr_at_100
value: 49.425000000000004
- type: mrr_at_1000
value: 49.458999999999996
- type: mrr_at_3
value: 46.144
- type: mrr_at_5
value: 47.654999999999994
- type: ndcg_at_1
value: 39.06
- type: ndcg_at_10
value: 50.647
- type: ndcg_at_100
value: 55.620000000000005
- type: ndcg_at_1000
value: 56.976000000000006
- type: ndcg_at_3
value: 45.705
- type: ndcg_at_5
value: 48.269
- type: precision_at_1
value: 39.06
- type: precision_at_10
value: 8.082
- type: precision_at_100
value: 1.161
- type: precision_at_1000
value: 0.133
- type: precision_at_3
value: 20.376
- type: precision_at_5
value: 14.069
- type: recall_at_1
value: 34.307
- type: recall_at_10
value: 63.497
- type: recall_at_100
value: 85.038
- type: recall_at_1000
value: 94.782
- type: recall_at_3
value: 50.209
- type: recall_at_5
value: 56.525000000000006
- type: map_at_1
value: 26.448
- type: map_at_10
value: 34.86
- type: map_at_100
value: 36.004999999999995
- type: map_at_1000
value: 36.081
- type: map_at_3
value: 32.527
- type: map_at_5
value: 33.955
- type: mrr_at_1
value: 28.701
- type: mrr_at_10
value: 36.909
- type: mrr_at_100
value: 37.89
- type: mrr_at_1000
value: 37.945
- type: mrr_at_3
value: 34.576
- type: mrr_at_5
value: 35.966
- type: ndcg_at_1
value: 28.701
- type: ndcg_at_10
value: 39.507999999999996
- type: ndcg_at_100
value: 45.056000000000004
- type: ndcg_at_1000
value: 47.034
- type: ndcg_at_3
value: 34.985
- type: ndcg_at_5
value: 37.384
- type: precision_at_1
value: 28.701
- type: precision_at_10
value: 5.921
- type: precision_at_100
value: 0.914
- type: precision_at_1000
value: 0.11199999999999999
- type: precision_at_3
value: 14.689
- type: precision_at_5
value: 10.237
- type: recall_at_1
value: 26.448
- type: recall_at_10
value: 51.781
- type: recall_at_100
value: 77.142
- type: recall_at_1000
value: 92.10000000000001
- type: recall_at_3
value: 39.698
- type: recall_at_5
value: 45.469
- type: map_at_1
value: 14.174000000000001
- type: map_at_10
value: 22.019
- type: map_at_100
value: 23.18
- type: map_at_1000
value: 23.304
- type: map_at_3
value: 19.332
- type: map_at_5
value: 20.816000000000003
- type: mrr_at_1
value: 17.785999999999998
- type: mrr_at_10
value: 26.233
- type: mrr_at_100
value: 27.254
- type: mrr_at_1000
value: 27.328000000000003
- type: mrr_at_3
value: 23.653
- type: mrr_at_5
value: 25.095
- type: ndcg_at_1
value: 17.785999999999998
- type: ndcg_at_10
value: 27.236
- type: ndcg_at_100
value: 32.932
- type: ndcg_at_1000
value: 36.134
- type: ndcg_at_3
value: 22.33
- type: ndcg_at_5
value: 24.573999999999998
- type: precision_at_1
value: 17.785999999999998
- type: precision_at_10
value: 5.286
- type: precision_at_100
value: 0.9369999999999999
- type: precision_at_1000
value: 0.136
- type: precision_at_3
value: 11.07
- type: precision_at_5
value: 8.308
- type: recall_at_1
value: 14.174000000000001
- type: recall_at_10
value: 39.135
- type: recall_at_100
value: 64.095
- type: recall_at_1000
value: 87.485
- type: recall_at_3
value: 25.496999999999996
- type: recall_at_5
value: 31.148999999999997
- type: map_at_1
value: 24.371000000000002
- type: map_at_10
value: 33.074999999999996
- type: map_at_100
value: 34.486
- type: map_at_1000
value: 34.608
- type: map_at_3
value: 30.483
- type: map_at_5
value: 31.972
- type: mrr_at_1
value: 29.548000000000002
- type: mrr_at_10
value: 38.431
- type: mrr_at_100
value: 39.347
- type: mrr_at_1000
value: 39.4
- type: mrr_at_3
value: 35.980000000000004
- type: mrr_at_5
value: 37.413999999999994
- type: ndcg_at_1
value: 29.548000000000002
- type: ndcg_at_10
value: 38.552
- type: ndcg_at_100
value: 44.598
- type: ndcg_at_1000
value: 47.0
- type: ndcg_at_3
value: 34.109
- type: ndcg_at_5
value: 36.263
- type: precision_at_1
value: 29.548000000000002
- type: precision_at_10
value: 6.92
- type: precision_at_100
value: 1.179
- type: precision_at_1000
value: 0.159
- type: precision_at_3
value: 16.137
- type: precision_at_5
value: 11.511000000000001
- type: recall_at_1
value: 24.371000000000002
- type: recall_at_10
value: 49.586999999999996
- type: recall_at_100
value: 75.15899999999999
- type: recall_at_1000
value: 91.06
- type: recall_at_3
value: 37.09
- type: recall_at_5
value: 42.588
- type: map_at_1
value: 24.517
- type: map_at_10
value: 32.969
- type: map_at_100
value: 34.199
- type: map_at_1000
value: 34.322
- type: map_at_3
value: 30.270999999999997
- type: map_at_5
value: 31.863000000000003
- type: mrr_at_1
value: 30.479
- type: mrr_at_10
value: 38.633
- type: mrr_at_100
value: 39.522
- type: mrr_at_1000
value: 39.583
- type: mrr_at_3
value: 36.454
- type: mrr_at_5
value: 37.744
- type: ndcg_at_1
value: 30.479
- type: ndcg_at_10
value: 38.269
- type: ndcg_at_100
value: 43.91
- type: ndcg_at_1000
value: 46.564
- type: ndcg_at_3
value: 34.03
- type: ndcg_at_5
value: 36.155
- type: precision_at_1
value: 30.479
- type: precision_at_10
value: 6.815
- type: precision_at_100
value: 1.138
- type: precision_at_1000
value: 0.158
- type: precision_at_3
value: 16.058
- type: precision_at_5
value: 11.416
- type: recall_at_1
value: 24.517
- type: recall_at_10
value: 48.559000000000005
- type: recall_at_100
value: 73.307
- type: recall_at_1000
value: 91.508
- type: recall_at_3
value: 36.563
- type: recall_at_5
value: 42.375
- type: map_at_1
value: 24.336166666666664
- type: map_at_10
value: 32.80791666666667
- type: map_at_100
value: 34.043416666666666
- type: map_at_1000
value: 34.162749999999996
- type: map_at_3
value: 30.187083333333337
- type: map_at_5
value: 31.637833333333337
- type: mrr_at_1
value: 28.669583333333343
- type: mrr_at_10
value: 36.88616666666667
- type: mrr_at_100
value: 37.80233333333333
- type: mrr_at_1000
value: 37.86141666666666
- type: mrr_at_3
value: 34.537416666666665
- type: mrr_at_5
value: 35.84275
- type: ndcg_at_1
value: 28.669583333333343
- type: ndcg_at_10
value: 37.956916666666665
- type: ndcg_at_100
value: 43.39475
- type: ndcg_at_1000
value: 45.79925
- type: ndcg_at_3
value: 33.43683333333334
- type: ndcg_at_5
value: 35.52575
- type: precision_at_1
value: 28.669583333333343
- type: precision_at_10
value: 6.603833333333335
- type: precision_at_100
value: 1.1079166666666667
- type: precision_at_1000
value: 0.15208333333333335
- type: precision_at_3
value: 15.338750000000001
- type: precision_at_5
value: 10.88775
- type: recall_at_1
value: 24.336166666666664
- type: recall_at_10
value: 49.19358333333333
- type: recall_at_100
value: 73.07583333333334
- type: recall_at_1000
value: 89.81675
- type: recall_at_3
value: 36.54091666666667
- type: recall_at_5
value: 41.919250000000005
- type: map_at_1
value: 23.388
- type: map_at_10
value: 29.408
- type: map_at_100
value: 30.452
- type: map_at_1000
value: 30.546
- type: map_at_3
value: 27.139000000000003
- type: map_at_5
value: 28.402
- type: mrr_at_1
value: 25.46
- type: mrr_at_10
value: 31.966
- type: mrr_at_100
value: 32.879999999999995
- type: mrr_at_1000
value: 32.944
- type: mrr_at_3
value: 29.755
- type: mrr_at_5
value: 30.974
- type: ndcg_at_1
value: 25.46
- type: ndcg_at_10
value: 33.449
- type: ndcg_at_100
value: 38.67
- type: ndcg_at_1000
value: 41.035
- type: ndcg_at_3
value: 29.048000000000002
- type: ndcg_at_5
value: 31.127
- type: precision_at_1
value: 25.46
- type: precision_at_10
value: 5.199
- type: precision_at_100
value: 0.8670000000000001
- type: precision_at_1000
value: 0.11399999999999999
- type: precision_at_3
value: 12.168
- type: precision_at_5
value: 8.62
- type: recall_at_1
value: 23.388
- type: recall_at_10
value: 43.428
- type: recall_at_100
value: 67.245
- type: recall_at_1000
value: 84.75399999999999
- type: recall_at_3
value: 31.416
- type: recall_at_5
value: 36.451
- type: map_at_1
value: 17.136000000000003
- type: map_at_10
value: 24.102999999999998
- type: map_at_100
value: 25.219
- type: map_at_1000
value: 25.344
- type: map_at_3
value: 22.004
- type: map_at_5
value: 23.145
- type: mrr_at_1
value: 20.613
- type: mrr_at_10
value: 27.753
- type: mrr_at_100
value: 28.698
- type: mrr_at_1000
value: 28.776000000000003
- type: mrr_at_3
value: 25.711000000000002
- type: mrr_at_5
value: 26.795
- type: ndcg_at_1
value: 20.613
- type: ndcg_at_10
value: 28.510999999999996
- type: ndcg_at_100
value: 33.924
- type: ndcg_at_1000
value: 36.849
- type: ndcg_at_3
value: 24.664
- type: ndcg_at_5
value: 26.365
- type: precision_at_1
value: 20.613
- type: precision_at_10
value: 5.069
- type: precision_at_100
value: 0.918
- type: precision_at_1000
value: 0.136
- type: precision_at_3
value: 11.574
- type: precision_at_5
value: 8.211
- type: recall_at_1
value: 17.136000000000003
- type: recall_at_10
value: 38.232
- type: recall_at_100
value: 62.571
- type: recall_at_1000
value: 83.23
- type: recall_at_3
value: 27.468999999999998
- type: recall_at_5
value: 31.852999999999998
- type: map_at_1
value: 25.580000000000002
- type: map_at_10
value: 33.449
- type: map_at_100
value: 34.58
- type: map_at_1000
value: 34.692
- type: map_at_3
value: 30.660999999999998
- type: map_at_5
value: 32.425
- type: mrr_at_1
value: 30.037000000000003
- type: mrr_at_10
value: 37.443
- type: mrr_at_100
value: 38.32
- type: mrr_at_1000
value: 38.384
- type: mrr_at_3
value: 34.778999999999996
- type: mrr_at_5
value: 36.458
- type: ndcg_at_1
value: 30.037000000000003
- type: ndcg_at_10
value: 38.46
- type: ndcg_at_100
value: 43.746
- type: ndcg_at_1000
value: 46.28
- type: ndcg_at_3
value: 33.52
- type: ndcg_at_5
value: 36.175000000000004
- type: precision_at_1
value: 30.037000000000003
- type: precision_at_10
value: 6.418
- type: precision_at_100
value: 1.0210000000000001
- type: precision_at_1000
value: 0.136
- type: precision_at_3
value: 15.018999999999998
- type: precision_at_5
value: 10.877
- type: recall_at_1
value: 25.580000000000002
- type: recall_at_10
value: 49.830000000000005
- type: recall_at_100
value: 73.04899999999999
- type: recall_at_1000
value: 90.751
- type: recall_at_3
value: 36.370999999999995
- type: recall_at_5
value: 43.104
- type: map_at_1
value: 24.071
- type: map_at_10
value: 33.384
- type: map_at_100
value: 35.004999999999995
- type: map_at_1000
value: 35.215999999999994
- type: map_at_3
value: 30.459000000000003
- type: map_at_5
value: 31.769
- type: mrr_at_1
value: 28.854000000000003
- type: mrr_at_10
value: 37.512
- type: mrr_at_100
value: 38.567
- type: mrr_at_1000
value: 38.618
- type: mrr_at_3
value: 35.211
- type: mrr_at_5
value: 36.13
- type: ndcg_at_1
value: 28.854000000000003
- type: ndcg_at_10
value: 39.216
- type: ndcg_at_100
value: 45.214
- type: ndcg_at_1000
value: 47.573
- type: ndcg_at_3
value: 34.597
- type: ndcg_at_5
value: 36.063
- type: precision_at_1
value: 28.854000000000003
- type: precision_at_10
value: 7.648000000000001
- type: precision_at_100
value: 1.545
- type: precision_at_1000
value: 0.241
- type: precision_at_3
value: 16.667
- type: precision_at_5
value: 11.818
- type: recall_at_1
value: 24.071
- type: recall_at_10
value: 50.802
- type: recall_at_100
value: 77.453
- type: recall_at_1000
value: 92.304
- type: recall_at_3
value: 36.846000000000004
- type: recall_at_5
value: 41.14
- type: map_at_1
value: 23.395
- type: map_at_10
value: 29.189999999999998
- type: map_at_100
value: 30.226999999999997
- type: map_at_1000
value: 30.337999999999997
- type: map_at_3
value: 27.342
- type: map_at_5
value: 28.116999999999997
- type: mrr_at_1
value: 25.323
- type: mrr_at_10
value: 31.241000000000003
- type: mrr_at_100
value: 32.225
- type: mrr_at_1000
value: 32.304
- type: mrr_at_3
value: 29.452
- type: mrr_at_5
value: 30.209000000000003
- type: ndcg_at_1
value: 25.323
- type: ndcg_at_10
value: 33.024
- type: ndcg_at_100
value: 38.279
- type: ndcg_at_1000
value: 41.026
- type: ndcg_at_3
value: 29.243000000000002
- type: ndcg_at_5
value: 30.564000000000004
- type: precision_at_1
value: 25.323
- type: precision_at_10
value: 4.972
- type: precision_at_100
value: 0.8210000000000001
- type: precision_at_1000
value: 0.116
- type: precision_at_3
value: 12.076
- type: precision_at_5
value: 8.133
- type: recall_at_1
value: 23.395
- type: recall_at_10
value: 42.994
- type: recall_at_100
value: 66.985
- type: recall_at_1000
value: 87.483
- type: recall_at_3
value: 32.505
- type: recall_at_5
value: 35.721000000000004
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: climate-fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 8.322000000000001
- type: map_at_10
value: 14.491000000000001
- type: map_at_100
value: 16.066
- type: map_at_1000
value: 16.238
- type: map_at_3
value: 12.235
- type: map_at_5
value: 13.422999999999998
- type: mrr_at_1
value: 19.479
- type: mrr_at_10
value: 29.38
- type: mrr_at_100
value: 30.520999999999997
- type: mrr_at_1000
value: 30.570999999999998
- type: mrr_at_3
value: 26.395000000000003
- type: mrr_at_5
value: 27.982000000000003
- type: ndcg_at_1
value: 19.479
- type: ndcg_at_10
value: 21.215
- type: ndcg_at_100
value: 27.966
- type: ndcg_at_1000
value: 31.324
- type: ndcg_at_3
value: 17.194000000000003
- type: ndcg_at_5
value: 18.593
- type: precision_at_1
value: 19.479
- type: precision_at_10
value: 6.5280000000000005
- type: precision_at_100
value: 1.359
- type: precision_at_1000
value: 0.198
- type: precision_at_3
value: 12.703999999999999
- type: precision_at_5
value: 9.655
- type: recall_at_1
value: 8.322000000000001
- type: recall_at_10
value: 26.165
- type: recall_at_100
value: 49.573
- type: recall_at_1000
value: 68.501
- type: recall_at_3
value: 16.179
- type: recall_at_5
value: 20.175
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: dbpedia-entity
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 8.003
- type: map_at_10
value: 16.087
- type: map_at_100
value: 21.363
- type: map_at_1000
value: 22.64
- type: map_at_3
value: 12.171999999999999
- type: map_at_5
value: 13.866
- type: mrr_at_1
value: 61.25000000000001
- type: mrr_at_10
value: 68.626
- type: mrr_at_100
value: 69.134
- type: mrr_at_1000
value: 69.144
- type: mrr_at_3
value: 67.042
- type: mrr_at_5
value: 67.929
- type: ndcg_at_1
value: 49.0
- type: ndcg_at_10
value: 34.132
- type: ndcg_at_100
value: 37.545
- type: ndcg_at_1000
value: 44.544
- type: ndcg_at_3
value: 38.946999999999996
- type: ndcg_at_5
value: 36.317
- type: precision_at_1
value: 61.25000000000001
- type: precision_at_10
value: 26.325
- type: precision_at_100
value: 8.173
- type: precision_at_1000
value: 1.778
- type: precision_at_3
value: 41.667
- type: precision_at_5
value: 34.300000000000004
- type: recall_at_1
value: 8.003
- type: recall_at_10
value: 20.577
- type: recall_at_100
value: 41.884
- type: recall_at_1000
value: 64.36500000000001
- type: recall_at_3
value: 13.602
- type: recall_at_5
value: 16.41
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 45.835
- type: f1
value: 41.66455981281837
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: fever
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 55.717000000000006
- type: map_at_10
value: 66.34100000000001
- type: map_at_100
value: 66.776
- type: map_at_1000
value: 66.794
- type: map_at_3
value: 64.386
- type: map_at_5
value: 65.566
- type: mrr_at_1
value: 60.141
- type: mrr_at_10
value: 70.928
- type: mrr_at_100
value: 71.29299999999999
- type: mrr_at_1000
value: 71.30199999999999
- type: mrr_at_3
value: 69.07900000000001
- type: mrr_at_5
value: 70.244
- type: ndcg_at_1
value: 60.141
- type: ndcg_at_10
value: 71.90100000000001
- type: ndcg_at_100
value: 73.836
- type: ndcg_at_1000
value: 74.214
- type: ndcg_at_3
value: 68.203
- type: ndcg_at_5
value: 70.167
- type: precision_at_1
value: 60.141
- type: precision_at_10
value: 9.268
- type: precision_at_100
value: 1.03
- type: precision_at_1000
value: 0.108
- type: precision_at_3
value: 27.028000000000002
- type: precision_at_5
value: 17.342
- type: recall_at_1
value: 55.717000000000006
- type: recall_at_10
value: 84.66799999999999
- type: recall_at_100
value: 93.28
- type: recall_at_1000
value: 95.887
- type: recall_at_3
value: 74.541
- type: recall_at_5
value: 79.389
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: fiqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 17.744
- type: map_at_10
value: 29.554000000000002
- type: map_at_100
value: 31.180000000000003
- type: map_at_1000
value: 31.372
- type: map_at_3
value: 25.6
- type: map_at_5
value: 27.642
- type: mrr_at_1
value: 35.802
- type: mrr_at_10
value: 44.812999999999995
- type: mrr_at_100
value: 45.56
- type: mrr_at_1000
value: 45.606
- type: mrr_at_3
value: 42.181000000000004
- type: mrr_at_5
value: 43.516
- type: ndcg_at_1
value: 35.802
- type: ndcg_at_10
value: 37.269999999999996
- type: ndcg_at_100
value: 43.575
- type: ndcg_at_1000
value: 46.916000000000004
- type: ndcg_at_3
value: 33.511
- type: ndcg_at_5
value: 34.504000000000005
- type: precision_at_1
value: 35.802
- type: precision_at_10
value: 10.448
- type: precision_at_100
value: 1.7129999999999999
- type: precision_at_1000
value: 0.231
- type: precision_at_3
value: 22.531000000000002
- type: precision_at_5
value: 16.512
- type: recall_at_1
value: 17.744
- type: recall_at_10
value: 44.616
- type: recall_at_100
value: 68.51899999999999
- type: recall_at_1000
value: 88.495
- type: recall_at_3
value: 30.235
- type: recall_at_5
value: 35.821999999999996
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: hotpotqa
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 33.315
- type: map_at_10
value: 45.932
- type: map_at_100
value: 46.708
- type: map_at_1000
value: 46.778999999999996
- type: map_at_3
value: 43.472
- type: map_at_5
value: 45.022
- type: mrr_at_1
value: 66.631
- type: mrr_at_10
value: 73.083
- type: mrr_at_100
value: 73.405
- type: mrr_at_1000
value: 73.421
- type: mrr_at_3
value: 71.756
- type: mrr_at_5
value: 72.616
- type: ndcg_at_1
value: 66.631
- type: ndcg_at_10
value: 54.949000000000005
- type: ndcg_at_100
value: 57.965
- type: ndcg_at_1000
value: 59.467000000000006
- type: ndcg_at_3
value: 51.086
- type: ndcg_at_5
value: 53.272
- type: precision_at_1
value: 66.631
- type: precision_at_10
value: 11.178
- type: precision_at_100
value: 1.3559999999999999
- type: precision_at_1000
value: 0.156
- type: precision_at_3
value: 31.582
- type: precision_at_5
value: 20.678
- type: recall_at_1
value: 33.315
- type: recall_at_10
value: 55.888000000000005
- type: recall_at_100
value: 67.812
- type: recall_at_1000
value: 77.839
- type: recall_at_3
value: 47.373
- type: recall_at_5
value: 51.695
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 66.424
- type: ap
value: 61.132235499939256
- type: f1
value: 66.07094958225315
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: msmarco
config: default
split: dev
revision: None
metrics:
- type: map_at_1
value: 21.575
- type: map_at_10
value: 33.509
- type: map_at_100
value: 34.725
- type: map_at_1000
value: 34.775
- type: map_at_3
value: 29.673
- type: map_at_5
value: 31.805
- type: mrr_at_1
value: 22.235
- type: mrr_at_10
value: 34.1
- type: mrr_at_100
value: 35.254999999999995
- type: mrr_at_1000
value: 35.299
- type: mrr_at_3
value: 30.334
- type: mrr_at_5
value: 32.419
- type: ndcg_at_1
value: 22.235
- type: ndcg_at_10
value: 40.341
- type: ndcg_at_100
value: 46.161
- type: ndcg_at_1000
value: 47.400999999999996
- type: ndcg_at_3
value: 32.482
- type: ndcg_at_5
value: 36.269
- type: precision_at_1
value: 22.235
- type: precision_at_10
value: 6.422999999999999
- type: precision_at_100
value: 0.9329999999999999
- type: precision_at_1000
value: 0.104
- type: precision_at_3
value: 13.835
- type: precision_at_5
value: 10.226
- type: recall_at_1
value: 21.575
- type: recall_at_10
value: 61.448
- type: recall_at_100
value: 88.289
- type: recall_at_1000
value: 97.76899999999999
- type: recall_at_3
value: 39.971000000000004
- type: recall_at_5
value: 49.053000000000004
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 92.83401732786137
- type: f1
value: 92.47678691291068
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 76.08983128134975
- type: f1
value: 59.782936393820904
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 72.73032952252858
- type: f1
value: 70.72684765888265
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 77.08473436449226
- type: f1
value: 77.31457411257054
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 30.11980959210532
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 25.2587629106119
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 31.48268319779204
- type: mrr
value: 32.501885728964304
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: nfcorpus
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 5.284
- type: map_at_10
value: 11.509
- type: map_at_100
value: 14.624
- type: map_at_1000
value: 16.035
- type: map_at_3
value: 8.347999999999999
- type: map_at_5
value: 9.919
- type: mrr_at_1
value: 43.344
- type: mrr_at_10
value: 52.303999999999995
- type: mrr_at_100
value: 52.994
- type: mrr_at_1000
value: 53.032999999999994
- type: mrr_at_3
value: 50.361
- type: mrr_at_5
value: 51.754
- type: ndcg_at_1
value: 41.176
- type: ndcg_at_10
value: 32.244
- type: ndcg_at_100
value: 29.916999999999998
- type: ndcg_at_1000
value: 38.753
- type: ndcg_at_3
value: 36.856
- type: ndcg_at_5
value: 35.394999999999996
- type: precision_at_1
value: 43.034
- type: precision_at_10
value: 24.118000000000002
- type: precision_at_100
value: 7.926
- type: precision_at_1000
value: 2.045
- type: precision_at_3
value: 34.675
- type: precision_at_5
value: 31.146
- type: recall_at_1
value: 5.284
- type: recall_at_10
value: 15.457
- type: recall_at_100
value: 30.914
- type: recall_at_1000
value: 63.788999999999994
- type: recall_at_3
value: 9.596
- type: recall_at_5
value: 12.391
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: nq
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 29.537999999999997
- type: map_at_10
value: 43.99
- type: map_at_100
value: 45.003
- type: map_at_1000
value: 45.04
- type: map_at_3
value: 39.814
- type: map_at_5
value: 42.166
- type: mrr_at_1
value: 33.256
- type: mrr_at_10
value: 46.487
- type: mrr_at_100
value: 47.264
- type: mrr_at_1000
value: 47.29
- type: mrr_at_3
value: 43.091
- type: mrr_at_5
value: 45.013999999999996
- type: ndcg_at_1
value: 33.256
- type: ndcg_at_10
value: 51.403
- type: ndcg_at_100
value: 55.706999999999994
- type: ndcg_at_1000
value: 56.586000000000006
- type: ndcg_at_3
value: 43.559
- type: ndcg_at_5
value: 47.426
- type: precision_at_1
value: 33.256
- type: precision_at_10
value: 8.540000000000001
- type: precision_at_100
value: 1.093
- type: precision_at_1000
value: 0.11800000000000001
- type: precision_at_3
value: 19.834
- type: precision_at_5
value: 14.143
- type: recall_at_1
value: 29.537999999999997
- type: recall_at_10
value: 71.5
- type: recall_at_100
value: 90.25
- type: recall_at_1000
value: 96.82600000000001
- type: recall_at_3
value: 51.108
- type: recall_at_5
value: 60.006
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: quora
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 70.526
- type: map_at_10
value: 84.342
- type: map_at_100
value: 84.985
- type: map_at_1000
value: 85.003
- type: map_at_3
value: 81.472
- type: map_at_5
value: 83.292
- type: mrr_at_1
value: 81.17
- type: mrr_at_10
value: 87.33999999999999
- type: mrr_at_100
value: 87.445
- type: mrr_at_1000
value: 87.446
- type: mrr_at_3
value: 86.387
- type: mrr_at_5
value: 87.042
- type: ndcg_at_1
value: 81.19
- type: ndcg_at_10
value: 88.088
- type: ndcg_at_100
value: 89.35
- type: ndcg_at_1000
value: 89.462
- type: ndcg_at_3
value: 85.319
- type: ndcg_at_5
value: 86.858
- type: precision_at_1
value: 81.19
- type: precision_at_10
value: 13.33
- type: precision_at_100
value: 1.528
- type: precision_at_1000
value: 0.157
- type: precision_at_3
value: 37.31
- type: precision_at_5
value: 24.512
- type: recall_at_1
value: 70.526
- type: recall_at_10
value: 95.166
- type: recall_at_100
value: 99.479
- type: recall_at_1000
value: 99.984
- type: recall_at_3
value: 87.124
- type: recall_at_5
value: 91.53
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 45.049073872893494
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 55.13810914528368
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: scidocs
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 4.593
- type: map_at_10
value: 10.907
- type: map_at_100
value: 12.888
- type: map_at_1000
value: 13.167000000000002
- type: map_at_3
value: 7.936
- type: map_at_5
value: 9.31
- type: mrr_at_1
value: 22.7
- type: mrr_at_10
value: 32.509
- type: mrr_at_100
value: 33.69
- type: mrr_at_1000
value: 33.747
- type: mrr_at_3
value: 29.599999999999998
- type: mrr_at_5
value: 31.155
- type: ndcg_at_1
value: 22.7
- type: ndcg_at_10
value: 18.445
- type: ndcg_at_100
value: 26.241999999999997
- type: ndcg_at_1000
value: 31.409
- type: ndcg_at_3
value: 17.864
- type: ndcg_at_5
value: 15.232999999999999
- type: precision_at_1
value: 22.7
- type: precision_at_10
value: 9.43
- type: precision_at_100
value: 2.061
- type: precision_at_1000
value: 0.331
- type: precision_at_3
value: 16.467000000000002
- type: precision_at_5
value: 13.08
- type: recall_at_1
value: 4.593
- type: recall_at_10
value: 19.115
- type: recall_at_100
value: 41.82
- type: recall_at_1000
value: 67.167
- type: recall_at_3
value: 9.983
- type: recall_at_5
value: 13.218
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 82.94432059816452
- type: cos_sim_spearman
value: 79.19993315048852
- type: euclidean_pearson
value: 72.43261099671753
- type: euclidean_spearman
value: 71.51531114998619
- type: manhattan_pearson
value: 71.83604124130447
- type: manhattan_spearman
value: 71.24460392842295
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 84.25401068481673
- type: cos_sim_spearman
value: 74.5249604699309
- type: euclidean_pearson
value: 71.1324859629043
- type: euclidean_spearman
value: 58.77041705276752
- type: manhattan_pearson
value: 71.01471521586141
- type: manhattan_spearman
value: 58.69949381017865
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 82.85731544223766
- type: cos_sim_spearman
value: 83.15607264736185
- type: euclidean_pearson
value: 75.8803249521361
- type: euclidean_spearman
value: 76.4862168799065
- type: manhattan_pearson
value: 75.80451454386811
- type: manhattan_spearman
value: 76.35986831074699
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 82.40669043798857
- type: cos_sim_spearman
value: 78.08686090667834
- type: euclidean_pearson
value: 74.48574712193803
- type: euclidean_spearman
value: 70.79423012045118
- type: manhattan_pearson
value: 74.39099211477354
- type: manhattan_spearman
value: 70.73135427277684
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 86.03027014209859
- type: cos_sim_spearman
value: 86.91082847840946
- type: euclidean_pearson
value: 69.13187603971996
- type: euclidean_spearman
value: 70.0370035340552
- type: manhattan_pearson
value: 69.2586635812031
- type: manhattan_spearman
value: 70.18638387118486
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 82.41190748361883
- type: cos_sim_spearman
value: 83.64850851235231
- type: euclidean_pearson
value: 71.60523243575282
- type: euclidean_spearman
value: 72.26134033805099
- type: manhattan_pearson
value: 71.50771482066683
- type: manhattan_spearman
value: 72.13707967973161
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 90.42838477648627
- type: cos_sim_spearman
value: 90.15798155439076
- type: euclidean_pearson
value: 77.09619972244516
- type: euclidean_spearman
value: 75.5953488548861
- type: manhattan_pearson
value: 77.36892406451771
- type: manhattan_spearman
value: 75.76625156149356
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 65.76151154879307
- type: cos_sim_spearman
value: 64.8846800918359
- type: euclidean_pearson
value: 50.23302700257155
- type: euclidean_spearman
value: 58.89455187289583
- type: manhattan_pearson
value: 50.05498582284945
- type: manhattan_spearman
value: 58.75893793871576
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 84.72381109169437
- type: cos_sim_spearman
value: 84.59820928231167
- type: euclidean_pearson
value: 74.85450857429493
- type: euclidean_spearman
value: 73.83634052565915
- type: manhattan_pearson
value: 74.97349743979106
- type: manhattan_spearman
value: 73.9636470375881
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 80.96736259172798
- type: mrr
value: 94.48378781712114
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: scifact
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 46.344
- type: map_at_10
value: 54.962
- type: map_at_100
value: 55.772
- type: map_at_1000
value: 55.81700000000001
- type: map_at_3
value: 51.832
- type: map_at_5
value: 53.718999999999994
- type: mrr_at_1
value: 49.0
- type: mrr_at_10
value: 56.721
- type: mrr_at_100
value: 57.287
- type: mrr_at_1000
value: 57.330000000000005
- type: mrr_at_3
value: 54.056000000000004
- type: mrr_at_5
value: 55.822
- type: ndcg_at_1
value: 49.0
- type: ndcg_at_10
value: 59.757000000000005
- type: ndcg_at_100
value: 63.149
- type: ndcg_at_1000
value: 64.43100000000001
- type: ndcg_at_3
value: 54.105000000000004
- type: ndcg_at_5
value: 57.196999999999996
- type: precision_at_1
value: 49.0
- type: precision_at_10
value: 8.200000000000001
- type: precision_at_100
value: 1.0070000000000001
- type: precision_at_1000
value: 0.11100000000000002
- type: precision_at_3
value: 20.889
- type: precision_at_5
value: 14.399999999999999
- type: recall_at_1
value: 46.344
- type: recall_at_10
value: 72.722
- type: recall_at_100
value: 88.167
- type: recall_at_1000
value: 98.333
- type: recall_at_3
value: 57.994
- type: recall_at_5
value: 65.506
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.83366336633664
- type: cos_sim_ap
value: 96.09329747251944
- type: cos_sim_f1
value: 91.66255550074001
- type: cos_sim_precision
value: 90.45764362220059
- type: cos_sim_recall
value: 92.9
- type: dot_accuracy
value: 99.32871287128712
- type: dot_ap
value: 63.95436644147969
- type: dot_f1
value: 60.61814556331008
- type: dot_precision
value: 60.437375745526836
- type: dot_recall
value: 60.8
- type: euclidean_accuracy
value: 99.66534653465347
- type: euclidean_ap
value: 85.85143979761818
- type: euclidean_f1
value: 81.57033805888769
- type: euclidean_precision
value: 89.68824940047962
- type: euclidean_recall
value: 74.8
- type: manhattan_accuracy
value: 99.65742574257426
- type: manhattan_ap
value: 85.55693926348405
- type: manhattan_f1
value: 81.13804004214963
- type: manhattan_precision
value: 85.74610244988864
- type: manhattan_recall
value: 77.0
- type: max_accuracy
value: 99.83366336633664
- type: max_ap
value: 96.09329747251944
- type: max_f1
value: 91.66255550074001
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 45.23573510003245
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 33.37478638401161
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 50.375920467392476
- type: mrr
value: 51.17302223919871
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 29.768864092288343
- type: cos_sim_spearman
value: 29.854278347043266
- type: dot_pearson
value: 20.51281723837505
- type: dot_spearman
value: 21.799102540913665
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: trec-covid
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 0.2
- type: map_at_10
value: 1.202
- type: map_at_100
value: 6.729
- type: map_at_1000
value: 15.928
- type: map_at_3
value: 0.492
- type: map_at_5
value: 0.712
- type: mrr_at_1
value: 76.0
- type: mrr_at_10
value: 84.75
- type: mrr_at_100
value: 84.75
- type: mrr_at_1000
value: 84.75
- type: mrr_at_3
value: 83.0
- type: mrr_at_5
value: 84.5
- type: ndcg_at_1
value: 71.0
- type: ndcg_at_10
value: 57.253
- type: ndcg_at_100
value: 44.383
- type: ndcg_at_1000
value: 38.666
- type: ndcg_at_3
value: 64.324
- type: ndcg_at_5
value: 60.791
- type: precision_at_1
value: 76.0
- type: precision_at_10
value: 59.599999999999994
- type: precision_at_100
value: 45.440000000000005
- type: precision_at_1000
value: 17.458000000000002
- type: precision_at_3
value: 69.333
- type: precision_at_5
value: 63.2
- type: recall_at_1
value: 0.2
- type: recall_at_10
value: 1.4949999999999999
- type: recall_at_100
value: 10.266
- type: recall_at_1000
value: 35.853
- type: recall_at_3
value: 0.5349999999999999
- type: recall_at_5
value: 0.8109999999999999
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: webis-touche2020
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 2.0140000000000002
- type: map_at_10
value: 8.474
- type: map_at_100
value: 14.058000000000002
- type: map_at_1000
value: 15.381
- type: map_at_3
value: 4.508
- type: map_at_5
value: 5.87
- type: mrr_at_1
value: 22.448999999999998
- type: mrr_at_10
value: 37.242
- type: mrr_at_100
value: 38.291
- type: mrr_at_1000
value: 38.311
- type: mrr_at_3
value: 32.312999999999995
- type: mrr_at_5
value: 34.762
- type: ndcg_at_1
value: 20.408
- type: ndcg_at_10
value: 20.729
- type: ndcg_at_100
value: 33.064
- type: ndcg_at_1000
value: 44.324999999999996
- type: ndcg_at_3
value: 21.251
- type: ndcg_at_5
value: 20.28
- type: precision_at_1
value: 22.448999999999998
- type: precision_at_10
value: 18.98
- type: precision_at_100
value: 7.224
- type: precision_at_1000
value: 1.471
- type: precision_at_3
value: 22.448999999999998
- type: precision_at_5
value: 20.816000000000003
- type: recall_at_1
value: 2.0140000000000002
- type: recall_at_10
value: 13.96
- type: recall_at_100
value: 44.187
- type: recall_at_1000
value: 79.328
- type: recall_at_3
value: 5.345
- type: recall_at_5
value: 7.979
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 69.1312
- type: ap
value: 12.606776505497608
- type: f1
value: 52.4112415600534
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 58.16072439162422
- type: f1
value: 58.29152785435414
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 40.421119289825924
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 85.48012159504083
- type: cos_sim_ap
value: 72.31974877212102
- type: cos_sim_f1
value: 67.96846573681019
- type: cos_sim_precision
value: 62.89562289562289
- type: cos_sim_recall
value: 73.93139841688654
- type: dot_accuracy
value: 78.52416999463551
- type: dot_ap
value: 43.65271285411479
- type: dot_f1
value: 46.94641449960599
- type: dot_precision
value: 37.456774599182644
- type: dot_recall
value: 62.875989445910285
- type: euclidean_accuracy
value: 83.90057817249806
- type: euclidean_ap
value: 65.96278727778665
- type: euclidean_f1
value: 63.35733232284957
- type: euclidean_precision
value: 60.770535497940394
- type: euclidean_recall
value: 66.17414248021109
- type: manhattan_accuracy
value: 83.96614412588663
- type: manhattan_ap
value: 66.03670273156699
- type: manhattan_f1
value: 63.49128406579917
- type: manhattan_precision
value: 59.366391184573
- type: manhattan_recall
value: 68.23218997361478
- type: max_accuracy
value: 85.48012159504083
- type: max_ap
value: 72.31974877212102
- type: max_f1
value: 67.96846573681019
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 88.97038848139093
- type: cos_sim_ap
value: 85.982764495556
- type: cos_sim_f1
value: 78.73283281450284
- type: cos_sim_precision
value: 75.07857791436754
- type: cos_sim_recall
value: 82.7610101632276
- type: dot_accuracy
value: 83.21108394458028
- type: dot_ap
value: 70.97956937273386
- type: dot_f1
value: 66.53083038279111
- type: dot_precision
value: 58.7551622418879
- type: dot_recall
value: 76.67847243609486
- type: euclidean_accuracy
value: 84.31520937633407
- type: euclidean_ap
value: 74.67323411319909
- type: euclidean_f1
value: 67.21935410935676
- type: euclidean_precision
value: 65.82773636430733
- type: euclidean_recall
value: 68.67108099784416
- type: manhattan_accuracy
value: 84.35013777312066
- type: manhattan_ap
value: 74.66508905354597
- type: manhattan_f1
value: 67.28264162375038
- type: manhattan_precision
value: 66.19970193740686
- type: manhattan_recall
value: 68.40160147828766
- type: max_accuracy
value: 88.97038848139093
- type: max_ap
value: 85.982764495556
- type: max_f1
value: 78.73283281450284
---
<br><br>
<p align="center">
<img src="https://huggingface.co/datasets/jinaai/documentation-images/resolve/main/logo.webp" alt="Jina AI: Your Search Foundation, Supercharged!" width="150px">
</p>
<p align="center">
<b>The text embedding set trained by <a href="https://jina.ai/"><b>Jina AI</b></a></b>
</p>
## Intented Usage & Model Info
`jina-embedding-l-en-v1` is a language model that has been trained using Jina AI's Linnaeus-Clean dataset.
This dataset consists of 380 million pairs of sentences, which include both query-document pairs.
These pairs were obtained from various domains and were carefully selected through a thorough cleaning process.
The Linnaeus-Full dataset, from which the Linnaeus-Clean dataset is derived, originally contained 1.6 billion sentence pairs.
The model has a range of use cases, including information retrieval, semantic textual similarity, text reranking, and more.
With a size of 330 million parameters,
the model enables single-gpu inference while delivering better performance than our small and base model.
Additionally, we provide the following options:
- [`jina-embedding-t-en-v1`](https://huggingface.co/jinaai/jina-embedding-t-en-v1): 14 million parameters.
- [`jina-embedding-s-en-v1`](https://huggingface.co/jinaai/jina-embedding-s-en-v1): 35 million parameters
- [`jina-embedding-b-en-v1`](https://huggingface.co/jinaai/jina-embedding-b-en-v1): 110 million parameters.
- [`jina-embedding-l-en-v1`](https://huggingface.co/jinaai/jina-embedding-l-en-v1): 330 million parameters **(you are here)**.
- `jina-embedding-1b-en-v1`: 1.2 billion parameters, 10 times bert-base (soon).
- `jina-embedding-6b-en-v1`: 6 billion parameters, 30 times bert-base (soon).
## Data & Parameters
Please checkout our [technical blog](https://arxiv.org/abs/2307.11224).
## Metrics
We compared the model against `all-minilm-l6-v2`/`all-mpnet-base-v2` from sbert and `text-embeddings-ada-002` from OpenAI:
|Name|param |dimension|
|------------------------------|-----|------|
|all-minilm-l6-v2|23m |384|
|all-mpnet-base-v2 |110m |768|
|ada-embedding-002|Unknown/OpenAI API |1536|
|jina-embedding-t-en-v1|14m |312|
|jina-embedding-s-en-v1|35m |512|
|jina-embedding-b-en-v1|110m |768|
|jina-embedding-l-en-v1|330m |1024|
|Name|STS12|STS13|STS14|STS15|STS16|STS17|TRECOVID|Quora|SciFact|
|------------------------------|-----|-----|-----|-----|-----|-----|--------|-----|-----|
|all-minilm-l6-v2|0.724|0.806|0.756|0.854|0.79 |0.876|0.473 |0.876|0.645 |
|all-mpnet-base-v2|0.726|**0.835**|0.78 |0.857|0.8 |**0.906**|0.513 |0.875|0.656 |
|ada-embedding-002|0.698|0.833|0.761|0.861|**0.86** |0.903|**0.685** |0.876|**0.726** |
|jina-embedding-t-en-v1|0.717|0.773|0.731|0.829|0.777|0.860|0.482 |0.840|0.522 |
|jina-embedding-s-en-v1|0.743|0.786|0.738|0.837|0.80|0.875|0.523 |0.857|0.524 |
|jina-embedding-b-en-v1|**0.751**|0.809|0.761|0.856|0.812|0.890|0.606 |0.876|0.594 |
|jina-embedding-l-en-v1|0.745|0.832|**0.781**|**0.869**|0.837|0.902|0.573 |**0.881**|0.598 |
## Usage
Use with Jina AI Finetuner
```python
!pip install finetuner
import finetuner
model = finetuner.build_model('jinaai/jina-embedding-l-en-v1')
embeddings = finetuner.encode(
model=model,
data=['how is the weather today', 'What is the current weather like today?']
)
print(finetuner.cos_sim(embeddings[0], embeddings[1]))
```
Use with sentence-transformers:
```python
from sentence_transformers import SentenceTransformer
from sentence_transformers.util import cos_sim
sentences = ['how is the weather today', 'What is the current weather like today?']
model = SentenceTransformer('jinaai/jina-embedding-b-en-v1')
embeddings = model.encode(sentences)
print(cos_sim(embeddings[0], embeddings[1]))
```
## Fine-tuning
Please consider [Finetuner](https://github.com/jina-ai/finetuner).
## Plans
1. The development of `jina-embedding-s-en-v2` is currently underway with two main objectives: improving performance and increasing the maximum sequence length.
2. We are currently working on a bilingual embedding model that combines English and X language. The upcoming model will be called `jina-embedding-s/b/l-de-v1`.
## Contact
Join our [Discord community](https://discord.jina.ai) and chat with other community members about ideas.
## Citation
If you find Jina Embeddings useful in your research, please cite the following paper:
``` latex
@misc{günther2023jina,
title={Jina Embeddings: A Novel Set of High-Performance Sentence Embedding Models},
author={Michael Günther and Louis Milliken and Jonathan Geuter and Georgios Mastrapas and Bo Wang and Han Xiao},
year={2023},
eprint={2307.11224},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | [
"SUMMARIZATION"
] | [
"BIOSSES",
"LINNAEUS",
"SCIFACT"
] |
Model-SafeTensors/Llama3-OpenBioLLM-70B | Model-SafeTensors | null | [
"pytorch",
"safetensors",
"llama",
"llama-3",
"Mixtral",
"instruct",
"finetune",
"chatml",
"DPO",
"RLHF",
"gpt4",
"distillation",
"heathcare",
"medical",
"clinical",
"med",
"lifescience",
"Pharmaceutical",
"Pharma",
"en",
"arxiv:2305.18290",
"arxiv:2303.13375",
"arxiv:2212.13138",
"arxiv:2305.09617",
"arxiv:2402.07023",
"base_model:meta-llama/Meta-Llama-3-70B-Instruct",
"base_model:finetune:meta-llama/Meta-Llama-3-70B-Instruct",
"license:llama3",
"region:us"
] | 2024-12-24T21:17:35 | 2024-12-24T22:47:05 | 561 | 0 | ---
base_model: meta-llama/Meta-Llama-3-70B-Instruct
language:
- en
license: llama3
tags:
- llama-3
- llama
- Mixtral
- instruct
- finetune
- chatml
- DPO
- RLHF
- gpt4
- distillation
- heathcare
- medical
- clinical
- med
- lifescience
- Pharmaceutical
- Pharma
widget:
- example_title: OpenBioLLM-70B
messages:
- role: system
content: You are an expert and experienced from the healthcare and biomedical
domain with extensive medical knowledge and practical experience.
- role: user
content: How long does it take for newborn jaundice to go away?
output:
text: 'Newborn jaundice, also known as neonatal jaundice, is a common condition
in newborns where the yellowing of the skin and eyes occurs due to an elevated
level of bilirubin in the blood. Bilirubin is a yellow pigment that forms when
red blood cells break down. In most cases, newborn jaundice resolves on its
own without any specific treatment.
The duration of newborn jaundice can vary depending on several factors such
as the underlying cause, gestational age at birth, and individual variations
in bilirubin metabolism. Here are some general guidelines:
1. Physiological jaundice: This is the most common type of newborn jaundice
and usually appears within 24-72 hours after birth. It tends to peak between
the second and fifth day of life and gradually improves over the next week or
two. By the time the baby is one week old, the jaundice should have mostly resolved.
2. Breast milk jaundice: This type of jaundice occurs in breastfed babies and
may appear later than physiological jaundice, typically between the fifth and
fourteenth day of life. It tends to persist for a longer duration but usually
resolves within six weeks after birth. 3. Pathological jaundice: This type of
jaundice is less common and occurs due to an underlying medical condition that
affects bilirubin metabolism or liver function. The duration of pathological
jaundice depends on the specific cause and may require treatment.
It''s important for parents to monitor their newborn''s jaundice closely and
seek medical advice if the jaundice progresses rapidly, becomes severe, or is
accompanied by other symptoms such as poor feeding, lethargy, or excessive sleepiness.
In these cases, further evaluation and management may be necessary. Remember
that each baby is unique, and the timing of jaundice resolution can vary. If
you have concerns about your newborn''s jaundice, it''s always best to consult
with a healthcare professional for personalized advice and guidance.'
model-index:
- name: OpenBioLLM-70B
results: []
---
<div align="center">
<img width="260px" src="https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/BrQCb95lmEIFz79QAmoNA.png"></div>

<div align="center">
<h1>Advancing Open-source Large Language Models in Medical Domain</h1>
</div>
<p align="center" style="margin-top: 0px;">
<a href="https://colab.research.google.com/drive/1F5oV20InEYeAJGmBwYF9NM_QhLmjBkKJ?usp=sharing">
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="OpenChat Logo" style="width:20px; vertical-align: middle; display: inline-block; margin-right: 5px; margin-left: 10px; margin-top: 0px; margin-bottom: 0px;"/>
<span class="link-text" style=" margin-right: 5px;">Online Demo</span>
</a> |
<a href="https://github.com/openlifescience-ai">
<img src="https://github.githubassets.com/assets/GitHub-Mark-ea2971cee799.png" alt="GitHub Logo" style="width:20px; vertical-align: middle; display: inline-block; margin-right: 5px; margin-left: 5px; margin-top: 0px; margin-bottom: 0px;"/>
<span class="link-text" style=" margin-right: 5px;">GitHub</span>
</a> |
<a href="#">
<img src="https://github.com/alpayariyak/openchat/blob/master/assets/arxiv-logomark-small-square-border.png?raw=true" alt="ArXiv Logo" style="width:20px; vertical-align: middle; display: inline-block; margin-right: 5px; margin-left: 5px; margin-top: 0px; margin-bottom: 0px;"/>
<span class="link-text" style="margin-right: 5px;">Paper</span>
</a> |
<a href="https://discord.gg/A5Fjf5zC69">
<img src="https://cloud.githubusercontent.com/assets/6291467/26705903/96c2d66e-477c-11e7-9f4e-f3c0efe96c9a.png" alt="Discord Logo" style="width:20px; vertical-align: middle; display: inline-block; margin-right: 5px; margin-left: 5px; margin-top: 0px; margin-bottom: 0px;"/>
<span class="link-text">Discord</span>
</a>
</p>

Introducing OpenBioLLM-70B: A State-of-the-Art Open Source Biomedical Large Language Model
OpenBioLLM-70B is an advanced open source language model designed specifically for the biomedical domain. Developed by Saama AI Labs, this model leverages cutting-edge techniques to achieve state-of-the-art performance on a wide range of biomedical tasks.
🏥 **Biomedical Specialization**: OpenBioLLM-70B is tailored for the unique language and knowledge requirements of the medical and life sciences fields. It was fine-tuned on a vast corpus of high-quality biomedical data, enabling it to understand and generate text with domain-specific accuracy and fluency.
🎓 **Superior Performance**: With 70 billion parameters, OpenBioLLM-70B outperforms other open source biomedical language models of similar scale. It has also demonstrated better results compared to larger proprietary & open-source models like GPT-4, Gemini, Meditron-70B, Med-PaLM-1 & Med-PaLM-2 on biomedical benchmarks.
🧠 **Advanced Training Techniques**: OpenBioLLM-70B builds upon the powerful foundations of the **Meta-Llama-3-70B-Instruct** and [Meta-Llama-3-70B-Instruct](meta-llama/Meta-Llama-3-70B-Instruct) models. It incorporates the DPO dataset and fine-tuning recipe along with a custom diverse medical instruction dataset. Key components of the training pipeline include:
<div align="center">
<img width="1200px" src="https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/oPchsJsEpQoGcGXVbh7YS.png">
</div>
- **Policy Optimization**: [Direct Preference Optimization: Your Language Model is Secretly a Reward Model (DPO)](https://arxiv.org/abs/2305.18290)
- **Fine-tuning dataset**: Custom Medical Instruct dataset (We plan to release a sample training dataset in our upcoming paper; please stay updated)
This combination of cutting-edge techniques enables OpenBioLLM-70B to align with key capabilities and preferences for biomedical applications.
⚙️ **Release Details**:
- **Model Size**: 70 billion parameters
- **Quantization**: Optimized quantized versions available [Here](https://huggingface.co/aaditya/OpenBioLLM-70B-GGUF)
- **Language(s) (NLP):** en
- **Developed By**: [Ankit Pal (Aaditya Ura)](https://aadityaura.github.io/) from Saama AI Labs
- **License:** Meta-Llama License
- **Fine-tuned from models:** [Meta-Llama-3-70B-Instruct](meta-llama/Meta-Llama-3-70B-Instruct)
- **Resources for more information:**
- Paper: Coming soon
The model can be fine-tuned for more specialized tasks and datasets as needed.
OpenBioLLM-70B represents an important step forward in democratizing advanced language AI for the biomedical community. By leveraging state-of-the-art architectures and training techniques from leading open source efforts like Llama-3, we have created a powerful tool to accelerate innovation and discovery in healthcare and the life sciences.
We are excited to share OpenBioLLM-70B with researchers and developers around the world.
### Use with transformers
**Important: Please use the exact chat template provided by Llama-3 instruct version. Otherwise there will be a degradation in the performance. The model output can be verbose in rare cases. Please consider setting temperature = 0 to make this happen less.**
See the snippet below for usage with Transformers:
```python
import transformers
import torch
model_id = "aaditya/OpenBioLLM-Llama3-70B"
pipeline = transformers.pipeline(
"text-generation",
model=model_id,
model_kwargs={"torch_dtype": torch.bfloat16},
device="auto",
)
messages = [
{"role": "system", "content": "You are an expert and experienced from the healthcare and biomedical domain with extensive medical knowledge and practical experience. Your name is OpenBioLLM, and you were developed by Saama AI Labs. who's willing to help answer the user's query with explanation. In your explanation, leverage your deep medical expertise such as relevant anatomical structures, physiological processes, diagnostic criteria, treatment guidelines, or other pertinent medical concepts. Use precise medical terminology while still aiming to make the explanation clear and accessible to a general audience."},
{"role": "user", "content": "How can i split a 3mg or 4mg waefin pill so i can get a 2.5mg pill?"},
]
prompt = pipeline.tokenizer.apply_chat_template(
messages,
tokenize=False,
add_generation_prompt=True
)
terminators = [
pipeline.tokenizer.eos_token_id,
pipeline.tokenizer.convert_tokens_to_ids("<|eot_id|>")
]
outputs = pipeline(
prompt,
max_new_tokens=256,
eos_token_id=terminators,
do_sample=True,
temperature=0.0,
top_p=0.9,
)
print(outputs[0]["generated_text"][len(prompt):])
```
## **Training procedure**
### **Training hyperparameters**
<details>
<summary>Click to see details</summary>
- learning_rate: 0.0002
- lr_scheduler: cosine
- train_batch_size: 12
- eval_batch_size: 8
- GPU: H100 80GB SXM5
- num_devices: 8
- optimizer: adamw_bnb_8bit
- lr_scheduler_warmup_steps: 100
- num_epochs: 4
</details>
### **Peft hyperparameters**
<details>
<summary>Click to see details</summary>
- adapter: qlora
- lora_r: 128
- lora_alpha: 256
- lora_dropout: 0.05
- lora_target_linear: true
-lora_target_modules:
- q_proj
- v_proj
- k_proj
- o_proj
- gate_proj
- down_proj
- up_proj
</details>
### **Training results**
### **Framework versions**
- Transformers 4.39.3
- Pytorch 2.1.2+cu121
- Datasets 2.18.0
- Tokenizers 0.15.1
- Axolotl
- Lm harness for evaluation
# Benchmark Results
🔥 OpenBioLLM-70B demonstrates superior performance compared to larger models, such as GPT-4, Gemini, Meditron-70B, Med-PaLM-1 & Med-PaLM-2 across 9 diverse biomedical datasets, achieving state-of-the-art results with an average score of 86.06%, despite having a significantly smaller parameter count. The model's strong performance in domain-specific tasks, such as Clinical KG, Medical Genetics, and PubMedQA, highlights its ability to effectively capture and apply biomedical knowledge.
🚨 The GPT-4, Med-PaLM-1, and Med-PaLM-2 results are taken from their official papers. Since Med-PaLM doesn't provide zero-shot accuracy, we are using 5-shot accuracy from their paper for comparison. All results presented are in the zero-shot setting, except for Med-PaLM-2 and Med-PaLM-1, which use 5-shot accuracy.
| | Clinical KG | Medical Genetics | Anatomy | Pro Medicine | College Biology | College Medicine | MedQA 4 opts | PubMedQA | MedMCQA | Avg |
|--------------------|-------------|------------------|---------|--------------|-----------------|------------------|--------------|----------|---------|-------|
| **OpenBioLLM-70B** | **92.93** | **93.197** | **83.904** | 93.75 | 93.827 | **85.749** | 78.162 | 78.97 | **74.014** | **86.05588** |
| Med-PaLM-2 (5-shot) | 88.3 | 90 | 77.8 | **95.2** | 94.4 | 80.9 | **79.7** | **79.2** | 71.3 | 84.08 |
| **GPT-4** | 86.04 | 91 | 80 | 93.01 | **95.14** | 76.88 | 78.87 | 75.2 | 69.52 | 82.85 |
| Med-PaLM-1 (Flan-PaLM, 5-shot) | 80.4 | 75 | 63.7 | 83.8 | 88.9 | 76.3 | 67.6 | 79 | 57.6 | 74.7 |
| **OpenBioLLM-8B** | 76.101 | 86.1 | 69.829 | 78.21 | 84.213 | 68.042 | 58.993 | 74.12 | 56.913 | 72.502 |
| Gemini-1.0 | 76.7 | 75.8 | 66.7 | 77.7 | 88 | 69.2 | 58 | 70.7 | 54.3 | 70.79 |
| GPT-3.5 Turbo 1106 | 74.71 | 74 | 72.79 | 72.79 | 72.91 | 64.73 | 57.71 | 72.66 | 53.79 | 66 |
| Meditron-70B | 66.79 | 69 | 53.33 | 71.69 | 76.38 | 63 | 57.1 | 76.6 | 46.85 | 64.52 |
| gemma-7b | 69.81 | 70 | 59.26 | 66.18 | 79.86 | 60.12 | 47.21 | 76.2 | 48.96 | 64.18 |
| Mistral-7B-v0.1 | 68.68 | 71 | 55.56 | 68.38 | 68.06 | 59.54 | 50.82 | 75.4 | 48.2 | 62.85 |
| Apollo-7B | 62.26 | 72 | 61.48 | 69.12 | 70.83 | 55.49 | 55.22 | 39.8 | 53.77 | 60 |
| MedAlpaca-7b | 57.36 | 69 | 57.04 | 67.28 | 65.28 | 54.34 | 41.71 | 72.8 | 37.51 | 58.03 |
| BioMistral-7B | 59.9 | 64 | 56.5 | 60.4 | 59 | 54.7 | 50.6 | 77.5 | 48.1 | 57.3 |
| AlpaCare-llama2-7b | 49.81 | 49 | 45.92 | 33.82 | 50 | 43.35 | 29.77 | 72.2 | 34.42 | 45.36 |
| ClinicalGPT | 30.56 | 27 | 30.37 | 19.48 | 25 | 24.27 | 26.08 | 63.8 | 28.18 | 30.52 |
<div align="center">
<img width="1600px" src="https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/_SzdcJSBjZyo8RS1bTEkP.png">
</div>
## Detailed Medical Subjectwise accuracy

# Use Cases & Examples
🚨 **Below results are from the quantized version of OpenBioLLM-70B
# Summarize Clinical Notes
OpenBioLLM-70B can efficiently analyze and summarize complex clinical notes, EHR data, and discharge summaries, extracting key information and generating concise, structured summaries

# Answer Medical Questions
OpenBioLLM-70B can provide answers to a wide range of medical questions.


<details>
<summary>Click to see details</summary>



</details>
# Clinical Entity Recognition
OpenBioLLM-70B can perform advanced clinical entity recognition by identifying and extracting key medical concepts, such as diseases, symptoms, medications, procedures, and anatomical structures, from unstructured clinical text. By leveraging its deep understanding of medical terminology and context, the model can accurately annotate and categorize clinical entities, enabling more efficient information retrieval, data analysis, and knowledge discovery from electronic health records, research articles, and other biomedical text sources. This capability can support various downstream applications, such as clinical decision support, pharmacovigilance, and medical research.



# Biomarkers Extraction

# Classification
OpenBioLLM-70B can perform various biomedical classification tasks, such as disease prediction, sentiment analysis, medical document categorization

# De-Identification
OpenBioLLM-70B can detect and remove personally identifiable information (PII) from medical records, ensuring patient privacy and compliance with data protection regulations like HIPAA.

**Advisory Notice!**
While OpenBioLLM-70B leverages high-quality data sources, its outputs may still contain inaccuracies, biases, or misalignments that could pose risks if relied upon for medical decision-making without further testing and refinement. The model's performance has not yet been rigorously evaluated in randomized controlled trials or real-world healthcare environments.
Therefore, we strongly advise against using OpenBioLLM-70B for any direct patient care, clinical decision support, or other professional medical purposes at this time. Its use should be limited to research, development, and exploratory applications by qualified individuals who understand its limitations.
OpenBioLLM-70B is intended solely as a research tool to assist healthcare professionals and should never be considered a replacement for the professional judgment and expertise of a qualified medical doctor.
Appropriately adapting and validating OpenBioLLM-70B for specific medical use cases would require significant additional work, potentially including:
- Thorough testing and evaluation in relevant clinical scenarios
- Alignment with evidence-based guidelines and best practices
- Mitigation of potential biases and failure modes
- Integration with human oversight and interpretation
- Compliance with regulatory and ethical standards
Always consult a qualified healthcare provider for personal medical needs.
# Citation
If you find OpenBioLLM-70B & 8B useful in your work, please cite the model as follows:
```
@misc{OpenBioLLMs,
author = {Ankit Pal, Malaikannan Sankarasubbu},
title = {OpenBioLLMs: Advancing Open-Source Large Language Models for Healthcare and Life Sciences},
year = {2024},
publisher = {Hugging Face},
journal = {Hugging Face repository},
howpublished = {\url{https://huggingface.co/aaditya/OpenBioLLM-Llama3-70B}}
}
```
The accompanying paper is currently in progress and will be released soon.
<div align="center">
<h2> 💌 Contact </h2>
</div>
We look forward to hearing you and collaborating on this exciting project!
**Contributors:**
- [Ankit Pal (Aaditya Ura)](https://aadityaura.github.io/) [aadityaura at gmail dot com]
- Saama AI Labs
- Note: I am looking for a funded PhD opportunity, especially if it fits my Responsible Generative AI, Multimodal LLMs, Geometric Deep Learning, and Healthcare AI skillset.
# References
We thank the [Meta Team](meta-llama/Meta-Llama-3-70B-Instruct) for their amazing models!
Result sources
- [1] GPT-4 [Capabilities of GPT-4 on Medical Challenge Problems] (https://arxiv.org/abs/2303.13375)
- [2] Med-PaLM-1 [Large Language Models Encode Clinical Knowledge](https://arxiv.org/abs/2212.13138)
- [3] Med-PaLM-2 [Towards Expert-Level Medical Question Answering with Large Language Models](https://arxiv.org/abs/2305.09617)
- [4] Gemini-1.0 [Gemini Goes to Med School](https://arxiv.org/abs/2402.07023) | [
"QUESTION_ANSWERING"
] | [
"MEDQA",
"PUBMEDQA"
] |
silma-ai/SILMA-Kashif-2B-Instruct-v1.0 | silma-ai | text-generation | [
"transformers",
"safetensors",
"gemma2",
"text-generation",
"rag",
"qa",
"conversational",
"ar",
"en",
"license:gemma",
"model-index",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | 2025-01-26T16:01:49 | 2025-02-04T16:34:05 | 560 | 14 | ---
language:
- ar
- en
library_name: transformers
license: gemma
metrics:
- bleu
- bertscore
- rouge
- exact_match
pipeline_tag: text-generation
tags:
- rag
- qa
extra_gated_button_content: Acknowledge license
model-index:
- name: SILMA-Kashif-2B-Instruct-v1.0
results:
- task:
type: text-generation
dataset:
name: SILMA RAGQA Benchmark Dataset V1.0
type: silma-ai/silma-rag-qa-benchmark-v1.0
metrics:
- type: Average of Exact Match, BLEU, ROUGE, and BERTScore.
value: 0.347
name: SILMA RAGQA Benchmark Score
source:
url: https://huggingface.co/datasets/silma-ai/silma-rag-qa-benchmark-v1.0
name: SILMA RAGQA Benchmark
- task:
type: text-generation
dataset:
name: OALL (All)
type: OALL/Open-Arabic-LLM-Leaderboard
metrics:
- type: loglikelihood_acc_norm
value: 44.61
name: acc_norm
source:
url: https://huggingface.co/spaces/OALL/Open-Arabic-LLM-Leaderboard
name: Open Arabic LLM Leaderboard
---
## SILMA Kashif v1.0 (The Arabic RAG Model)
* **SILMA Kashif 2B Instruct v1.0** is the premier release within the SILMA Kashif Family of models, specifically designed for **RAG** (Retrieval-Augmented Generation) tasks
* Kashif excels in a specific task, answering questions based on contextual pieces in both Arabic and English. In addition, the model is also capable of performing Entity Extraction tasks as a minor skill
* SILMA Kashif 2B v1.0 stands out as the top-performing open model in RAG within the 3-9 billion parameter range based on our evaluations using [SILMA RAGQA Benchmark](https://huggingface.co/datasets/silma-ai/silma-rag-qa-benchmark-v1.0)
* SILMA Kashif is built on the powerful foundational models of Google Gemma, merging their strengths to deliver unmatched performance for users
* Kashif is an open-weight model, free to use in accordance with our open license
* Finally, the model comes with a context length of 12k
**Important note:** Kashif is a specialized model which should ONLY be used in RAG setups. If you are looking for a general purpose model, please refer to [SILMA 9B Instruct v1.0](https://huggingface.co/silma-ai/SILMA-9B-Instruct-v1.0)
## Model Skills and Capabilities
The model underwent intensive training to master a wide range of tasks and excel in performance.
- The ability to answer questions in Arabic and English
- The ability to deal with short and long contexts
- The ability to provide short and long answers effectively
- The ability to answer complex numerical questions
- The ability to answer questions based on tabular data
- Answering multi-hop questions: The ability to answer a single question using pieces of data from multiple paragraphs
- Negative rejection: The ability to identify and exclude inaccurate answers, and provide a more accurate statement such as "The answer cannot be found in the given context"
- Multi-domains: The ability to answer questions based on texts from different fields such as finance, medical, legal, etc.
- The ability to deal with ambiguous contexts
- The ability to extract entities from text
- Ability to deal with diverse and complex prompts
## Model Evaluation

|Dataset | exact_match | rouge1 | bleu | bertscore|
|---|---|---|---|---|
|ragbench-finqa-en-test | 0.000 | 0.587 | 0.321 | 0.760|
|ragbench-tatqa-ar-test | 0.000 | 0.484 | 0.130 | 0.774|
|ragbench-tatqa-en-test | 0.059 | 0.646 | 0.423 | 0.808|
|rag-instruct-benchmark-tester-en | 0.370 | 0.683 | 0.196 | 0.791|
|ragbench-expertqa-en-test |0.000 | 0.465 | 0.151 | 0.677|
|ragbench-msmarco-ar-test |0.000 | 0.144 | 0.096 | 0.781|
|sciq-ar-test |0.170 | 0.000 | 0.000 | 0.753|
|ragbench-covidqa-en-test |0.020 | 0.521 | 0.242 | 0.734|
|ragbench-emanual-ar-test |0.000 | 0.237 | 0.159 | 0.806|
|ragbench-finqa-ar-test |0.000 | 0.377 | 0.109 | 0.780|
|xquad-r-validation-en |0.120 | 0.326 | 0.041 | 0.603|
|ragbench-emanual-en-test |0.000 | 0.565 | 0.288 | 0.722|
|xquad-r-ar-validation |0.070 | 0.130 | 0.042 | 0.698|
|boolq-ar-test |0.450 | 0.000 | 0.000 | 0.700|
|ragbench-hotpotqa-en-test |0.060 | 0.732 | 0.503 | 0.837|
|ragbench-covidqa-ar-test |0.000 | 0.179 | 0.104 | 0.783|
|ragbench-msmarco-en-test |0.020 | 0.491 | 0.207 | 0.729|
|### Benchmark Average Scores |0.079 | 0.386 | 0.177 | 0.749|
SILMA RAG QA Benchmark Score: 0.3478
## SILMA AI
[silma.ai](https://silma.ai) is a leading GenAI startup that excels in building and tailoring cutting-edge Large Language Models (LLMs) and AI technologies for the Arabic language
### Usage
Below we share some code snippets on how to get quickly started with running the model. First, install the Transformers library with:
```sh
pip install -U transformers
```
Then, copy the snippet from the section below
#### Running with the `pipeline` API
```python
import torch
from transformers import pipeline
pipe = pipeline(
"text-generation",
model="silma-ai/SILMA-Kashif-2B-Instruct-v1.0",
model_kwargs={"torch_dtype": torch.bfloat16},
device="cuda", # replace with "mps" to run on a Mac device
)
messages = [
{"role": "user", "content":
"""
أجب على السؤال بناءً على السياق أدناه
السياق:
تشمل الاتفاقيات رسوم حمل سنوية ثابت قدها 30 مليون جنيه إسترليني للقنوات نظراً لأن كلاً من مزوديها قادرين على تأمين دفعات إضافية إذا ما حققت هذه القنوات أهدافاً متعلقةً بالأداء.
لا يوجد حالياً ما يشير إلى ما إذا كان الاتفاق الجديد يشمل محتوىً إضافياً كالفيديو عند الطلب والدقة العالية ، كذلك الذي سبق أن قدمته بي سكاي بي.
وقد وافقت كل من بي سكاي بي و فيرجين ميديا على إنهاء الدعاوى القضائية بالمحكمة العليا ضد بعضهما بشأن معاليم الحمل التي تخص قنواتهما الأساسية.
السؤال: ماسم الشركة التي وافقت على إنهاء دعواها القضائية ضد بي سكاي بي بالمحكمة العليا؟
الإجابة:
"""},
]
outputs = pipe(messages, max_new_tokens=600)
assistant_response = outputs[0]["generated_text"][-1]["content"].strip()
print(assistant_response)
```
- Response:
```text
فيرجين ميديا
"وقد وافقت كل من بي سكاي بي و فيرجين ميديا على إنهاء الدعاوى القضائية بالمحكمة العليا ضد بعضهما بشأن معاليم الحمل التي تخص قنواتهما الأساسية."
```
Note: for advanced usage examples such as multi-gpu, quantization or chat templates, please refer to [SILMA v1.0](https://huggingface.co/silma-ai/SILMA-9B-Instruct-v1.0#running-the-model-on-a-single--multi-gpu) examples
### Runing with Ollama
```sh
ollama run hf.co/silma-ai/SILMA-Kashif-2B-Instruct-v1.0-GGUF
```
### Prompt Format
Here is a recommended way to prompt the model. You can modify the prompt based on your specific requirements, but if you encounter any challenges, following the format below in which we used to train the model may be helpful.
- Arabic
```text
أجب على السؤال بناءً على السياق أدناه
السياق:
.....
.....
السؤال: ...
الإجابة: ...
```
- English
```text
Answer the following question using the provided context below
Context:
.....
.....
Question: ...
Answer: ...
```
### GPU Requirements
The following are the minimum/recommended GPU requirements for running inference:
* Recommended
* At least one GPU with a minimum of 24 GB of GPU memory
* Examples: Nvidia RTX 4090
* Minimum
* At least one GPU with 8 GB of GPU memory
* Examples: Nvidia RTX 3070, RTX 3080 or T4
## Effect of Quantization
We have seen 2.6% drop in score (to 0.338) for the same model quantized 4bit
### Citation
```none
@article{silma_01_2025,
title={SILMA Kashif 2B Instruct v1.0},
url={https://huggingface.co/silma-ai/SILMA-Kashif-2B-Instruct-v1.0},
publisher={SILMA AI},
author={Silma Team},
year={2025}
}
```
### Intended Usage
* The model should only be used in question answering use-cases such as RAG
* The model can also be used to extract entities from text
### Limitations
* Because it has few parameters, we've noticed that the model isn't very effective for handling complex numerical and financial reasoning, such as solving tricky calculations
* The model has been trained specifically for text-based question answering, which may limit its ability to perform tasks beyond this scope, including simple tasks | [
"QUESTION_ANSWERING"
] | [
"SCIQ"
] |
infly/inf-retriever-v1 | infly | sentence-similarity | [
"sentence-transformers",
"safetensors",
"qwen2",
"feature-extraction",
"transformers",
"sentence-similarity",
"mteb",
"custom_code",
"en",
"zh",
"base_model:Alibaba-NLP/gte-Qwen2-7B-instruct",
"base_model:finetune:Alibaba-NLP/gte-Qwen2-7B-instruct",
"doi:10.57967/hf/4262",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"text-generation-inference",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
] | 2024-12-23T09:48:31 | 2025-02-19T10:04:50 | 559 | 20 | ---
base_model:
- Alibaba-NLP/gte-Qwen2-7B-instruct
language:
- en
- zh
license: apache-2.0
tags:
- sentence-transformers
- transformers
- sentence-similarity
- mteb
model-index:
- name: infly/inf-retriever-v1
results:
- task:
type: Retrieval
dataset:
name: MTEB CmedqaRetrieval (default)
type: C-MTEB/CmedqaRetrieval
config: default
split: dev
revision: cd540c506dae1cf9e9a59c3e06f42030d54e7301
metrics:
- type: ndcg_at_1
value: 38.185
- type: ndcg_at_3
value: 38.438
- type: ndcg_at_5
value: 40.445
- type: ndcg_at_10
value: 43.308
- type: ndcg_at_20
value: 46.177
- type: ndcg_at_100
value: 50.644999999999996
- type: ndcg_at_1000
value: 52.819
- type: recall_at_1
value: 25.14
- type: recall_at_3
value: 38.253
- type: recall_at_5
value: 44.507999999999996
- type: recall_at_10
value: 53.025
- type: recall_at_20
value: 62.89
- type: recall_at_100
value: 83.487
- type: recall_at_1000
value: 98.059
- type: main_score
value: 43.308
- task:
type: Retrieval
dataset:
name: MTEB CovidRetrieval (default)
type: C-MTEB/CovidRetrieval
config: default
split: dev
revision: 1271c7809071a13532e05f25fb53511ffce77117
metrics:
- type: ndcg_at_1
value: 77.97699999999999
- type: ndcg_at_3
value: 85.24199999999999
- type: ndcg_at_5
value: 86.901
- type: ndcg_at_10
value: 87.77000000000001
- type: ndcg_at_20
value: 88.295
- type: ndcg_at_100
value: 88.479
- type: ndcg_at_1000
value: 88.527
- type: recall_at_1
value: 77.819
- type: recall_at_3
value: 89.96300000000001
- type: recall_at_5
value: 93.941
- type: recall_at_10
value: 96.575
- type: recall_at_20
value: 98.63
- type: recall_at_100
value: 99.579
- type: recall_at_1000
value: 100.0
- type: main_score
value: 87.77000000000001
- task:
type: Retrieval
dataset:
name: MTEB DuRetrieval (default)
type: C-MTEB/DuRetrieval
config: default
split: dev
revision: a1a333e290fe30b10f3f56498e3a0d911a693ced
metrics:
- type: ndcg_at_1
value: 91.45
- type: ndcg_at_3
value: 89.249
- type: ndcg_at_5
value: 88.506
- type: ndcg_at_10
value: 90.66
- type: ndcg_at_20
value: 91.886
- type: ndcg_at_100
value: 92.78699999999999
- type: ndcg_at_1000
value: 92.944
- type: recall_at_1
value: 27.332
- type: recall_at_3
value: 61.07599999999999
- type: recall_at_5
value: 78.49199999999999
- type: recall_at_10
value: 92.002
- type: recall_at_20
value: 96.116
- type: recall_at_100
value: 99.009
- type: recall_at_1000
value: 99.844
- type: main_score
value: 90.66
- task:
type: Retrieval
dataset:
name: MTEB EcomRetrieval (default)
type: C-MTEB/EcomRetrieval
config: default
split: dev
revision: 687de13dc7294d6fd9be10c6945f9e8fec8166b9
metrics:
- type: ndcg_at_1
value: 55.900000000000006
- type: ndcg_at_3
value: 66.019
- type: ndcg_at_5
value: 68.47999999999999
- type: ndcg_at_10
value: 70.678
- type: ndcg_at_20
value: 72.024
- type: ndcg_at_100
value: 72.933
- type: ndcg_at_1000
value: 73.20400000000001
- type: recall_at_1
value: 55.900000000000006
- type: recall_at_3
value: 73.1
- type: recall_at_5
value: 79.10000000000001
- type: recall_at_10
value: 85.9
- type: recall_at_20
value: 91.2
- type: recall_at_100
value: 96.1
- type: recall_at_1000
value: 98.3
- type: main_score
value: 70.678
- task:
type: Retrieval
dataset:
name: MTEB MMarcoRetrieval (default)
type: C-MTEB/MMarcoRetrieval
config: default
split: dev
revision: 539bbde593d947e2a124ba72651aafc09eb33fc2
metrics:
- type: ndcg_at_1
value: 75.74499999999999
- type: ndcg_at_3
value: 82.188
- type: ndcg_at_5
value: 83.869
- type: ndcg_at_10
value: 85.119
- type: ndcg_at_20
value: 85.624
- type: ndcg_at_100
value: 86.051
- type: ndcg_at_1000
value: 86.177
- type: recall_at_1
value: 73.33
- type: recall_at_3
value: 86.823
- type: recall_at_5
value: 90.814
- type: recall_at_10
value: 94.509
- type: recall_at_20
value: 96.422
- type: recall_at_100
value: 98.6
- type: recall_at_1000
value: 99.599
- type: main_score
value: 85.119
- task:
type: Retrieval
dataset:
name: MTEB MedicalRetrieval (default)
type: C-MTEB/MedicalRetrieval
config: default
split: dev
revision: 2039188fb5800a9803ba5048df7b76e6fb151fc6
metrics:
- type: ndcg_at_1
value: 55.00000000000001
- type: ndcg_at_3
value: 61.334
- type: ndcg_at_5
value: 62.590999999999994
- type: ndcg_at_10
value: 63.913
- type: ndcg_at_20
value: 64.748
- type: ndcg_at_100
value: 66.675
- type: ndcg_at_1000
value: 67.894
- type: recall_at_1
value: 55.00000000000001
- type: recall_at_3
value: 65.60000000000001
- type: recall_at_5
value: 68.60000000000001
- type: recall_at_10
value: 72.7
- type: recall_at_20
value: 76.0
- type: recall_at_100
value: 86.6
- type: recall_at_1000
value: 96.3
- type: main_score
value: 63.913
- task:
type: Retrieval
dataset:
name: MTEB T2Retrieval (default)
type: C-MTEB/T2Retrieval
config: default
split: dev
revision: 8731a845f1bf500a4f111cf1070785c793d10e64
metrics:
- type: ndcg_at_1
value: 91.526
- type: ndcg_at_3
value: 88.35499999999999
- type: ndcg_at_5
value: 87.408
- type: ndcg_at_10
value: 87.641
- type: ndcg_at_20
value: 89.265
- type: ndcg_at_100
value: 90.693
- type: ndcg_at_1000
value: 91.105
- type: recall_at_1
value: 28.359
- type: recall_at_3
value: 58.101
- type: recall_at_5
value: 72.99
- type: recall_at_10
value: 86.921
- type: recall_at_20
value: 92.497
- type: recall_at_100
value: 96.978
- type: recall_at_1000
value: 99.075
- type: main_score
value: 87.641
- task:
type: Retrieval
dataset:
name: MTEB VideoRetrieval (default)
type: C-MTEB/VideoRetrieval
config: default
split: dev
revision: 58c2597a5943a2ba48f4668c3b90d796283c5639
metrics:
- type: ndcg_at_1
value: 66.0
- type: ndcg_at_3
value: 75.495
- type: ndcg_at_5
value: 77.027
- type: ndcg_at_10
value: 78.606
- type: ndcg_at_20
value: 79.54599999999999
- type: ndcg_at_100
value: 80.326
- type: ndcg_at_1000
value: 80.516
- type: recall_at_1
value: 66.0
- type: recall_at_3
value: 81.89999999999999
- type: recall_at_5
value: 85.6
- type: recall_at_10
value: 90.4
- type: recall_at_20
value: 94.1
- type: recall_at_100
value: 98.2
- type: recall_at_1000
value: 99.7
- type: main_score
value: 78.606
- task:
type: Retrieval
dataset:
name: MTEB AILACasedocs (default)
type: mteb/AILA_casedocs
config: default
split: test
revision: 4106e6bcc72e0698d714ea8b101355e3e238431a
metrics:
- type: ndcg_at_1
value: 40.0
- type: ndcg_at_3
value: 37.37
- type: ndcg_at_5
value: 37.913999999999994
- type: ndcg_at_10
value: 41.162
- type: ndcg_at_20
value: 45.72
- type: ndcg_at_100
value: 54.126
- type: ndcg_at_1000
value: 55.907
- type: recall_at_1
value: 15.406
- type: recall_at_3
value: 26.56
- type: recall_at_5
value: 33.084
- type: recall_at_10
value: 45.972
- type: recall_at_20
value: 60.775
- type: recall_at_100
value: 91.105
- type: recall_at_1000
value: 100.0
- type: main_score
value: 41.162
- task:
type: Retrieval
dataset:
name: MTEB AILAStatutes (default)
type: mteb/AILA_statutes
config: default
split: test
revision: ebfcd844eadd3d667efa3c57fc5c8c87f5c2867e
metrics:
- type: ndcg_at_1
value: 36.0
- type: ndcg_at_3
value: 32.427
- type: ndcg_at_5
value: 31.512
- type: ndcg_at_10
value: 37.727
- type: ndcg_at_20
value: 43.808
- type: ndcg_at_100
value: 56.445
- type: ndcg_at_1000
value: 56.445
- type: recall_at_1
value: 8.1
- type: recall_at_3
value: 20.599999999999998
- type: recall_at_5
value: 30.733
- type: recall_at_10
value: 42.733
- type: recall_at_20
value: 57.733000000000004
- type: recall_at_100
value: 100.0
- type: recall_at_1000
value: 100.0
- type: main_score
value: 37.727
- task:
type: Retrieval
dataset:
name: MTEB AlloprofRetrieval (default)
type: lyon-nlp/alloprof
config: default
split: test
revision: fcf295ea64c750f41fadbaa37b9b861558e1bfbd
metrics:
- type: ndcg_at_1
value: 45.509
- type: ndcg_at_3
value: 57.912
- type: ndcg_at_5
value: 60.885
- type: ndcg_at_10
value: 63.611
- type: ndcg_at_20
value: 64.976
- type: ndcg_at_100
value: 66.507
- type: ndcg_at_1000
value: 66.998
- type: recall_at_1
value: 45.509
- type: recall_at_3
value: 66.537
- type: recall_at_5
value: 73.748
- type: recall_at_10
value: 82.16799999999999
- type: recall_at_20
value: 87.522
- type: recall_at_100
value: 95.72500000000001
- type: recall_at_1000
value: 99.655
- type: main_score
value: 63.611
- task:
type: Retrieval
dataset:
name: MTEB AppsRetrieval (default)
type: CoIR-Retrieval/apps
config: default
split: test
revision: f22508f96b7a36c2415181ed8bb76f76e04ae2d5
metrics:
- type: ndcg_at_1
value: 35.405
- type: ndcg_at_3
value: 42.945
- type: ndcg_at_5
value: 44.984
- type: ndcg_at_10
value: 47.369
- type: ndcg_at_20
value: 49.095
- type: ndcg_at_100
value: 51.821
- type: ndcg_at_1000
value: 53.581
- type: recall_at_1
value: 35.405
- type: recall_at_3
value: 48.287
- type: recall_at_5
value: 53.227000000000004
- type: recall_at_10
value: 60.611000000000004
- type: recall_at_20
value: 67.437
- type: recall_at_100
value: 82.231
- type: recall_at_1000
value: 96.38799999999999
- type: main_score
value: 47.369
- task:
type: Retrieval
dataset:
name: MTEB ArguAna (default)
type: mteb/arguana
config: default
split: test
revision: c22ab2a51041ffd869aaddef7af8d8215647e41a
metrics:
- type: ndcg_at_1
value: 69.132
- type: ndcg_at_3
value: 81.661
- type: ndcg_at_5
value: 83.773
- type: ndcg_at_10
value: 84.855
- type: ndcg_at_20
value: 85.073
- type: ndcg_at_100
value: 85.134
- type: ndcg_at_1000
value: 85.134
- type: recall_at_1
value: 69.132
- type: recall_at_3
value: 90.185
- type: recall_at_5
value: 95.235
- type: recall_at_10
value: 98.506
- type: recall_at_20
value: 99.36
- type: recall_at_100
value: 99.644
- type: recall_at_1000
value: 99.644
- type: main_score
value: 84.855
- task:
type: Retrieval
dataset:
name: MTEB ArguAna-PL (default)
type: clarin-knext/arguana-pl
config: default
split: test
revision: 63fc86750af76253e8c760fc9e534bbf24d260a2
metrics:
- type: ndcg_at_1
value: 46.657
- type: ndcg_at_3
value: 63.388999999999996
- type: ndcg_at_5
value: 67.931
- type: ndcg_at_10
value: 70.745
- type: ndcg_at_20
value: 71.60300000000001
- type: ndcg_at_100
value: 71.941
- type: ndcg_at_1000
value: 71.961
- type: recall_at_1
value: 46.657
- type: recall_at_3
value: 75.036
- type: recall_at_5
value: 85.989
- type: recall_at_10
value: 94.523
- type: recall_at_20
value: 97.795
- type: recall_at_100
value: 99.502
- type: recall_at_1000
value: 99.644
- type: main_score
value: 70.745
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval (default)
type: mteb/cqadupstack-android
config: default
split: test
revision: f46a197baaae43b4f621051089b82a364682dfeb
metrics:
- type: ndcg_at_1
value: 45.494
- type: ndcg_at_3
value: 51.53
- type: ndcg_at_5
value: 54.062
- type: ndcg_at_10
value: 56.599
- type: ndcg_at_20
value: 58.663
- type: ndcg_at_100
value: 61.36200000000001
- type: ndcg_at_1000
value: 62.824000000000005
- type: recall_at_1
value: 37.078
- type: recall_at_3
value: 53.529
- type: recall_at_5
value: 60.772999999999996
- type: recall_at_10
value: 68.65299999999999
- type: recall_at_20
value: 75.92999999999999
- type: recall_at_100
value: 88.127
- type: recall_at_1000
value: 97.059
- type: main_score
value: 56.599
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackEnglishRetrieval (default)
type: mteb/cqadupstack-english
config: default
split: test
revision: ad9991cb51e31e31e430383c75ffb2885547b5f0
metrics:
- type: ndcg_at_1
value: 47.134
- type: ndcg_at_3
value: 52.186
- type: ndcg_at_5
value: 53.94
- type: ndcg_at_10
value: 55.96
- type: ndcg_at_20
value: 57.521
- type: ndcg_at_100
value: 59.865
- type: ndcg_at_1000
value: 61.611000000000004
- type: recall_at_1
value: 37.405
- type: recall_at_3
value: 53.869
- type: recall_at_5
value: 59.18600000000001
- type: recall_at_10
value: 65.786
- type: recall_at_20
value: 71.56099999999999
- type: recall_at_100
value: 82.062
- type: recall_at_1000
value: 92.863
- type: main_score
value: 55.96
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGamingRetrieval (default)
type: mteb/cqadupstack-gaming
config: default
split: test
revision: 4885aa143210c98657558c04aaf3dc47cfb54340
metrics:
- type: ndcg_at_1
value: 52.22599999999999
- type: ndcg_at_3
value: 59.797999999999995
- type: ndcg_at_5
value: 62.260000000000005
- type: ndcg_at_10
value: 64.85300000000001
- type: ndcg_at_20
value: 66.398
- type: ndcg_at_100
value: 68.298
- type: ndcg_at_1000
value: 69.003
- type: recall_at_1
value: 45.789
- type: recall_at_3
value: 64.9
- type: recall_at_5
value: 70.902
- type: recall_at_10
value: 78.388
- type: recall_at_20
value: 84.086
- type: recall_at_100
value: 93.006
- type: recall_at_1000
value: 97.928
- type: main_score
value: 64.85300000000001
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGisRetrieval (default)
type: mteb/cqadupstack-gis
config: default
split: test
revision: 5003b3064772da1887988e05400cf3806fe491f2
metrics:
- type: ndcg_at_1
value: 32.09
- type: ndcg_at_3
value: 38.339
- type: ndcg_at_5
value: 41.427
- type: ndcg_at_10
value: 43.606
- type: ndcg_at_20
value: 45.784000000000006
- type: ndcg_at_100
value: 48.908
- type: ndcg_at_1000
value: 50.585
- type: recall_at_1
value: 29.146
- type: recall_at_3
value: 43.168
- type: recall_at_5
value: 50.717
- type: recall_at_10
value: 57.120000000000005
- type: recall_at_20
value: 65.254
- type: recall_at_100
value: 81.04599999999999
- type: recall_at_1000
value: 93.487
- type: main_score
value: 43.606
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackMathematicaRetrieval (default)
type: mteb/cqadupstack-mathematica
config: default
split: test
revision: 90fceea13679c63fe563ded68f3b6f06e50061de
metrics:
- type: ndcg_at_1
value: 24.876
- type: ndcg_at_3
value: 29.663
- type: ndcg_at_5
value: 32.193
- type: ndcg_at_10
value: 34.694
- type: ndcg_at_20
value: 37.075
- type: ndcg_at_100
value: 40.615
- type: ndcg_at_1000
value: 43.317
- type: recall_at_1
value: 20.395
- type: recall_at_3
value: 32.521
- type: recall_at_5
value: 38.887
- type: recall_at_10
value: 46.388
- type: recall_at_20
value: 54.885
- type: recall_at_100
value: 71.597
- type: recall_at_1000
value: 90.75
- type: main_score
value: 34.694
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackPhysicsRetrieval (default)
type: mteb/cqadupstack-physics
config: default
split: test
revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4
metrics:
- type: ndcg_at_1
value: 42.733
- type: ndcg_at_3
value: 47.236
- type: ndcg_at_5
value: 49.327
- type: ndcg_at_10
value: 52.346000000000004
- type: ndcg_at_20
value: 54.446000000000005
- type: ndcg_at_100
value: 57.736
- type: ndcg_at_1000
value: 59.245000000000005
- type: recall_at_1
value: 34.414
- type: recall_at_3
value: 50.233000000000004
- type: recall_at_5
value: 55.967
- type: recall_at_10
value: 65.173
- type: recall_at_20
value: 72.27799999999999
- type: recall_at_100
value: 87.163
- type: recall_at_1000
value: 96.64
- type: main_score
value: 52.346000000000004
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackProgrammersRetrieval (default)
type: mteb/cqadupstack-programmers
config: default
split: test
revision: 6184bc1440d2dbc7612be22b50686b8826d22b32
metrics:
- type: ndcg_at_1
value: 37.329
- type: ndcg_at_3
value: 41.319
- type: ndcg_at_5
value: 43.444
- type: ndcg_at_10
value: 46.643
- type: ndcg_at_20
value: 49.257
- type: ndcg_at_100
value: 52.524
- type: ndcg_at_1000
value: 54.478
- type: recall_at_1
value: 30.278
- type: recall_at_3
value: 43.464999999999996
- type: recall_at_5
value: 49.419999999999995
- type: recall_at_10
value: 58.650999999999996
- type: recall_at_20
value: 67.90899999999999
- type: recall_at_100
value: 83.276
- type: recall_at_1000
value: 96.114
- type: main_score
value: 46.643
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackRetrieval (default)
type: CQADupstackRetrieval_is_a_combined_dataset
config: default
split: test
revision: CQADupstackRetrieval_is_a_combined_dataset
metrics:
- type: main_score
value: 46.644083333333334
- type: ndcg_at_10
value: 46.644083333333334
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackStatsRetrieval (default)
type: mteb/cqadupstack-stats
config: default
split: test
revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a
metrics:
- type: ndcg_at_1
value: 30.368000000000002
- type: ndcg_at_3
value: 35.004000000000005
- type: ndcg_at_5
value: 37.125
- type: ndcg_at_10
value: 39.831
- type: ndcg_at_20
value: 42.099
- type: ndcg_at_100
value: 45.032
- type: ndcg_at_1000
value: 47.016999999999996
- type: recall_at_1
value: 27.151999999999997
- type: recall_at_3
value: 38.2
- type: recall_at_5
value: 43.349
- type: recall_at_10
value: 51.50599999999999
- type: recall_at_20
value: 60.035000000000004
- type: recall_at_100
value: 74.869
- type: recall_at_1000
value: 89.159
- type: main_score
value: 39.831
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackTexRetrieval (default)
type: mteb/cqadupstack-tex
config: default
split: test
revision: 46989137a86843e03a6195de44b09deda022eec7
metrics:
- type: ndcg_at_1
value: 26.222
- type: ndcg_at_3
value: 30.085
- type: ndcg_at_5
value: 31.977
- type: ndcg_at_10
value: 34.107
- type: ndcg_at_20
value: 35.939
- type: ndcg_at_100
value: 39.054
- type: ndcg_at_1000
value: 41.899
- type: recall_at_1
value: 21.552
- type: recall_at_3
value: 32.66
- type: recall_at_5
value: 37.785000000000004
- type: recall_at_10
value: 44.143
- type: recall_at_20
value: 50.968999999999994
- type: recall_at_100
value: 66.392
- type: recall_at_1000
value: 86.601
- type: main_score
value: 34.107
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackUnixRetrieval (default)
type: mteb/cqadupstack-unix
config: default
split: test
revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53
metrics:
- type: ndcg_at_1
value: 36.287000000000006
- type: ndcg_at_3
value: 41.15
- type: ndcg_at_5
value: 43.283
- type: ndcg_at_10
value: 45.698
- type: ndcg_at_20
value: 47.754000000000005
- type: ndcg_at_100
value: 50.800999999999995
- type: ndcg_at_1000
value: 53.024
- type: recall_at_1
value: 30.791
- type: recall_at_3
value: 44.802
- type: recall_at_5
value: 50.434999999999995
- type: recall_at_10
value: 57.424
- type: recall_at_20
value: 64.702
- type: recall_at_100
value: 79.216
- type: recall_at_1000
value: 94.602
- type: main_score
value: 45.698
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWebmastersRetrieval (default)
type: mteb/cqadupstack-webmasters
config: default
split: test
revision: 160c094312a0e1facb97e55eeddb698c0abe3571
metrics:
- type: ndcg_at_1
value: 37.352000000000004
- type: ndcg_at_3
value: 43.029
- type: ndcg_at_5
value: 44.811
- type: ndcg_at_10
value: 47.493
- type: ndcg_at_20
value: 49.76
- type: ndcg_at_100
value: 52.925
- type: ndcg_at_1000
value: 55.117000000000004
- type: recall_at_1
value: 31.719
- type: recall_at_3
value: 45.466
- type: recall_at_5
value: 50.087
- type: recall_at_10
value: 57.86
- type: recall_at_20
value: 66.27
- type: recall_at_100
value: 81.437
- type: recall_at_1000
value: 95.162
- type: main_score
value: 47.493
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWordpressRetrieval (default)
type: mteb/cqadupstack-wordpress
config: default
split: test
revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4
metrics:
- type: ndcg_at_1
value: 29.020000000000003
- type: ndcg_at_3
value: 33.715
- type: ndcg_at_5
value: 35.266
- type: ndcg_at_10
value: 37.899
- type: ndcg_at_20
value: 39.812999999999995
- type: ndcg_at_100
value: 42.998999999999995
- type: ndcg_at_1000
value: 45.257
- type: recall_at_1
value: 26.784000000000002
- type: recall_at_3
value: 37.049
- type: recall_at_5
value: 40.638000000000005
- type: recall_at_10
value: 48.204
- type: recall_at_20
value: 55.496
- type: recall_at_100
value: 71.749
- type: recall_at_1000
value: 88.22
- type: main_score
value: 37.899
- task:
type: Retrieval
dataset:
name: MTEB CodeFeedbackMT (default)
type: CoIR-Retrieval/codefeedback-mt
config: default
split: test
revision: b0f12fa0c0dd67f59c95a5c33d02aeeb4c398c5f
metrics:
- type: ndcg_at_1
value: 67.214
- type: ndcg_at_3
value: 74.774
- type: ndcg_at_5
value: 76.297
- type: ndcg_at_10
value: 77.644
- type: ndcg_at_20
value: 78.41
- type: ndcg_at_100
value: 79.374
- type: ndcg_at_1000
value: 79.77
- type: recall_at_1
value: 67.214
- type: recall_at_3
value: 79.95
- type: recall_at_5
value: 83.65599999999999
- type: recall_at_10
value: 87.776
- type: recall_at_20
value: 90.781
- type: recall_at_100
value: 95.993
- type: recall_at_1000
value: 99.104
- type: main_score
value: 77.644
- task:
type: Retrieval
dataset:
name: MTEB CodeFeedbackST (default)
type: CoIR-Retrieval/codefeedback-st
config: default
split: test
revision: d213819e87aab9010628da8b73ab4eb337c89340
metrics:
- type: ndcg_at_1
value: 74.05000000000001
- type: ndcg_at_3
value: 84.59
- type: ndcg_at_5
value: 85.949
- type: ndcg_at_10
value: 86.627
- type: ndcg_at_20
value: 86.907
- type: ndcg_at_100
value: 87.149
- type: ndcg_at_1000
value: 87.21799999999999
- type: recall_at_1
value: 74.05000000000001
- type: recall_at_3
value: 91.685
- type: recall_at_5
value: 94.959
- type: recall_at_10
value: 97.017
- type: recall_at_20
value: 98.10900000000001
- type: recall_at_100
value: 99.396
- type: recall_at_1000
value: 99.92699999999999
- type: main_score
value: 86.627
- task:
type: Retrieval
dataset:
name: MTEB CodeSearchNetCCRetrieval (python)
type: CoIR-Retrieval/CodeSearchNet-ccr
config: python
split: test
revision: 6e1effa2c03723c5fde48ee912b5ee08d4f211e8
metrics:
- type: ndcg_at_1
value: 69.875
- type: ndcg_at_3
value: 79.45100000000001
- type: ndcg_at_5
value: 80.95400000000001
- type: ndcg_at_10
value: 82.025
- type: ndcg_at_20
value: 82.526
- type: ndcg_at_100
value: 83.07
- type: ndcg_at_1000
value: 83.28999999999999
- type: recall_at_1
value: 69.875
- type: recall_at_3
value: 85.957
- type: recall_at_5
value: 89.59
- type: recall_at_10
value: 92.874
- type: recall_at_20
value: 94.838
- type: recall_at_100
value: 97.748
- type: recall_at_1000
value: 99.47
- type: main_score
value: 82.025
- task:
type: Retrieval
dataset:
name: MTEB CodeSearchNetCCRetrieval (javascript)
type: CoIR-Retrieval/CodeSearchNet-ccr
config: javascript
split: test
revision: 6e1effa2c03723c5fde48ee912b5ee08d4f211e8
metrics:
- type: ndcg_at_1
value: 66.18
- type: ndcg_at_3
value: 76.294
- type: ndcg_at_5
value: 77.849
- type: ndcg_at_10
value: 78.95400000000001
- type: ndcg_at_20
value: 79.71000000000001
- type: ndcg_at_100
value: 80.402
- type: ndcg_at_1000
value: 80.694
- type: recall_at_1
value: 66.18
- type: recall_at_3
value: 83.10499999999999
- type: recall_at_5
value: 86.873
- type: recall_at_10
value: 90.277
- type: recall_at_20
value: 93.22399999999999
- type: recall_at_100
value: 96.87
- type: recall_at_1000
value: 99.21
- type: main_score
value: 78.95400000000001
- task:
type: Retrieval
dataset:
name: MTEB CodeSearchNetCCRetrieval (go)
type: CoIR-Retrieval/CodeSearchNet-ccr
config: go
split: test
revision: 6e1effa2c03723c5fde48ee912b5ee08d4f211e8
metrics:
- type: ndcg_at_1
value: 57.24
- type: ndcg_at_3
value: 67.84700000000001
- type: ndcg_at_5
value: 70.126
- type: ndcg_at_10
value: 71.839
- type: ndcg_at_20
value: 72.89
- type: ndcg_at_100
value: 73.904
- type: ndcg_at_1000
value: 74.343
- type: recall_at_1
value: 57.24
- type: recall_at_3
value: 75.179
- type: recall_at_5
value: 80.67
- type: recall_at_10
value: 85.939
- type: recall_at_20
value: 90.076
- type: recall_at_100
value: 95.48100000000001
- type: recall_at_1000
value: 98.929
- type: main_score
value: 71.839
- task:
type: Retrieval
dataset:
name: MTEB CodeSearchNetCCRetrieval (ruby)
type: CoIR-Retrieval/CodeSearchNet-ccr
config: ruby
split: test
revision: 6e1effa2c03723c5fde48ee912b5ee08d4f211e8
metrics:
- type: ndcg_at_1
value: 64.235
- type: ndcg_at_3
value: 73.451
- type: ndcg_at_5
value: 75.233
- type: ndcg_at_10
value: 76.53
- type: ndcg_at_20
value: 77.35
- type: ndcg_at_100
value: 78.13799999999999
- type: ndcg_at_1000
value: 78.57
- type: recall_at_1
value: 64.235
- type: recall_at_3
value: 79.699
- type: recall_at_5
value: 83.981
- type: recall_at_10
value: 88.02499999999999
- type: recall_at_20
value: 91.277
- type: recall_at_100
value: 95.638
- type: recall_at_1000
value: 99.048
- type: main_score
value: 76.53
- task:
type: Retrieval
dataset:
name: MTEB CodeSearchNetCCRetrieval (java)
type: CoIR-Retrieval/CodeSearchNet-ccr
config: java
split: test
revision: 6e1effa2c03723c5fde48ee912b5ee08d4f211e8
metrics:
- type: ndcg_at_1
value: 65.468
- type: ndcg_at_3
value: 75.064
- type: ndcg_at_5
value: 76.786
- type: ndcg_at_10
value: 77.929
- type: ndcg_at_20
value: 78.596
- type: ndcg_at_100
value: 79.28699999999999
- type: ndcg_at_1000
value: 79.625
- type: recall_at_1
value: 65.468
- type: recall_at_3
value: 81.56099999999999
- type: recall_at_5
value: 85.714
- type: recall_at_10
value: 89.229
- type: recall_at_20
value: 91.83
- type: recall_at_100
value: 95.509
- type: recall_at_1000
value: 98.17399999999999
- type: main_score
value: 77.929
- task:
type: Retrieval
dataset:
name: MTEB CodeSearchNetCCRetrieval (php)
type: CoIR-Retrieval/CodeSearchNet-ccr
config: php
split: test
revision: 6e1effa2c03723c5fde48ee912b5ee08d4f211e8
metrics:
- type: ndcg_at_1
value: 52.71900000000001
- type: ndcg_at_3
value: 63.025
- type: ndcg_at_5
value: 65.17399999999999
- type: ndcg_at_10
value: 66.982
- type: ndcg_at_20
value: 68.113
- type: ndcg_at_100
value: 69.443
- type: ndcg_at_1000
value: 70.111
- type: recall_at_1
value: 52.71900000000001
- type: recall_at_3
value: 70.158
- type: recall_at_5
value: 75.35300000000001
- type: recall_at_10
value: 80.919
- type: recall_at_20
value: 85.36500000000001
- type: recall_at_100
value: 92.486
- type: recall_at_1000
value: 97.788
- type: main_score
value: 66.982
- task:
type: Retrieval
dataset:
name: MTEB CodeSearchNetRetrieval (python)
type: code-search-net/code_search_net
config: python
split: test
revision: fdc6a9e39575768c27eb8a2a5f702bf846eb4759
metrics:
- type: ndcg_at_1
value: 86.9
- type: ndcg_at_3
value: 92.012
- type: ndcg_at_5
value: 93.002
- type: ndcg_at_10
value: 93.304
- type: ndcg_at_20
value: 93.432
- type: ndcg_at_100
value: 93.50500000000001
- type: ndcg_at_1000
value: 93.54
- type: recall_at_1
value: 86.9
- type: recall_at_3
value: 95.5
- type: recall_at_5
value: 97.89999999999999
- type: recall_at_10
value: 98.8
- type: recall_at_20
value: 99.3
- type: recall_at_100
value: 99.7
- type: recall_at_1000
value: 100.0
- type: main_score
value: 93.304
- task:
type: Retrieval
dataset:
name: MTEB CodeSearchNetRetrieval (javascript)
type: code-search-net/code_search_net
config: javascript
split: test
revision: fdc6a9e39575768c27eb8a2a5f702bf846eb4759
metrics:
- type: ndcg_at_1
value: 73.9
- type: ndcg_at_3
value: 80.297
- type: ndcg_at_5
value: 81.162
- type: ndcg_at_10
value: 82.075
- type: ndcg_at_20
value: 82.432
- type: ndcg_at_100
value: 82.948
- type: ndcg_at_1000
value: 83.722
- type: recall_at_1
value: 73.9
- type: recall_at_3
value: 84.6
- type: recall_at_5
value: 86.7
- type: recall_at_10
value: 89.5
- type: recall_at_20
value: 90.9
- type: recall_at_100
value: 93.7
- type: recall_at_1000
value: 100.0
- type: main_score
value: 82.075
- task:
type: Retrieval
dataset:
name: MTEB CodeSearchNetRetrieval (go)
type: code-search-net/code_search_net
config: go
split: test
revision: fdc6a9e39575768c27eb8a2a5f702bf846eb4759
metrics:
- type: ndcg_at_1
value: 86.9
- type: ndcg_at_3
value: 92.961
- type: ndcg_at_5
value: 93.632
- type: ndcg_at_10
value: 93.865
- type: ndcg_at_20
value: 93.917
- type: ndcg_at_100
value: 93.994
- type: ndcg_at_1000
value: 94.02199999999999
- type: recall_at_1
value: 86.9
- type: recall_at_3
value: 96.89999999999999
- type: recall_at_5
value: 98.5
- type: recall_at_10
value: 99.2
- type: recall_at_20
value: 99.4
- type: recall_at_100
value: 99.8
- type: recall_at_1000
value: 100.0
- type: main_score
value: 93.865
- task:
type: Retrieval
dataset:
name: MTEB CodeSearchNetRetrieval (ruby)
type: code-search-net/code_search_net
config: ruby
split: test
revision: fdc6a9e39575768c27eb8a2a5f702bf846eb4759
metrics:
- type: ndcg_at_1
value: 79.10000000000001
- type: ndcg_at_3
value: 85.626
- type: ndcg_at_5
value: 86.629
- type: ndcg_at_10
value: 87.16000000000001
- type: ndcg_at_20
value: 87.414
- type: ndcg_at_100
value: 87.7
- type: ndcg_at_1000
value: 88.115
- type: recall_at_1
value: 79.10000000000001
- type: recall_at_3
value: 89.9
- type: recall_at_5
value: 92.30000000000001
- type: recall_at_10
value: 93.89999999999999
- type: recall_at_20
value: 94.89999999999999
- type: recall_at_100
value: 96.39999999999999
- type: recall_at_1000
value: 100.0
- type: main_score
value: 87.16000000000001
- task:
type: Retrieval
dataset:
name: MTEB CodeSearchNetRetrieval (java)
type: code-search-net/code_search_net
config: java
split: test
revision: fdc6a9e39575768c27eb8a2a5f702bf846eb4759
metrics:
- type: ndcg_at_1
value: 82.0
- type: ndcg_at_3
value: 89.205
- type: ndcg_at_5
value: 89.86699999999999
- type: ndcg_at_10
value: 90.269
- type: ndcg_at_20
value: 90.32
- type: ndcg_at_100
value: 90.36999999999999
- type: ndcg_at_1000
value: 90.691
- type: recall_at_1
value: 82.0
- type: recall_at_3
value: 94.0
- type: recall_at_5
value: 95.6
- type: recall_at_10
value: 96.8
- type: recall_at_20
value: 97.0
- type: recall_at_100
value: 97.3
- type: recall_at_1000
value: 100.0
- type: main_score
value: 90.269
- task:
type: Retrieval
dataset:
name: MTEB CodeSearchNetRetrieval (php)
type: code-search-net/code_search_net
config: php
split: test
revision: fdc6a9e39575768c27eb8a2a5f702bf846eb4759
metrics:
- type: ndcg_at_1
value: 76.1
- type: ndcg_at_3
value: 83.97
- type: ndcg_at_5
value: 85.128
- type: ndcg_at_10
value: 85.922
- type: ndcg_at_20
value: 86.279
- type: ndcg_at_100
value: 86.53
- type: ndcg_at_1000
value: 86.846
- type: recall_at_1
value: 76.1
- type: recall_at_3
value: 89.3
- type: recall_at_5
value: 92.10000000000001
- type: recall_at_10
value: 94.5
- type: recall_at_20
value: 95.89999999999999
- type: recall_at_100
value: 97.3
- type: recall_at_1000
value: 100.0
- type: main_score
value: 85.922
- task:
type: Retrieval
dataset:
name: MTEB CodeTransOceanContest (default)
type: CoIR-Retrieval/codetrans-contest
config: default
split: test
revision: 20da4eb20a4b17300c0986ee148c90867a7f2a4d
metrics:
- type: ndcg_at_1
value: 82.353
- type: ndcg_at_3
value: 86.792
- type: ndcg_at_5
value: 88.116
- type: ndcg_at_10
value: 89.164
- type: ndcg_at_20
value: 89.627
- type: ndcg_at_100
value: 89.816
- type: ndcg_at_1000
value: 89.929
- type: recall_at_1
value: 82.353
- type: recall_at_3
value: 90.045
- type: recall_at_5
value: 93.21300000000001
- type: recall_at_10
value: 96.38
- type: recall_at_20
value: 98.19
- type: recall_at_100
value: 99.095
- type: recall_at_1000
value: 100.0
- type: main_score
value: 89.164
- task:
type: Retrieval
dataset:
name: MTEB CodeTransOceanDL (default)
type: CoIR-Retrieval/codetrans-dl
config: default
split: test
revision: 281562cb8a1265ab5c0824bfa6ddcd9b0a15618f
metrics:
- type: ndcg_at_1
value: 9.443999999999999
- type: ndcg_at_3
value: 13.141
- type: ndcg_at_5
value: 20.149
- type: ndcg_at_10
value: 35.181000000000004
- type: ndcg_at_20
value: 39.898
- type: ndcg_at_100
value: 40.337
- type: ndcg_at_1000
value: 40.337
- type: recall_at_1
value: 9.443999999999999
- type: recall_at_3
value: 16.111
- type: recall_at_5
value: 32.778
- type: recall_at_10
value: 80.55600000000001
- type: recall_at_20
value: 97.77799999999999
- type: recall_at_100
value: 100.0
- type: recall_at_1000
value: 100.0
- type: main_score
value: 35.181000000000004
- task:
type: Retrieval
dataset:
name: MTEB CosQA (default)
type: CoIR-Retrieval/cosqa
config: default
split: test
revision: bc5efb7e9d437246ce393ed19d772e08e4a79535
metrics:
- type: ndcg_at_1
value: 14.2
- type: ndcg_at_3
value: 23.647000000000002
- type: ndcg_at_5
value: 28.655
- type: ndcg_at_10
value: 34.175
- type: ndcg_at_20
value: 37.04
- type: ndcg_at_100
value: 41.074
- type: ndcg_at_1000
value: 41.917
- type: recall_at_1
value: 14.2
- type: recall_at_3
value: 31.0
- type: recall_at_5
value: 43.4
- type: recall_at_10
value: 60.4
- type: recall_at_20
value: 71.8
- type: recall_at_100
value: 93.0
- type: recall_at_1000
value: 99.2
- type: main_score
value: 34.175
- task:
type: Retrieval
dataset:
name: MTEB DBPedia (default)
type: mteb/dbpedia
config: default
split: test
revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659
metrics:
- type: ndcg_at_1
value: 61.5
- type: ndcg_at_3
value: 53.476
- type: ndcg_at_5
value: 51.601
- type: ndcg_at_10
value: 50.391
- type: ndcg_at_20
value: 49.342000000000006
- type: ndcg_at_100
value: 55.37800000000001
- type: ndcg_at_1000
value: 62.470000000000006
- type: recall_at_1
value: 9.757
- type: recall_at_3
value: 17.203
- type: recall_at_5
value: 21.878
- type: recall_at_10
value: 30.425
- type: recall_at_20
value: 39.137
- type: recall_at_100
value: 62.885000000000005
- type: recall_at_1000
value: 85.795
- type: main_score
value: 50.391
- task:
type: Retrieval
dataset:
name: MTEB FiQA-PL (default)
type: clarin-knext/fiqa-pl
config: default
split: test
revision: 2e535829717f8bf9dc829b7f911cc5bbd4e6608e
metrics:
- type: ndcg_at_1
value: 46.296
- type: ndcg_at_3
value: 43.682
- type: ndcg_at_5
value: 44.818999999999996
- type: ndcg_at_10
value: 47.137
- type: ndcg_at_20
value: 49.957
- type: ndcg_at_100
value: 53.998999999999995
- type: ndcg_at_1000
value: 56.547000000000004
- type: recall_at_1
value: 23.116999999999997
- type: recall_at_3
value: 39.967000000000006
- type: recall_at_5
value: 46.745
- type: recall_at_10
value: 54.202
- type: recall_at_20
value: 62.61600000000001
- type: recall_at_100
value: 79.322
- type: recall_at_1000
value: 94.114
- type: main_score
value: 47.137
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018 (default)
type: mteb/fiqa
config: default
split: test
revision: 27a168819829fe9bcd655c2df245fb19452e8e06
metrics:
- type: ndcg_at_1
value: 63.117000000000004
- type: ndcg_at_3
value: 58.538999999999994
- type: ndcg_at_5
value: 59.147000000000006
- type: ndcg_at_10
value: 62.35000000000001
- type: ndcg_at_20
value: 65.36800000000001
- type: ndcg_at_100
value: 68.801
- type: ndcg_at_1000
value: 70.06599999999999
- type: recall_at_1
value: 33.377
- type: recall_at_3
value: 52.817
- type: recall_at_5
value: 59.03699999999999
- type: recall_at_10
value: 69.116
- type: recall_at_20
value: 78.30799999999999
- type: recall_at_100
value: 91.715
- type: recall_at_1000
value: 98.783
- type: main_score
value: 62.35000000000001
- task:
type: Retrieval
dataset:
name: MTEB GerDaLIRSmall (default)
type: mteb/GerDaLIRSmall
config: default
split: test
revision: 48327de6ee192e9610f3069789719788957c7abd
metrics:
- type: ndcg_at_1
value: 30.047
- type: ndcg_at_3
value: 36.635
- type: ndcg_at_5
value: 39.237
- type: ndcg_at_10
value: 41.752
- type: ndcg_at_20
value: 43.467
- type: ndcg_at_100
value: 45.793
- type: ndcg_at_1000
value: 47.404
- type: recall_at_1
value: 27.272999999999996
- type: recall_at_3
value: 41.534
- type: recall_at_5
value: 47.678
- type: recall_at_10
value: 55.131
- type: recall_at_20
value: 61.592
- type: recall_at_100
value: 73.604
- type: recall_at_1000
value: 86.146
- type: main_score
value: 41.752
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA (default)
type: mteb/hotpotqa
config: default
split: test
revision: ab518f4d6fcca38d87c25209f94beba119d02014
metrics:
- type: ndcg_at_1
value: 88.062
- type: ndcg_at_3
value: 77.443
- type: ndcg_at_5
value: 80.05600000000001
- type: ndcg_at_10
value: 81.979
- type: ndcg_at_20
value: 83.033
- type: ndcg_at_100
value: 84.232
- type: ndcg_at_1000
value: 84.827
- type: recall_at_1
value: 44.031
- type: recall_at_3
value: 75.71900000000001
- type: recall_at_5
value: 80.851
- type: recall_at_10
value: 85.652
- type: recall_at_20
value: 89.021
- type: recall_at_100
value: 94.267
- type: recall_at_1000
value: 98.136
- type: main_score
value: 81.979
- task:
type: Retrieval
dataset:
name: MTEB LEMBNarrativeQARetrieval (default)
type: dwzhu/LongEmbed
config: default
split: test
revision: 6e346642246bfb4928c560ee08640dc84d074e8c
metrics:
- type: ndcg_at_1
value: 49.009
- type: ndcg_at_3
value: 56.69
- type: ndcg_at_5
value: 58.572
- type: ndcg_at_10
value: 60.702
- type: ndcg_at_20
value: 62.160000000000004
- type: ndcg_at_100
value: 64.461
- type: ndcg_at_1000
value: 65.604
- type: recall_at_1
value: 49.009
- type: recall_at_3
value: 62.073
- type: recall_at_5
value: 66.648
- type: recall_at_10
value: 73.222
- type: recall_at_20
value: 78.974
- type: recall_at_100
value: 91.444
- type: recall_at_1000
value: 100.0
- type: main_score
value: 60.702
- type: ndcg_at_1
value: 37.263000000000005
- type: ndcg_at_3
value: 48.207
- type: ndcg_at_5
value: 51.464
- type: ndcg_at_10
value: 55.071999999999996
- type: ndcg_at_20
value: 57.364000000000004
- type: ndcg_at_100
value: 60.236999999999995
- type: ndcg_at_1000
value: 60.352
- type: recall_at_1
value: 37.263000000000005
- type: recall_at_3
value: 55.92700000000001
- type: recall_at_5
value: 63.851
- type: recall_at_10
value: 74.91799999999999
- type: recall_at_20
value: 83.955
- type: recall_at_100
value: 99.214
- type: recall_at_1000
value: 100.0
- type: main_score
value: 55.071999999999996
- type: ndcg_at_1
value: 80.0
- type: ndcg_at_3
value: 84.024
- type: ndcg_at_5
value: 84.985
- type: ndcg_at_10
value: 85.751
- type: ndcg_at_20
value: 86.634
- type: ndcg_at_100
value: 87.348
- type: ndcg_at_1000
value: 87.48599999999999
- type: recall_at_1
value: 80.0
- type: recall_at_3
value: 87.0
- type: recall_at_5
value: 89.333
- type: recall_at_10
value: 91.667
- type: recall_at_20
value: 95.0
- type: recall_at_100
value: 99.0
- type: recall_at_1000
value: 100.0
- type: main_score
value: 85.751
- task:
type: Retrieval
dataset:
name: MTEB LEMBNeedleRetrieval (default)
type: dwzhu/LongEmbed
config: default
split: test_256
revision: 6e346642246bfb4928c560ee08640dc84d074e8c
metrics:
- type: ndcg_at_1
value: 8.0
- type: ndcg_at_3
value: 12.786
- type: ndcg_at_5
value: 15.282000000000002
- type: ndcg_at_10
value: 20.096
- type: ndcg_at_20
value: 22.631
- type: ndcg_at_100
value: 32.174
- type: ndcg_at_1000
value: 32.174
- type: recall_at_1
value: 8.0
- type: recall_at_3
value: 16.0
- type: recall_at_5
value: 22.0
- type: recall_at_10
value: 36.0
- type: recall_at_20
value: 46.0
- type: recall_at_100
value: 100.0
- type: recall_at_1000
value: 100.0
- type: main_score
value: 8.0
- type: ndcg_at_1
value: 10.0
- type: ndcg_at_3
value: 12.0
- type: ndcg_at_5
value: 12.0
- type: ndcg_at_10
value: 12.631
- type: ndcg_at_20
value: 14.982000000000001
- type: ndcg_at_100
value: 28.534
- type: ndcg_at_1000
value: 28.534
- type: recall_at_1
value: 10.0
- type: recall_at_3
value: 14.000000000000002
- type: recall_at_5
value: 14.000000000000002
- type: recall_at_10
value: 16.0
- type: recall_at_20
value: 26.0
- type: recall_at_100
value: 100.0
- type: recall_at_1000
value: 100.0
- type: main_score
value: 10.0
- task:
type: Retrieval
dataset:
name: MTEB LEMBSummScreenFDRetrieval (default)
type: dwzhu/LongEmbed
config: default
split: validation
revision: 6e346642246bfb4928c560ee08640dc84d074e8c
metrics:
- type: ndcg_at_1
value: 94.345
- type: ndcg_at_3
value: 96.66900000000001
- type: ndcg_at_5
value: 97.297
- type: ndcg_at_10
value: 97.387
- type: ndcg_at_20
value: 97.387
- type: ndcg_at_100
value: 97.387
- type: ndcg_at_1000
value: 97.387
- type: recall_at_1
value: 94.345
- type: recall_at_3
value: 98.214
- type: recall_at_5
value: 99.702
- type: recall_at_10
value: 100.0
- type: recall_at_20
value: 100.0
- type: recall_at_100
value: 100.0
- type: recall_at_1000
value: 100.0
- type: main_score
value: 97.387
- task:
type: Retrieval
dataset:
name: MTEB LeCaRDv2 (default)
type: mteb/LeCaRDv2
config: default
split: test
revision: b78e18688c3d012a33dc3676597c1d1b2243ce1c
metrics:
- type: ndcg_at_1
value: 87.421
- type: ndcg_at_3
value: 83.159
- type: ndcg_at_5
value: 79.818
- type: ndcg_at_10
value: 74.168
- type: ndcg_at_20
value: 67.81
- type: ndcg_at_100
value: 80.432
- type: ndcg_at_1000
value: 84.423
- type: recall_at_1
value: 4.1450000000000005
- type: recall_at_3
value: 10.988000000000001
- type: recall_at_5
value: 16.808999999999997
- type: recall_at_10
value: 29.329
- type: recall_at_20
value: 48.425000000000004
- type: recall_at_100
value: 89.63600000000001
- type: recall_at_1000
value: 99.823
- type: main_score
value: 74.168
- task:
type: Retrieval
dataset:
name: MTEB LegalBenchConsumerContractsQA (default)
type: mteb/legalbench_consumer_contracts_qa
config: default
split: test
revision: b23590301ec94e8087e2850b21d43d4956b1cca9
metrics:
- type: ndcg_at_1
value: 73.485
- type: ndcg_at_3
value: 81.977
- type: ndcg_at_5
value: 84.63000000000001
- type: ndcg_at_10
value: 85.444
- type: ndcg_at_20
value: 86.008
- type: ndcg_at_100
value: 86.262
- type: ndcg_at_1000
value: 86.262
- type: recall_at_1
value: 73.485
- type: recall_at_3
value: 87.626
- type: recall_at_5
value: 93.939
- type: recall_at_10
value: 96.465
- type: recall_at_20
value: 98.737
- type: recall_at_100
value: 100.0
- type: recall_at_1000
value: 100.0
- type: main_score
value: 85.444
- task:
type: Retrieval
dataset:
name: MTEB LegalBenchCorporateLobbying (default)
type: mteb/legalbench_corporate_lobbying
config: default
split: test
revision: f69691c650464e62546d7f2a4536f8f87c891e38
metrics:
- type: ndcg_at_1
value: 91.471
- type: ndcg_at_3
value: 95.84700000000001
- type: ndcg_at_5
value: 96.088
- type: ndcg_at_10
value: 96.17999999999999
- type: ndcg_at_20
value: 96.17999999999999
- type: ndcg_at_100
value: 96.17999999999999
- type: ndcg_at_1000
value: 96.259
- type: recall_at_1
value: 91.471
- type: recall_at_3
value: 98.529
- type: recall_at_5
value: 99.118
- type: recall_at_10
value: 99.412
- type: recall_at_20
value: 99.412
- type: recall_at_100
value: 99.412
- type: recall_at_1000
value: 100.0
- type: main_score
value: 96.17999999999999
- task:
type: Retrieval
dataset:
name: MTEB LegalQuAD (default)
type: mteb/LegalQuAD
config: default
split: test
revision: 37aa6cfb01d48960b0f8e3f17d6e3d99bf1ebc3e
metrics:
- type: ndcg_at_1
value: 48.0
- type: ndcg_at_3
value: 59.397999999999996
- type: ndcg_at_5
value: 61.05500000000001
- type: ndcg_at_10
value: 63.219
- type: ndcg_at_20
value: 65.102
- type: ndcg_at_100
value: 67.254
- type: ndcg_at_1000
value: 67.746
- type: recall_at_1
value: 48.0
- type: recall_at_3
value: 67.0
- type: recall_at_5
value: 71.0
- type: recall_at_10
value: 77.5
- type: recall_at_20
value: 85.0
- type: recall_at_100
value: 96.5
- type: recall_at_1000
value: 100.0
- type: main_score
value: 63.219
- task:
type: Retrieval
dataset:
name: MTEB LegalSummarization (default)
type: mteb/legal_summarization
config: default
split: test
revision: 3bb1a05c66872889662af04c5691c14489cebd72
metrics:
- type: ndcg_at_1
value: 58.451
- type: ndcg_at_3
value: 63.70099999999999
- type: ndcg_at_5
value: 66.792
- type: ndcg_at_10
value: 69.76
- type: ndcg_at_20
value: 71.487
- type: ndcg_at_100
value: 73.6
- type: ndcg_at_1000
value: 74.05000000000001
- type: recall_at_1
value: 52.028
- type: recall_at_3
value: 66.7
- type: recall_at_5
value: 74.119
- type: recall_at_10
value: 82.595
- type: recall_at_20
value: 88.209
- type: recall_at_100
value: 97.24000000000001
- type: recall_at_1000
value: 100.0
- type: main_score
value: 69.76
- task:
type: Retrieval
dataset:
name: MTEB MintakaRetrieval (ar)
type: jinaai/mintakaqa
config: ar
split: test
revision: efa78cc2f74bbcd21eff2261f9e13aebe40b814e
metrics:
- type: ndcg_at_1
value: 19.791
- type: ndcg_at_3
value: 29.751
- type: ndcg_at_5
value: 32.83
- type: ndcg_at_10
value: 35.553000000000004
- type: ndcg_at_20
value: 37.528
- type: ndcg_at_100
value: 40.025
- type: ndcg_at_1000
value: 42.693
- type: recall_at_1
value: 19.791
- type: recall_at_3
value: 36.632
- type: recall_at_5
value: 44.076
- type: recall_at_10
value: 52.474
- type: recall_at_20
value: 60.281
- type: recall_at_100
value: 73.94500000000001
- type: recall_at_1000
value: 96.096
- type: main_score
value: 35.553000000000004
- task:
type: Retrieval
dataset:
name: MTEB MintakaRetrieval (de)
type: jinaai/mintakaqa
config: de
split: test
revision: efa78cc2f74bbcd21eff2261f9e13aebe40b814e
metrics:
- type: ndcg_at_1
value: 34.288000000000004
- type: ndcg_at_3
value: 47.29
- type: ndcg_at_5
value: 50.622
- type: ndcg_at_10
value: 53.291999999999994
- type: ndcg_at_20
value: 55.062999999999995
- type: ndcg_at_100
value: 56.987
- type: ndcg_at_1000
value: 58.084
- type: recall_at_1
value: 34.288000000000004
- type: recall_at_3
value: 56.486999999999995
- type: recall_at_5
value: 64.532
- type: recall_at_10
value: 72.746
- type: recall_at_20
value: 79.697
- type: recall_at_100
value: 90.185
- type: recall_at_1000
value: 98.989
- type: main_score
value: 53.291999999999994
- task:
type: Retrieval
dataset:
name: MTEB MintakaRetrieval (es)
type: jinaai/mintakaqa
config: es
split: test
revision: efa78cc2f74bbcd21eff2261f9e13aebe40b814e
metrics:
- type: ndcg_at_1
value: 31.889
- type: ndcg_at_3
value: 45.182
- type: ndcg_at_5
value: 48.475
- type: ndcg_at_10
value: 51.402
- type: ndcg_at_20
value: 53.089
- type: ndcg_at_100
value: 55.116
- type: ndcg_at_1000
value: 56.333999999999996
- type: recall_at_1
value: 31.889
- type: recall_at_3
value: 54.455
- type: recall_at_5
value: 62.417
- type: recall_at_10
value: 71.328
- type: recall_at_20
value: 77.97
- type: recall_at_100
value: 88.944
- type: recall_at_1000
value: 98.639
- type: main_score
value: 51.402
- task:
type: Retrieval
dataset:
name: MTEB MintakaRetrieval (fr)
type: jinaai/mintakaqa
config: fr
split: test
revision: efa78cc2f74bbcd21eff2261f9e13aebe40b814e
metrics:
- type: ndcg_at_1
value: 32.555
- type: ndcg_at_3
value: 45.278
- type: ndcg_at_5
value: 48.559000000000005
- type: ndcg_at_10
value: 51.485
- type: ndcg_at_20
value: 53.263000000000005
- type: ndcg_at_100
value: 55.221
- type: ndcg_at_1000
value: 56.501999999999995
- type: recall_at_1
value: 32.555
- type: recall_at_3
value: 54.054
- type: recall_at_5
value: 62.039
- type: recall_at_10
value: 70.966
- type: recall_at_20
value: 77.969
- type: recall_at_100
value: 88.411
- type: recall_at_1000
value: 98.69
- type: main_score
value: 51.485
- task:
type: Retrieval
dataset:
name: MTEB MintakaRetrieval (hi)
type: jinaai/mintakaqa
config: hi
split: test
revision: efa78cc2f74bbcd21eff2261f9e13aebe40b814e
metrics:
- type: ndcg_at_1
value: 24.757
- type: ndcg_at_3
value: 35.427
- type: ndcg_at_5
value: 38.431
- type: ndcg_at_10
value: 41.459
- type: ndcg_at_20
value: 44.137
- type: ndcg_at_100
value: 47.174
- type: ndcg_at_1000
value: 48.907000000000004
- type: recall_at_1
value: 24.757
- type: recall_at_3
value: 43.082
- type: recall_at_5
value: 50.336999999999996
- type: recall_at_10
value: 59.611000000000004
- type: recall_at_20
value: 70.157
- type: recall_at_100
value: 86.387
- type: recall_at_1000
value: 100.0
- type: main_score
value: 41.459
- task:
type: Retrieval
dataset:
name: MTEB MintakaRetrieval (it)
type: jinaai/mintakaqa
config: it
split: test
revision: efa78cc2f74bbcd21eff2261f9e13aebe40b814e
metrics:
- type: ndcg_at_1
value: 32.818000000000005
- type: ndcg_at_3
value: 46.503
- type: ndcg_at_5
value: 49.68
- type: ndcg_at_10
value: 52.510999999999996
- type: ndcg_at_20
value: 54.269999999999996
- type: ndcg_at_100
value: 56.17100000000001
- type: ndcg_at_1000
value: 57.38100000000001
- type: recall_at_1
value: 32.818000000000005
- type: recall_at_3
value: 56.033
- type: recall_at_5
value: 63.715999999999994
- type: recall_at_10
value: 72.48400000000001
- type: recall_at_20
value: 79.374
- type: recall_at_100
value: 89.436
- type: recall_at_1000
value: 98.914
- type: main_score
value: 52.510999999999996
- task:
type: Retrieval
dataset:
name: MTEB MintakaRetrieval (ja)
type: jinaai/mintakaqa
config: ja
split: test
revision: efa78cc2f74bbcd21eff2261f9e13aebe40b814e
metrics:
- type: ndcg_at_1
value: 25.216
- type: ndcg_at_3
value: 35.982
- type: ndcg_at_5
value: 38.694
- type: ndcg_at_10
value: 41.585
- type: ndcg_at_20
value: 43.334
- type: ndcg_at_100
value: 45.831
- type: ndcg_at_1000
value: 48.06
- type: recall_at_1
value: 25.216
- type: recall_at_3
value: 43.599
- type: recall_at_5
value: 50.173
- type: recall_at_10
value: 59.083
- type: recall_at_20
value: 65.96
- type: recall_at_100
value: 79.542
- type: recall_at_1000
value: 97.794
- type: main_score
value: 41.585
- task:
type: Retrieval
dataset:
name: MTEB MintakaRetrieval (pt)
type: jinaai/mintakaqa
config: pt
split: test
revision: efa78cc2f74bbcd21eff2261f9e13aebe40b814e
metrics:
- type: ndcg_at_1
value: 33.517
- type: ndcg_at_3
value: 46.955999999999996
- type: ndcg_at_5
value: 50.441
- type: ndcg_at_10
value: 53.256
- type: ndcg_at_20
value: 55.086
- type: ndcg_at_100
value: 57.104
- type: ndcg_at_1000
value: 58.07600000000001
- type: recall_at_1
value: 33.517
- type: recall_at_3
value: 56.245
- type: recall_at_5
value: 64.63499999999999
- type: recall_at_10
value: 73.258
- type: recall_at_20
value: 80.47999999999999
- type: recall_at_100
value: 91.27
- type: recall_at_1000
value: 99.10799999999999
- type: main_score
value: 53.256
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus (default)
type: mteb/nfcorpus
config: default
split: test
revision: ec0fa4fe99da2ff19ca1214b7966684033a58814
metrics:
- type: ndcg_at_1
value: 51.702999999999996
- type: ndcg_at_3
value: 48.064
- type: ndcg_at_5
value: 46.379
- type: ndcg_at_10
value: 43.663999999999994
- type: ndcg_at_20
value: 41.407
- type: ndcg_at_100
value: 42.083
- type: ndcg_at_1000
value: 52.335
- type: recall_at_1
value: 6.241
- type: recall_at_3
value: 12.214
- type: recall_at_5
value: 16.473
- type: recall_at_10
value: 21.84
- type: recall_at_20
value: 27.474999999999998
- type: recall_at_100
value: 45.01
- type: recall_at_1000
value: 80.71300000000001
- type: main_score
value: 43.663999999999994
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus-PL (default)
type: clarin-knext/nfcorpus-pl
config: default
split: test
revision: 9a6f9567fda928260afed2de480d79c98bf0bec0
metrics:
- type: ndcg_at_1
value: 41.641
- type: ndcg_at_3
value: 37.617
- type: ndcg_at_5
value: 36.024
- type: ndcg_at_10
value: 33.51
- type: ndcg_at_20
value: 31.575999999999997
- type: ndcg_at_100
value: 31.601000000000003
- type: ndcg_at_1000
value: 41.099000000000004
- type: recall_at_1
value: 4.61
- type: recall_at_3
value: 9.366
- type: recall_at_5
value: 11.793
- type: recall_at_10
value: 16.255
- type: recall_at_20
value: 20.713
- type: recall_at_100
value: 33.396
- type: recall_at_1000
value: 65.532
- type: main_score
value: 33.51
- task:
type: Retrieval
dataset:
name: MTEB NQ (default)
type: mteb/nq
config: default
split: test
revision: b774495ed302d8c44a3a7ea25c90dbce03968f31
metrics:
- type: ndcg_at_1
value: 50.753
- type: ndcg_at_3
value: 62.541000000000004
- type: ndcg_at_5
value: 66.46600000000001
- type: ndcg_at_10
value: 69.65400000000001
- type: ndcg_at_20
value: 70.91499999999999
- type: ndcg_at_100
value: 71.908
- type: ndcg_at_1000
value: 72.08200000000001
- type: recall_at_1
value: 45.293
- type: recall_at_3
value: 71.089
- type: recall_at_5
value: 79.93
- type: recall_at_10
value: 89.01599999999999
- type: recall_at_20
value: 93.60300000000001
- type: recall_at_100
value: 98.501
- type: recall_at_1000
value: 99.768
- type: main_score
value: 69.65400000000001
- task:
type: Retrieval
dataset:
name: MTEB NQ-PL (default)
type: clarin-knext/nq-pl
config: default
split: test
revision: f171245712cf85dd4700b06bef18001578d0ca8d
metrics:
- type: ndcg_at_1
value: 34.791
- type: ndcg_at_3
value: 45.418
- type: ndcg_at_5
value: 49.486000000000004
- type: ndcg_at_10
value: 53.141000000000005
- type: ndcg_at_20
value: 55.230999999999995
- type: ndcg_at_100
value: 57.358
- type: ndcg_at_1000
value: 58.166
- type: recall_at_1
value: 31.04
- type: recall_at_3
value: 53.179
- type: recall_at_5
value: 62.539
- type: recall_at_10
value: 73.08099999999999
- type: recall_at_20
value: 80.83500000000001
- type: recall_at_100
value: 91.503
- type: recall_at_1000
value: 97.429
- type: main_score
value: 53.141000000000005
- task:
type: Retrieval
dataset:
name: MTEB Quora-PL (default)
type: clarin-knext/quora-pl
config: default
split: validation
revision: 0be27e93455051e531182b85e85e425aba12e9d4
metrics:
- type: ndcg_at_1
value: 76.99000000000001
- type: ndcg_at_3
value: 81.781
- type: ndcg_at_5
value: 83.627
- type: ndcg_at_10
value: 85.146
- type: ndcg_at_20
value: 86.015
- type: ndcg_at_100
value: 86.745
- type: ndcg_at_1000
value: 86.882
- type: recall_at_1
value: 66.806
- type: recall_at_3
value: 84.09400000000001
- type: recall_at_5
value: 89.09899999999999
- type: recall_at_10
value: 93.512
- type: recall_at_20
value: 96.365
- type: recall_at_100
value: 99.22
- type: recall_at_1000
value: 99.937
- type: main_score
value: 85.146
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval (default)
type: mteb/quora
config: default
split: test
revision: e4e08e0b7dbe3c8700f0daef558ff32256715259
metrics:
- type: ndcg_at_1
value: 83.66
- type: ndcg_at_3
value: 87.863
- type: ndcg_at_5
value: 89.279
- type: ndcg_at_10
value: 90.372
- type: ndcg_at_20
value: 90.955
- type: ndcg_at_100
value: 91.352
- type: ndcg_at_1000
value: 91.39500000000001
- type: recall_at_1
value: 72.75399999999999
- type: recall_at_3
value: 89.41799999999999
- type: recall_at_5
value: 93.509
- type: recall_at_10
value: 96.679
- type: recall_at_20
value: 98.519
- type: recall_at_100
value: 99.845
- type: recall_at_1000
value: 99.998
- type: main_score
value: 90.372
- task:
type: Retrieval
dataset:
name: MTEB RiaNewsRetrieval (default)
type: ai-forever/ria-news-retrieval
config: default
split: test
revision: 82374b0bbacda6114f39ff9c5b925fa1512ca5d7
metrics:
- type: ndcg_at_1
value: 75.41
- type: ndcg_at_3
value: 83.13000000000001
- type: ndcg_at_5
value: 84.313
- type: ndcg_at_10
value: 85.009
- type: ndcg_at_20
value: 85.436
- type: ndcg_at_100
value: 85.875
- type: ndcg_at_1000
value: 86.048
- type: recall_at_1
value: 75.41
- type: recall_at_3
value: 88.38000000000001
- type: recall_at_5
value: 91.23
- type: recall_at_10
value: 93.34
- type: recall_at_20
value: 95.02000000000001
- type: recall_at_100
value: 97.37
- type: recall_at_1000
value: 98.78
- type: main_score
value: 85.009
- task:
type: Retrieval
dataset:
name: MTEB RuBQRetrieval (default)
type: ai-forever/rubq-retrieval
config: default
split: test
revision: e19b6ffa60b3bc248e0b41f4cc37c26a55c2a67b
metrics:
- type: ndcg_at_1
value: 63.652
- type: ndcg_at_3
value: 67.829
- type: ndcg_at_5
value: 72.141
- type: ndcg_at_10
value: 75.551
- type: ndcg_at_20
value: 76.925
- type: ndcg_at_100
value: 77.813
- type: ndcg_at_1000
value: 77.994
- type: recall_at_1
value: 45.09
- type: recall_at_3
value: 71.562
- type: recall_at_5
value: 81.474
- type: recall_at_10
value: 90.237
- type: recall_at_20
value: 94.679
- type: recall_at_100
value: 98.752
- type: recall_at_1000
value: 99.83999999999999
- type: main_score
value: 75.551
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS (default)
type: mteb/scidocs
config: default
split: test
revision: f8c2fcf00f625baaa80f62ec5bd9e1fff3b8ae88
metrics:
- type: ndcg_at_1
value: 33.7
- type: ndcg_at_3
value: 28.360999999999997
- type: ndcg_at_5
value: 25.259999999999998
- type: ndcg_at_10
value: 30.775999999999996
- type: ndcg_at_20
value: 34.782000000000004
- type: ndcg_at_100
value: 41.753
- type: ndcg_at_1000
value: 46.887
- type: recall_at_1
value: 6.843000000000001
- type: recall_at_3
value: 16.228
- type: recall_at_5
value: 22.828
- type: recall_at_10
value: 33.007
- type: recall_at_20
value: 42.433
- type: recall_at_100
value: 64.967
- type: recall_at_1000
value: 89.587
- type: main_score
value: 30.775999999999996
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS-PL (default)
type: clarin-knext/scidocs-pl
config: default
split: test
revision: 45452b03f05560207ef19149545f168e596c9337
metrics:
- type: ndcg_at_1
value: 26.5
- type: ndcg_at_3
value: 21.079
- type: ndcg_at_5
value: 18.63
- type: ndcg_at_10
value: 22.483
- type: ndcg_at_20
value: 25.552999999999997
- type: ndcg_at_100
value: 31.572
- type: ndcg_at_1000
value: 37.147000000000006
- type: recall_at_1
value: 5.367999999999999
- type: recall_at_3
value: 11.907
- type: recall_at_5
value: 16.631999999999998
- type: recall_at_10
value: 23.647000000000002
- type: recall_at_20
value: 30.857
- type: recall_at_100
value: 50.236999999999995
- type: recall_at_1000
value: 77.445
- type: main_score
value: 22.483
- task:
type: Retrieval
dataset:
name: MTEB SciFact (default)
type: mteb/scifact
config: default
split: test
revision: 0228b52cf27578f30900b9e5271d331663a030d7
metrics:
- type: ndcg_at_1
value: 74.333
- type: ndcg_at_3
value: 82.071
- type: ndcg_at_5
value: 83.83800000000001
- type: ndcg_at_10
value: 85.399
- type: ndcg_at_20
value: 85.57900000000001
- type: ndcg_at_100
value: 86.075
- type: ndcg_at_1000
value: 86.164
- type: recall_at_1
value: 70.994
- type: recall_at_3
value: 87.417
- type: recall_at_5
value: 91.89399999999999
- type: recall_at_10
value: 96.167
- type: recall_at_20
value: 96.833
- type: recall_at_100
value: 99.333
- type: recall_at_1000
value: 100.0
- type: main_score
value: 85.399
- task:
type: Retrieval
dataset:
name: MTEB SciFact-PL (default)
type: clarin-knext/scifact-pl
config: default
split: test
revision: 47932a35f045ef8ed01ba82bf9ff67f6e109207e
metrics:
- type: ndcg_at_1
value: 65.333
- type: ndcg_at_3
value: 73.291
- type: ndcg_at_5
value: 75.149
- type: ndcg_at_10
value: 77.633
- type: ndcg_at_20
value: 78.236
- type: ndcg_at_100
value: 79.182
- type: ndcg_at_1000
value: 79.431
- type: recall_at_1
value: 61.99400000000001
- type: recall_at_3
value: 79.01700000000001
- type: recall_at_5
value: 83.72800000000001
- type: recall_at_10
value: 90.72200000000001
- type: recall_at_20
value: 93.0
- type: recall_at_100
value: 98.0
- type: recall_at_1000
value: 100.0
- type: main_score
value: 77.633
- task:
type: Retrieval
dataset:
name: MTEB StackOverflowQA (default)
type: CoIR-Retrieval/stackoverflow-qa
config: default
split: test
revision: db8f169f3894c14a00251061f957b2063eef2bd5
metrics:
- type: ndcg_at_1
value: 90.07
- type: ndcg_at_3
value: 93.30199999999999
- type: ndcg_at_5
value: 93.812
- type: ndcg_at_10
value: 94.219
- type: ndcg_at_20
value: 94.46799999999999
- type: ndcg_at_100
value: 94.581
- type: ndcg_at_1000
value: 94.626
- type: recall_at_1
value: 90.07
- type: recall_at_3
value: 95.537
- type: recall_at_5
value: 96.78999999999999
- type: recall_at_10
value: 98.044
- type: recall_at_20
value: 99.047
- type: recall_at_100
value: 99.649
- type: recall_at_1000
value: 100.0
- type: main_score
value: 94.219
- task:
type: Retrieval
dataset:
name: MTEB SyntecRetrieval (default)
type: lyon-nlp/mteb-fr-retrieval-syntec-s2p
config: default
split: test
revision: 19661ccdca4dfc2d15122d776b61685f48c68ca9
metrics:
- type: ndcg_at_1
value: 83.0
- type: ndcg_at_3
value: 90.809
- type: ndcg_at_5
value: 91.583
- type: ndcg_at_10
value: 92.199
- type: ndcg_at_20
value: 92.199
- type: ndcg_at_100
value: 92.199
- type: ndcg_at_1000
value: 92.199
- type: recall_at_1
value: 83.0
- type: recall_at_3
value: 96.0
- type: recall_at_5
value: 98.0
- type: recall_at_10
value: 100.0
- type: recall_at_20
value: 100.0
- type: recall_at_100
value: 100.0
- type: recall_at_1000
value: 100.0
- type: main_score
value: 92.199
- task:
type: Retrieval
dataset:
name: MTEB SyntheticText2SQL (default)
type: CoIR-Retrieval/synthetic-text2sql
config: default
split: test
revision: 686b87296c3a0191b5d9415a00526c62db9fce09
metrics:
- type: ndcg_at_1
value: 20.526
- type: ndcg_at_3
value: 60.12
- type: ndcg_at_5
value: 62.134
- type: ndcg_at_10
value: 63.50599999999999
- type: ndcg_at_20
value: 64.167
- type: ndcg_at_100
value: 64.687
- type: ndcg_at_1000
value: 64.801
- type: recall_at_1
value: 20.526
- type: recall_at_3
value: 84.721
- type: recall_at_5
value: 89.574
- type: recall_at_10
value: 93.762
- type: recall_at_20
value: 96.36
- type: recall_at_100
value: 99.09400000000001
- type: recall_at_1000
value: 99.966
- type: main_score
value: 63.50599999999999
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID (default)
type: mteb/trec-covid
config: default
split: test
revision: bb9466bac8153a0349341eb1b22e06409e78ef4e
metrics:
- type: ndcg_at_1
value: 76.0
- type: ndcg_at_3
value: 78.899
- type: ndcg_at_5
value: 78.212
- type: ndcg_at_10
value: 75.09700000000001
- type: ndcg_at_20
value: 72.158
- type: ndcg_at_100
value: 58.465999999999994
- type: ndcg_at_1000
value: 53.702000000000005
- type: recall_at_1
value: 0.231
- type: recall_at_3
value: 0.7000000000000001
- type: recall_at_5
value: 1.146
- type: recall_at_10
value: 2.174
- type: recall_at_20
value: 4.031
- type: recall_at_100
value: 14.713999999999999
- type: recall_at_1000
value: 50.8
- type: main_score
value: 75.09700000000001
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID-PL (default)
type: clarin-knext/trec-covid-pl
config: default
split: test
revision: 81bcb408f33366c2a20ac54adafad1ae7e877fdd
metrics:
- type: ndcg_at_1
value: 75.0
- type: ndcg_at_3
value: 75.531
- type: ndcg_at_5
value: 75.327
- type: ndcg_at_10
value: 74.28
- type: ndcg_at_20
value: 71.5
- type: ndcg_at_100
value: 58.412
- type: ndcg_at_1000
value: 52.580000000000005
- type: recall_at_1
value: 0.214
- type: recall_at_3
value: 0.647
- type: recall_at_5
value: 1.083
- type: recall_at_10
value: 2.141
- type: recall_at_20
value: 3.9309999999999996
- type: recall_at_100
value: 14.738999999999999
- type: recall_at_1000
value: 49.494
- type: main_score
value: 74.28
- task:
type: Retrieval
dataset:
name: MTEB Touche2020 (default)
type: mteb/touche2020
config: default
split: test
revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f
metrics:
- type: ndcg_at_1
value: 20.408
- type: ndcg_at_3
value: 23.368
- type: ndcg_at_5
value: 24.795
- type: ndcg_at_10
value: 24.442
- type: ndcg_at_20
value: 26.712000000000003
- type: ndcg_at_100
value: 38.218999999999994
- type: ndcg_at_1000
value: 50.395
- type: recall_at_1
value: 2.414
- type: recall_at_3
value: 6.3549999999999995
- type: recall_at_5
value: 9.888
- type: recall_at_10
value: 16.31
- type: recall_at_20
value: 25.369000000000003
- type: recall_at_100
value: 51.449999999999996
- type: recall_at_1000
value: 88.532
- type: main_score
value: 24.442
- task:
type: Retrieval
dataset:
name: MTEB ARCChallenge (default)
type: RAR-b/ARC-Challenge
config: default
split: test
revision: c481e0da3dcbbad8bce7721dea9085b74320a0a3
metrics:
- type: ndcg_at_1
value: 8.959
- type: ndcg_at_3
value: 16.238
- type: ndcg_at_5
value: 18.841
- type: ndcg_at_10
value: 21.606
- type: ndcg_at_20
value: 24.326
- type: ndcg_at_100
value: 28.410999999999998
- type: ndcg_at_1000
value: 31.279
- type: recall_at_1
value: 8.959
- type: recall_at_3
value: 21.416
- type: recall_at_5
value: 27.73
- type: recall_at_10
value: 36.348
- type: recall_at_20
value: 47.184
- type: recall_at_100
value: 69.539
- type: recall_at_1000
value: 92.747
- type: main_score
value: 21.606
- task:
type: Retrieval
dataset:
name: MTEB AlphaNLI (default)
type: RAR-b/alphanli
config: default
split: test
revision: 303f40ef3d50918d3dc43577d33f2f7344ad72c1
metrics:
- type: ndcg_at_1
value: 29.047
- type: ndcg_at_3
value: 37.782
- type: ndcg_at_5
value: 39.989999999999995
- type: ndcg_at_10
value: 41.926
- type: ndcg_at_20
value: 43.573
- type: ndcg_at_100
value: 45.957
- type: ndcg_at_1000
value: 47.799
- type: recall_at_1
value: 29.047
- type: recall_at_3
value: 43.799
- type: recall_at_5
value: 49.151
- type: recall_at_10
value: 55.222
- type: recall_at_20
value: 61.748999999999995
- type: recall_at_100
value: 74.543
- type: recall_at_1000
value: 89.491
- type: main_score
value: 41.926
- task:
type: Retrieval
dataset:
name: MTEB BSARDRetrieval (default)
type: maastrichtlawtech/bsard
config: default
split: test
revision: 5effa1b9b5fa3b0f9e12523e6e43e5f86a6e6d59
metrics:
- type: ndcg_at_1
value: 15.315000000000001
- type: ndcg_at_3
value: 22.742
- type: ndcg_at_5
value: 25.146
- type: ndcg_at_10
value: 28.993000000000002
- type: ndcg_at_20
value: 30.797
- type: ndcg_at_100
value: 34.189
- type: ndcg_at_1000
value: 36.507
- type: recall_at_1
value: 15.315000000000001
- type: recall_at_3
value: 27.927999999999997
- type: recall_at_5
value: 33.784
- type: recall_at_10
value: 45.495000000000005
- type: recall_at_20
value: 52.703
- type: recall_at_100
value: 71.622
- type: recall_at_1000
value: 90.54100000000001
- type: main_score
value: 71.622
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER (default)
type: mteb/climate-fever
config: default
split: test
revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380
metrics:
- type: ndcg_at_1
value: 38.111
- type: ndcg_at_3
value: 34.489999999999995
- type: ndcg_at_5
value: 36.986999999999995
- type: ndcg_at_10
value: 41.825
- type: ndcg_at_20
value: 45.326
- type: ndcg_at_100
value: 50.207
- type: ndcg_at_1000
value: 52.686
- type: recall_at_1
value: 16.898
- type: recall_at_3
value: 31.636999999999997
- type: recall_at_5
value: 39.147
- type: recall_at_10
value: 49.787
- type: recall_at_20
value: 59.41499999999999
- type: recall_at_100
value: 77.506
- type: recall_at_1000
value: 90.803
- type: main_score
value: 41.825
- task:
type: Retrieval
dataset:
name: MTEB DBPedia-PL (default)
type: clarin-knext/dbpedia-pl
config: default
split: test
revision: 76afe41d9af165cc40999fcaa92312b8b012064a
metrics:
- type: ndcg_at_1
value: 50.875
- type: ndcg_at_3
value: 43.745
- type: ndcg_at_5
value: 42.186
- type: ndcg_at_10
value: 40.506
- type: ndcg_at_20
value: 40.372
- type: ndcg_at_100
value: 45.967
- type: ndcg_at_1000
value: 53.247
- type: recall_at_1
value: 8.14
- type: recall_at_3
value: 14.038
- type: recall_at_5
value: 18.394
- type: recall_at_10
value: 24.476
- type: recall_at_20
value: 32.141999999999996
- type: recall_at_100
value: 53.027
- type: recall_at_1000
value: 76.108
- type: main_score
value: 40.506
- task:
type: Retrieval
dataset:
name: MTEB FEVER (default)
type: mteb/fever
config: default
split: test
revision: bea83ef9e8fb933d90a2f1d5515737465d613e12
metrics:
- type: ndcg_at_1
value: 91.899
- type: ndcg_at_3
value: 93.267
- type: ndcg_at_5
value: 93.757
- type: ndcg_at_10
value: 94.146
- type: ndcg_at_20
value: 94.42399999999999
- type: ndcg_at_100
value: 94.647
- type: ndcg_at_1000
value: 94.765
- type: recall_at_1
value: 85.329
- type: recall_at_3
value: 94.89
- type: recall_at_5
value: 96.185
- type: recall_at_10
value: 97.234
- type: recall_at_20
value: 98.059
- type: recall_at_100
value: 98.946
- type: recall_at_1000
value: 99.605
- type: main_score
value: 94.146
- task:
type: Retrieval
dataset:
name: MTEB GermanDPR (default)
type: deepset/germandpr
config: default
split: test
revision: 5129d02422a66be600ac89cd3e8531b4f97d347d
metrics:
- type: ndcg_at_1
value: 67.415
- type: ndcg_at_3
value: 81.684
- type: ndcg_at_5
value: 83.829
- type: ndcg_at_10
value: 84.624
- type: ndcg_at_20
value: 84.77900000000001
- type: ndcg_at_100
value: 84.832
- type: ndcg_at_1000
value: 84.832
- type: recall_at_1
value: 67.415
- type: recall_at_3
value: 91.61
- type: recall_at_5
value: 96.78
- type: recall_at_10
value: 99.122
- type: recall_at_20
value: 99.70700000000001
- type: recall_at_100
value: 100.0
- type: recall_at_1000
value: 100.0
- type: main_score
value: 84.624
- task:
type: Retrieval
dataset:
name: MTEB GermanQuAD-Retrieval (default)
type: mteb/germanquad-retrieval
config: default
split: test
revision: f5c87ae5a2e7a5106606314eef45255f03151bb3
metrics:
- type: ndcg_at_1
value: 92.967
- type: ndcg_at_3
value: 96.289
- type: ndcg_at_5
value: 96.626
- type: ndcg_at_10
value: 96.68900000000001
- type: ndcg_at_20
value: 96.767
- type: ndcg_at_100
value: 96.812
- type: ndcg_at_1000
value: 96.812
- type: recall_at_1
value: 92.967
- type: recall_at_3
value: 98.457
- type: recall_at_5
value: 99.274
- type: recall_at_10
value: 99.456
- type: recall_at_20
value: 99.773
- type: recall_at_100
value: 100.0
- type: recall_at_1000
value: 100.0
- type: main_score
value: 95.7191
- task:
type: Retrieval
dataset:
name: MTEB HellaSwag (default)
type: RAR-b/hellaswag
config: default
split: test
revision: a5c990205e017d10761197ccab3000936689c3ae
metrics:
- type: ndcg_at_1
value: 24.139
- type: ndcg_at_3
value: 34.455999999999996
- type: ndcg_at_5
value: 37.217
- type: ndcg_at_10
value: 39.655
- type: ndcg_at_20
value: 41.177
- type: ndcg_at_100
value: 43.695
- type: ndcg_at_1000
value: 45.528
- type: recall_at_1
value: 24.139
- type: recall_at_3
value: 41.894
- type: recall_at_5
value: 48.565999999999995
- type: recall_at_10
value: 56.065
- type: recall_at_20
value: 62.07899999999999
- type: recall_at_100
value: 75.812
- type: recall_at_1000
value: 90.5
- type: main_score
value: 39.655
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA-PL (default)
type: clarin-knext/hotpotqa-pl
config: default
split: test
revision: a0bd479ac97b4ccb5bd6ce320c415d0bb4beb907
metrics:
- type: ndcg_at_1
value: 81.796
- type: ndcg_at_3
value: 68.66499999999999
- type: ndcg_at_5
value: 71.364
- type: ndcg_at_10
value: 73.414
- type: ndcg_at_20
value: 74.634
- type: ndcg_at_100
value: 76.276
- type: ndcg_at_1000
value: 77.34299999999999
- type: recall_at_1
value: 40.898
- type: recall_at_3
value: 66.009
- type: recall_at_5
value: 71.317
- type: recall_at_10
value: 76.435
- type: recall_at_20
value: 80.35799999999999
- type: recall_at_100
value: 87.54899999999999
- type: recall_at_1000
value: 94.537
- type: main_score
value: 73.414
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO (default)
type: mteb/msmarco
config: default
split: dev
revision: c5a29a104738b98a9e76336939199e264163d4a0
metrics:
- type: ndcg_at_1
value: 23.854
- type: ndcg_at_3
value: 35.573
- type: ndcg_at_5
value: 39.96
- type: ndcg_at_10
value: 44.064
- type: ndcg_at_20
value: 46.572
- type: ndcg_at_100
value: 49.492000000000004
- type: ndcg_at_1000
value: 50.43
- type: recall_at_1
value: 23.202
- type: recall_at_3
value: 44.092999999999996
- type: recall_at_5
value: 54.6
- type: recall_at_10
value: 67.11399999999999
- type: recall_at_20
value: 76.79899999999999
- type: recall_at_100
value: 92.085
- type: recall_at_1000
value: 99.122
- type: main_score
value: 44.064
- task:
type: Retrieval
dataset:
name: MTEB PIQA (default)
type: RAR-b/piqa
config: default
split: test
revision: bb30be7e9184e6b6b1d99bbfe1bb90a3a81842e6
metrics:
- type: ndcg_at_1
value: 26.387
- type: ndcg_at_3
value: 36.972
- type: ndcg_at_5
value: 39.534000000000006
- type: ndcg_at_10
value: 42.443
- type: ndcg_at_20
value: 44.36
- type: ndcg_at_100
value: 46.575
- type: ndcg_at_1000
value: 48.024
- type: recall_at_1
value: 26.387
- type: recall_at_3
value: 44.45
- type: recall_at_5
value: 50.598
- type: recall_at_10
value: 59.57599999999999
- type: recall_at_20
value: 67.13799999999999
- type: recall_at_100
value: 79.217
- type: recall_at_1000
value: 91.023
- type: main_score
value: 42.443
- task:
type: Retrieval
dataset:
name: MTEB Quail (default)
type: RAR-b/quail
config: default
split: test
revision: 1851bc536f8bdab29e03e29191c4586b1d8d7c5a
metrics:
- type: ndcg_at_1
value: 7.242999999999999
- type: ndcg_at_3
value: 11.727
- type: ndcg_at_5
value: 13.69
- type: ndcg_at_10
value: 16.186
- type: ndcg_at_20
value: 17.988
- type: ndcg_at_100
value: 20.926000000000002
- type: ndcg_at_1000
value: 23.980999999999998
- type: recall_at_1
value: 7.242999999999999
- type: recall_at_3
value: 15.037
- type: recall_at_5
value: 19.853
- type: recall_at_10
value: 27.573999999999998
- type: recall_at_20
value: 34.669
- type: recall_at_100
value: 50.662
- type: recall_at_1000
value: 75.735
- type: main_score
value: 16.186
- task:
type: Retrieval
dataset:
name: MTEB RARbCode (default)
type: RAR-b/humanevalpack-mbpp-pooled
config: default
split: test
revision: 25f7d11a7ac12dcbb8d3836eb2de682b98c825e4
metrics:
- type: ndcg_at_1
value: 75.40400000000001
- type: ndcg_at_3
value: 84.796
- type: ndcg_at_5
value: 86.68599999999999
- type: ndcg_at_10
value: 87.63499999999999
- type: ndcg_at_20
value: 87.813
- type: ndcg_at_100
value: 87.912
- type: ndcg_at_1000
value: 87.938
- type: recall_at_1
value: 75.40400000000001
- type: recall_at_3
value: 91.24
- type: recall_at_5
value: 95.822
- type: recall_at_10
value: 98.585
- type: recall_at_20
value: 99.259
- type: recall_at_100
value: 99.798
- type: recall_at_1000
value: 100.0
- type: main_score
value: 87.63499999999999
- task:
type: Retrieval
dataset:
name: MTEB RARbMath (default)
type: RAR-b/math-pooled
config: default
split: test
revision: 2393603c0221ff52f448d12dd75f0856103c6cca
metrics:
- type: ndcg_at_1
value: 90.869
- type: ndcg_at_3
value: 92.971
- type: ndcg_at_5
value: 93.365
- type: ndcg_at_10
value: 93.75099999999999
- type: ndcg_at_20
value: 94.05799999999999
- type: ndcg_at_100
value: 94.426
- type: ndcg_at_1000
value: 94.46600000000001
- type: recall_at_1
value: 90.869
- type: recall_at_3
value: 94.414
- type: recall_at_5
value: 95.363
- type: recall_at_10
value: 96.55
- type: recall_at_20
value: 97.753
- type: recall_at_100
value: 99.699
- type: recall_at_1000
value: 100.0
- type: main_score
value: 93.75099999999999
- task:
type: Retrieval
dataset:
name: MTEB SIQA (default)
type: RAR-b/siqa
config: default
split: test
revision: 4ed8415e9dc24060deefc84be59e2db0aacbadcc
metrics:
- type: ndcg_at_1
value: 2.661
- type: ndcg_at_3
value: 4.207000000000001
- type: ndcg_at_5
value: 4.577
- type: ndcg_at_10
value: 5.219
- type: ndcg_at_20
value: 5.917
- type: ndcg_at_100
value: 7.9670000000000005
- type: ndcg_at_1000
value: 11.527999999999999
- type: recall_at_1
value: 2.661
- type: recall_at_3
value: 5.271
- type: recall_at_5
value: 6.192
- type: recall_at_10
value: 8.187999999999999
- type: recall_at_20
value: 10.952
- type: recall_at_100
value: 22.262
- type: recall_at_1000
value: 52.098
- type: main_score
value: 5.219
- task:
type: Retrieval
dataset:
name: MTEB SpartQA (default)
type: RAR-b/spartqa
config: default
split: test
revision: 9ab3ca3ccdd0d43f9cd6d346a363935d127f4f45
metrics:
- type: ndcg_at_1
value: 1.252
- type: ndcg_at_3
value: 3.644
- type: ndcg_at_5
value: 5.27
- type: ndcg_at_10
value: 7.768
- type: ndcg_at_20
value: 10.181
- type: ndcg_at_100
value: 14.29
- type: ndcg_at_1000
value: 18.417
- type: recall_at_1
value: 0.788
- type: recall_at_3
value: 5.157
- type: recall_at_5
value: 8.728
- type: recall_at_10
value: 15.786
- type: recall_at_20
value: 24.365000000000002
- type: recall_at_100
value: 43.553999999999995
- type: recall_at_1000
value: 73.66
- type: main_score
value: 7.768
- task:
type: Retrieval
dataset:
name: MTEB TempReasonL1 (default)
type: RAR-b/TempReason-l1
config: default
split: test
revision: 9097e99aa8c9d827189c65f2e11bfe756af439f6
metrics:
- type: ndcg_at_1
value: 0.1
- type: ndcg_at_3
value: 0.716
- type: ndcg_at_5
value: 1.095
- type: ndcg_at_10
value: 1.6889999999999998
- type: ndcg_at_20
value: 2.374
- type: ndcg_at_100
value: 4.125
- type: ndcg_at_1000
value: 9.126
- type: recall_at_1
value: 0.1
- type: recall_at_3
value: 1.175
- type: recall_at_5
value: 2.1
- type: recall_at_10
value: 3.975
- type: recall_at_20
value: 6.675000000000001
- type: recall_at_100
value: 16.575
- type: recall_at_1000
value: 59.275
- type: main_score
value: 1.6889999999999998
- task:
type: Retrieval
dataset:
name: MTEB TempReasonL2Fact (default)
type: RAR-b/TempReason-l2-fact
config: default
split: test
revision: 13758bcf978613b249d0de4d0840f57815122bdf
metrics:
- type: ndcg_at_1
value: 28.942
- type: ndcg_at_3
value: 45.412
- type: ndcg_at_5
value: 50.43299999999999
- type: ndcg_at_10
value: 53.976
- type: ndcg_at_20
value: 55.703
- type: ndcg_at_100
value: 57.445
- type: ndcg_at_1000
value: 57.838
- type: recall_at_1
value: 28.942
- type: recall_at_3
value: 57.495
- type: recall_at_5
value: 69.631
- type: recall_at_10
value: 80.452
- type: recall_at_20
value: 87.252
- type: recall_at_100
value: 96.44200000000001
- type: recall_at_1000
value: 99.518
- type: main_score
value: 53.976
- task:
type: Retrieval
dataset:
name: MTEB TempReasonL2Pure (default)
type: RAR-b/TempReason-l2-pure
config: default
split: test
revision: 27668949b97bfb178901e0cf047cbee805305dc1
metrics:
- type: ndcg_at_1
value: 2.001
- type: ndcg_at_3
value: 3.746
- type: ndcg_at_5
value: 4.665
- type: ndcg_at_10
value: 5.972
- type: ndcg_at_20
value: 7.321999999999999
- type: ndcg_at_100
value: 11.068
- type: ndcg_at_1000
value: 15.675
- type: recall_at_1
value: 2.001
- type: recall_at_3
value: 5.04
- type: recall_at_5
value: 7.3
- type: recall_at_10
value: 11.34
- type: recall_at_20
value: 16.713
- type: recall_at_100
value: 37.576
- type: recall_at_1000
value: 75.394
- type: main_score
value: 5.972
- task:
type: Retrieval
dataset:
name: MTEB TempReasonL3Fact (default)
type: RAR-b/TempReason-l3-fact
config: default
split: test
revision: 4b70e90197901da24f3cfcd51d27111292878680
metrics:
- type: ndcg_at_1
value: 19.114
- type: ndcg_at_3
value: 34.72
- type: ndcg_at_5
value: 40.509
- type: ndcg_at_10
value: 44.894
- type: ndcg_at_20
value: 47.021
- type: ndcg_at_100
value: 49.162
- type: ndcg_at_1000
value: 49.833
- type: recall_at_1
value: 19.114
- type: recall_at_3
value: 46.385
- type: recall_at_5
value: 60.438
- type: recall_at_10
value: 73.882
- type: recall_at_20
value: 82.219
- type: recall_at_100
value: 93.47
- type: recall_at_1000
value: 98.735
- type: main_score
value: 44.894
- task:
type: Retrieval
dataset:
name: MTEB TempReasonL3Pure (default)
type: RAR-b/TempReason-l3-pure
config: default
split: test
revision: 68fba138e7e63daccecfbdad0a9d2714e56e34ff
metrics:
- type: ndcg_at_1
value: 0.836
- type: ndcg_at_3
value: 5.319
- type: ndcg_at_5
value: 7.468
- type: ndcg_at_10
value: 10.282
- type: ndcg_at_20
value: 12.457
- type: ndcg_at_100
value: 16.384
- type: ndcg_at_1000
value: 20.081
- type: recall_at_1
value: 0.836
- type: recall_at_3
value: 8.744
- type: recall_at_5
value: 13.963000000000001
- type: recall_at_10
value: 22.729
- type: recall_at_20
value: 31.338
- type: recall_at_100
value: 52.824000000000005
- type: recall_at_1000
value: 82.784
- type: main_score
value: 10.282
- task:
type: Retrieval
dataset:
name: MTEB WinoGrande (default)
type: RAR-b/winogrande
config: default
split: test
revision: f74c094f321077cf909ddfb8bccc1b5912a4ac28
metrics:
- type: ndcg_at_1
value: 47.908
- type: ndcg_at_3
value: 71.58200000000001
- type: ndcg_at_5
value: 74.265
- type: ndcg_at_10
value: 75.61099999999999
- type: ndcg_at_20
value: 76.07300000000001
- type: ndcg_at_100
value: 76.249
- type: ndcg_at_1000
value: 76.249
- type: recall_at_1
value: 47.908
- type: recall_at_3
value: 86.74
- type: recall_at_5
value: 93.21199999999999
- type: recall_at_10
value: 97.316
- type: recall_at_20
value: 99.132
- type: recall_at_100
value: 100.0
- type: recall_at_1000
value: 100.0
- type: main_score
value: 75.61099999999999
- task:
type: Retrieval
dataset:
name: MTEB XMarket (de)
type: jinaai/xmarket_ml
config: de
split: test
revision: dfe57acff5b62c23732a7b7d3e3fb84ff501708b
metrics:
- type: ndcg_at_1
value: 30.394
- type: ndcg_at_3
value: 30.701
- type: ndcg_at_5
value: 31.574
- type: ndcg_at_10
value: 32.961
- type: ndcg_at_20
value: 34.765
- type: ndcg_at_100
value: 38.772
- type: ndcg_at_1000
value: 43.317
- type: recall_at_1
value: 10.193000000000001
- type: recall_at_3
value: 19.141
- type: recall_at_5
value: 24.362000000000002
- type: recall_at_10
value: 31.995
- type: recall_at_20
value: 40.047
- type: recall_at_100
value: 56.769000000000005
- type: recall_at_1000
value: 76.318
- type: main_score
value: 32.961
- task:
type: Retrieval
dataset:
name: MTEB XMarket (en)
type: jinaai/xmarket_ml
config: en
split: test
revision: dfe57acff5b62c23732a7b7d3e3fb84ff501708b
metrics:
- type: ndcg_at_1
value: 37.652
- type: ndcg_at_3
value: 38.444
- type: ndcg_at_5
value: 39.163
- type: ndcg_at_10
value: 40.557
- type: ndcg_at_20
value: 42.224000000000004
- type: ndcg_at_100
value: 46.817
- type: ndcg_at_1000
value: 51.939
- type: recall_at_1
value: 8.909
- type: recall_at_3
value: 18.673000000000002
- type: recall_at_5
value: 24.364
- type: recall_at_10
value: 32.919
- type: recall_at_20
value: 41.908
- type: recall_at_100
value: 61.663999999999994
- type: recall_at_1000
value: 80.619
- type: main_score
value: 40.557
- task:
type: Retrieval
dataset:
name: MTEB XMarket (es)
type: jinaai/xmarket_ml
config: es
split: test
revision: dfe57acff5b62c23732a7b7d3e3fb84ff501708b
metrics:
- type: ndcg_at_1
value: 32.168
- type: ndcg_at_3
value: 32.389
- type: ndcg_at_5
value: 33.054
- type: ndcg_at_10
value: 34.549
- type: ndcg_at_20
value: 36.34
- type: ndcg_at_100
value: 40.324
- type: ndcg_at_1000
value: 44.784
- type: recall_at_1
value: 10.845
- type: recall_at_3
value: 21.058
- type: recall_at_5
value: 26.327
- type: recall_at_10
value: 34.306
- type: recall_at_20
value: 42.46
- type: recall_at_100
value: 59.156
- type: recall_at_1000
value: 78.249
- type: main_score
value: 34.549
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (ara-ara)
type: jinaai/xpqa
config: ara-ara
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 42.0
- type: ndcg_at_3
value: 43.802
- type: ndcg_at_5
value: 46.1
- type: ndcg_at_10
value: 50.858000000000004
- type: ndcg_at_20
value: 54.303999999999995
- type: ndcg_at_100
value: 57.692
- type: ndcg_at_1000
value: 58.97599999999999
- type: recall_at_1
value: 23.989
- type: recall_at_3
value: 42.753
- type: recall_at_5
value: 51.56699999999999
- type: recall_at_10
value: 63.92400000000001
- type: recall_at_20
value: 75.249
- type: recall_at_100
value: 90.851
- type: recall_at_1000
value: 99.733
- type: main_score
value: 50.858000000000004
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (eng-ara)
type: jinaai/xpqa
config: eng-ara
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 31.2
- type: ndcg_at_3
value: 33.296
- type: ndcg_at_5
value: 35.727
- type: ndcg_at_10
value: 39.837
- type: ndcg_at_20
value: 43.354
- type: ndcg_at_100
value: 47.908
- type: ndcg_at_1000
value: 50.187000000000005
- type: recall_at_1
value: 18.007
- type: recall_at_3
value: 32.5
- type: recall_at_5
value: 41.422
- type: recall_at_10
value: 51.673
- type: recall_at_20
value: 63.144
- type: recall_at_100
value: 83.733
- type: recall_at_1000
value: 99.10900000000001
- type: main_score
value: 39.837
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (ara-eng)
type: jinaai/xpqa
config: ara-eng
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 40.431
- type: ndcg_at_3
value: 41.419
- type: ndcg_at_5
value: 44.051
- type: ndcg_at_10
value: 48.94
- type: ndcg_at_20
value: 52.532999999999994
- type: ndcg_at_100
value: 56.203
- type: ndcg_at_1000
value: 57.467999999999996
- type: recall_at_1
value: 22.534000000000002
- type: recall_at_3
value: 40.119
- type: recall_at_5
value: 49.569
- type: recall_at_10
value: 62.156
- type: recall_at_20
value: 74.191
- type: recall_at_100
value: 90.973
- type: recall_at_1000
value: 99.72999999999999
- type: main_score
value: 48.94
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (deu-deu)
type: jinaai/xpqa
config: deu-deu
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 76.50099999999999
- type: ndcg_at_3
value: 79.38199999999999
- type: ndcg_at_5
value: 81.00500000000001
- type: ndcg_at_10
value: 82.786
- type: ndcg_at_20
value: 83.844
- type: ndcg_at_100
value: 84.708
- type: ndcg_at_1000
value: 84.956
- type: recall_at_1
value: 58.464000000000006
- type: recall_at_3
value: 79.963
- type: recall_at_5
value: 85.757
- type: recall_at_10
value: 90.372
- type: recall_at_20
value: 94.13
- type: recall_at_100
value: 98.24000000000001
- type: recall_at_1000
value: 100.0
- type: main_score
value: 82.786
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (eng-deu)
type: jinaai/xpqa
config: eng-deu
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 52.611
- type: ndcg_at_3
value: 55.35099999999999
- type: ndcg_at_5
value: 57.452999999999996
- type: ndcg_at_10
value: 61.553999999999995
- type: ndcg_at_20
value: 63.919000000000004
- type: ndcg_at_100
value: 66.90700000000001
- type: ndcg_at_1000
value: 67.685
- type: recall_at_1
value: 33.47
- type: recall_at_3
value: 55.174
- type: recall_at_5
value: 63.512
- type: recall_at_10
value: 73.934
- type: recall_at_20
value: 81.26400000000001
- type: recall_at_100
value: 94.606
- type: recall_at_1000
value: 100.0
- type: main_score
value: 61.553999999999995
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (deu-eng)
type: jinaai/xpqa
config: deu-eng
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 70.235
- type: ndcg_at_3
value: 74.824
- type: ndcg_at_5
value: 76.47699999999999
- type: ndcg_at_10
value: 78.803
- type: ndcg_at_20
value: 80.19
- type: ndcg_at_100
value: 81.07799999999999
- type: ndcg_at_1000
value: 81.40899999999999
- type: recall_at_1
value: 52.818
- type: recall_at_3
value: 76.754
- type: recall_at_5
value: 82.637
- type: recall_at_10
value: 88.655
- type: recall_at_20
value: 93.61
- type: recall_at_100
value: 97.731
- type: recall_at_1000
value: 100.0
- type: main_score
value: 78.803
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (spa-spa)
type: jinaai/xpqa
config: spa-spa
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 64.18700000000001
- type: ndcg_at_3
value: 62.714999999999996
- type: ndcg_at_5
value: 64.134
- type: ndcg_at_10
value: 68.143
- type: ndcg_at_20
value: 70.625
- type: ndcg_at_100
value: 73.333
- type: ndcg_at_1000
value: 74.02300000000001
- type: recall_at_1
value: 34.400999999999996
- type: recall_at_3
value: 57.654
- type: recall_at_5
value: 67.167
- type: recall_at_10
value: 76.31599999999999
- type: recall_at_20
value: 83.731
- type: recall_at_100
value: 95.502
- type: recall_at_1000
value: 99.58
- type: main_score
value: 68.143
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (eng-spa)
type: jinaai/xpqa
config: eng-spa
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 47.667
- type: ndcg_at_3
value: 46.35
- type: ndcg_at_5
value: 47.879
- type: ndcg_at_10
value: 52.733
- type: ndcg_at_20
value: 55.620000000000005
- type: ndcg_at_100
value: 59.70100000000001
- type: ndcg_at_1000
value: 61.417
- type: recall_at_1
value: 23.394000000000002
- type: recall_at_3
value: 42.264
- type: recall_at_5
value: 51.144999999999996
- type: recall_at_10
value: 62.556
- type: recall_at_20
value: 71.269
- type: recall_at_100
value: 88.668
- type: recall_at_1000
value: 99.466
- type: main_score
value: 52.733
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (spa-eng)
type: jinaai/xpqa
config: spa-eng
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 61.285999999999994
- type: ndcg_at_3
value: 60.303
- type: ndcg_at_5
value: 62.062
- type: ndcg_at_10
value: 66.042
- type: ndcg_at_20
value: 68.509
- type: ndcg_at_100
value: 71.539
- type: ndcg_at_1000
value: 72.258
- type: recall_at_1
value: 32.224000000000004
- type: recall_at_3
value: 55.443
- type: recall_at_5
value: 65.67699999999999
- type: recall_at_10
value: 74.607
- type: recall_at_20
value: 82.234
- type: recall_at_100
value: 95.275
- type: recall_at_1000
value: 99.723
- type: main_score
value: 66.042
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (fr)
type: jinaai/xpqa
config: fra-fra
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 71.429
- type: ndcg_at_3
value: 71.13000000000001
- type: ndcg_at_5
value: 72.709
- type: ndcg_at_10
value: 76.236
- type: ndcg_at_20
value: 77.78500000000001
- type: ndcg_at_100
value: 79.634
- type: ndcg_at_1000
value: 79.953
- type: recall_at_1
value: 45.943
- type: recall_at_3
value: 68.293
- type: recall_at_5
value: 76.5
- type: recall_at_10
value: 85.11999999999999
- type: recall_at_20
value: 90.069
- type: recall_at_100
value: 97.82600000000001
- type: recall_at_1000
value: 99.866
- type: main_score
value: 76.236
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (eng-fra)
type: jinaai/xpqa
config: eng-fra
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 47.797
- type: ndcg_at_3
value: 49.514
- type: ndcg_at_5
value: 51.782
- type: ndcg_at_10
value: 55.891000000000005
- type: ndcg_at_20
value: 59.226
- type: ndcg_at_100
value: 62.612
- type: ndcg_at_1000
value: 63.749
- type: recall_at_1
value: 26.689
- type: recall_at_3
value: 47.408
- type: recall_at_5
value: 57.399
- type: recall_at_10
value: 67.147
- type: recall_at_20
value: 77.837
- type: recall_at_100
value: 92.494
- type: recall_at_1000
value: 99.74
- type: main_score
value: 55.891000000000005
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (fra-eng)
type: jinaai/xpqa
config: fra-eng
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 68.625
- type: ndcg_at_3
value: 68.239
- type: ndcg_at_5
value: 70.175
- type: ndcg_at_10
value: 73.452
- type: ndcg_at_20
value: 75.66000000000001
- type: ndcg_at_100
value: 77.506
- type: ndcg_at_1000
value: 77.936
- type: recall_at_1
value: 44.035999999999994
- type: recall_at_3
value: 65.291
- type: recall_at_5
value: 74.37899999999999
- type: recall_at_10
value: 82.15
- type: recall_at_20
value: 89.457
- type: recall_at_100
value: 97.194
- type: recall_at_1000
value: 99.933
- type: main_score
value: 73.452
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (hin-hin)
type: jinaai/xpqa
config: hin-hin
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 66.703
- type: ndcg_at_3
value: 72.993
- type: ndcg_at_5
value: 75.138
- type: ndcg_at_10
value: 77.371
- type: ndcg_at_20
value: 78.389
- type: ndcg_at_100
value: 79.623
- type: ndcg_at_1000
value: 79.975
- type: recall_at_1
value: 57.094
- type: recall_at_3
value: 77.2
- type: recall_at_5
value: 82.50800000000001
- type: recall_at_10
value: 88.486
- type: recall_at_20
value: 91.863
- type: recall_at_100
value: 97.359
- type: recall_at_1000
value: 99.892
- type: main_score
value: 77.371
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (eng-hin)
type: jinaai/xpqa
config: eng-hin
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 29.837999999999997
- type: ndcg_at_3
value: 34.187
- type: ndcg_at_5
value: 37.132
- type: ndcg_at_10
value: 41.357
- type: ndcg_at_20
value: 44.522
- type: ndcg_at_100
value: 49.486999999999995
- type: ndcg_at_1000
value: 51.458000000000006
- type: recall_at_1
value: 24.959999999999997
- type: recall_at_3
value: 36.472
- type: recall_at_5
value: 44.175
- type: recall_at_10
value: 55.371
- type: recall_at_20
value: 65.506
- type: recall_at_100
value: 87.252
- type: recall_at_1000
value: 99.78399999999999
- type: main_score
value: 41.357
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (hin-eng)
type: jinaai/xpqa
config: hin-eng
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 62.829
- type: ndcg_at_3
value: 68.886
- type: ndcg_at_5
value: 71.812
- type: ndcg_at_10
value: 74.405
- type: ndcg_at_20
value: 75.702
- type: ndcg_at_100
value: 77.08500000000001
- type: ndcg_at_1000
value: 77.377
- type: recall_at_1
value: 53.568000000000005
- type: recall_at_3
value: 73.095
- type: recall_at_5
value: 80.211
- type: recall_at_10
value: 87.229
- type: recall_at_20
value: 91.625
- type: recall_at_100
value: 97.844
- type: recall_at_1000
value: 100.0
- type: main_score
value: 74.405
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (ita-ita)
type: jinaai/xpqa
config: ita-ita
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 73.303
- type: ndcg_at_3
value: 74.51299999999999
- type: ndcg_at_5
value: 76.383
- type: ndcg_at_10
value: 78.968
- type: ndcg_at_20
value: 80.331
- type: ndcg_at_100
value: 81.65599999999999
- type: ndcg_at_1000
value: 82.075
- type: recall_at_1
value: 50.68899999999999
- type: recall_at_3
value: 72.763
- type: recall_at_5
value: 80.85
- type: recall_at_10
value: 87.071
- type: recall_at_20
value: 91.62599999999999
- type: recall_at_100
value: 97.333
- type: recall_at_1000
value: 100.0
- type: main_score
value: 78.968
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (eng-ita)
type: jinaai/xpqa
config: eng-ita
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 42.232
- type: ndcg_at_3
value: 46.231
- type: ndcg_at_5
value: 48.197
- type: ndcg_at_10
value: 52.217
- type: ndcg_at_20
value: 55.472
- type: ndcg_at_100
value: 58.803000000000004
- type: ndcg_at_1000
value: 60.321000000000005
- type: recall_at_1
value: 26.368000000000002
- type: recall_at_3
value: 46.709
- type: recall_at_5
value: 54.721
- type: recall_at_10
value: 64.46
- type: recall_at_20
value: 74.997
- type: recall_at_100
value: 89.527
- type: recall_at_1000
value: 99.698
- type: main_score
value: 52.217
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (ita-eng)
type: jinaai/xpqa
config: ita-eng
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 68.326
- type: ndcg_at_3
value: 70.71499999999999
- type: ndcg_at_5
value: 72.748
- type: ndcg_at_10
value: 75.31
- type: ndcg_at_20
value: 76.958
- type: ndcg_at_100
value: 78.66300000000001
- type: ndcg_at_1000
value: 79.089
- type: recall_at_1
value: 46.583999999999996
- type: recall_at_3
value: 69.887
- type: recall_at_5
value: 78.10000000000001
- type: recall_at_10
value: 84.329
- type: recall_at_20
value: 89.51
- type: recall_at_100
value: 97.235
- type: recall_at_1000
value: 100.0
- type: main_score
value: 75.31
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (jpn-jpn)
type: jinaai/xpqa
config: jpn-jpn
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 72.0
- type: ndcg_at_3
value: 74.005
- type: ndcg_at_5
value: 75.411
- type: ndcg_at_10
value: 77.12
- type: ndcg_at_20
value: 78.625
- type: ndcg_at_100
value: 80.281
- type: ndcg_at_1000
value: 80.682
- type: recall_at_1
value: 46.988
- type: recall_at_3
value: 72.36200000000001
- type: recall_at_5
value: 79.501
- type: recall_at_10
value: 83.83
- type: recall_at_20
value: 88.907
- type: recall_at_100
value: 96.739
- type: recall_at_1000
value: 99.636
- type: main_score
value: 77.12
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (eng-jpn)
type: jinaai/xpqa
config: eng-jpn
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 43.758
- type: ndcg_at_3
value: 45.513999999999996
- type: ndcg_at_5
value: 47.93
- type: ndcg_at_10
value: 51.983999999999995
- type: ndcg_at_20
value: 54.544000000000004
- type: ndcg_at_100
value: 58.022
- type: ndcg_at_1000
value: 59.843
- type: recall_at_1
value: 25.543
- type: recall_at_3
value: 44.374
- type: recall_at_5
value: 53.86300000000001
- type: recall_at_10
value: 63.756
- type: recall_at_20
value: 72.14699999999999
- type: recall_at_100
value: 87.58200000000001
- type: recall_at_1000
value: 99.295
- type: main_score
value: 51.983999999999995
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (jpn-eng)
type: jinaai/xpqa
config: jpn-eng
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 68.978
- type: ndcg_at_3
value: 71.019
- type: ndcg_at_5
value: 72.697
- type: ndcg_at_10
value: 75.267
- type: ndcg_at_20
value: 76.655
- type: ndcg_at_100
value: 78.388
- type: ndcg_at_1000
value: 78.899
- type: recall_at_1
value: 44.958999999999996
- type: recall_at_3
value: 69.56400000000001
- type: recall_at_5
value: 77.082
- type: recall_at_10
value: 83.646
- type: recall_at_20
value: 88.238
- type: recall_at_100
value: 96.194
- type: recall_at_1000
value: 99.818
- type: main_score
value: 75.267
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (kor-kor)
type: jinaai/xpqa
config: kor-kor
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 33.18
- type: ndcg_at_3
value: 35.311
- type: ndcg_at_5
value: 38.366
- type: ndcg_at_10
value: 41.654
- type: ndcg_at_20
value: 44.244
- type: ndcg_at_100
value: 49.001
- type: ndcg_at_1000
value: 51.01
- type: recall_at_1
value: 23.201
- type: recall_at_3
value: 37.011
- type: recall_at_5
value: 44.493
- type: recall_at_10
value: 53.489
- type: recall_at_20
value: 62.548
- type: recall_at_100
value: 85.55
- type: recall_at_1000
value: 100.0
- type: main_score
value: 41.654
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (eng-kor)
type: jinaai/xpqa
config: eng-kor
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 34.404
- type: ndcg_at_3
value: 35.821
- type: ndcg_at_5
value: 37.268
- type: ndcg_at_10
value: 40.967
- type: ndcg_at_20
value: 43.509
- type: ndcg_at_100
value: 49.326
- type: ndcg_at_1000
value: 51.410999999999994
- type: recall_at_1
value: 20.363999999999997
- type: recall_at_3
value: 35.293
- type: recall_at_5
value: 41.251
- type: recall_at_10
value: 50.766999999999996
- type: recall_at_20
value: 59.274
- type: recall_at_100
value: 86.669
- type: recall_at_1000
value: 100.0
- type: main_score
value: 40.967
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (kor-eng)
type: jinaai/xpqa
config: kor-eng
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 33.062000000000005
- type: ndcg_at_3
value: 35.619
- type: ndcg_at_5
value: 37.684
- type: ndcg_at_10
value: 40.986
- type: ndcg_at_20
value: 43.736999999999995
- type: ndcg_at_100
value: 48.632999999999996
- type: ndcg_at_1000
value: 50.78
- type: recall_at_1
value: 23.18
- type: recall_at_3
value: 37.235
- type: recall_at_5
value: 42.448
- type: recall_at_10
value: 51.395
- type: recall_at_20
value: 61.01
- type: recall_at_100
value: 84.382
- type: recall_at_1000
value: 100.0
- type: main_score
value: 40.986
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (pol-pol)
type: jinaai/xpqa
config: pol-pol
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 46.115
- type: ndcg_at_3
value: 45.966
- type: ndcg_at_5
value: 48.119
- type: ndcg_at_10
value: 51.53
- type: ndcg_at_20
value: 54.447
- type: ndcg_at_100
value: 58.939
- type: ndcg_at_1000
value: 60.428000000000004
- type: recall_at_1
value: 27.641
- type: recall_at_3
value: 45.021
- type: recall_at_5
value: 52.580000000000005
- type: recall_at_10
value: 61.141999999999996
- type: recall_at_20
value: 70.588
- type: recall_at_100
value: 90.29700000000001
- type: recall_at_1000
value: 99.851
- type: main_score
value: 51.53
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (eng-pol)
type: jinaai/xpqa
config: eng-pol
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 32.357
- type: ndcg_at_3
value: 31.573
- type: ndcg_at_5
value: 33.046
- type: ndcg_at_10
value: 37.364999999999995
- type: ndcg_at_20
value: 40.407
- type: ndcg_at_100
value: 45.965
- type: ndcg_at_1000
value: 48.982
- type: recall_at_1
value: 14.865999999999998
- type: recall_at_3
value: 28.51
- type: recall_at_5
value: 35.827999999999996
- type: recall_at_10
value: 46.11
- type: recall_at_20
value: 55.498999999999995
- type: recall_at_100
value: 79.73
- type: recall_at_1000
value: 99.236
- type: main_score
value: 37.364999999999995
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (pol-eng)
type: jinaai/xpqa
config: pol-eng
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 43.114999999999995
- type: ndcg_at_3
value: 42.306
- type: ndcg_at_5
value: 44.484
- type: ndcg_at_10
value: 48.374
- type: ndcg_at_20
value: 51.347
- type: ndcg_at_100
value: 56.223
- type: ndcg_at_1000
value: 57.93899999999999
- type: recall_at_1
value: 25.746000000000002
- type: recall_at_3
value: 41.160000000000004
- type: recall_at_5
value: 48.256
- type: recall_at_10
value: 58.038999999999994
- type: recall_at_20
value: 67.499
- type: recall_at_100
value: 88.912
- type: recall_at_1000
value: 99.85000000000001
- type: main_score
value: 48.374
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (por-por)
type: jinaai/xpqa
config: por-por
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 47.25
- type: ndcg_at_3
value: 46.225
- type: ndcg_at_5
value: 47.813
- type: ndcg_at_10
value: 51.383
- type: ndcg_at_20
value: 54.291
- type: ndcg_at_100
value: 58.434
- type: ndcg_at_1000
value: 60.07
- type: recall_at_1
value: 25.394
- type: recall_at_3
value: 43.446
- type: recall_at_5
value: 51.037
- type: recall_at_10
value: 59.61
- type: recall_at_20
value: 68.925
- type: recall_at_100
value: 88.277
- type: recall_at_1000
value: 99.44800000000001
- type: main_score
value: 51.383
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (eng-por)
type: jinaai/xpqa
config: eng-por
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 29.5
- type: ndcg_at_3
value: 29.971999999999998
- type: ndcg_at_5
value: 31.513999999999996
- type: ndcg_at_10
value: 35.449999999999996
- type: ndcg_at_20
value: 38.912
- type: ndcg_at_100
value: 44.695
- type: ndcg_at_1000
value: 47.309
- type: recall_at_1
value: 14.335
- type: recall_at_3
value: 27.839999999999996
- type: recall_at_5
value: 34.737
- type: recall_at_10
value: 44.358
- type: recall_at_20
value: 55.65
- type: recall_at_100
value: 82.077
- type: recall_at_1000
value: 99.44800000000001
- type: main_score
value: 35.449999999999996
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (por-eng)
type: jinaai/xpqa
config: por-eng
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 46.048
- type: ndcg_at_3
value: 45.519
- type: ndcg_at_5
value: 47.693999999999996
- type: ndcg_at_10
value: 51.535
- type: ndcg_at_20
value: 54.179
- type: ndcg_at_100
value: 58.416999999999994
- type: ndcg_at_1000
value: 59.955000000000005
- type: recall_at_1
value: 25.325999999999997
- type: recall_at_3
value: 42.779
- type: recall_at_5
value: 51.453
- type: recall_at_10
value: 60.876
- type: recall_at_20
value: 69.184
- type: recall_at_100
value: 88.97699999999999
- type: recall_at_1000
value: 99.58200000000001
- type: main_score
value: 51.535
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (tam-tam)
type: jinaai/xpqa
config: tam-tam
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 31.968999999999998
- type: ndcg_at_3
value: 34.555
- type: ndcg_at_5
value: 36.504999999999995
- type: ndcg_at_10
value: 38.958
- type: ndcg_at_20
value: 40.77
- type: ndcg_at_100
value: 43.779
- type: ndcg_at_1000
value: 47.388999999999996
- type: recall_at_1
value: 21.13
- type: recall_at_3
value: 35.838
- type: recall_at_5
value: 41.535
- type: recall_at_10
value: 48.075
- type: recall_at_20
value: 54.290000000000006
- type: recall_at_100
value: 68.325
- type: recall_at_1000
value: 95.62
- type: main_score
value: 38.958
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (eng-tam)
type: jinaai/xpqa
config: eng-tam
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 12.531999999999998
- type: ndcg_at_3
value: 12.849
- type: ndcg_at_5
value: 13.979
- type: ndcg_at_10
value: 16.573
- type: ndcg_at_20
value: 18.861
- type: ndcg_at_100
value: 23.779
- type: ndcg_at_1000
value: 29.859
- type: recall_at_1
value: 7.388999999999999
- type: recall_at_3
value: 12.531999999999998
- type: recall_at_5
value: 16.279
- type: recall_at_10
value: 23.099
- type: recall_at_20
value: 30.697000000000003
- type: recall_at_100
value: 53.608
- type: recall_at_1000
value: 94.719
- type: main_score
value: 16.573
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (tam-eng)
type: jinaai/xpqa
config: tam-eng
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 21.066
- type: ndcg_at_3
value: 23.677999999999997
- type: ndcg_at_5
value: 25.851000000000003
- type: ndcg_at_10
value: 28.615000000000002
- type: ndcg_at_20
value: 30.817
- type: ndcg_at_100
value: 34.874
- type: ndcg_at_1000
value: 39.24
- type: recall_at_1
value: 15.037
- type: recall_at_3
value: 25.285999999999998
- type: recall_at_5
value: 30.717
- type: recall_at_10
value: 37.722
- type: recall_at_20
value: 44.927
- type: recall_at_100
value: 63.917
- type: recall_at_1000
value: 96.145
- type: main_score
value: 28.615000000000002
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (cmn-cmn)
type: jinaai/xpqa
config: cmn-cmn
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 65.049
- type: ndcg_at_3
value: 65.534
- type: ndcg_at_5
value: 67.498
- type: ndcg_at_10
value: 70.812
- type: ndcg_at_20
value: 73.026
- type: ndcg_at_100
value: 75.316
- type: ndcg_at_1000
value: 75.882
- type: recall_at_1
value: 41.357
- type: recall_at_3
value: 63.176
- type: recall_at_5
value: 71.381
- type: recall_at_10
value: 79.47
- type: recall_at_20
value: 86.616
- type: recall_at_100
value: 96.36099999999999
- type: recall_at_1000
value: 100.0
- type: main_score
value: 70.812
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (eng-cmn)
type: jinaai/xpqa
config: eng-cmn
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 35.073
- type: ndcg_at_3
value: 35.782000000000004
- type: ndcg_at_5
value: 36.99
- type: ndcg_at_10
value: 40.974
- type: ndcg_at_20
value: 43.971
- type: ndcg_at_100
value: 49.165
- type: ndcg_at_1000
value: 51.93
- type: recall_at_1
value: 20.057
- type: recall_at_3
value: 34.064
- type: recall_at_5
value: 40.831
- type: recall_at_10
value: 50.33
- type: recall_at_20
value: 59.306000000000004
- type: recall_at_100
value: 82.231
- type: recall_at_1000
value: 99.759
- type: main_score
value: 40.974
- task:
type: Retrieval
dataset:
name: MTEB XPQARetrieval (cmn-eng)
type: jinaai/xpqa
config: cmn-eng
split: test
revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f
metrics:
- type: ndcg_at_1
value: 57.68299999999999
- type: ndcg_at_3
value: 60.089000000000006
- type: ndcg_at_5
value: 62.217999999999996
- type: ndcg_at_10
value: 65.81700000000001
- type: ndcg_at_20
value: 67.886
- type: ndcg_at_100
value: 70.804
- type: ndcg_at_1000
value: 71.54
- type: recall_at_1
value: 36.146
- type: recall_at_3
value: 59.035000000000004
- type: recall_at_5
value: 67.376
- type: recall_at_10
value: 76.213
- type: recall_at_20
value: 82.756
- type: recall_at_100
value: 95.341
- type: recall_at_1000
value: 100.0
- type: main_score
value: 65.81700000000001
---
# INF-Retriever-v1
## Model Overview
- **INF-Retriever-v1** is an LLM-based dense retrieval model developed by [INF TECH](https://www.infly.cn/en).
It is built upon the [gte-Qwen2-7B-instruct](https://huggingface.co/Alibaba-NLP/gte-Qwen2-7B-instruct) model and specifically fine-tuned to excel in retrieval tasks, particularly for Chinese and English data.
- As of January 23, 2025, **INF-Retriever-v1** ranks both **No.1** on the Automated Heterogeneous Information Retrieval Benchmark of version 24.04 & 24.05([AIR-Bench](https://huggingface.co/spaces/AIR-Bench/leaderboard)), showcasing its cutting-edge performance in heterogeneous information retrieval tasks.
## Key Features
- **Optimized for Chinese and English retrieval**: The model has been specifically fine-tuned with retrieval-focused datasets in both languages, significantly improving its accuracy and efficiency for a variety of retrieval scenarios.
- **Top-tier performance**: **INF-Retriever-v1** has achieved outstanding results on the AIR-Bench leaderboard, making it a top choice for heterogeneous information retrieval tasks across various domains.
## Model Details
- Model Size: 7B
- Embedding Dimension: 3584
- Max Input Tokens: 32768
- Language Support: Chinese & English (also effective in other languages)
## Usage
### Sentence Transformers
```python
from sentence_transformers import SentenceTransformer
model = SentenceTransformer("infly/inf-retriever-v1", trust_remote_code=True)
# In case you want to reduce the maximum length:
model.max_seq_length = 8192
queries = [
"how much protein should a female eat",
"summit define",
]
documents = [
"As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.",
"Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments.",
]
query_embeddings = model.encode(queries, prompt_name="query")
document_embeddings = model.encode(documents)
scores = (query_embeddings @ document_embeddings.T) * 100
print(scores.tolist())
# [[86.8702392578125, 67.82364654541016], [59.51014709472656, 82.33668518066406]]
```
### Transformers
```python
import torch
import torch.nn.functional as F
from torch import Tensor
from transformers import AutoTokenizer, AutoModel
def last_token_pool(last_hidden_states: Tensor,
attention_mask: Tensor) -> Tensor:
left_padding = (attention_mask[:, -1].sum() == attention_mask.shape[0])
if left_padding:
return last_hidden_states[:, -1]
else:
sequence_lengths = attention_mask.sum(dim=1) - 1
batch_size = last_hidden_states.shape[0]
return last_hidden_states[torch.arange(batch_size, device=last_hidden_states.device), sequence_lengths]
def get_detailed_instruct(task_description: str, query: str) -> str:
return f'Instruct: {task_description}\nQuery: {query}'
# Each query must come with a one-sentence instruction that describes the task
task = 'Given a web search query, retrieve relevant passages that answer the query'
queries = [
get_detailed_instruct(task, 'how much protein should a female eat'),
get_detailed_instruct(task, 'summit define')
]
# No need to add instruction for retrieval documents
documents = [
"As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.",
"Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments."
]
input_texts = queries + documents
tokenizer = AutoTokenizer.from_pretrained('infly/inf-retriever-v1', trust_remote_code=True)
model = AutoModel.from_pretrained('infly/inf-retriever-v1', trust_remote_code=True)
max_length = 8192
# Tokenize the input texts
batch_dict = tokenizer(input_texts, max_length=max_length, padding=True, truncation=True, return_tensors='pt')
outputs = model(**batch_dict)
embeddings = last_token_pool(outputs.last_hidden_state, batch_dict['attention_mask'])
# normalize embeddings
embeddings = F.normalize(embeddings, p=2, dim=1)
scores = (embeddings[:2] @ embeddings[2:].T) * 100
print(scores.tolist())
# [[86.87025451660156, 67.82366180419922], [59.510135650634766, 82.33667755126953]]
```
## Evaluation
### AIR-Bench
**INF-Retriever-v1** has demonstrated superior retrieval capabilities across multiple domains and languages. The results from the Automated Heterogeneous Information Retrieval Benchmark ([AIR-Bench](https://huggingface.co/spaces/AIR-Bench/leaderboard)) as of January 23, 2025, are as follows:
#### AIR-Bench_24.04 (Bilingual, EN & ZH)
| Model Name | Average⬆️ | wiki_en | wiki_zh | web_en | web_zh | healthcare_en | healthcare_zh | law_en | arxiv_en | news_en | news_zh | finance_en | finance_zh | msmarco_en |
|-----------------------------------------------------------------------------------|-----------|-----------|-----------|-----------|----------|---------------|---------------|-----------|-----------|-----------|-----------|------------|------------|------------|
| [E5-mistral-7b-instruct](https://huggingface.co/intfloat/e5-mistral-7b-instruct) | 45.26 | 61.67 | 55.97 | 44.41 | 45.96 | 56.32 | 35.79 | 19.32 | 44.78 | 48.18 | 35.99 | 54.79 | 26.11 | 59.03 |
| [BGE-M3](https://huggingface.co/BAAI/bge-m3) | 46.65 | 60.49 | 62.36 | 47.35 | 50.38 | 49.1 | **42.38** | 26.68 | 40.76 | 48.04 | 40.75 | 51.52 | 32.18 | 54.4 |
| [BGE-Multilingual-Gemma2](https://huggingface.co/BAAI/bge-multilingual-gemma2) | 46.83 | 63.71 | 67.3 | 50.38 | 53.24 | 47.24 | 42.13 | 22.58 | 23.28 | 50.91 | 44.02 | 49.3 | 31.6 | **63.14** |
| [GTE-Qwen2-7B-instruct](https://huggingface.co/Alibaba-NLP/gte-Qwen2-7B-instruct) | 48.38 | 63.46 | 66.44 | 51.2 | 51.98 | 54.2 | 38.82 | 22.31 | 40.27 | **54.07** | 43.03 | 58.2 | 26.63 | 58.39 |
| **INF-Retriever-v1** | **52.56** | **65.25** | **68.44** | **52.13** | **56.6** | **56.96** | 42.03 | **34.51** | **50.62** | 53.32 | **50.02** | **58.34** | **35.42** | 59.64 |
#### AIR-Bench_24.05 (Multilingual, 13 languages)
Although INF-Retriever-v1 has been fine-tuned exclusively on English and Chinese, it continues to perform exceptionally well across other languages, securing the No. 1 position on this multilingual benchmark.
| Model Name | Average⬆️ | wiki_en | wiki_zh | wiki_ar | wiki_bn | wiki_de | wiki_es | wiki_fa | wiki_fr | wiki_hi | wiki_id | wiki_ja | wiki_ko | wiki_ru | web_en | web_zh | web_ar | web_bn | web_de | web_es | web_fa | web_fr | web_hi | web_id | web_ja | web_ko | web_ru | healthcare_en | healthcare_zh | healthcare_de | healthcare_es | healthcare_fr | law_en | law_de | law_fr | arxiv_en | science_ru | news_en | news_zh | news_ar | news_bn | news_de | news_es | news_fa | news_fr | news_hi | news_id | news_ja | news_ko | news_ru | finance_en | finance_zh | finance_ar | finance_fr |
|--------------------------------------------------------------------------------------------------|-----------|-----------|-----------|-----------|-----------|-----------|-----------|-----------|-----------|-----------|-----------|-----------|-----------|-----------|-----------|-----------|-----------|-----------|-----------|----------|-----------|-----------|----------|--------|-----------|-----------|-----------|---------------|---------------|---------------|---------------|---------------|-----------|-----------|-----------|-----------|------------|-----------|-----------|-----------|-----------|-----------|----------|-----------|----------|-----------|-----------|-----------|-----------|-----------|------------|------------|------------|------------|
| [GTE-Qwen2-7B-instruct](https://huggingface.co/Alibaba-NLP/gte-Qwen2-7B-instruct) | 50.05 | **73.59** | 67.5 | 59.44 | 58.17 | 63.96 | 67.62 | 57.05 | 70.32 | 60.54 | 61.81 | 62.88 | 59.17 | 62.95 | **58.99** | 51.66 | 55.56 | 51.45 | 48.62 | 54.11 | 49.54 | 55.16 | 53.06 | 55.51 | 57.27 | 57.54 | 55.88 | 54.46 | 38.66 | 53.92 | 53.78 | 30.29 | 22.75 | 13.18 | 13.15 | 41.32 | 45.21 | **52.74** | 43.17 | 37.63 | **61.31** | 44.89 | 45.21 | 30.1 | 49.76 | 30.28 | 46.44 | 44.13 | 47.19 | 46.55 | 59.23 | 34.61 | 43.56 | 39.57 |
| [Multilingual-E5-large-instruct](https://huggingface.co/intfloat/multilingual-e5-large-instruct) | 51.11 | 68.62 | 62.82 | 63.21 | 64.45 | 65.81 | 68.1 | 64.2 | 69.72 | 71.81 | 66.36 | 64.12 | 64.79 | 62.57 | 41.58 | 47.06 | 56.4 | 56.17 | 50.87 | 52.24 | 58.68 | 50.2 | 56.32 | 54.49 | 54.89 | 55.81 | 54.97 | 54.02 | 39.76 | 52.06 | 51.74 | 36.64 | 16.9 | 15.59 | 15.12 | 39.52 | 56.86 | 44.28 | 35.46 | 48.2 | 49.31 | 47.84 | 45.99 | **45.59** | 50.58 | 39.66 | 48.59 | 47.6 | 50.52 | 48.81 | 52.79 | 37.72 | 48.95 | 42.74 |
| [BGE-M3](https://huggingface.co/BAAI/bge-m3) | 51.31 | 69.7 | 63.52 | 59.65 | 64.33 | 64.68 | 65.4 | 61.14 | 66.04 | 69.02 | 66.3 | 60.86 | 62.36 | 60.18 | 53.88 | 50.2 | 52.53 | 55.53 | 51.89 | 51.78 | 55.81 | 51.46 | 57.06 | 53.14 | 54.75 | 55.28 | 54.53 | 49.05 | 42.31 | 49 | 53.05 | 39.29 | 26.95 | 20.11 | 20.2 | 41.64 | 55.18 | 47.34 | 41 | 44.93 | 59.03 | 47.87 | 44.7 | 43.81 | 49.52 | 42.12 | 47.45 | 47.09 | 48.14 | 48.31 | 52.92 | 40.23 | 45.76 | 41.44 |
| [BGE-Multilingual-Gemma2](https://huggingface.co/BAAI/bge-multilingual-gemma2) | 54.46 | 72.8 | 68.64 | **63.42** | **69.48** | **67.91** | **71.79** | **67.57** | **71.28** | **75.39** | **68.91** | **68.29** | **66.78** | **64.15** | 56.48 | 53.04 | **59.97** | **59.68** | **57.72** | **58.2** | **62.43** | **59.54** | **64.5** | **60** | **60.26** | 59.64 | **60.12** | 47.48 | **42.35** | 55.4 | **63.13** | **45.13** | 22.6 | 15.75 | 14.29 | 24 | 44.13 | 50.29 | 43.42 | 48.41 | 58.77 | **52.05** | **49.9** | 43.4 | **56.8** | **44.89** | 50.65 | **51.51** | 51.64 | 51.48 | 50.08 | 39.23 | 50.25 | **51.1** |
| **INF-Retriever-v1** | **54.47** | 73.52 | **69.45** | 63.13 | 61.58 | 66.8 | 69.29 | 63.03 | 69.74 | 69.02 | 68.63 | 63.45 | 64.44 | 62.74 | 57.6 | **56.46** | 58.48 | 53.7 | 55.2 | 57.08 | 53.27 | 57.35 | 55.64 | 58.85 | 59.52 | **60.01** | 58.79 | **57.03** | 41.82 | **55.46** | 57.6 | 43.25 | **34.76** | **21.75** | **21.87** | **51.38** | **59.72** | 52.7 | **49.78** | **49.11** | 43.62 | 51.47 | 49.52 | 40.43 | 54.54 | 38.57 | **51.06** | 51.12 | **53.15** | **51.88** | **59.44** | **44.13** | **50.71** | 44.2 |
## Contributors
### Supervisors
Wei Chu • Yinghui Xu • Yuan Qi
### INF memory team
Junhan Yang ([email protected]) • Jiahe Wan • Yichen Yao ([email protected])
## Citation
If you find our model useful, please consider citing:
```
@misc {infly-ai_2025,
author = { Junhan Yang, Jiahe Wan, Yichen Yao, Wei Chu, Yinghui Xu, Yuan Qi },
title = { inf-retriever-v1 (Revision 5f469d7) },
year = 2025,
url = { https://huggingface.co/infly/inf-retriever-v1 },
doi = { 10.57967/hf/4262 },
publisher = { Hugging Face }
}
``` | [
"SUMMARIZATION"
] | [
"SCIFACT"
] |
EleutherAI/pythia-12b-v0 | EleutherAI | text-generation | [
"transformers",
"pytorch",
"gpt_neox",
"text-generation",
"causal-lm",
"pythia",
"pythia_v0",
"en",
"dataset:the_pile",
"arxiv:2101.00027",
"arxiv:2201.07311",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | 2022-10-16T19:03:14 | 2023-03-29T18:46:38 | 558 | 21 | ---
datasets:
- the_pile
language:
- en
license: apache-2.0
tags:
- pytorch
- causal-lm
- pythia
- pythia_v0
---
The *Pythia Scaling Suite* is a collection of models developed to facilitate
interpretability research. It contains two sets of eight models of sizes
70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two
models: one trained on the Pile, and one trained on the Pile after the dataset
has been globally deduplicated. All 8 model sizes are trained on the exact
same data, in the exact same order. All Pythia models are available
[on Hugging Face](https://huggingface.co/models?other=pythia).
The Pythia model suite was deliberately designed to promote scientific
research on large language models, especially interpretability research.
Despite not centering downstream performance as a design goal, we find the
models <a href="#evaluations">match or exceed</a> the performance of
similar and same-sized models, such as those in the OPT and GPT-Neo suites.
Please note that all models in the *Pythia* suite were renamed in January
2023. For clarity, a <a href="#naming-convention-and-parameter-count">table
comparing the old and new names</a> is provided in this model card, together
with exact parameter counts.
## Pythia-12B
### Model Details
- Developed by: [EleutherAI](http://eleuther.ai)
- Model type: Transformer-based Language Model
- Language: English
- Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia)
for training procedure, config files, and details on how to use.
- Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
- License: Apache 2.0
- Contact: to ask questions about this model, join the [EleutherAI
Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`.
Please read the existing *Pythia* documentation before asking about it in the
EleutherAI Discord. For general correspondence: [contact@eleuther.
ai](mailto:[email protected]).
<figure>
| Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models |
| -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: |
| 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — |
| 160M | 85,056,000 | 12 | 768 | 12 | 4M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M |
| 410M | 302,311,424 | 24 | 1024 | 16 | 4M | 3.0 x 10<sup>-4</sup> | OPT-350M |
| 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — |
| 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 4M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B |
| 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B |
| 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B |
| 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — |
<figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and
non-deduped models of a given size have the same hyperparameters. “Equivalent”
models have <b>exactly</b> the same architecture, and the same number of
non-embedding parameters.</figcaption>
</figure>
### Uses and Limitations
#### Intended Use
The primary intended use of Pythia is research on the behavior, functionality,
and limitations of large language models. This suite is intended to provide
a controlled setting for performing scientific experiments. To enable the
study of how language models change over the course of training, we provide
143 evenly spaced intermediate checkpoints per model. These checkpoints are
hosted on Hugging Face as branches. Note that branch `143000` corresponds
exactly to the model checkpoint on the `main` branch of each model.
You may also further fine-tune and adapt Pythia-12B for deployment,
as long as your use is in accordance with the Apache 2.0 license. Pythia
models work with the Hugging Face [Transformers
Library](https://huggingface.co/docs/transformers/index). If you decide to use
pre-trained Pythia-12B as a basis for your fine-tuned model, please
conduct your own risk and bias assessment.
#### Out-of-scope use
The Pythia Suite is **not** intended for deployment. It is not a in itself
a product and cannot be used for human-facing interactions.
Pythia models are English-language only, and are not suitable for translation
or generating text in other languages.
Pythia-12B has not been fine-tuned for downstream contexts in which
language models are commonly deployed, such as writing genre prose,
or commercial chatbots. This means Pythia-12B will **not**
respond to a given prompt the way a product like ChatGPT does. This is because,
unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement
Learning from Human Feedback (RLHF) to better “understand” human instructions.
#### Limitations and biases
The core functionality of a large language model is to take a string of text
and predict the next token. The token deemed statistically most likely by the
model need not produce the most “accurate” text. Never rely on
Pythia-12B to produce factually accurate output.
This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset
known to contain profanity and texts that are lewd or otherwise offensive.
See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a
discussion of documented biases with regards to gender, religion, and race.
Pythia-12B may produce socially unacceptable or undesirable text, *even if*
the prompt itself does not include anything explicitly offensive.
If you plan on using text generated through, for example, the Hosted Inference
API, we recommend having a human curate the outputs of this language model
before presenting it to other people. Please inform your audience that the
text was generated by Pythia-12B.
### Quickstart
Pythia models can be loaded and used via the following code, demonstrated here
for the third `pythia-70m-deduped` checkpoint:
```python
from transformers import GPTNeoXForCausalLM, AutoTokenizer
model = GPTNeoXForCausalLM.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
tokenizer = AutoTokenizer.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
inputs = tokenizer("Hello, I am", return_tensors="pt")
tokens = model.generate(**inputs)
tokenizer.decode(tokens[0])
```
Revision/branch `step143000` corresponds exactly to the model checkpoint on
the `main` branch of each model.<br>
For more information on how to use all Pythia models, see [documentation on
GitHub](https://github.com/EleutherAI/pythia).
### Training
#### Training data
[The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in
English. It was created by EleutherAI specifically for training large language
models. It contains texts from 22 diverse sources, roughly broken down into
five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl),
prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and
miscellaneous (e.g. GitHub, Enron Emails). See [the Pile
paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources,
methodology, and a discussion of ethical implications. Consult [the
datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation
about the Pile and its component datasets. The Pile can be downloaded from
the [official website](https://pile.eleuther.ai/), or from a [community
mirror](https://the-eye.eu/public/AI/pile/).<br>
The Pile was **not** deduplicated before being used to train Pythia-12B.
#### Training procedure
All models were trained on the exact same data, in the exact same order. Each
model saw 299,892,736,000 tokens during training, and 143 checkpoints for each
model are saved every 2,097,152,000 tokens, spaced evenly throughout training.
This corresponds to training for just under 1 epoch on the Pile for
non-deduplicated models, and about 1.5 epochs on the deduplicated Pile.
All *Pythia* models trained for the equivalent of 143000 steps at a batch size
of 2,097,152 tokens. Two batch sizes were used: 2M and 4M. Models with a batch
size of 4M tokens listed were originally trained for 71500 steps instead, with
checkpoints every 500 steps. The checkpoints on Hugging Face are renamed for
consistency with all 2M batch models, so `step1000` is the first checkpoint
for `pythia-1.4b` that was saved (corresponding to step 500 in training), and
`step1000` is likewise the first `pythia-6.9b` checkpoint that was saved
(corresponding to 1000 “actual” steps).<br>
See [GitHub](https://github.com/EleutherAI/pythia) for more details on training
procedure, including [how to reproduce
it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br>
Pythia uses the same tokenizer as [GPT-NeoX-
20B](https://huggingface.co/EleutherAI/gpt-neox-20b).
### Evaluations
All 16 *Pythia* models were evaluated using the [LM Evaluation
Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access
the results by model and step at `results/json/*` in the [GitHub
repository](https://github.com/EleutherAI/pythia/tree/main/results/json).<br>
Expand the sections below to see plots of evaluation results for all
Pythia and Pythia-deduped models compared with OPT and BLOOM.
<details>
<summary>LAMBADA – OpenAI</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai.png" style="width:auto"/>
</details>
<details>
<summary>Physical Interaction: Question Answering (PIQA)</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa.png" style="width:auto"/>
</details>
<details>
<summary>WinoGrande</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande.png" style="width:auto"/>
</details>
<details>
<summary>AI2 Reasoning Challenge—Challenge Set</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_challenge.png" style="width:auto"/>
</details>
<details>
<summary>SciQ</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq.png" style="width:auto"/>
</details>
### Naming convention and parameter count
*Pythia* models were renamed in January 2023. It is possible that the old
naming convention still persists in some documentation by accident. The
current naming convention (70M, 160M, etc.) is based on total parameter count.
<figure style="width:32em">
| current Pythia suffix | old suffix | total params | non-embedding params |
| --------------------: | ---------: | -------------: | -------------------: |
| 70M | 19M | 70,426,624 | 18,915,328 |
| 160M | 125M | 162,322,944 | 85,056,000 |
| 410M | 350M | 405,334,016 | 302,311,424 |
| 1B | 800M | 1,011,781,632 | 805,736,448 |
| 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 |
| 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 |
| 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 |
| 12B | 13B | 11,846,072,320 | 11,327,027,200 |
</figure> | [
"QUESTION_ANSWERING",
"TRANSLATION"
] | [
"SCIQ"
] |
EleutherAI/pythia-6.9b-deduped-v0 | EleutherAI | text-generation | [
"transformers",
"pytorch",
"gpt_neox",
"text-generation",
"causal-lm",
"pythia",
"pythia_v0",
"en",
"dataset:EleutherAI/the_pile_deduplicated",
"arxiv:2101.00027",
"arxiv:2201.07311",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | 2022-10-18T03:04:37 | 2023-07-10T01:30:05 | 558 | 20 | ---
datasets:
- EleutherAI/the_pile_deduplicated
language:
- en
license: apache-2.0
tags:
- pytorch
- causal-lm
- pythia
- pythia_v0
---
The *Pythia Scaling Suite* is a collection of models developed to facilitate
interpretability research. It contains two sets of eight models of sizes
70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two
models: one trained on the Pile, and one trained on the Pile after the dataset
has been globally deduplicated. All 8 model sizes are trained on the exact
same data, in the exact same order. All Pythia models are available
[on Hugging Face](https://huggingface.co/models?other=pythia).
The Pythia model suite was deliberately designed to promote scientific
research on large language models, especially interpretability research.
Despite not centering downstream performance as a design goal, we find the
models <a href="#evaluations">match or exceed</a> the performance of
similar and same-sized models, such as those in the OPT and GPT-Neo suites.
Please note that all models in the *Pythia* suite were renamed in January
2023. For clarity, a <a href="#naming-convention-and-parameter-count">table
comparing the old and new names</a> is provided in this model card, together
with exact parameter counts.
## Pythia-6.9B-deduped
### Model Details
- Developed by: [EleutherAI](http://eleuther.ai)
- Model type: Transformer-based Language Model
- Language: English
- Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia)
for training procedure, config files, and details on how to use.
- Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
- License: Apache 2.0
- Contact: to ask questions about this model, join the [EleutherAI
Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`.
Please read the existing *Pythia* documentation before asking about it in the
EleutherAI Discord. For general correspondence: [contact@eleuther.
ai](mailto:[email protected]).
<figure>
| Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models |
| -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: |
| 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — |
| 160M | 85,056,000 | 12 | 768 | 12 | 4M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M |
| 410M | 302,311,424 | 24 | 1024 | 16 | 4M | 3.0 x 10<sup>-4</sup> | OPT-350M |
| 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — |
| 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 4M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B |
| 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B |
| 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B |
| 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — |
<figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and
non-deduped models of a given size have the same hyperparameters. “Equivalent”
models have <b>exactly</b> the same architecture, and the same number of
non-embedding parameters.</figcaption>
</figure>
### Uses and Limitations
#### Intended Use
The primary intended use of Pythia is research on the behavior, functionality,
and limitations of large language models. This suite is intended to provide
a controlled setting for performing scientific experiments. To enable the
study of how language models change in the course of training, we provide
143 evenly spaced intermediate checkpoints per model. These checkpoints are
hosted on Hugging Face as branches. Note that branch `143000` corresponds
exactly to the model checkpoint on the `main` branch of each model.
You may also further fine-tune and adapt Pythia-6.9B-deduped for deployment,
as long as your use is in accordance with the Apache 2.0 license. Pythia
models work with the Hugging Face [Transformers
Library](https://huggingface.co/docs/transformers/index). If you decide to use
pre-trained Pythia-6.9B-deduped as a basis for your fine-tuned model, please
conduct your own risk and bias assessment.
#### Out-of-scope use
The Pythia Suite is **not** intended for deployment. It is not a in itself
a product and cannot be used for human-facing interactions.
Pythia models are English-language only, and are not suitable for translation
or generating text in other languages.
Pythia-6.9B-deduped has not been fine-tuned for downstream contexts in which
language models are commonly deployed, such as writing genre prose,
or commercial chatbots. This means Pythia-6.9B-deduped will **not**
respond to a given prompt the way a product like ChatGPT does. This is because,
unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement
Learning from Human Feedback (RLHF) to better “understand” human instructions.
#### Limitations and biases
The core functionality of a large language model is to take a string of text
and predict the next token. The token deemed statistically most likely by the
model need not produce the most “accurate” text. Never rely on
Pythia-6.9B-deduped to produce factually accurate output.
This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset
known to contain profanity and texts that are lewd or otherwise offensive.
See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a
discussion of documented biases with regards to gender, religion, and race.
Pythia-6.9B-deduped may produce socially unacceptable or undesirable text,
*even if* the prompt itself does not include anything explicitly offensive.
If you plan on using text generated through, for example, the Hosted Inference
API, we recommend having a human curate the outputs of this language model
before presenting it to other people. Please inform your audience that the
text was generated by Pythia-6.9B-deduped.
### Quickstart
Pythia models can be loaded and used via the following code, demonstrated here
for the third `pythia-70m-deduped` checkpoint:
```python
from transformers import GPTNeoXForCausalLM, AutoTokenizer
model = GPTNeoXForCausalLM.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
tokenizer = AutoTokenizer.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
inputs = tokenizer("Hello, I am", return_tensors="pt")
tokens = model.generate(**inputs)
tokenizer.decode(tokens[0])
```
Revision/branch `step143000` corresponds exactly to the model checkpoint on
the `main` branch of each model.<br>
For more information on how to use all Pythia models, see [documentation on
GitHub](https://github.com/EleutherAI/pythia).
### Training
#### Training data
Pythia-6.9B-deduped was trained on the Pile **after the dataset has been
globally deduplicated**.<br>
[The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in
English. It was created by EleutherAI specifically for training large language
models. It contains texts from 22 diverse sources, roughly broken down into
five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl),
prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and
miscellaneous (e.g. GitHub, Enron Emails). See [the Pile
paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources,
methodology, and a discussion of ethical implications. Consult [the
datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation
about the Pile and its component datasets. The Pile can be downloaded from
the [official website](https://pile.eleuther.ai/), or from a [community
mirror](https://the-eye.eu/public/AI/pile/).
#### Training procedure
All models were trained on the exact same data, in the exact same order. Each
model saw 299,892,736,000 tokens during training, and 143 checkpoints for each
model are saved every 2,097,152,000 tokens, spaced evenly throughout training.
This corresponds to training for just under 1 epoch on the Pile for
non-deduplicated models, and about 1.5 epochs on the deduplicated Pile.
All *Pythia* models trained for the equivalent of 143000 steps at a batch size
of 2,097,152 tokens. Two batch sizes were used: 2M and 4M. Models with a batch
size of 4M tokens listed were originally trained for 71500 steps instead, with
checkpoints every 500 steps. The checkpoints on Hugging Face are renamed for
consistency with all 2M batch models, so `step1000` is the first checkpoint
for `pythia-1.4b` that was saved (corresponding to step 500 in training), and
`step1000` is likewise the first `pythia-6.9b` checkpoint that was saved
(corresponding to 1000 “actual” steps).<br>
See [GitHub](https://github.com/EleutherAI/pythia) for more details on training
procedure, including [how to reproduce
it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br>
Pythia uses the same tokenizer as [GPT-NeoX-
20B](https://huggingface.co/EleutherAI/gpt-neox-20b).
### Evaluations
All 16 *Pythia* models were evaluated using the [LM Evaluation
Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access
the results by model and step at `results/json/*` in the [GitHub
repository](https://github.com/EleutherAI/pythia/tree/main/results/json).<br>
Expand the sections below to see plots of evaluation results for all
Pythia and Pythia-deduped models compared with OPT and BLOOM.
<details>
<summary>LAMBADA – OpenAI</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai.png" style="width:auto"/>
</details>
<details>
<summary>Physical Interaction: Question Answering (PIQA)</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa.png" style="width:auto"/>
</details>
<details>
<summary>WinoGrande</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande.png" style="width:auto"/>
</details>
<details>
<summary>AI2 Reasoning Challenge – Challenge Set</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_challenge.png" style="width:auto"/>
</details>
<details>
<summary>SciQ</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq.png" style="width:auto"/>
</details>
### Naming convention and parameter count
*Pythia* models were renamed in January 2023. It is possible that the old
naming convention still persists in some documentation by accident. The
current naming convention (70M, 160M, etc.) is based on total parameter count.
<figure style="width:32em">
| current Pythia suffix | old suffix | total params | non-embedding params |
| --------------------: | ---------: | -------------: | -------------------: |
| 70M | 19M | 70,426,624 | 18,915,328 |
| 160M | 125M | 162,322,944 | 85,056,000 |
| 410M | 350M | 405,334,016 | 302,311,424 |
| 1B | 800M | 1,011,781,632 | 805,736,448 |
| 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 |
| 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 |
| 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 |
| 13B | 12B | 11,846,072,320 | 11,327,027,200 |
</figure> | [
"QUESTION_ANSWERING",
"TRANSLATION"
] | [
"SCIQ"
] |
EleutherAI/pythia-410m-deduped-v0 | EleutherAI | text-generation | [
"transformers",
"pytorch",
"safetensors",
"gpt_neox",
"text-generation",
"causal-lm",
"pythia",
"pythia_v0",
"en",
"dataset:EleutherAI/the_pile_deduplicated",
"arxiv:2101.00027",
"arxiv:2201.07311",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | 2022-11-01T00:48:44 | 2023-07-10T01:31:39 | 552 | 6 | ---
datasets:
- EleutherAI/the_pile_deduplicated
language:
- en
license: apache-2.0
tags:
- pytorch
- causal-lm
- pythia
- pythia_v0
---
The *Pythia Scaling Suite* is a collection of models developed to facilitate
interpretability research. It contains two sets of eight models of sizes
70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two
models: one trained on the Pile, and one trained on the Pile after the dataset
has been globally deduplicated. All 8 model sizes are trained on the exact
same data, in the exact same order. All Pythia models are available
[on Hugging Face](https://huggingface.co/models?other=pythia).
The Pythia model suite was deliberately designed to promote scientific
research on large language models, especially interpretability research.
Despite not centering downstream performance as a design goal, we find the
models <a href="#evaluations">match or exceed</a> the performance of
similar and same-sized models, such as those in the OPT and GPT-Neo suites.
Please note that all models in the *Pythia* suite were renamed in January
2023. For clarity, a <a href="#naming-convention-and-parameter-count">table
comparing the old and new names</a> is provided in this model card, together
with exact parameter counts.
## Pythia-410M-deduped
### Model Details
- Developed by: [EleutherAI](http://eleuther.ai)
- Model type: Transformer-based Language Model
- Language: English
- Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia)
for training procedure, config files, and details on how to use.
- Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
- License: Apache 2.0
- Contact: to ask questions about this model, join the [EleutherAI
Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`.
Please read the existing *Pythia* documentation before asking about it in the
EleutherAI Discord. For general correspondence: [contact@eleuther.
ai](mailto:[email protected]).
<figure>
| Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models |
| -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: |
| 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — |
| 160M | 85,056,000 | 12 | 768 | 12 | 4M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M |
| 410M | 302,311,424 | 24 | 1024 | 16 | 4M | 3.0 x 10<sup>-4</sup> | OPT-350M |
| 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — |
| 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 4M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B |
| 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B |
| 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B |
| 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — |
<figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and
non-deduped models of a given size have the same hyperparameters. “Equivalent”
models have <b>exactly</b> the same architecture, and the same number of
non-embedding parameters.</figcaption>
</figure>
### Uses and Limitations
#### Intended Use
The primary intended use of Pythia is research on the behavior, functionality,
and limitations of large language models. This suite is intended to provide
a controlled setting for performing scientific experiments. To enable the
study of how language models change in the course of training, we provide
143 evenly spaced intermediate checkpoints per model. These checkpoints are
hosted on Hugging Face as branches. Note that branch `143000` corresponds
exactly to the model checkpoint on the `main` branch of each model.
You may also further fine-tune and adapt Pythia-410M-deduped for deployment,
as long as your use is in accordance with the Apache 2.0 license. Pythia
models work with the Hugging Face [Transformers
Library](https://huggingface.co/docs/transformers/index). If you decide to use
pre-trained Pythia-410M-deduped as a basis for your fine-tuned model, please
conduct your own risk and bias assessment.
#### Out-of-scope use
The Pythia Suite is **not** intended for deployment. It is not a in itself
a product and cannot be used for human-facing interactions.
Pythia models are English-language only, and are not suitable for translation
or generating text in other languages.
Pythia-410M-deduped has not been fine-tuned for downstream contexts in which
language models are commonly deployed, such as writing genre prose,
or commercial chatbots. This means Pythia-410M-deduped will **not**
respond to a given prompt the way a product like ChatGPT does. This is because,
unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement
Learning from Human Feedback (RLHF) to better “understand” human instructions.
#### Limitations and biases
The core functionality of a large language model is to take a string of text
and predict the next token. The token deemed statistically most likely by the
model need not produce the most “accurate” text. Never rely on
Pythia-410M-deduped to produce factually accurate output.
This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset
known to contain profanity and texts that are lewd or otherwise offensive.
See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a
discussion of documented biases with regards to gender, religion, and race.
Pythia-410M-deduped may produce socially unacceptable or undesirable text,
*even if* the prompt itself does not include anything explicitly offensive.
If you plan on using text generated through, for example, the Hosted Inference
API, we recommend having a human curate the outputs of this language model
before presenting it to other people. Please inform your audience that the
text was generated by Pythia-410M-deduped.
### Quickstart
Pythia models can be loaded and used via the following code, demonstrated here
for the third `pythia-70m-deduped` checkpoint:
```python
from transformers import GPTNeoXForCausalLM, AutoTokenizer
model = GPTNeoXForCausalLM.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
tokenizer = AutoTokenizer.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
inputs = tokenizer("Hello, I am", return_tensors="pt")
tokens = model.generate(**inputs)
tokenizer.decode(tokens[0])
```
Revision/branch `step143000` corresponds exactly to the model checkpoint on
the `main` branch of each model.<br>
For more information on how to use all Pythia models, see [documentation on
GitHub](https://github.com/EleutherAI/pythia).
### Training
#### Training data
Pythia-410M-deduped was trained on the Pile **after the dataset has been
globally deduplicated**.<br>
[The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in
English. It was created by EleutherAI specifically for training large language
models. It contains texts from 22 diverse sources, roughly broken down into
five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl),
prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and
miscellaneous (e.g. GitHub, Enron Emails). See [the Pile
paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources,
methodology, and a discussion of ethical implications. Consult [the
datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation
about the Pile and its component datasets. The Pile can be downloaded from
the [official website](https://pile.eleuther.ai/), or from a [community
mirror](https://the-eye.eu/public/AI/pile/).
#### Training procedure
All models were trained on the exact same data, in the exact same order. Each
model saw 299,892,736,000 tokens during training, and 143 checkpoints for each
model are saved every 2,097,152,000 tokens, spaced evenly throughout training.
This corresponds to training for just under 1 epoch on the Pile for
non-deduplicated models, and about 1.5 epochs on the deduplicated Pile.
All *Pythia* models trained for the equivalent of 143000 steps at a batch size
of 2,097,152 tokens. Two batch sizes were used: 2M and 4M. Models with a batch
size of 4M tokens listed were originally trained for 71500 steps instead, with
checkpoints every 500 steps. The checkpoints on Hugging Face are renamed for
consistency with all 2M batch models, so `step1000` is the first checkpoint
for `pythia-1.4b` that was saved (corresponding to step 500 in training), and
`step1000` is likewise the first `pythia-6.9b` checkpoint that was saved
(corresponding to 1000 “actual” steps).<br>
See [GitHub](https://github.com/EleutherAI/pythia) for more details on training
procedure, including [how to reproduce
it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br>
Pythia uses the same tokenizer as [GPT-NeoX-
20B](https://huggingface.co/EleutherAI/gpt-neox-20b).
### Evaluations
All 16 *Pythia* models were evaluated using the [LM Evaluation
Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access
the results by model and step at `results/json/*` in the [GitHub
repository](https://github.com/EleutherAI/pythia/tree/main/results/json).<br>
Expand the sections below to see plots of evaluation results for all
Pythia and Pythia-deduped models compared with OPT and BLOOM.
<details>
<summary>LAMBADA – OpenAI</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai.png" style="width:auto"/>
</details>
<details>
<summary>Physical Interaction: Question Answering (PIQA)</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa.png" style="width:auto"/>
</details>
<details>
<summary>WinoGrande</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande.png" style="width:auto"/>
</details>
<details>
<summary>AI2 Reasoning Challenge – Challenge Set</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_challenge.png" style="width:auto"/>
</details>
<details>
<summary>SciQ</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq.png" style="width:auto"/>
</details>
### Naming convention and parameter count
*Pythia* models were renamed in January 2023. It is possible that the old
naming convention still persists in some documentation by accident. The
current naming convention (70M, 160M, etc.) is based on total parameter count.
<figure style="width:32em">
| current Pythia suffix | old suffix | total params | non-embedding params |
| --------------------: | ---------: | -------------: | -------------------: |
| 70M | 19M | 70,426,624 | 18,915,328 |
| 160M | 125M | 162,322,944 | 85,056,000 |
| 410M | 350M | 405,334,016 | 302,311,424 |
| 1B | 800M | 1,011,781,632 | 805,736,448 |
| 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 |
| 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 |
| 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 |
| 12B | 13B | 11,846,072,320 | 11,327,027,200 |
</figure> | [
"QUESTION_ANSWERING",
"TRANSLATION"
] | [
"SCIQ"
] |
aisingapore/gemma2-9b-cpt-sea-lionv3-base | aisingapore | text-generation | [
"transformers",
"safetensors",
"gemma2",
"text-generation",
"en",
"zh",
"vi",
"id",
"th",
"fil",
"ta",
"ms",
"km",
"lo",
"my",
"arxiv:2309.06085",
"arxiv:2101.09635",
"base_model:google/gemma-2-9b",
"base_model:finetune:google/gemma-2-9b",
"license:gemma",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | 2024-10-30T03:21:35 | 2024-12-19T12:56:00 | 550 | 2 | ---
base_model: google/gemma-2-9b
language:
- en
- zh
- vi
- id
- th
- fil
- ta
- ms
- km
- lo
- my
library_name: transformers
license: gemma
pipeline_tag: text-generation
---
<div>
<img src="gemma_2_9b_sea-lion_v3_base_banner.png"/>
</div>
# Gemma2 9B CPT SEA-LIONv3
SEA-LION is a collection of Large Language Models (LLMs) which has been pretrained and instruct-tuned for the Southeast Asia (SEA) region.
Gemma2 9B CPT SEA-LIONv3 Base is a multilingual model which has undergone continued pre-training on approximately **200B** tokens across the 11 official Southeast Asian languages: English, Chinese, Vietnamese, Indonesian, Thai, Tamil, Filipino, Malay, Khmer, Lao, Burmese.
SEA-LION stands for <i>Southeast Asian Languages In One Network</i>.
- **Developed by:** Products Pillar, AI Singapore
- **Funded by:** Singapore NRF
- **Model type:** Decoder
- **Languages supported:** Burmese, Chinese, English, Filipino, Indonesia, Khmer, Lao, Malay, Tamil, Thai, Vietnamese
- **License:** [Gemma Community License](https://ai.google.dev/gemma/terms)
## Model Details
### Model Description
We performed continued pre-training in English and ASEAN languages on [Gemma-2-9B](https://huggingface.co/google/gemma-2-9b), a decoder model using the Gemma 2 architecture, to create Gemma2 9B CPT SEA-LIONv3 Base.
For tokenisation, the model employs the default tokenizer used in Gemma 2 9B.
### Benchmark Performance
We evaluated Gemma2 9B CPT SEA-LIONv3 base model on general language capabilities.
#### General Language Capabilities
For the evaluation of general language capabilities, we employed the [SEA HELM (also known as BHASA) evaluation benchmark](https://arxiv.org/abs/2309.06085v2) across a variety of tasks.
These tasks include Question Answering (QA), Sentiment Analysis (Sentiment), Toxicity Detection (Toxicity), Translation in both directions (Eng>Lang & Lang>Eng), Abstractive Summarization (Summ), Causal Reasoning (Causal) and Natural Language Inference (NLI).
Note: SEA HELM is implemented using prompts to elicit answers in a strict format. For all tasks, the model is expected to provide an answer tag from which the answer is automatically extracted. For tasks where options are provided, the answer should comprise one of the pre-defined options. The scores for each task is normalised to account for baseline performance due to random chance.
The evaluation was done **five-shot** with native prompts on a sample of 100-1000 instances for each dataset.
For more details on Gemma2 9B CPT SEA-LIONv3 base benchmark performance, please refer to the SEA HELM leaderboard, https://leaderboard.sea-lion.ai/
## Technical Specifications
### Infrastructure
Gemma2 9B CPT SEA-LIONv3 was trained using [MosaicML Composer](https://github.com/mosaicml/composer) on the following hardware:
| Training Details | Gemma2 9B CPT SEA-LIONv3 |
|----------------------|:------------------------:|
| SingTel HGX-100 | 8 instances |
| Nvidia H100 80GB GPU | 64 |
| Training Duration | 10 days |
### Configuration
| HyperParameter | Gemma2 9B CPT SEA-LIONv3 |
|-------------------|:------------------------:|
| Precision | bfloat16 |
| Optimizer | decoupled_adamw |
| Scheduler | weight_stable_decay |
| Learning Rate | 1.0e-5 |
| Global Batch Size | 512 |
| Micro Batch Size | 1 |
## Data
Gemma2 9B CPT SEA-LIONv3 base model was continued pre-trained on 200B tokens of the following data:
| Language | Source | Total Tokens (B) | Percentage (%) | Total percentage (%) |
| ------------------------ | ---------------- | ---------------- | -------------- | -------------------- |
| Code | StackV2 | 40 | 20 | 20 |
| English | Dolma | 37.5 | 18.75 | 25 |
| | Fineweb-Edu | 7.5 | 3.75 |
| | Others | 5 | 2.5 |
| Chinese | SEA-LION Pile v1 | 12 | 6 | 13 |
| | Others | 14 | 7 |
| Vietnamese | SEA-LION Pile v1 | 8.4 | 4.2 | 13 |
| | VinBigData | 16 | 8 |
| | Others | 1.6 | 0.8 |
| Indonesian | SEA-LION Pile v1 | 7 | 3.5 | 13 |
| | SEA-LION Pile v2 | 7 | 3.5 |
| | Others | 12 | 6 |
| Thai | SEA-LION Pile v1 | 10.7 | 5.35 | 10 |
| | WangChanBERTa | 8.5 | 4.25 |
| | Others | 0.8 | 0.4 |
| Filipino - Malay - Tamil | SEA-LION Pile v1 | 4.28 | 2.14 | 3 |
| | Others | 1.72 | 0.86 |
| Khmer - Lao - Burmese | SEA-LION Pile v1 | 5.2 | 2.6 | 3 |
| | Others | 0.8 | 0.4 |
Note:
- All token counts are counted using Gemma 2 9B tokenizer
- SEA-LION Pile v1 is processed from Common Crawl WET, which is published [here](https://huggingface.co/datasets/aisingapore/sea-lion-pile). The cutoff date of this version is September 2020.
- SEA-LION Pile v2 is processed from Common Crawl WARC from October 2020 to April 2024.
- Tamil news is sourced with permission from [Seithi](https://seithi.mediacorp.sg/)
## Call for Contributions
We encourage researchers, developers, and language enthusiasts to actively contribute to the enhancement and expansion of SEA-LION. Contributions can involve identifying and reporting bugs, sharing pre-training, instruction, and preference data, improving documentation usability, proposing and implementing new model evaluation tasks and metrics, or training versions of the model in additional Southeast Asian languages. Join us in shaping the future of SEA-LION by sharing your expertise and insights to make these models more accessible, accurate, and versatile. Please check out our GitHub for further information on the call for contributions.
## The Team
Chan Adwin, Cheng Nicholas, Choa Esther, Huang Yuli, Hulagadri Adithya Venkatadri, Lau Wayne, Lee Chwan Ren, Leong Wai Yi, Leong Wei Qi, Limkonchotiwat Peerat, Liu Bing Jie Darius, Montalan Jann Railey, Ng Boon Cheong Raymond, Ngui Jian Gang, Nguyen Thanh Ngan, Ong Brandon, Ong Tat-Wee David, Ong Zhi Hao, Rengarajan Hamsawardhini, Siow Bryan, Susanto Yosephine, Tai Ngee Chia, Tan Choon Meng, Teng Walter, Teo Eng Sipp Leslie, Teo Wei Yi, Tjhi William, Yeo Yeow Tong, Yong Xianbin
## Acknowledgements
[AI Singapore](https://aisingapore.org/) is a national programme supported by the National Research Foundation, Singapore and hosted by the National University of Singapore. Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not reflect the views of the National Research Foundation or the National University of Singapore.
## Contact
For more info, please contact us using this [SEA-LION Inquiry Form.](https://forms.gle/sLCUVb95wmGf43hi6)
[Link to SEA-LION's GitHub repository.](https://github.com/aisingapore/sealion)
## Disclaimer
This is the repository for the commercial instruction-tuned model.
The model has _not_ been aligned for safety.
Developers and users should perform their own safety fine-tuning and related security measures.
In no event shall the authors be held liable for any claims, damages, or other liabilities arising from the use of the released weights and codes.
## References
### Thai Pre-Training Data Reference
```bibtex
@misc{lowphansirikul2021wangchanberta,
title={WangchanBERTa: Pretraining transformer-based Thai Language Models},
author={Lalita Lowphansirikul and Charin Polpanumas and Nawat Jantrakulchai and Sarana Nutanong},
year={2021},
eprint={2101.09635},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | [
"QUESTION_ANSWERING",
"TRANSLATION",
"SUMMARIZATION"
] | [
"CHIA"
] |
EleutherAI/pythia-12b-deduped-v0 | EleutherAI | text-generation | [
"transformers",
"pytorch",
"gpt_neox",
"text-generation",
"causal-lm",
"pythia",
"pythia_v0",
"en",
"dataset:EleutherAI/the_pile_deduplicated",
"arxiv:2101.00027",
"arxiv:2201.07311",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | 2022-10-18T03:09:06 | 2023-03-29T18:48:21 | 545 | 25 | ---
datasets:
- EleutherAI/the_pile_deduplicated
language:
- en
license: apache-2.0
tags:
- pytorch
- causal-lm
- pythia
- pythia_v0
---
The *Pythia Scaling Suite* is a collection of models developed to facilitate
interpretability research. It contains two sets of eight models of sizes
70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two
models: one trained on the Pile, and one trained on the Pile after the dataset
has been globally deduplicated. All 8 model sizes are trained on the exact
same data, in the exact same order. All Pythia models are available
[on Hugging Face](https://huggingface.co/models?other=pythia).
The Pythia model suite was deliberately designed to promote scientific
research on large language models, especially interpretability research.
Despite not centering downstream performance as a design goal, we find the
models <a href="#evaluations">match or exceed</a> the performance of
similar and same-sized models, such as those in the OPT and GPT-Neo suites.
Please note that all models in the *Pythia* suite were renamed in January
2023. For clarity, a <a href="#naming-convention-and-parameter-count">table
comparing the old and new names</a> is provided in this model card, together
with exact parameter counts.
## Pythia-12B-deduped
### Model Details
- Developed by: [EleutherAI](http://eleuther.ai)
- Model type: Transformer-based Language Model
- Language: English
- Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia)
for training procedure, config files, and details on how to use.
- Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
- License: Apache 2.0
- Contact: to ask questions about this model, join the [EleutherAI
Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`.
Please read the existing *Pythia* documentation before asking about it in the
EleutherAI Discord. For general correspondence: [contact@eleuther.
ai](mailto:[email protected]).
<figure>
| Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models |
| -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: |
| 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — |
| 160M | 85,056,000 | 12 | 768 | 12 | 4M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M |
| 410M | 302,311,424 | 24 | 1024 | 16 | 4M | 3.0 x 10<sup>-4</sup> | OPT-350M |
| 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — |
| 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 4M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B |
| 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B |
| 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B |
| 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — |
<figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and
non-deduped models of a given size have the same hyperparameters. “Equivalent”
models have <b>exactly</b> the same architecture, and the same number of
non-embedding parameters.</figcaption>
</figure>
### Uses and Limitations
#### Intended Use
The primary intended use of Pythia is research on the behavior, functionality,
and limitations of large language models. This suite is intended to provide
a controlled setting for performing scientific experiments. To enable the
study of how language models change in the course of training, we provide
143 evenly spaced intermediate checkpoints per model. These checkpoints are
hosted on Hugging Face as branches. Note that branch `143000` corresponds
exactly to the model checkpoint on the `main` branch of each model.
You may also further fine-tune and adapt Pythia-12B-deduped for deployment,
as long as your use is in accordance with the Apache 2.0 license. Pythia
models work with the Hugging Face [Transformers
Library](https://huggingface.co/docs/transformers/index). If you decide to use
pre-trained Pythia-12B-deduped as a basis for your fine-tuned model, please
conduct your own risk and bias assessment.
#### Out-of-scope use
The Pythia Suite is **not** intended for deployment. It is not a in itself
a product and cannot be used for human-facing interactions.
Pythia models are English-language only, and are not suitable for translation
or generating text in other languages.
Pythia-12B-deduped has not been fine-tuned for downstream contexts in which
language models are commonly deployed, such as writing genre prose,
or commercial chatbots. This means Pythia-12B-deduped will **not**
respond to a given prompt the way a product like ChatGPT does. This is because,
unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement
Learning from Human Feedback (RLHF) to better “understand” human instructions.
#### Limitations and biases
The core functionality of a large language model is to take a string of text
and predict the next token. The token deemed statistically most likely by the
model need not produce the most “accurate” text. Never rely on
Pythia-12B-deduped to produce factually accurate output.
This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset
known to contain profanity and texts that are lewd or otherwise offensive.
See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a
discussion of documented biases with regards to gender, religion, and race.
Pythia-12B-deduped may produce socially unacceptable or undesirable text,
*even if* the prompt itself does not include anything explicitly offensive.
If you plan on using text generated through, for example, the Hosted Inference
API, we recommend having a human curate the outputs of this language model
before presenting it to other people. Please inform your audience that the
text was generated by Pythia-12B-deduped.
### Quickstart
Pythia models can be loaded and used via the following code, demonstrated here
for the third `pythia-70m-deduped` checkpoint:
```python
from transformers import GPTNeoXForCausalLM, AutoTokenizer
model = GPTNeoXForCausalLM.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
tokenizer = AutoTokenizer.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
inputs = tokenizer("Hello, I am", return_tensors="pt")
tokens = model.generate(**inputs)
tokenizer.decode(tokens[0])
```
Revision/branch `step143000` corresponds exactly to the model checkpoint on
the `main` branch of each model.<br>
For more information on how to use all Pythia models, see [documentation on
GitHub](https://github.com/EleutherAI/pythia).
### Training
#### Training data
Pythia-12B-deduped was trained on the Pile **after the dataset has been
globally deduplicated**.<br>
[The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in
English. It was created by EleutherAI specifically for training large language
models. It contains texts from 22 diverse sources, roughly broken down into
five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl),
prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and
miscellaneous (e.g. GitHub, Enron Emails). See [the Pile
paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources,
methodology, and a discussion of ethical implications. Consult [the
datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation
about the Pile and its component datasets. The Pile can be downloaded from
the [official website](https://pile.eleuther.ai/), or from a [community
mirror](https://the-eye.eu/public/AI/pile/).
#### Training procedure
All models were trained on the exact same data, in the exact same order. Each
model saw 299,892,736,000 tokens during training, and 143 checkpoints for each
model are saved every 2,097,152,000 tokens, spaced evenly throughout training.
This corresponds to training for just under 1 epoch on the Pile for
non-deduplicated models, and about 1.5 epochs on the deduplicated Pile.
All *Pythia* models trained for the equivalent of 143000 steps at a batch size
of 2,097,152 tokens. Two batch sizes were used: 2M and 4M. Models with a batch
size of 4M tokens listed were originally trained for 71500 steps instead, with
checkpoints every 500 steps. The checkpoints on Hugging Face are renamed for
consistency with all 2M batch models, so `step1000` is the first checkpoint
for `pythia-1.4b` that was saved (corresponding to step 500 in training), and
`step1000` is likewise the first `pythia-6.9b` checkpoint that was saved
(corresponding to 1000 “actual” steps).<br>
See [GitHub](https://github.com/EleutherAI/pythia) for more details on training
procedure, including [how to reproduce
it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br>
Pythia uses the same tokenizer as [GPT-NeoX-
20B](https://huggingface.co/EleutherAI/gpt-neox-20b).
### Evaluations
All 16 *Pythia* models were evaluated using the [LM Evaluation
Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access
the results by model and step at `results/json/*` in the [GitHub
repository](https://github.com/EleutherAI/pythia/tree/main/results/json).<br>
Expand the sections below to see plots of evaluation results for all
Pythia and Pythia-deduped models compared with OPT and BLOOM.
<details>
<summary>LAMBADA – OpenAI</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai.png" style="width:auto"/>
</details>
<details>
<summary>Physical Interaction: Question Answering (PIQA)</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa.png" style="width:auto"/>
</details>
<details>
<summary>WinoGrande</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande.png" style="width:auto"/>
</details>
<details>
<summary>AI2 Reasoning Challenge—Challenge Set</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_challenge.png" style="width:auto"/>
</details>
<details>
<summary>SciQ</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq.png" style="width:auto"/>
</details>
### Naming convention and parameter count
*Pythia* models were renamed in January 2023. It is possible that the old
naming convention still persists in some documentation by accident. The
current naming convention (70M, 160M, etc.) is based on total parameter count.
<figure style="width:32em">
| current Pythia suffix | old suffix | total params | non-embedding params |
| --------------------: | ---------: | -------------: | -------------------: |
| 70M | 19M | 70,426,624 | 18,915,328 |
| 160M | 125M | 162,322,944 | 85,056,000 |
| 410M | 350M | 405,334,016 | 302,311,424 |
| 1B | 800M | 1,011,781,632 | 805,736,448 |
| 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 |
| 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 |
| 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 |
| 12B | 13B | 11,846,072,320 | 11,327,027,200 |
</figure> | [
"QUESTION_ANSWERING",
"TRANSLATION"
] | [
"SCIQ"
] |
EleutherAI/pythia-1b-deduped-v0 | EleutherAI | text-generation | [
"transformers",
"pytorch",
"safetensors",
"gpt_neox",
"text-generation",
"causal-lm",
"pythia",
"pythia_v0",
"en",
"dataset:EleutherAI/the_pile_deduplicated",
"arxiv:2101.00027",
"arxiv:2201.07311",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | 2022-10-18T03:08:13 | 2023-07-10T01:32:03 | 543 | 10 | ---
datasets:
- EleutherAI/the_pile_deduplicated
language:
- en
license: apache-2.0
tags:
- pytorch
- causal-lm
- pythia
- pythia_v0
---
The *Pythia Scaling Suite* is a collection of models developed to facilitate
interpretability research. It contains two sets of eight models of sizes
70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two
models: one trained on the Pile, and one trained on the Pile after the dataset
has been globally deduplicated. All 8 model sizes are trained on the exact
same data, in the exact same order. All Pythia models are available
[on Hugging Face](https://huggingface.co/models?other=pythia).
The Pythia model suite was deliberately designed to promote scientific
research on large language models, especially interpretability research.
Despite not centering downstream performance as a design goal, we find the
models <a href="#evaluations">match or exceed</a> the performance of
similar and same-sized models, such as those in the OPT and GPT-Neo suites.
Please note that all models in the *Pythia* suite were renamed in January
2023. For clarity, a <a href="#naming-convention-and-parameter-count">table
comparing the old and new names</a> is provided in this model card, together
with exact parameter counts.
## Pythia-1B-deduped
### Model Details
- Developed by: [EleutherAI](http://eleuther.ai)
- Model type: Transformer-based Language Model
- Language: English
- Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia)
for training procedure, config files, and details on how to use.
- Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
- License: Apache 2.0
- Contact: to ask questions about this model, join the [EleutherAI
Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`.
Please read the existing *Pythia* documentation before asking about it in the
EleutherAI Discord. For general correspondence: [contact@eleuther.
ai](mailto:[email protected]).
<figure>
| Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models |
| -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: |
| 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — |
| 160M | 85,056,000 | 12 | 768 | 12 | 4M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M |
| 410M | 302,311,424 | 24 | 1024 | 16 | 4M | 3.0 x 10<sup>-4</sup> | OPT-350M |
| 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — |
| 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 4M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B |
| 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B |
| 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B |
| 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — |
<figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and
non-deduped models of a given size have the same hyperparameters. “Equivalent”
models have <b>exactly</b> the same architecture, and the same number of
non-embedding parameters.</figcaption>
</figure>
### Uses and Limitations
#### Intended Use
The primary intended use of Pythia is research on the behavior, functionality,
and limitations of large language models. This suite is intended to provide
a controlled setting for performing scientific experiments. To enable the
study of how language models change in the course of training, we provide
143 evenly spaced intermediate checkpoints per model. These checkpoints are
hosted on Hugging Face as branches. Note that branch `143000` corresponds
exactly to the model checkpoint on the `main` branch of each model.
You may also further fine-tune and adapt Pythia-1B-deduped for deployment,
as long as your use is in accordance with the Apache 2.0 license. Pythia
models work with the Hugging Face [Transformers
Library](https://huggingface.co/docs/transformers/index). If you decide to use
pre-trained Pythia-1B-deduped as a basis for your fine-tuned model, please
conduct your own risk and bias assessment.
#### Out-of-scope use
The Pythia Suite is **not** intended for deployment. It is not a in itself
a product and cannot be used for human-facing interactions.
Pythia models are English-language only, and are not suitable for translation
or generating text in other languages.
Pythia-1B-deduped has not been fine-tuned for downstream contexts in which
language models are commonly deployed, such as writing genre prose,
or commercial chatbots. This means Pythia-1B-deduped will **not**
respond to a given prompt the way a product like ChatGPT does. This is because,
unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement
Learning from Human Feedback (RLHF) to better “understand” human instructions.
#### Limitations and biases
The core functionality of a large language model is to take a string of text
and predict the next token. The token deemed statistically most likely by the
model need not produce the most “accurate” text. Never rely on
Pythia-1B-deduped to produce factually accurate output.
This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset
known to contain profanity and texts that are lewd or otherwise offensive.
See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a
discussion of documented biases with regards to gender, religion, and race.
Pythia-1B-deduped may produce socially unacceptable or undesirable text,
*even if* the prompt itself does not include anything explicitly offensive.
If you plan on using text generated through, for example, the Hosted Inference
API, we recommend having a human curate the outputs of this language model
before presenting it to other people. Please inform your audience that the
text was generated by Pythia-1B-deduped.
### Quickstart
Pythia models can be loaded and used via the following code, demonstrated here
for the third `pythia-70m-deduped` checkpoint:
```python
from transformers import GPTNeoXForCausalLM, AutoTokenizer
model = GPTNeoXForCausalLM.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
tokenizer = AutoTokenizer.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
inputs = tokenizer("Hello, I am", return_tensors="pt")
tokens = model.generate(**inputs)
tokenizer.decode(tokens[0])
```
Revision/branch `step143000` corresponds exactly to the model checkpoint on
the `main` branch of each model.<br>
For more information on how to use all Pythia models, see [documentation on
GitHub](https://github.com/EleutherAI/pythia).
### Training
#### Training data
Pythia-1B-deduped was trained on the Pile **after the dataset has been
globally deduplicated**.<br>
[The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in
English. It was created by EleutherAI specifically for training large language
models. It contains texts from 22 diverse sources, roughly broken down into
five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl),
prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and
miscellaneous (e.g. GitHub, Enron Emails). See [the Pile
paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources,
methodology, and a discussion of ethical implications. Consult [the
datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation
about the Pile and its component datasets. The Pile can be downloaded from
the [official website](https://pile.eleuther.ai/), or from a [community
mirror](https://the-eye.eu/public/AI/pile/).
#### Training procedure
All models were trained on the exact same data, in the exact same order. Each
model saw 299,892,736,000 tokens during training, and 143 checkpoints for each
model are saved every 2,097,152,000 tokens, spaced evenly throughout training.
This corresponds to training for just under 1 epoch on the Pile for
non-deduplicated models, and about 1.5 epochs on the deduplicated Pile.
All *Pythia* models trained for the equivalent of 143000 steps at a batch size
of 2,097,152 tokens. Two batch sizes were used: 2M and 4M. Models with a batch
size of 4M tokens listed were originally trained for 71500 steps instead, with
checkpoints every 500 steps. The checkpoints on Hugging Face are renamed for
consistency with all 2M batch models, so `step1000` is the first checkpoint
for `pythia-1.4b` that was saved (corresponding to step 500 in training), and
`step1000` is likewise the first `pythia-6.9b` checkpoint that was saved
(corresponding to 1000 “actual” steps).<br>
See [GitHub](https://github.com/EleutherAI/pythia) for more details on training
procedure, including [how to reproduce
it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br>
Pythia uses the same tokenizer as [GPT-NeoX-
20B](https://huggingface.co/EleutherAI/gpt-neox-20b).
### Evaluations
All 16 *Pythia* models were evaluated using the [LM Evaluation
Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access
the results by model and step at `results/json/*` in the [GitHub
repository](https://github.com/EleutherAI/pythia/tree/main/results/json).<br>
Expand the sections below to see plots of evaluation results for all
Pythia and Pythia-deduped models compared with OPT and BLOOM.
<details>
<summary>LAMBADA – OpenAI</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai.png" style="width:auto"/>
</details>
<details>
<summary>Physical Interaction: Question Answering (PIQA)</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa.png" style="width:auto"/>
</details>
<details>
<summary>WinoGrande</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande.png" style="width:auto"/>
</details>
<details>
<summary>AI2 Reasoning Challenge – Challenge Set</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_challenge.png" style="width:auto"/>
</details>
<details>
<summary>SciQ</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq.png" style="width:auto"/>
</details>
### Naming convention and parameter count
*Pythia* models were renamed in January 2023. It is possible that the old
naming convention still persists in some documentation by accident. The
current naming convention (70M, 160M, etc.) is based on total parameter count.
<figure style="width:32em">
| current Pythia suffix | old suffix | total params | non-embedding params |
| --------------------: | ---------: | -------------: | -------------------: |
| 70M | 19M | 70,426,624 | 18,915,328 |
| 160M | 125M | 162,322,944 | 85,056,000 |
| 410M | 350M | 405,334,016 | 302,311,424 |
| 1B | 800M | 1,011,781,632 | 805,736,448 |
| 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 |
| 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 |
| 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 |
| 12B | 13B | 11,846,072,320 | 11,327,027,200 |
</figure> | [
"QUESTION_ANSWERING",
"TRANSLATION"
] | [
"SCIQ"
] |
moussaKam/barthez-orangesum-abstract | moussaKam | summarization | [
"transformers",
"pytorch",
"mbart",
"text2text-generation",
"summarization",
"bart",
"fr",
"arxiv:2010.12321",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:05 | 2021-11-15T13:03:03 | 540 | 7 | ---
language:
- fr
license: apache-2.0
tags:
- summarization
- bart
widget:
- text: Citant les préoccupations de ses clients dénonçant des cas de censure après
la suppression du compte de Trump, un fournisseur d'accès Internet de l'État de
l'Idaho a décidé de bloquer Facebook et Twitter. La mesure ne concernera cependant
que les clients mécontents de la politique de ces réseaux sociaux.
---
### Barthez model finetuned on orangeSum (abstract generation)
finetuning: examples/seq2seq (as of Feb 08 2021)
paper: https://arxiv.org/abs/2010.12321 \
github: https://github.com/moussaKam/BARThez
```
@article{eddine2020barthez,
title={BARThez: a Skilled Pretrained French Sequence-to-Sequence Model},
author={Eddine, Moussa Kamal and Tixier, Antoine J-P and Vazirgiannis, Michalis},
journal={arXiv preprint arXiv:2010.12321},
year={2020}
}
```
| [
"SUMMARIZATION"
] | [
"CAS"
] |
sultan/BioM-ALBERT-xxlarge-PMC | sultan | fill-mask | [
"transformers",
"pytorch",
"albert",
"fill-mask",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:05 | 2023-11-04T23:06:21 | 534 | 4 | ---
{}
---
# BioM-Transformers: Building Large Biomedical Language Models with BERT, ALBERT and ELECTRA
# Abstract
The impact of design choices on the performance
of biomedical language models recently
has been a subject for investigation. In
this paper, we empirically study biomedical
domain adaptation with large transformer models
using different design choices. We evaluate
the performance of our pretrained models
against other existing biomedical language
models in the literature. Our results show that
we achieve state-of-the-art results on several
biomedical domain tasks despite using similar
or less computational cost compared to other
models in the literature. Our findings highlight
the significant effect of design choices on
improving the performance of biomedical language
models.
# Model Description
This model was pre-trained on PMC full article for further 64k steps with a batch size of 8192, where we initiate our weights from our model BioM-ALBERT-xxlarge. Thus, the total training steps for this model is 264k+64K=328K steps. The model is very large due to the number of hidden layer size (4096). In order to help researchers with limited resources to fine-tune larger models, we created an example with PyTorch XLA. PyTorch XLA (https://github.com/pytorch/xla) is a library that allows you to use PyTorch on TPU units, which is provided for free by Google Colab and Kaggle. Follow this example to work with PyTorch/XLA [Link](https://github.com/salrowili/BioM-Transformers/blob/main/examples/Fine_Tuning_Biomedical_Models_on_Text_Classification_Task_With_HuggingFace_Transformers_and_PyTorch_XLA.ipynb). In this example we achieve 80.74 micro F1 score on ChemProt task with BioM-ALBERTxxlarge . Fine-tuning takes 43 minutes for 5 epochs .
Check our GitHub repo at https://github.com/salrowili/BioM-Transformers for TensorFlow and GluonNLP checkpoints. We also updated this repo with a couple of examples on how to fine-tune LMs on text classification and questions answering tasks such as ChemProt, SQuAD, and BioASQ.
# Colab Notebook Examples
BioM-ELECTRA-LARGE on NER and ChemProt Task [![Open In Colab][COLAB]](https://colab.research.google.com/github/salrowili/BioM-Transformers/blob/main/examples/Example_of_NER_and_ChemProt_Task_on_TPU.ipynb)
BioM-ELECTRA-Large on SQuAD2.0 and BioASQ7B Factoid tasks [![Open In Colab][COLAB]](https://colab.research.google.com/github/salrowili/BioM-Transformers/blob/main/examples/Example_of_SQuAD2_0_and_BioASQ7B_tasks_with_BioM_ELECTRA_Large_on_TPU.ipynb)
BioM-ALBERT-xxlarge on SQuAD2.0 and BioASQ7B Factoid tasks [![Open In Colab][COLAB]](https://colab.research.google.com/github/salrowili/BioM-Transformers/blob/main/examples/Example_of_SQuAD2_0_and_BioASQ7B_tasks_with_BioM_ALBERT_xxlarge_on_TPU.ipynb)
Text Classification Task With HuggingFace Transformers and PyTorchXLA on Free TPU [![Open In Colab][COLAB]](https://colab.research.google.com/github/salrowili/BioM-Transformers/blob/main/examples/Fine_Tuning_Biomedical_Models_on_Text_Classification_Task_With_HuggingFace_Transformers_and_PyTorch_XLA.ipynb)
Reproducing our BLURB results with JAX [![Open In Colab][COLAB]](https://colab.research.google.com/github/salrowili/BioM-Transformers/blob/main/examples/BLURB_LeaderBoard_with_TPU_VM.ipynb)
Finetunning BioM-Transformers with Jax/Flax on TPUv3-8 with free Kaggle resource [![Open In Colab][COLAB]](https://www.kaggle.com/code/sultanalrowili/biom-transoformers-with-flax-on-tpu-with-kaggle)
[COLAB]: https://colab.research.google.com/assets/colab-badge.svg
# Acknowledgment
We would like to acknowledge the support we have from Tensorflow Research Cloud (TFRC) team to grant us access to TPUv3 units.
# Citation
```bibtex
@inproceedings{alrowili-shanker-2021-biom,
title = "{B}io{M}-Transformers: Building Large Biomedical Language Models with {BERT}, {ALBERT} and {ELECTRA}",
author = "Alrowili, Sultan and
Shanker, Vijay",
booktitle = "Proceedings of the 20th Workshop on Biomedical Language Processing",
month = jun,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2021.bionlp-1.24",
pages = "221--227",
abstract = "The impact of design choices on the performance of biomedical language models recently has been a subject for investigation. In this paper, we empirically study biomedical domain adaptation with large transformer models using different design choices. We evaluate the performance of our pretrained models against other existing biomedical language models in the literature. Our results show that we achieve state-of-the-art results on several biomedical domain tasks despite using similar or less computational cost compared to other models in the literature. Our findings highlight the significant effect of design choices on improving the performance of biomedical language models.",
}
``` | [
"TEXT_CLASSIFICATION"
] | [
"BLURB",
"CHEMPROT"
] |
QuantFactory/pythia-12b-GGUF | QuantFactory | text-generation | [
"gguf",
"pytorch",
"causal-lm",
"pythia",
"text-generation",
"en",
"dataset:EleutherAI/pile",
"arxiv:2304.01373",
"arxiv:2101.00027",
"arxiv:2201.07311",
"base_model:EleutherAI/pythia-12b",
"base_model:quantized:EleutherAI/pythia-12b",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2024-07-12T10:15:14 | 2024-07-13T15:25:15 | 513 | 3 | ---
base_model: EleutherAI/pythia-12b
datasets:
- EleutherAI/pile
language:
- en
license: apache-2.0
pipeline_tag: text-generation
tags:
- pytorch
- causal-lm
- pythia
---
# QuantFactory/pythia-12b-GGUF
This is quantized version of [EleutherAI/pythia-12b](https://huggingface.co/EleutherAI/pythia-12b) created using llama.cpp
# Model Description
The *Pythia Scaling Suite* is a collection of models developed to facilitate
interpretability research [(see paper)](https://arxiv.org/pdf/2304.01373.pdf).
It contains two sets of eight models of sizes
70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two
models: one trained on the Pile, and one trained on the Pile after the dataset
has been globally deduplicated. All 8 model sizes are trained on the exact
same data, in the exact same order. We also provide 154 intermediate
checkpoints per model, hosted on Hugging Face as branches.
The Pythia model suite was deliberately designed to promote scientific
research on large language models, especially interpretability research.
Despite not centering downstream performance as a design goal, we find the
models <a href="#evaluations">match or exceed</a> the performance of
similar and same-sized models, such as those in the OPT and GPT-Neo suites.
<details>
<summary style="font-weight: 600">Past early release and naming convention.</summary>
Previously, we released an early version of the Pythia suite to the public.
However, we decided to retrain the model suite to address a few hyperparameter
discrepancies. This model card <a href="#changelog">lists the changes</a>;
see appendix B in the Pythia paper for further discussion. We found no
difference in benchmark performance between the two Pythia versions.
The old models are
[still available](https://huggingface.co/models?other=pythia_v0), but we
suggest the retrained suite if you are just starting to use Pythia.<br>
**This is the current release.**
Please note that all models in the *Pythia* suite were renamed in January
2023. For clarity, a <a href="#naming-convention-and-parameter-count">table
comparing the old and new names</a> is provided in this model card, together
with exact parameter counts.
</details>
<br>
# Pythia-12B
## Model Details
- Developed by: [EleutherAI](http://eleuther.ai)
- Model type: Transformer-based Language Model
- Language: English
- Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia)
for training procedure, config files, and details on how to use.
[See paper](https://arxiv.org/pdf/2304.01373.pdf) for more evals and implementation
details.
- Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
- License: Apache 2.0
- Contact: to ask questions about this model, join the [EleutherAI
Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`.
Please read the existing *Pythia* documentation before asking about it in the
EleutherAI Discord. For general correspondence: [contact@eleuther.
ai](mailto:[email protected]).
<figure>
| Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models |
| -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: |
| 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — |
| 160M | 85,056,000 | 12 | 768 | 12 | 2M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M |
| 410M | 302,311,424 | 24 | 1024 | 16 | 2M | 3.0 x 10<sup>-4</sup> | OPT-350M |
| 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — |
| 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 2M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B |
| 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B |
| 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B |
| 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — |
<figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and
non-deduped models of a given size have the same hyperparameters. “Equivalent”
models have <b>exactly</b> the same architecture, and the same number of
non-embedding parameters.</figcaption>
</figure>
## Uses and Limitations
### Intended Use
The primary intended use of Pythia is research on the behavior, functionality,
and limitations of large language models. This suite is intended to provide
a controlled setting for performing scientific experiments. We also provide
154 checkpoints per model: initial `step0`, 10 log-spaced checkpoints
`step{1,2,4...512}`, and 143 evenly-spaced checkpoints from `step1000` to
`step143000`. These checkpoints are hosted on Hugging Face as branches. Note
that branch `143000` corresponds exactly to the model checkpoint on the `main`
branch of each model.
You may also further fine-tune and adapt Pythia-12B for deployment,
as long as your use is in accordance with the Apache 2.0 license. Pythia
models work with the Hugging Face [Transformers
Library](https://huggingface.co/docs/transformers/index). If you decide to use
pre-trained Pythia-12B as a basis for your fine-tuned model, please
conduct your own risk and bias assessment.
### Out-of-scope use
The Pythia Suite is **not** intended for deployment. It is not a in itself
a product and cannot be used for human-facing interactions. For example,
the model may generate harmful or offensive text. Please evaluate the risks
associated with your particular use case.
Pythia models are English-language only, and are not suitable for translation
or generating text in other languages.
Pythia-12B has not been fine-tuned for downstream contexts in which
language models are commonly deployed, such as writing genre prose,
or commercial chatbots. This means Pythia-12B will **not**
respond to a given prompt the way a product like ChatGPT does. This is because,
unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement
Learning from Human Feedback (RLHF) to better “follow” human instructions.
### Limitations and biases
The core functionality of a large language model is to take a string of text
and predict the next token. The token used by the model need not produce the
most “accurate” text. Never rely on Pythia-12B to produce factually accurate
output.
This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset
known to contain profanity and texts that are lewd or otherwise offensive.
See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a
discussion of documented biases with regards to gender, religion, and race.
Pythia-12B may produce socially unacceptable or undesirable text, *even if*
the prompt itself does not include anything explicitly offensive.
If you plan on using text generated through, for example, the Hosted Inference
API, we recommend having a human curate the outputs of this language model
before presenting it to other people. Please inform your audience that the
text was generated by Pythia-12B.
### Quickstart
Pythia models can be loaded and used via the following code, demonstrated here
for the third `pythia-70m-deduped` checkpoint:
```python
from transformers import GPTNeoXForCausalLM, AutoTokenizer
model = GPTNeoXForCausalLM.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
tokenizer = AutoTokenizer.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
inputs = tokenizer("Hello, I am", return_tensors="pt")
tokens = model.generate(**inputs)
tokenizer.decode(tokens[0])
```
Revision/branch `step143000` corresponds exactly to the model checkpoint on
the `main` branch of each model.<br>
For more information on how to use all Pythia models, see [documentation on
GitHub](https://github.com/EleutherAI/pythia).
## Training
### Training data
[The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in
English. It was created by EleutherAI specifically for training large language
models. It contains texts from 22 diverse sources, roughly broken down into
five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl),
prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and
miscellaneous (e.g. GitHub, Enron Emails). See [the Pile
paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources,
methodology, and a discussion of ethical implications. Consult [the
datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation
about the Pile and its component datasets. The Pile can be downloaded from
the [official website](https://pile.eleuther.ai/), or from a [community
mirror](https://the-eye.eu/public/AI/pile/).<br>
The Pile was **not** deduplicated before being used to train Pythia-12B.
### Training procedure
All models were trained on the exact same data, in the exact same order. Each
model saw 299,892,736,000 tokens during training, and 143 checkpoints for each
model are saved every 2,097,152,000 tokens, spaced evenly throughout training,
from `step1000` to `step143000` (which is the same as `main`). In addition, we
also provide frequent early checkpoints: `step0` and `step{1,2,4...512}`.
This corresponds to training for just under 1 epoch on the Pile for
non-deduplicated models, and about 1.5 epochs on the deduplicated Pile.
All *Pythia* models trained for 143000 steps at a batch size
of 2M (2,097,152 tokens).<br>
See [GitHub](https://github.com/EleutherAI/pythia) for more details on training
procedure, including [how to reproduce
it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br>
Pythia uses the same tokenizer as [GPT-NeoX-
20B](https://huggingface.co/EleutherAI/gpt-neox-20b).
## Evaluations
All 16 *Pythia* models were evaluated using the [LM Evaluation
Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access
the results by model and step at `results/json/*` in the [GitHub
repository](https://github.com/EleutherAI/pythia/tree/main/results/json/).<br>
Expand the sections below to see plots of evaluation results for all
Pythia and Pythia-deduped models compared with OPT and BLOOM.
<details>
<summary>LAMBADA – OpenAI</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai_v1.png" style="width:auto"/>
</details>
<details>
<summary>Physical Interaction: Question Answering (PIQA)</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa_v1.png" style="width:auto"/>
</details>
<details>
<summary>WinoGrande</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande_v1.png" style="width:auto"/>
</details>
<details>
<summary>AI2 Reasoning Challenge—Easy Set</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_easy_v1.png" style="width:auto"/>
</details>
<details>
<summary>SciQ</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq_v1.png" style="width:auto"/>
</details>
## Changelog
This section compares differences between previously released
[Pythia v0](https://huggingface.co/models?other=pythia_v0) and the current
models. See Appendix B of the Pythia paper for further discussion of these
changes and the motivation behind them. We found that retraining Pythia had no
impact on benchmark performance.
- All model sizes are now trained with uniform batch size of 2M tokens.
Previously, the models of size 160M, 410M, and 1.4B parameters were trained
with batch sizes of 4M tokens.
- We added checkpoints at initialization (step 0) and steps {1,2,4,8,16,32,64,
128,256,512} in addition to every 1000 training steps.
- Flash Attention was used in the new retrained suite.
- We remedied a minor inconsistency that existed in the original suite: all
models of size 2.8B parameters or smaller had a learning rate (LR) schedule
which decayed to a minimum LR of 10% the starting LR rate, but the 6.9B and
12B models all used an LR schedule which decayed to a minimum LR of 0. In
the redone training runs, we rectified this inconsistency: all models now were
trained with LR decaying to a minimum of 0.1× their maximum LR.
### Naming convention and parameter count
*Pythia* models were renamed in January 2023. It is possible that the old
naming convention still persists in some documentation by accident. The
current naming convention (70M, 160M, etc.) is based on total parameter count.
<figure style="width:32em">
| current Pythia suffix | old suffix | total params | non-embedding params |
| --------------------: | ---------: | -------------: | -------------------: |
| 70M | 19M | 70,426,624 | 18,915,328 |
| 160M | 125M | 162,322,944 | 85,056,000 |
| 410M | 350M | 405,334,016 | 302,311,424 |
| 1B | 800M | 1,011,781,632 | 805,736,448 |
| 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 |
| 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 |
| 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 |
| 12B | 13B | 11,846,072,320 | 11,327,027,200 |
</figure> | [
"QUESTION_ANSWERING",
"TRANSLATION"
] | [
"SCIQ"
] |
Sifal/ClinicalMosaic | Sifal | fill-mask | [
"transformers",
"safetensors",
"bert",
"feature-extraction",
"clinical",
"healthcare",
"NLP",
"BERT",
"MIMIC-IV",
"MedNLI",
"transformer",
"fill-mask",
"custom_code",
"en",
"dataset:bigbio/mednli",
"arxiv:2502.18009",
"base_model:mosaicml/mosaic-bert-base",
"base_model:finetune:mosaicml/mosaic-bert-base",
"license:mit",
"model-index",
"endpoints_compatible",
"region:us"
] | 2024-07-05T07:17:09 | 2025-02-26T08:57:46 | 502 | 0 | ---
base_model:
- mosaicml/mosaic-bert-base
datasets:
- bigbio/mednli
language:
- en
library_name: transformers
license: mit
metrics:
- accuracy
pipeline_tag: fill-mask
tags:
- clinical
- healthcare
- NLP
- BERT
- MIMIC-IV
- MedNLI
- transformer
model-index:
- name: ClinicalMosaic
results:
- task:
type: classification
dataset:
name: MedNLI
type: MedNLI
metrics:
- type: accuracy
value: 86.5
name: Accuracy
---
# Model Card for Clinical Mosaic
Clinical Mosaic is a transformer-based language model designed for clinical text, built on the Mosaic BERT architecture. The model is pretrained on 331,794 deidentified clinical notes from the MIMIC-IV-NOTES 2.2 database, with a sequence length of 512 tokens, while leveraging Attention with Linear Biases (ALiBi) to improve extrapolation beyond this limit without re-quiring learned positional embeddings.
## Model Details
- **Developed by:** Sifal Klioui, Sana Sellami, and Youssef Trardi (Aix-Marseille Univ, LIS, CNRS, Marseille, France)
- **Funded by:** PICOMALE project (AMIDEX) Under the direction of the CEDRE
- **Base Model:** Mosaic BERT
- **License:** MIMIC Data Use Agreement (requires compliance with original DUA)
- **Repository:** [PatientTrajectoryForecasting](https://github.com/MostHumble/PatientTrajectoryForecasting)
- **Paper:** *Patient Trajectory Prediction: Integrating Clinical Notes with Transformers* [[EN](https://arxiv.org/abs/2502.18009), [FR](https://editions-rnti.fr/?inprocid=1002990)]
## Uses
### Direct Use
Clinical Mosaic can be used directly as a clinical language model for:
- Natural language inference and clinical reasoning tasks.
- Serving as a backbone for further fine-tuning on clinical NLP applications (e.g., clinical note summarization, diagnosis classification).
### Downstream Use
The model can be integrated into larger systems for tasks such as patient trajectory prediction or decision support, provided that additional fine-tuning and rigorous validation are performed.
### Out-of-Scope Use
- **Clinical Decision-Making:** The model is for research use only and should not be deployed for direct patient care without further validation and regulatory approval.
- **Medical Advice Generation:** It is not intended to replace expert clinical judgment.
## Bias, Risks, and Limitations
Clinical Mosaic was pre-trained on deidentified clinical notes from MIMIC-IV-NOTES 2.2—a dataset from a single U.S. institution. This may introduce biases related to local clinical practices and patient demographics. Although extensive care was taken to deidentify data and prevent PHI leakage, users must ensure that the model is not used to inadvertently reidentify sensitive information. When applying the model to new populations or clinical settings, performance may vary, so further fine-tuning and bias audits are recommended.
### Recommendations
- **Text Normalization Consistency:** Any input text must undergo identical preprocessing as applied during training
- **Evaluation:** Users should evaluate the model’s performance on their target population.
- **Fine-Tuning:** Consider additional fine-tuning or domain adaptation if deploying in a different clinical context.
- **Bias Audits:** Regularly perform bias audits to monitor for potential disparities in performance.
## How to Get Started with the Model
Install the Hugging Face Transformers library and load the model as follows:
### For embeddings generation:
```python
# Load model directly
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Sifal/ClinicalMosaic", trust_remote_code=True)
ClincalMosaic = AutoModel.from_pretrained("Sifal/ClinicalMosaic", trust_remote_code=True)
# Example usage
clinical_text = "..."
inputs = tokenizer(clinical_text, return_tensors="pt")
last_layer_embeddings = ClincalMosaic(**inputs, output_all_encoded_layers=False)
```
### For sequence classification:
```python
from transformers import AutoModelForSequenceClassification, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained('Sifal/ClinicalMosaic')
ClassifierClincalMosaic = AutoModelForSequenceClassification.from_pretrained(
'Sifal/ClinicalMosaic',
torch_dtype='auto',
trust_remote_code=True,
device_map="auto"
)
# Example usage
clinical_text = "..."
inputs = tokenizer(clinical_text, return_tensors="pt")
logits = ClassifierClincalMosaic(**inputs).logits
```
Further instructions and example scripts are provided in the model’s repository.
## Training Details
### Training Data
- **Data Source:** 331,794 clinical notes from the MIMIC-IV-NOTES 2.2 dataset.
- **Preprocessing:**
- Following (Alsentzer et al., 2019), clinical notes were preprocessed by unifying medical abbreviations (e.g., “hr”, “hrs” to “hours”), removing accents, converting special characters, and normalizing text to lowercase. These steps help mitigate variations caused by subword tokenizers.
- For additional details, please refer to the [PatientTrajectoryForecasting GitHub repository](https://github.com/MostHumble/PatientTrajectoryForecasting/).
### Training Procedure
- **Setup:** Distributed training using 7 NVIDIA A40 GPUs.
- **Hyperparameters:**
- **Effective Batch Size:** 224
- **Training Steps:** 80,000
- **Sequence Length:** 512 tokens
- **Optimizer:** ADAMW
- **Initial Learning Rate:** 5e-4
- **Learning Rate Schedule:** Linear warmup for 33,000 steps, followed by cosine annealing for 46,000 steps (final LR 1e-5)
- **Masking Probability:** 30%
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
The model was evaluated on the MedNLI dataset, which comprises 14,049 clinical premise-hypothesis pairs derived from MIMIC-III notes.
#### Factors
Evaluation disaggregated performance across clinical language understanding, with a focus on natural language inference.
### Results
Clinical Mosaic outperformed comparable models:
- **BERT:** 77.6%
- **BioBERT:** 80.8%
- **Clinical Discharge BERT:** 84.1%
- **Bio+Clinical BERT:** 82.7%
- **Clinical Mosaic:** **86.5%**
These results indicate improved clinical reasoning and language understanding.
#### Summary
The model demonstrates robust performance on clinical natural language inference tasks and serves as a strong foundation for further clinical NLP applications.
## Environmental Impact
- **Hardware Type:** 7 NVIDIA A40 GPUs
- **Hours used:** 1008 (GPU hours)
- **Provider:** Private Infrastructure
- **Carbon Emitted:** 108.86 kg CO2 eq.
## Acknowledgments
The project leading to this publication has received funding from the Excellence Initiative of Aix Marseille Université - A*Midex, a French “Investissements d’Avenir programme” AMX-21-IET-017.
We would like to thank **LIS** | Laboratoire d'Informatique et Systèmes, Aix-Marseille University for providing the GPU resources necessary for pretraining and conducting extensive experiments. Additionally, we acknowledge **CEDRE** | CEntre de formation et de soutien aux Données de la REcherche, Programme 2 du projet France 2030 IDeAL for supporting early-stage experiments and hosting part of the computational infrastructure.
## Citation
**BibTeX:**
```bibtex
@misc{klioui2025patienttrajectorypredictionintegrating,
title={Patient Trajectory Prediction: Integrating Clinical Notes with Transformers},
author={Sifal Klioui and Sana Sellami and Youssef Trardi},
year={2025},
eprint={2502.18009},
archivePrefix={arXiv},
primaryClass={cs.LG},
url={https://arxiv.org/abs/2502.18009},
}
@article{RNTI/papers/1002990,
author = {Sifal Klioui and Sana Sellami and Youssef Trardi},
title = {Prédiction de la trajectoire du patient : Intégration des notes cliniques aux transformers},
journal = {Revue des Nouvelles Technologies de l'Information},
volume = {Extraction et Gestion des Connaissances, RNTI-E-41},
year = {2025},
pages = {135-146}
}
```
## More Information
For further details, please refer to the model’s repository and supplementary documentation.
## Model Card Contact
For questions or further information, please contact [[email protected]]. | [
"SUMMARIZATION"
] | [
"MEDNLI"
] |
Subsets and Splits